Science.gov

Sample records for electronic design automation

  1. Electronic Design Automation: Integrating the Design and Manufacturing Functions

    NASA Technical Reports Server (NTRS)

    Bachnak, Rafic; Salkowski, Charles

    1997-01-01

    As the complexity of electronic systems grows, the traditional design practice, a sequential process, is replaced by concurrent design methodologies. A major advantage of concurrent design is that the feedback from software and manufacturing engineers can be easily incorporated into the design. The implementation of concurrent engineering methodologies is greatly facilitated by employing the latest Electronic Design Automation (EDA) tools. These tools offer integrated simulation of the electrical, mechanical, and manufacturing functions and support virtual prototyping, rapid prototyping, and hardware-software co-design. This report presents recommendations for enhancing the electronic design and manufacturing capabilities and procedures at JSC based on a concurrent design methodology that employs EDA tools.

  2. Electronic Design Automation (EDA) Roadmap Taskforce Report, Design of Microprocessors

    NASA Astrophysics Data System (ADS)

    1999-04-01

    The goal of this project was to support the establishment of tool interoperability standards for the semiconductor industry. Accomplishments include the publication of the 'EDA Industry Standards Roadmap - 1996' and the 'EDA Roadmap Taskforce Report - Design of Microprocessors.'

  3. A Parallel Genetic Algorithm for Automated Electronic Circuit Design

    NASA Technical Reports Server (NTRS)

    Long, Jason D.; Colombano, Silvano P.; Haith, Gary L.; Stassinopoulos, Dimitris

    2000-01-01

    Parallelized versions of genetic algorithms (GAs) are popular primarily for three reasons: the GA is an inherently parallel algorithm, typical GA applications are very compute intensive, and powerful computing platforms, especially Beowulf-style computing clusters, are becoming more affordable and easier to implement. In addition, the low communication bandwidth required allows the use of inexpensive networking hardware such as standard office ethernet. In this paper we describe a parallel GA and its use in automated high-level circuit design. Genetic algorithms are a type of trial-and-error search technique that are guided by principles of Darwinian evolution. Just as the genetic material of two living organisms can intermix to produce offspring that are better adapted to their environment, GAs expose genetic material, frequently strings of 1s and Os, to the forces of artificial evolution: selection, mutation, recombination, etc. GAs start with a pool of randomly-generated candidate solutions which are then tested and scored with respect to their utility. Solutions are then bred by probabilistically selecting high quality parents and recombining their genetic representations to produce offspring solutions. Offspring are typically subjected to a small amount of random mutation. After a pool of offspring is produced, this process iterates until a satisfactory solution is found or an iteration limit is reached. Genetic algorithms have been applied to a wide variety of problems in many fields, including chemistry, biology, and many engineering disciplines. There are many styles of parallelism used in implementing parallel GAs. One such method is called the master-slave or processor farm approach. In this technique, slave nodes are used solely to compute fitness evaluations (the most time consuming part). The master processor collects fitness scores from the nodes and performs the genetic operators (selection, reproduction, variation, etc.). Because of dependency

  4. A Parallel Genetic Algorithm for Automated Electronic Circuit Design

    NASA Technical Reports Server (NTRS)

    Lohn, Jason D.; Colombano, Silvano P.; Haith, Gary L.; Stassinopoulos, Dimitris; Norvig, Peter (Technical Monitor)

    2000-01-01

    We describe a parallel genetic algorithm (GA) that automatically generates circuit designs using evolutionary search. A circuit-construction programming language is introduced and we show how evolution can generate practical analog circuit designs. Our system allows circuit size (number of devices), circuit topology, and device values to be evolved. We present experimental results as applied to analog filter and amplifier design tasks.

  5. Design automation for integrated circuits

    NASA Astrophysics Data System (ADS)

    Newell, S. B.; de Geus, A. J.; Rohrer, R. A.

    1983-04-01

    Consideration is given to the development status of the use of computers in automated integrated circuit design methods, which promise the minimization of both design time and design error incidence. Integrated circuit design encompasses two major tasks: error specification, in which the goal is a logic diagram that accurately represents the desired electronic function, and physical specification, in which the goal is an exact description of the physical locations of all circuit elements and their interconnections on the chip. Design automation not only saves money by reducing design and fabrication time, but also helps the community of systems and logic designers to work more innovatively. Attention is given to established design automation methodologies, programmable logic arrays, and design shortcuts.

  6. Design automation for integrated optics

    NASA Astrophysics Data System (ADS)

    Condrat, Christopher

    Recent breakthroughs in silicon photonics technology are enabling the integration of optical devices into silicon-based semiconductor processes. Photonics technology enables high-speed, high-bandwidth, and high-fidelity communications on the chip-scale---an important development in an increasingly communications-oriented semiconductor world. Significant developments in silicon photonic manufacturing and integration are also enabling investigations into applications beyond that of traditional telecom: sensing, filtering, signal processing, quantum technology---and even optical computing. In effect, we are now seeing a convergence of communications and computation, where the traditional roles of optics and microelectronics are becoming blurred. As the applications for opto-electronic integrated circuits (OEICs) are developed, and manufacturing capabilities expand, design support is necessary to fully exploit the potential of this optics technology. Such design support for moving beyond custom-design to automated synthesis and optimization is not well developed. Scalability requires abstractions, which in turn enables and requires the use of optimization algorithms and design methodology flows. Design automation represents an opportunity to take OEIC design to a larger scale, facilitating design-space exploration, and laying the foundation for current and future optical applications---thus fully realizing the potential of this technology. This dissertation proposes design automation for integrated optic system design. Using a building-block model for optical devices, we provide an EDA-inspired design flow and methodologies for optical design automation. Underlying these flows and methodologies are new supporting techniques in behavioral and physical synthesis, as well as device-resynthesis techniques for thermal-aware system integration. We also provide modeling for optical devices and determine optimization and constraint parameters that guide the automation

  7. Genetic circuit design automation.

    PubMed

    Nielsen, Alec A K; Der, Bryan S; Shin, Jonghyeon; Vaidyanathan, Prashant; Paralanov, Vanya; Strychalski, Elizabeth A; Ross, David; Densmore, Douglas; Voigt, Christopher A

    2016-04-01

    Computation can be performed in living cells by DNA-encoded circuits that process sensory information and control biological functions. Their construction is time-intensive, requiring manual part assembly and balancing of regulator expression. We describe a design environment, Cello, in which a user writes Verilog code that is automatically transformed into a DNA sequence. Algorithms build a circuit diagram, assign and connect gates, and simulate performance. Reliable circuit design requires the insulation of gates from genetic context, so that they function identically when used in different circuits. We used Cello to design 60 circuits forEscherichia coli(880,000 base pairs of DNA), for which each DNA sequence was built as predicted by the software with no additional tuning. Of these, 45 circuits performed correctly in every output state (up to 10 regulators and 55 parts), and across all circuits 92% of the output states functioned as predicted. Design automation simplifies the incorporation of genetic circuits into biotechnology projects that require decision-making, control, sensing, or spatial organization. PMID:27034378

  8. Automated design of aerospace structures

    NASA Technical Reports Server (NTRS)

    Fulton, R. E.; Mccomb, H. G.

    1974-01-01

    The current state-of-the-art in structural analysis of aerospace vehicles is characterized, automated design technology is discussed, and an indication is given of the future direction of research in analysis and automated design. Representative computer programs for analysis typical of those in routine use in vehicle design activities are described, and results are shown for some selected analysis problems. Recent and planned advances in analysis capability are indicated. Techniques used to automate the more routine aspects of structural design are discussed, and some recently developed automated design computer programs are described. Finally, discussion is presented of early accomplishments in interdisciplinary automated design systems, and some indication of the future thrust of research in this field is given.

  9. Automated Hardware Design via Evolutionary Search

    NASA Technical Reports Server (NTRS)

    Lohn, Jason D.; Colombano, Silvano P.

    2000-01-01

    The goal of this research is to investigate the application of evolutionary search to the process of automated engineering design. Evolutionary search techniques involve the simulation of Darwinian mechanisms by computer algorithms. In recent years, such techniques have attracted much attention because they are able to tackle a wide variety of difficult problems and frequently produce acceptable solutions. The results obtained are usually functional, often surprising, and typically "messy" because the algorithms are told to concentrate on the overriding objective and not elegance or simplicity. advantages. First, faster design cycles translate into time and, hence, cost savings. Second, automated design techniques can be made to scale well and hence better deal with increasing amounts of design complexity. Third, design quality can increase because design properties can be specified a priori. For example, size and weight specifications of a device, smaller and lighter than the best known design, might be optimized by the automated design technique. The domain of electronic circuit design is an advantageous platform in which to study automated design techniques because it is a rich design space that is well understood, permitting human-created designs to be compared to machine- generated designs. developed for circuit design was to automatically produce high-level integrated electronic circuit designs whose properties permit physical implementation in silicon. This process entailed designing an effective evolutionary algorithm and solving a difficult multiobjective optimization problem. FY 99 saw many accomplishments in this effort.

  10. Airborne electronics for automated flight systems

    NASA Technical Reports Server (NTRS)

    Graves, G. B., Jr.

    1975-01-01

    The increasing importance of airborne electronics for use in automated flight systems is briefly reviewed with attention to both basic aircraft control functions and flight management systems for operational use. The requirements for high levels of systems reliability are recognized. Design techniques are discussed and the areas of control systems, computing and communications are considered in terms of key technical problems and trends for their solution.

  11. Intelligent design system for design automation

    NASA Astrophysics Data System (ADS)

    Shakeri, Cirrus; Deif, Ismail; Katragadda, Prasanna; Knutson, Stanley

    2000-10-01

    In order to succeed in today's global, competitive market, companies need continuous improvements in their product development processes. These improvements should result in expending fewer resources on the design process while achieving better quality. Automating the design process reduces resources needed and allows designers to spend more time on creative aspects that improve the quality of design. For the last three decades, engineers and designers have been searching for better ways to automate the product development process. For certain classes of design problems, which cover a large portion of real world design situations, the process can be automated using knowledge-based systems. These are design problems in which the knowledge sources are known in advance. Using techniques from Knowledge-Based Engineering, knowledge is codified and inserted into a knowledge-based system. The system activates the design knowledge, automatically generating designs that satisfy the design constraints. To increase the return on investment of building automated design systems, Knowledge management methodologies and techniques are required for capturing, formalizing, storing, and searching design knowledge.

  12. Automated cleaning of electronic components

    SciTech Connect

    Drotning, W.

    1994-03-01

    Environmental and operator safety concerns are leading to the elimination of trichloroethylene (TCE) and chlorofluorocarbon (CFC) solvents in electronic component cleaning processes that remove rosin flux, organic and inorganic contamination, and particulates. Present processes depend heavily on these solvents for manual spray cleaning of small components and subassemblies. Use of alternative solvent systems can lead to longer processing times and reduced quality. Automated spray cleaning can improve the quality of the cleaning process, thus enabling the productive use of environmentally conscious materials, while minimizing personnel exposure to hazardous materials. In addition, the use of robotic and automated systems can reduce the manual handling of parts that necessitates additional cleaning. We describe the development of a prototype robotic system for cleaning electronic components in a spray cleaning workcell. An important feature of the prototype system is the capability to generate the robot paths and motions automatically from the CAD models of the part to be cleaned, and to embed cleaning process knowledge into the automatically programmed operations.

  13. Automated cleaning of electronic components

    SciTech Connect

    Drotning, W.; Meirans, L.; Wapman, W.; Hwang, Y.; Koenig, L.; Petterson, B.

    1994-07-01

    Environmental and operator safety concerns are leading to the elimination of trichloroethylene and chlorofluorocarbon solvents in cleaning processes that remove rosin flux, organic and inorganic contamination, and particulates from electronic components. Present processes depend heavily on these solvents for manual spray cleaning of small components and subassemblies. Use of alternative solvent systems can lead to longer processing times and reduced quality. Automated spray cleaning can improve the quality of the cleaning process, thus enabling the productive use of environmentally conscious materials, while minimizing personnel exposure to hazardous materials. We describe the development of a prototype robotic system for cleaning electronic components in a spray cleaning workcell. An important feature of the prototype system is the capability to generate the robot paths and motions automatically from the CAD models of the part to be cleaned, and to embed cleaning process knowledge into the automatically programmed operations.

  14. Automated Core Design

    SciTech Connect

    Kobayashi, Yoko; Aiyoshi, Eitaro

    2005-07-15

    Multistate searching methods are a subfield of distributed artificial intelligence that aims to provide both principles for construction of complex systems involving multiple states and mechanisms for coordination of independent agents' actions. This paper proposes a multistate searching algorithm with reinforcement learning for the automatic core design of a boiling water reactor. The characteristics of this algorithm are that the coupling structure and the coupling operation suitable for the assigned problem are assumed and an optimal solution is obtained by mutual interference in multistate transitions using multiagents. Calculations in an actual plant confirmed that the proposed algorithm increased the convergence ability of the optimization process.

  15. Computer automated design and computer automated manufacture.

    PubMed

    Brncick, M

    2000-08-01

    The introduction of computer aided design and computer aided manufacturing into the field of prosthetics and orthotics did not arrive without concern. Many prosthetists feared that the computer would provide other allied health practitioners who had little or no experience in prosthetics the ability to fit and manage amputees. Technicians in the field felt their jobs may be jeopardized by automated fabrication techniques. This has not turned out to be the case. Prosthetists who use CAD-CAM techniques are finding they have more time for patient care and clinical assessment. CAD-CAM is another tool for them to provide better care for the patients/clients they serve. One of the factors that deterred the acceptance of CAD-CAM techniques in its early stages was that of cost. It took a significant investment in software and hardware for the prosthetists to begin to use the new systems. This new technique was not reimbursed by insurance coverage. Practitioners did not have enough information about this new technique to make a sound decision on their investment of time and money. Ironically, it is the need to hold health care costs down that may prove to be the catalyst for the increased use of CAD-CAM in the field. Providing orthoses and prostheses to patients who require them is a very labor intensive process. Practitioners are looking for better, faster, and more economical ways in which to provide their services under the pressure of managed care. CAD-CAM may be the answer. The author foresees shape sensing departments in hospitals where patients would be sent to be digitized, similar to someone going for radiograph or ultrasound. Afterwards, an orthosis or prosthesis could be provided from a central fabrication facility at a remote site, most likely on the same day. Not long ago, highly skilled practitioners with extensive technical ability would custom make almost every orthosis. One now practices in an atmosphere where off-the-shelf orthoses are the standard. This

  16. Automated Design of Quantum Circuits

    NASA Technical Reports Server (NTRS)

    Williams, Colin P.; Gray, Alexander G.

    2000-01-01

    In order to design a quantum circuit that performs a desired quantum computation, it is necessary to find a decomposition of the unitary matrix that represents that computation in terms of a sequence of quantum gate operations. To date, such designs have either been found by hand or by exhaustive enumeration of all possible circuit topologies. In this paper we propose an automated approach to quantum circuit design using search heuristics based on principles abstracted from evolutionary genetics, i.e. using a genetic programming algorithm adapted specially for this problem. We demonstrate the method on the task of discovering quantum circuit designs for quantum teleportation. We show that to find a given known circuit design (one which was hand-crafted by a human), the method considers roughly an order of magnitude fewer designs than naive enumeration. In addition, the method finds novel circuit designs superior to those previously known.

  17. Automated design of flexible linkers.

    PubMed

    Manion, Charles; Arlitt, Ryan; Campbell, Matthew I; Tumer, Irem; Stone, Rob; Greaney, P Alex

    2016-03-14

    This paper presents a method for the systematic and automated design of flexible organic linkers for construction of metal organic-frameworks (MOFs) in which flexibility, compliance, or other mechanically exotic properties originate at the linker level rather than from the framework kinematics. Our method couples a graph grammar method for systematically generating linker like molecules with molecular dynamics modeling of linkers' mechanical response. Using this approach we have generated a candidate pool of >59,000 hypothetical linkers. We screen linker candidates according to their mechanical behaviors under large deformation, and extract fragments common to the most performant candidate materials. To demonstrate the general approach to MOF design we apply our system to designing linkers for pressure switching MOFs-MOFs that undergo reversible structural collapse after a stress threshold is exceeded. PMID:26687337

  18. Automated Engineering Design (AED); An approach to automated documentation

    NASA Technical Reports Server (NTRS)

    Mcclure, C. W.

    1970-01-01

    The automated engineering design (AED) is reviewed, consisting of a high level systems programming language, a series of modular precoded subroutines, and a set of powerful software machine tools that effectively automate the production and design of new languages. AED is used primarily for development of problem and user-oriented languages. Software production phases are diagramed, and factors which inhibit effective documentation are evaluated.

  19. Automation of Design Engineering Processes

    NASA Technical Reports Server (NTRS)

    Torrey, Glenn; Sawasky, Gerald; Courey, Karim

    2004-01-01

    A method, and a computer program that helps to implement the method, have been developed to automate and systematize the retention and retrieval of all the written records generated during the process of designing a complex engineering system. It cannot be emphasized strongly enough that all the written records as used here is meant to be taken literally: it signifies not only final drawings and final engineering calculations but also such ancillary documents as minutes of meetings, memoranda, requests for design changes, approval and review documents, and reports of tests. One important purpose served by the method is to make the records readily available to all involved users via their computer workstations from one computer archive while eliminating the need for voluminous paper files stored in different places. Another important purpose served by the method is to facilitate the work of engineers who are charged with sustaining the system and were not involved in the original design decisions. The method helps the sustaining engineers to retrieve information that enables them to retrace the reasoning that led to the original design decisions, thereby helping them to understand the system better and to make informed engineering choices pertaining to maintenance and/or modifications of the system. The software used to implement the method is written in Microsoft Access. All of the documents pertaining to the design of a given system are stored in one relational database in such a manner that they can be related to each other via a single tracking number.

  20. Computer automation for feedback system design

    NASA Technical Reports Server (NTRS)

    1975-01-01

    Mathematical techniques and explanations of various steps used by an automated computer program to design feedback systems are summarized. Special attention was given to refining the automatic evaluation suboptimal loop transmission and the translation of time to frequency domain specifications.

  1. Automated CPX support system preliminary design phase

    NASA Technical Reports Server (NTRS)

    Bordeaux, T. A.; Carson, E. T.; Hepburn, C. D.; Shinnick, F. M.

    1984-01-01

    The development of the Distributed Command and Control System (DCCS) is discussed. The development of an automated C2 system stimulated the development of an automated command post exercise (CPX) support system to provide a more realistic stimulus to DCCS than could be achieved with the existing manual system. An automated CPX system to support corps-level exercise was designed. The effort comprised four tasks: (1) collecting and documenting user requirements; (2) developing a preliminary system design; (3) defining a program plan; and (4) evaluating the suitability of the TRASANA FOURCE computer model.

  2. Automated Simulation For Analysis And Design

    NASA Technical Reports Server (NTRS)

    Cantwell, E.; Shenk, Tim; Robinson, Peter; Upadhye, R.

    1992-01-01

    Design Assistant Workstation (DAWN) software being developed to facilitate simulation of qualitative and quantitative aspects of behavior of life-support system in spacecraft, chemical-processing plant, heating and cooling system of large building, or any of variety of systems including interacting process streams and processes. Used to analyze alternative design scenarios or specific designs of such systems. Expert system will automate part of design analysis: reason independently by simulating design scenarios and return to designer with overall evaluations and recommendations.

  3. Automated CAD design for sculptured airfoil surfaces

    NASA Astrophysics Data System (ADS)

    Murphy, S. D.; Yeagley, S. R.

    1990-11-01

    The design of tightly tolerated sculptured surfaces such as those for airfoils requires a significant design effort in order to machine the tools to create these surfaces. Because of the quantity of numerical data required to describe the airfoil surfaces, a CAD approach is required. Although this approach will result in productivity gains, much larger gains can be achieved by automating the design process. This paper discusses an application which resulted in an eightfold improvement in productivity by automating the design process on the CAD system.

  4. Design considerations for automated packaging operations

    SciTech Connect

    Fahrenholtz, J.; Jones, J.; Kincy, M.

    1993-12-31

    The paper is based on work performed at Sandia National Laboratories to automate DOE packaging operations. It is a general summary of work from several projects which may be applicable to other packaging operations. Examples are provided of robotic operations which have been demonstrated as well as operations that are currently being developed. General design considerations for packages and for automated handling systems are described.

  5. An Automated Approach to Instructional Design Guidance.

    ERIC Educational Resources Information Center

    Spector, J. Michael; And Others

    This paper describes the Guided Approach to Instructional Design Advising (GAIDA), an automated instructional design tool that incorporates techniques of artificial intelligence. GAIDA was developed by the U.S. Air Force Armstrong Laboratory to facilitate the planning and production of interactive courseware and computer-based training materials.…

  6. Automated database design technology and tools

    NASA Technical Reports Server (NTRS)

    Shen, Stewart N. T.

    1988-01-01

    The Automated Database Design Technology and Tools research project results are summarized in this final report. Comments on the state of the art in various aspects of database design are provided, and recommendations made for further research for SNAP and NAVMASSO future database applications.

  7. Automating Relational Database Design for Microcomputer Users.

    ERIC Educational Resources Information Center

    Pu, Hao-Che

    1991-01-01

    Discusses issues involved in automating the relational database design process for microcomputer users and presents a prototype of a microcomputer-based system (RA, Relation Assistant) that is based on expert systems technology and helps avoid database maintenance problems. Relational database design is explained and the importance of easy input…

  8. Automated solar collector installation design

    DOEpatents

    Wayne, Gary; Frumkin, Alexander; Zaydman, Michael; Lehman, Scott; Brenner, Jules

    2014-08-26

    Embodiments may include systems and methods to create and edit a representation of a worksite, to create various data objects, to classify such objects as various types of pre-defined "features" with attendant properties and layout constraints. As part of or in addition to classification, an embodiment may include systems and methods to create, associate, and edit intrinsic and extrinsic properties to these objects. A design engine may apply of design rules to the features described above to generate one or more solar collectors installation design alternatives, including generation of on-screen and/or paper representations of the physical layout or arrangement of the one or more design alternatives.

  9. Automated ILA design for synchronous sequential circuits

    NASA Technical Reports Server (NTRS)

    Liu, M. N.; Liu, K. Z.; Maki, G. K.; Whitaker, S. R.

    1991-01-01

    An iterative logic array (ILA) architecture for synchronous sequential circuits is presented. This technique utilizes linear algebra to produce the design equations. The ILA realization of synchronous sequential logic can be fully automated with a computer program. A programmable design procedure is proposed to fullfill the design task and layout generation. A software algorithm in the C language has been developed and tested to generate 1 micron CMOS layouts using the Hewlett-Packard FUNGEN module generator shell.

  10. Automated Design of Quantum Circuits

    NASA Technical Reports Server (NTRS)

    Williams, C.; Gray, G.

    1998-01-01

    In order to design a quantum circuit that performs a desired quantum computation, it is necessary to find a decomposition of the unitary matrix that represents that computation in terms of a sequence of quantum gate operations.

  11. Improved automated electronic balance calibration program

    SciTech Connect

    Clark, J.P.; Frickey, E.M.

    1995-11-01

    An improved automated electronic balance calibration and record system has been developed using a spread sheet to consolidate information required to calibrate electronic balances and satisfy requirements for traceability, validation and documentation. Several improvements have been made over an Epson HX-20{trademark} notebook computer-based balance calibration system, which was developed at the Savannah River Site in 1986 and used continuously since to annually calibrate electronic balances. These improvements included: built in tables of balance models performance test limits and calibration standards` apparent masses & uncertainties; calculated ratios of balance to test weight uncertainties; bar-code data input; enhanced graphs and tables; and permanent electronic records. The software and hardware were thoroughly tested by calibrating 30 balances in another department. Hardware for importing data from balances through an RS-232 interface and bar code reader into a portable computer`s spread sheet was evaluated and found to add little value to the calibration process. Computerized data collection minimizes record handling and reduces paper work costs by >50%. Databases are established for each organization`s electronic balances that contain records for each balance that are identified by model, property identification number and location. In addition, each record contains calibration and expiration dates, performance testing information, etc. Details of equipment, statistical testing, spread sheet features and examples of the program are described.

  12. Design of Inhouse Automated Library Systems.

    ERIC Educational Resources Information Center

    Cortez, Edwin M.

    1984-01-01

    Examines six steps inherent to development of in-house automated library system: (1) problem definition, (2) requirement specifications, (3) analysis of alternatives and solutions, (4, 5) design and implementation of hardware and software, and (6) evaluation. Practical method for comparing and weighting options is illustrated and explained. A…

  13. Banning design automation software implementation

    NASA Technical Reports Server (NTRS)

    Kuehlthau, R. L.

    1975-01-01

    The research is reported for developing a system of computer programs to aid engineering in the design, fabrication, and testing of large scale integrated circuits, hybrid circuits, and printed circuit boards. The automatic layout programs, analysis programs, and interface programs are discussed.

  14. Automated mixed traffic vehicle design AMTV 2

    NASA Technical Reports Server (NTRS)

    Johnston, A. R.; Marks, R. A.; Cassell, P. L.

    1982-01-01

    The design of an improved and enclosed Automated Mixed Traffic Transit (AMTT) vehicle is described. AMTT is an innovative concept for low-speed tram-type transit in which suitable vehicles are equipped with sensors and controls to permit them to operate in an automated mode on existing road or walkway surfaces. The vehicle chassis and body design are presented in terms of sketches and photographs. The functional design of the sensing and control system is presented, and modifications which could be made to the baseline design for improved performance, in particular to incorporate a 20-mph capability, are also discussed. The vehicle system is described at the block-diagram-level of detail. Specifications and parameter values are given where available.

  15. j5 DNA assembly design automation.

    PubMed

    Hillson, Nathan J

    2014-01-01

    Modern standardized methodologies, described in detail in the previous chapters of this book, have enabled the software-automated design of optimized DNA construction protocols. This chapter describes how to design (combinatorial) scar-less DNA assembly protocols using the web-based software j5. j5 assists biomedical and biotechnological researchers construct DNA by automating the design of optimized protocols for flanking homology sequence as well as type IIS endonuclease-mediated DNA assembly methodologies. Unlike any other software tool available today, j5 designs scar-less combinatorial DNA assembly protocols, performs a cost-benefit analysis to identify which portions of an assembly process would be less expensive to outsource to a DNA synthesis service provider, and designs hierarchical DNA assembly strategies to mitigate anticipated poor assembly junction sequence performance. Software integrated with j5 add significant value to the j5 design process through graphical user-interface enhancement and downstream liquid-handling robotic laboratory automation. PMID:24395369

  16. Automating software design system DESTA

    NASA Technical Reports Server (NTRS)

    Lovitsky, Vladimir A.; Pearce, Patricia D.

    1992-01-01

    'DESTA' is the acronym for the Dialogue Evolutionary Synthesizer of Turnkey Algorithms by means of a natural language (Russian or English) functional specification of algorithms or software being developed. DESTA represents the computer-aided and/or automatic artificial intelligence 'forgiving' system which provides users with software tools support for algorithm and/or structured program development. The DESTA system is intended to provide support for the higher levels and earlier stages of engineering design of software in contrast to conventional Computer Aided Design (CAD) systems which provide low level tools for use at a stage when the major planning and structuring decisions have already been taken. DESTA is a knowledge-intensive system. The main features of the knowledge are procedures, functions, modules, operating system commands, batch files, their natural language specifications, and their interlinks. The specific domain for the DESTA system is a high level programming language like Turbo Pascal 6.0. The DESTA system is operational and runs on an IBM PC computer.

  17. Design automation techniques for custom LSI arrays

    NASA Technical Reports Server (NTRS)

    Feller, A.

    1975-01-01

    The standard cell design automation technique is described as an approach for generating random logic PMOS, CMOS or CMOS/SOS custom large scale integration arrays with low initial nonrecurring costs and quick turnaround time or design cycle. The system is composed of predesigned circuit functions or cells and computer programs capable of automatic placement and interconnection of the cells in accordance with an input data net list. The program generates a set of instructions to drive an automatic precision artwork generator. A series of support design automation and simulation programs are described, including programs for verifying correctness of the logic on the arrays, performing dc and dynamic analysis of MOS devices, and generating test sequences.

  18. j5 DNA assembly design automation software.

    PubMed

    Hillson, Nathan J; Rosengarten, Rafael D; Keasling, Jay D

    2012-01-20

    Recent advances in Synthetic Biology have yielded standardized and automatable DNA assembly protocols that enable a broad range of biotechnological research and development. Unfortunately, the experimental design required for modern scar-less multipart DNA assembly methods is frequently laborious, time-consuming, and error-prone. Here, we report the development and deployment of a web-based software tool, j5, which automates the design of scar-less multipart DNA assembly protocols including SLIC, Gibson, CPEC, and Golden Gate. The key innovations of the j5 design process include cost optimization, leveraging DNA synthesis when cost-effective to do so, the enforcement of design specification rules, hierarchical assembly strategies to mitigate likely assembly errors, and the instruction of manual or automated construction of scar-less combinatorial DNA libraries. Using a GFP expression testbed, we demonstrate that j5 designs can be executed with the SLIC, Gibson, or CPEC assembly methods, used to build combinatorial libraries with the Golden Gate assembly method, and applied to the preparation of linear gene deletion cassettes for E. coli. The DNA assembly design algorithms reported here are generally applicable to broad classes of DNA construction methodologies and could be implemented to supplement other DNA assembly design tools. Taken together, these innovations save researchers time and effort, reduce the frequency of user design errors and off-target assembly products, decrease research costs, and enable scar-less multipart and combinatorial DNA construction at scales unfeasible without computer-aided design. PMID:23651006

  19. Handbook: Design of automated redundancy verification

    NASA Technical Reports Server (NTRS)

    Ford, F. A.; Hasslinger, T. W.; Moreno, F. J.

    1971-01-01

    The use of the handbook is discussed and the design progress is reviewed. A description of the problem is presented, and examples are given to illustrate the necessity for redundancy verification, along with the types of situations to which it is typically applied. Reusable space vehicles, such as the space shuttle, are recognized as being significant in the development of the automated redundancy verification problem.

  20. Fully Mechanically Controlled Automated Electron Microscopic Tomography

    NASA Astrophysics Data System (ADS)

    Liu, Jinxin; Li, Hongchang; Zhang, Lei; Rames, Matthew; Zhang, Meng; Yu, Yadong; Peng, Bo; Celis, César Díaz; Xu, April; Zou, Qin; Yang, Xu; Chen, Xuefeng; Ren, Gang

    2016-07-01

    Knowledge of three-dimensional (3D) structures of each individual particles of asymmetric and flexible proteins is essential in understanding those proteins’ functions; but their structures are difficult to determine. Electron tomography (ET) provides a tool for imaging a single and unique biological object from a series of tilted angles, but it is challenging to image a single protein for three-dimensional (3D) reconstruction due to the imperfect mechanical control capability of the specimen goniometer under both a medium to high magnification (approximately 50,000–160,000×) and an optimized beam coherence condition. Here, we report a fully mechanical control method for automating ET data acquisition without using beam tilt/shift processes. This method could reduce the accumulation of beam tilt/shift that used to compensate the error from the mechanical control, but downgraded the beam coherence. Our method was developed by minimizing the error of the target object center during the tilting process through a closed-loop proportional-integral (PI) control algorithm. The validations by both negative staining (NS) and cryo-electron microscopy (cryo-EM) suggest that this method has a comparable capability to other ET methods in tracking target proteins while maintaining optimized beam coherence conditions for imaging.

  1. Fully Mechanically Controlled Automated Electron Microscopic Tomography

    PubMed Central

    Liu, Jinxin; Li, Hongchang; Zhang, Lei; Rames, Matthew; Zhang, Meng; Yu, Yadong; Peng, Bo; Celis, César Díaz; Xu, April; Zou, Qin; Yang, Xu; Chen, Xuefeng; Ren, Gang

    2016-01-01

    Knowledge of three-dimensional (3D) structures of each individual particles of asymmetric and flexible proteins is essential in understanding those proteins’ functions; but their structures are difficult to determine. Electron tomography (ET) provides a tool for imaging a single and unique biological object from a series of tilted angles, but it is challenging to image a single protein for three-dimensional (3D) reconstruction due to the imperfect mechanical control capability of the specimen goniometer under both a medium to high magnification (approximately 50,000–160,000×) and an optimized beam coherence condition. Here, we report a fully mechanical control method for automating ET data acquisition without using beam tilt/shift processes. This method could reduce the accumulation of beam tilt/shift that used to compensate the error from the mechanical control, but downgraded the beam coherence. Our method was developed by minimizing the error of the target object center during the tilting process through a closed-loop proportional-integral (PI) control algorithm. The validations by both negative staining (NS) and cryo-electron microscopy (cryo-EM) suggest that this method has a comparable capability to other ET methods in tracking target proteins while maintaining optimized beam coherence conditions for imaging. PMID:27403922

  2. Fully Mechanically Controlled Automated Electron Microscopic Tomography.

    PubMed

    Liu, Jinxin; Li, Hongchang; Zhang, Lei; Rames, Matthew; Zhang, Meng; Yu, Yadong; Peng, Bo; Celis, César Díaz; Xu, April; Zou, Qin; Yang, Xu; Chen, Xuefeng; Ren, Gang

    2016-01-01

    Knowledge of three-dimensional (3D) structures of each individual particles of asymmetric and flexible proteins is essential in understanding those proteins' functions; but their structures are difficult to determine. Electron tomography (ET) provides a tool for imaging a single and unique biological object from a series of tilted angles, but it is challenging to image a single protein for three-dimensional (3D) reconstruction due to the imperfect mechanical control capability of the specimen goniometer under both a medium to high magnification (approximately 50,000-160,000×) and an optimized beam coherence condition. Here, we report a fully mechanical control method for automating ET data acquisition without using beam tilt/shift processes. This method could reduce the accumulation of beam tilt/shift that used to compensate the error from the mechanical control, but downgraded the beam coherence. Our method was developed by minimizing the error of the target object center during the tilting process through a closed-loop proportional-integral (PI) control algorithm. The validations by both negative staining (NS) and cryo-electron microscopy (cryo-EM) suggest that this method has a comparable capability to other ET methods in tracking target proteins while maintaining optimized beam coherence conditions for imaging. PMID:27403922

  3. Design of the hybrid automated reliability predictor

    NASA Technical Reports Server (NTRS)

    Geist, R.; Trivedi, K.; Dugan, J. B.; Smotherman, M.

    1983-01-01

    The design of the Hybrid Automated Reliability Predictor (HARP), now under development at Duke University, is presented. The HARP approach to reliability prediction is characterized by a decomposition of the overall model into fault-occurrence and fault-handling sub-models. The fault-occurrence model is a non-homogeneous Markov chain which is solved analytically, while the fault-handling model is a Petri Net which is simulated. HARP provides automated analysis of sensitivity to uncertainties in the input parameters and in the initial state specifications. It then produces a predicted reliability band as a function of mission time, as well as estimates of the improvement (narrowing of the band) to be gained by a specified amount of reduction in uncertainty.

  4. Automated Antenna Design with Evolutionary Algorithms

    NASA Technical Reports Server (NTRS)

    Linden, Derek; Hornby, Greg; Lohn, Jason; Globus, Al; Krishunkumor, K.

    2006-01-01

    Current methods of designing and optimizing antennas by hand are time and labor intensive, and limit complexity. Evolutionary design techniques can overcome these limitations by searching the design space and automatically finding effective solutions. In recent years, evolutionary algorithms have shown great promise in finding practical solutions in large, poorly understood design spaces. In particular, spacecraft antenna design has proven tractable to evolutionary design techniques. Researchers have been investigating evolutionary antenna design and optimization since the early 1990s, and the field has grown in recent years as computer speed has increased and electromagnetic simulators have improved. Two requirements-compliant antennas, one for ST5 and another for TDRS-C, have been automatically designed by evolutionary algorithms. The ST5 antenna is slated to fly this year, and a TDRS-C phased array element has been fabricated and tested. Such automated evolutionary design is enabled by medium-to-high quality simulators and fast modern computers to evaluate computer-generated designs. Evolutionary algorithms automate cut-and-try engineering, substituting automated search though millions of potential designs for intelligent search by engineers through a much smaller number of designs. For evolutionary design, the engineer chooses the evolutionary technique, parameters and the basic form of the antenna, e.g., single wire for ST5 and crossed-element Yagi for TDRS-C. Evolutionary algorithms then search for optimal configurations in the space defined by the engineer. NASA's Space Technology 5 (ST5) mission will launch three small spacecraft to test innovative concepts and technologies. Advanced evolutionary algorithms were used to automatically design antennas for ST5. The combination of wide beamwidth for a circularly-polarized wave and wide impedance bandwidth made for a challenging antenna design problem. From past experience in designing wire antennas, we chose to

  5. Automating Risk Analysis of Software Design Models

    PubMed Central

    Ruiz, Guifré; Heymann, Elisa; César, Eduardo; Miller, Barton P.

    2014-01-01

    The growth of the internet and networked systems has exposed software to an increased amount of security threats. One of the responses from software developers to these threats is the introduction of security activities in the software development lifecycle. This paper describes an approach to reduce the need for costly human expertise to perform risk analysis in software, which is common in secure development methodologies, by automating threat modeling. Reducing the dependency on security experts aims at reducing the cost of secure development by allowing non-security-aware developers to apply secure development with little to no additional cost, making secure development more accessible. To automate threat modeling two data structures are introduced, identification trees and mitigation trees, to identify threats in software designs and advise mitigation techniques, while taking into account specification requirements and cost concerns. These are the components of our model for automated threat modeling, AutSEC. We validated AutSEC by implementing it in a tool based on data flow diagrams, from the Microsoft security development methodology, and applying it to VOMS, a grid middleware component, to evaluate our model's performance. PMID:25136688

  6. Automating risk analysis of software design models.

    PubMed

    Frydman, Maxime; Ruiz, Guifré; Heymann, Elisa; César, Eduardo; Miller, Barton P

    2014-01-01

    The growth of the internet and networked systems has exposed software to an increased amount of security threats. One of the responses from software developers to these threats is the introduction of security activities in the software development lifecycle. This paper describes an approach to reduce the need for costly human expertise to perform risk analysis in software, which is common in secure development methodologies, by automating threat modeling. Reducing the dependency on security experts aims at reducing the cost of secure development by allowing non-security-aware developers to apply secure development with little to no additional cost, making secure development more accessible. To automate threat modeling two data structures are introduced, identification trees and mitigation trees, to identify threats in software designs and advise mitigation techniques, while taking into account specification requirements and cost concerns. These are the components of our model for automated threat modeling, AutSEC. We validated AutSEC by implementing it in a tool based on data flow diagrams, from the Microsoft security development methodology, and applying it to VOMS, a grid middleware component, to evaluate our model's performance. PMID:25136688

  7. Designing Electronic Books.

    ERIC Educational Resources Information Center

    Barker, Philip; Manji, Karim

    1991-01-01

    Discussion of the design of interactive environments focuses on three types of electronic book metaphors that use optical discs and can facilitate computer-based learning: (1) static picture books, (2) moving picture books, and (3) multimedia books. Guidelines for designing electronic books are presented, and future directions are discussed. (17…

  8. Automated Procedure for Roll Pass Design

    NASA Astrophysics Data System (ADS)

    Lambiase, F.; Langella, A.

    2009-04-01

    The aim of this work has been to develop an automatic roll pass design method, capable of minimizing the number of roll passes. The adoption of artificial intelligence technologies, particularly expert systems, and a hybrid model for the surface profile evaluation of rolled bars, has allowed us to model the search for the minimal sequence with a tree path search. This approach permitted a geometrical optimization of roll passes while allowing automation of the roll pass design process. Moreover, the heuristic nature of the inferential engine contributes a great deal toward reducing search time, thus allowing such a system to be employed for industrial purposes. Finally, this new approach was compared with other recently developed automatic systems to validate and measure possible improvements among them.

  9. Automating analog design: Taming the shrew

    NASA Technical Reports Server (NTRS)

    Barlow, A.

    1990-01-01

    The pace of progress in the design of integrated circuits continues to amaze observers inside and outside of the industry. Three decades ago, a 50 transistor chip was a technological wonder. Fifteen year later, a 5000 transistor device would 'wow' the crowds. Today, 50,000 transistor chips will earn a 'not too bad' assessment, but it takes 500,000 to really leave an impression. In 1975 a typical ASIC device had 1000 transistors, took one year to first samples (and two years to production) and sold for about 5 cents per transistor. Today's 50,000 transistor gate array takes about 4 months from spec to silicon, works the first time, and sells for about 0.02 cents per transistor. Fifteen years ago, the single most laborious and error prone step in IC design was the physical layout. Today, most IC's never see the hand of a layout designer: and automatic place and route tool converts the engineer's computer captured schematic to a complete physical design using a gate array or a library of standard cells also created by software rather than by designers. CAD has also been a generous benefactor to the digital design process. The architect of today's digital systems creates the design using an RTL or other high level simulator. Then the designer pushes a button to invoke the logic synthesizer-optimizer tool. A fault analyzer checks the result for testability and suggests where scan based cells will improve test coverage. One obstinate holdout amidst this parade of progress is the automation of analog design and its reduction to semi-custom techniques. This paper investigates the application of CAD techniques to analog design.

  10. Team-Centered Perspective for Adaptive Automation Design

    NASA Technical Reports Server (NTRS)

    Prinzel, Lawrence J., III

    2003-01-01

    Automation represents a very active area of human factors research. The journal, Human Factors, published a special issue on automation in 1985. Since then, hundreds of scientific studies have been published examining the nature of automation and its interaction with human performance. However, despite a dramatic increase in research investigating human factors issues in aviation automation, there remain areas that need further exploration. This NASA Technical Memorandum describes a new area of automation design and research, called adaptive automation. It discusses the concepts and outlines the human factors issues associated with the new method of adaptive function allocation. The primary focus is on human-centered design, and specifically on ensuring that adaptive automation is from a team-centered perspective. The document shows that adaptive automation has many human factors issues common to traditional automation design. Much like the introduction of other new technologies and paradigm shifts, adaptive automation presents an opportunity to remediate current problems but poses new ones for human-automation interaction in aerospace operations. The review here is intended to communicate the philosophical perspective and direction of adaptive automation research conducted under the Aerospace Operations Systems (AOS), Physiological and Psychological Stressors and Factors (PPSF) project.

  11. Automated Design Space Exploration with Aspen

    DOE PAGESBeta

    Spafford, Kyle L.; Vetter, Jeffrey S.

    2015-01-01

    Architects and applications scientists often use performance models to explore a multidimensional design space of architectural characteristics, algorithm designs, and application parameters. With traditional performance modeling tools, these explorations forced users to first develop a performance model and then repeatedly evaluate and analyze the model manually. These manual investigations proved laborious and error prone. More importantly, the complexity of this traditional process often forced users to simplify their investigations. To address this challenge of design space exploration, we extend our Aspen (Abstract Scalable Performance Engineering Notation) language with three new language constructs: user-defined resources, parameter ranges, and a collection ofmore » costs in the abstract machine model. Then, we use these constructs to enable automated design space exploration via a nonlinear optimization solver. We show how four interesting classes of design space exploration scenarios can be derived from Aspen models and formulated as pure nonlinear programs. The analysis tools are demonstrated using examples based on Aspen models for a three-dimensional Fast Fourier Transform, the CoMD molecular dynamics proxy application, and the DARPA Streaming Sensor Challenge Problem. Our results show that this approach can compose and solve arbitrary performance modeling questions quickly and rigorously when compared to the traditional manual approach.« less

  12. Analyzing Automated Instructional Systems: Metaphors from Related Design Professions.

    ERIC Educational Resources Information Center

    Jonassen, David H.; Wilson, Brent G.

    Noting that automation has had an impact on virtually every manufacturing and information operation in the world, including instructional design (ID), this paper suggests three basic metaphors for automating instructional design activities: (1) computer-aided design and manufacturing (CAD/CAM) systems; (2) expert system advisor systems; and (3)…

  13. Automated design tools for biophotonic systems

    NASA Astrophysics Data System (ADS)

    Vacca, Giacomo; Lehtimäki, Hannu; Karras, Tapio; Murphy, Sean

    2014-03-01

    Traditional design methods for flow cytometers and other complex biophotonic systems are increasingly recognized as a major bottleneck in instrumentation development. The many manual steps involved in the analysis and translation of the design, from optical layout to a detailed mechanical model and ultimately to a fully functional instrument, are laborintensive and prone to wasteful trial-and-error iterations. We have developed two complementary, linked technologies that address this problem: one design tool (LiveIdeas™) provides an intuitive environment for interactive, real-time simulations of system-level performance; the other tool (BeamWise™) automates the generation of mechanical 3D CAD models based on those simulations. The strength of our approach lies in a parametric modeling strategy that breaks boundaries between engineering subsystems (e.g., optics and fluidics) to predict critical behavior of the instrument as a whole. The results: 70 percent reduction in early-stage project effort, significantly enhancing the probability of success by virtue of a more efficient exploration of the design space.

  14. Flight control system design factors for applying automated testing techniques

    NASA Technical Reports Server (NTRS)

    Sitz, Joel R.; Vernon, Todd H.

    1990-01-01

    Automated validation of flight-critical embedded systems is being done at ARC Dryden Flight Research Facility. The automated testing techniques are being used to perform closed-loop validation of man-rated flight control systems. The principal design features and operational experiences of the X-29 forward-swept-wing aircraft and F-18 High Alpha Research Vehicle (HARV) automated test systems are discussed. Operationally applying automated testing techniques has accentuated flight control system features that either help or hinder the application of these techniques. The paper also discusses flight control system features which foster the use of automated testing techniques.

  15. Large scale oil lease automation and electronic custody transfer

    SciTech Connect

    Price, C.R.; Elmer, D.C.

    1995-12-31

    Typically, oil field production operations have only been automated at fields with long term production profiles and enhanced recovery. The automation generally consists of monitoring and control at the wellhead and centralized facilities. However, Union Pacific Resources Co. (UPRC) has successfully implemented a large scale automation program for rapid-decline primary recovery Austin Chalk wells where purchasers buy and transport oil from each individual wellsite. This project has resulted in two significant benefits. First, operators are using the system to re-engineer their work processes. Second, an inter-company team created a new electronic custody transfer method. This paper will describe: the progression of the company`s automation objectives in the area; the field operator`s interaction with the system, and the related benefits; the research and development of the new electronic custody transfer method.

  16. Rapid iterative reanalysis for automated design

    NASA Technical Reports Server (NTRS)

    Bhatia, K. G.

    1973-01-01

    A method for iterative reanalysis in automated structural design is presented for a finite-element analysis using the direct stiffness approach. A basic feature of the method is that the generalized stiffness and inertia matrices are expressed as functions of structural design parameters, and these generalized matrices are expanded in Taylor series about the initial design. Only the linear terms are retained in the expansions. The method is approximate because it uses static condensation, modal reduction, and the linear Taylor series expansions. The exact linear representation of the expansions of the generalized matrices is also described and a basis for the present method is established. Results of applications of the present method to the recalculation of the natural frequencies of two simple platelike structural models are presented and compared with results obtained by using a commonly applied analysis procedure used as a reference. In general, the results are in good agreement. A comparison of the computer times required for the use of the present method and the reference method indicated that the present method required substantially less time for reanalysis. Although the results presented are for relatively small-order problems, the present method will become more efficient relative to the reference method as the problem size increases. An extension of the present method to static reanalysis is described, ana a basis for unifying the static and dynamic reanalysis procedures is presented.

  17. Designing the Electronic Classroom.

    ERIC Educational Resources Information Center

    Adams, Laural L.

    In an increasingly technological environment, traditional teaching presentation methods such as the podium, overhead, and transparencies are no longer sufficient. This document serves as a guide to designing and planning an electronic classroom for "bidirectional" communication between teacher and student. Topics include: (1) determining whether…

  18. DASLL. Printed Circuit Board Design Automation

    SciTech Connect

    Magnuson, W.G.Jr.; Willett, G.W.

    1983-06-03

    DASLL (Design Automation System at Lawrence Livermore) is a set of computer programs for printed circuit board (PCB) layout. The DASLL system can process a number of PCB trimlines, including: DEC 1, 2, 4, and 6 high configurations, CLI, Augat, Varian, and several rectangular geometries; others can be added. Over 800 components and generic package types are available in DASLLDB, the system reference library. Two-layer boards with non-gridded (structured) power and ground busses are supported, and PCB densities of approximately 1.2 square inches per equivalent IC (or less dense) are best accommodated by DASLL. The system has been used to make etch artwork and drill tapes (starting with a schematic drawing) for a six IC CLI board in less than two working days. Initial processing will produce reports and computer printer-plots which can be used to verify the input. Final output can include silkscreen photo-artwork, PCB etch photo-artwork, punched paper tapes for the SLO-SYN and Pratt-Whitney N/C drill machines, and computer listings of signal strings, parts lists, etc.

  19. Design automation for integrated nonlinear logic circuits (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Van Vaerenbergh, Thomas; Pelc, Jason; Santori, Charles; Bose, Ranojoy; Kielpinski, Dave; Beausoleil, Raymond G.

    2016-05-01

    A key enabler of the IT revolution of the late 20th century was the development of electronic design automation (EDA) tools allowing engineers to manage the complexity of electronic circuits with transistor counts now reaching into the billions. Recently, we have been developing large-scale nonlinear photonic integrated logic circuits for next generation all-optical information processing. At this time a sufficiently powerful EDA-style software tool chain to design this type of complex circuits does not yet exist. Here we describe a hierarchical approach to automating the design and validation of photonic integrated circuits, which can scale to several orders of magnitude higher complexity than the state of the art. Most photonic integrated circuits developed today consist of a small number of components, and only limited hierarchy. For example, a simple photonic transceiver may contain on the order of 10 building-block components, consisting of grating couplers for photonic I/O, modulators, and signal splitters/combiners. Because this is relatively easy to lay out by hand (or simple script) existing photonic design tools have relatively little automation in comparison to electronics tools. But demonstrating all-optical logic will require significantly more complex photonic circuits containing up to 1,000 components, hence becoming infeasible to design manually. Our design framework is based off Python-based software from Luceda Photonics which provides an environment to describe components, simulate their behavior, and export design files (GDS) to foundries for fabrication. At a fundamental level, a photonic component is described as a parametric cell (PCell) similarly to electronics design. PCells are described by geometric characteristics of their layout. A critical part of the design framework is the implementation of PCells as Python objects. PCell objects can then use inheritance to simplify design, and hierarchical designs can be made by creating composite

  20. Design of automated system for management of arrival traffic

    NASA Technical Reports Server (NTRS)

    Erzberger, Heinz; Nedell, William

    1989-01-01

    The design of an automated air traffic control system based on a hierarchy of advisory tools for controllers is described. Compatibility of the tools with the human controller, a key objective of the design, is achieved by a judicious selection of tasks to be automated and careful attention to the design of the controller system interface. The design comprises three interconnected subsystems referred to as the Traffic Management Advisor, the Descent Advisor, and the Final Approach Spacing Tool. Each of these subsystems provides a collection of tools for specific controller positions and tasks. The design of two of these tools, the Descent Advisor, which provides automation tools for managing descent traffic, and the Traffic Management Advisor, which generates optimum landing schedules is focused on. The algorithms, automation modes, and graphical interfaces incorporated in the design are described.

  1. EXADS - EXPERT SYSTEM FOR AUTOMATED DESIGN SYNTHESIS

    NASA Technical Reports Server (NTRS)

    Rogers, J. L.

    1994-01-01

    The expert system called EXADS was developed to aid users of the Automated Design Synthesis (ADS) general purpose optimization program. Because of the general purpose nature of ADS, it is difficult for a nonexpert to select the best choice of strategy, optimizer, and one-dimensional search options from the one hundred or so combinations that are available. EXADS aids engineers in determining the best combination based on their knowledge of the problem and the expert knowledge previously stored by experts who developed ADS. EXADS is a customized application of the AESOP artificial intelligence program (the general version of AESOP is available separately from COSMIC. The ADS program is also available from COSMIC.) The expert system consists of two main components. The knowledge base contains about 200 rules and is divided into three categories: constrained, unconstrained, and constrained treated as unconstrained. The EXADS inference engine is rule-based and makes decisions about a particular situation using hypotheses (potential solutions), rules, and answers to questions drawn from the rule base. EXADS is backward-chaining, that is, it works from hypothesis to facts. The rule base was compiled from sources such as literature searches, ADS documentation, and engineer surveys. EXADS will accept answers such as yes, no, maybe, likely, and don't know, or a certainty factor ranging from 0 to 10. When any hypothesis reaches a confidence level of 90% or more, it is deemed as the best choice and displayed to the user. If no hypothesis is confirmed, the user can examine explanations of why the hypotheses failed to reach the 90% level. The IBM PC version of EXADS is written in IQ-LISP for execution under DOS 2.0 or higher with a central memory requirement of approximately 512K of 8 bit bytes. This program was developed in 1986.

  2. Automated electronic system for measuring thermophysical properties

    NASA Technical Reports Server (NTRS)

    Creel, T. R., Jr.; Jones, R. A.; Corwin, R. R.; Kramer, J. S.

    1975-01-01

    Phase-charge coatings are used to measure surface temperature accurately under transient heating conditions. Coating melts when surface reaches calibrated phase-charge temperature. Temperature is monitored by infrared thermometer, and corresponding elapsed time is recorded by electronic data-handling system.

  3. An Intelligent Automation Platform for Rapid Bioprocess Design

    PubMed Central

    Wu, Tianyi

    2014-01-01

    Bioprocess development is very labor intensive, requiring many experiments to characterize each unit operation in the process sequence to achieve product safety and process efficiency. Recent advances in microscale biochemical engineering have led to automated experimentation. A process design workflow is implemented sequentially in which (1) a liquid-handling system performs high-throughput wet lab experiments, (2) standalone analysis devices detect the data, and (3) specific software is used for data analysis and experiment design given the user’s inputs. We report an intelligent automation platform that integrates these three activities to enhance the efficiency of such a workflow. A multiagent intelligent architecture has been developed incorporating agent communication to perform the tasks automatically. The key contribution of this work is the automation of data analysis and experiment design and also the ability to generate scripts to run the experiments automatically, allowing the elimination of human involvement. A first-generation prototype has been established and demonstrated through lysozyme precipitation process design. All procedures in the case study have been fully automated through an intelligent automation platform. The realization of automated data analysis and experiment design, and automated script programming for experimental procedures has the potential to increase lab productivity. PMID:24088579

  4. Designing Domain-Specific HUMS Architectures: An Automated Approach

    NASA Technical Reports Server (NTRS)

    Mukkamala, Ravi; Agarwal, Neha; Kumar, Pramod; Sundaram, Parthiban

    2004-01-01

    The HUMS automation system automates the design of HUMS architectures. The automated design process involves selection of solutions from a large space of designs as well as pure synthesis of designs. Hence the whole objective is to efficiently search for or synthesize designs or parts of designs in the database and to integrate them to form the entire system design. The automation system adopts two approaches in order to produce the designs: (a) Bottom-up approach and (b) Top down approach. Both the approaches are endowed with a Suite of quantitative and quantitative techniques that enable a) the selection of matching component instances, b) the determination of design parameters, c) the evaluation of candidate designs at component-level and at system-level, d) the performance of cost-benefit analyses, e) the performance of trade-off analyses, etc. In short, the automation system attempts to capitalize on the knowledge developed from years of experience in engineering, system design and operation of the HUMS systems in order to economically produce the most optimal and domain-specific designs.

  5. TARDIS: An Automation Framework for JPL Mission Design and Navigation

    NASA Technical Reports Server (NTRS)

    Roundhill, Ian M.; Kelly, Richard M.

    2014-01-01

    Mission Design and Navigation at the Jet Propulsion Laboratory has implemented an automation framework tool to assist in orbit determination and maneuver design analysis. This paper describes the lessons learned from previous automation tools and how they have been implemented in this tool. In addition this tool has revealed challenges in software implementation, testing, and user education. This paper describes some of these challenges and invites others to share their experiences.

  6. The Electronic Nose Training Automation Development

    NASA Technical Reports Server (NTRS)

    Schattke, Nathan

    2002-01-01

    The electronic nose is a method of using several sensors in conjunction to identify an unknown gas. Statistical analysis has shown that a large number of training exposures need to be performed in order to get a model that can be depended on. The number of training exposures needed is on the order of 1000. Data acquisition from the noses are generally automatic and built in. The gas generation equipment consists of a Miller-Nelson (MN) flow/temperature/humidity controller and a Kin-Tek (KT) trace gas generator. This equipment has been controlled in the past by an old data acquisition and control system. The new system will use new control boards and an easy graphical user interface. The programming for this is in the LabVIEW G programming language. A language easy for the user to make modifications to. This paper details some of the issues in selecting the components and programming the connections. It is not a primer on LabVIEW programming, a separate CD is being delivered with website files to teach that.

  7. Delivery validation of an automated modulated electron radiotherapy plan

    SciTech Connect

    Connell, T. Papaconstadopoulos, P.; Alexander, A.; Serban, M.; Devic, S.; Seuntjens, J.

    2014-06-15

    Purpose: Modulated electron radiation therapy (MERT) represents an active area of interest that offers the potential to improve healthy tissue sparing in treatment of certain cancer cases. Challenges remain however in accurate beamlet dose calculation, plan optimization, collimation method, and delivery accuracy. In this work, the authors investigate the accuracy and efficiency of an end-to-end MERT plan and automated delivery method. Methods: Treatment planning was initiated on a previously treated whole breast irradiation case including an electron boost. All dose calculations were performed using Monte Carlo methods and beam weights were determined using a research-based treatment planning system capable of inverse optimization. The plan was delivered to radiochromic film placed in a water equivalent phantom for verification, using an automated motorized tertiary collimator. Results: The automated delivery, which covered four electron energies, 196 subfields, and 6183 total MU was completed in 25.8 min, including 6.2 min of beam-on time. The remainder of the delivery time was spent on collimator leaf motion and the automated interfacing with the accelerator in service mode. Comparison of the planned and delivered film dose gave 3%/3mm gamma pass rates of 62.1%, 99.8%, 97.8%, 98.3%, and 98.7% for the 9, 12, 16, and 20 MeV, and combined energy deliveries, respectively. Delivery was also performed with a MapCHECK device and resulted in 3%/3  mm gamma pass rates of 88.8%, 86.1%, 89.4%, and 94.8% for the 9, 12, 16, and 20 MeV energies, respectively. Conclusions: Results of the authors’ study showed that an accurate delivery utilizing an add-on tertiary electron collimator is possible using Monte Carlo calculated plans and inverse optimization, which brings MERT closer to becoming a viable option for physicians in treating superficial malignancies.

  8. Software design for automated assembly of truss structures

    NASA Technical Reports Server (NTRS)

    Herstrom, Catherine L.; Grantham, Carolyn; Allen, Cheryl L.; Doggett, William R.; Will, Ralph W.

    1992-01-01

    Concern over the limited intravehicular activity time has increased the interest in performing in-space assembly and construction operations with automated robotic systems. A technique being considered at LaRC is a supervised-autonomy approach, which can be monitored by an Earth-based supervisor that intervenes only when the automated system encounters a problem. A test-bed to support evaluation of the hardware and software requirements for supervised-autonomy assembly methods was developed. This report describes the design of the software system necessary to support the assembly process. The software is hierarchical and supports both automated assembly operations and supervisor error-recovery procedures, including the capability to pause and reverse any operation. The software design serves as a model for the development of software for more sophisticated automated systems and as a test-bed for evaluation of new concepts and hardware components.

  9. A Toolset for Supporting Iterative Human Automation: Interaction in Design

    NASA Technical Reports Server (NTRS)

    Feary, Michael S.

    2010-01-01

    The addition of automation has greatly extended humans' capability to accomplish tasks, including those that are difficult, complex and safety critical. The majority of Human - Automation Interacton (HAl) results in more efficient and safe operations, ho,,:,ever ertain unpected atomatlon behaviors or "automation surprises" can be frustrating and, In certain safety critical operations (e.g. transporttion, manufacturing control, medicine), may result in injuries or. the loss of life.. (Mellor, 1994; Leveson, 1995; FAA, 1995; BASI, 1998; Sheridan, 2002). This papr describes he development of a design tool that enables on the rapid development and evaluation. of automaton prototypes. The ultimate goal of the work is to provide a design platform upon which automation surprise vulnerability analyses can be integrated.

  10. Design of electronic composites

    SciTech Connect

    Taya, M.

    1995-12-31

    This report describes the requirements of selected electronic composites with application to electronic packaging, then focuses on the modeling for the microstructure-macrobehavior relation of electronic composites. The modeling depends on the microstructure, percolative and non-percolative.

  11. How the new optoelectronic design automation industry is taking advantage of preexisting EDA standards

    NASA Astrophysics Data System (ADS)

    Nesmith, Kevin A.; Carver, Susan

    2014-05-01

    With the advancements in design processes down to the sub 7nm levels, the Electronic Design Automation industry appears to be coming to an end of advancements, as the size of the silicon atom becomes the limiting factor. Or is it? The commercial viability of mass-producing silicon photonics is bringing about the Optoelectronic Design Automation (OEDA) industry. With the science of photonics in its infancy, adding these circuits to ever-increasing complex electronic designs, will allow for new generations of advancements. Learning from the past 50 years of the EDA industry's mistakes and missed opportunities, the photonics industry is starting with electronic standards and extending them to become photonically aware. Adapting the use of pre-existing standards into this relatively new industry will allow for easier integration into the present infrastructure and faster time to market.

  12. Semi-Automated Diagnosis, Repair, and Rework of Spacecraft Electronics

    NASA Technical Reports Server (NTRS)

    Struk, Peter M.; Oeftering, Richard C.; Easton, John W.; Anderson, Eric E.

    2008-01-01

    NASA's Constellation Program for Exploration of the Moon and Mars places human crews in extreme isolation in resource scarce environments. Near Earth, the discontinuation of Space Shuttle flights after 2010 will alter the up- and down-mass capacity for the International Space Station (ISS). NASA is considering new options for logistics support strategies for future missions. Aerospace systems are often composed of replaceable modular blocks that minimize the need for complex service operations in the field. Such a strategy however, implies a robust and responsive logistics infrastructure with relatively low transportation costs. The modular Orbital Replacement Units (ORU) used for ISS requires relatively large blocks of replacement hardware even though the actual failed component may really be three orders of magnitude smaller. The ability to perform in-situ repair of electronics circuits at the component level can dramatically reduce the scale of spares and related logistics cost. This ability also reduces mission risk, increases crew independence and improves the overall supportability of the program. The Component-Level Electronics Assembly Repair (CLEAR) task under the NASA Supportability program was established to demonstrate the practicality of repair by first investigating widely used soldering materials and processes (M&P) performed by modest manual means. The work will result in program guidelines for performing manual repairs along with design guidance for circuit reparability. The next phase of CLEAR recognizes that manual repair has its limitations and some highly integrated devices are extremely difficult to handle and demand semi-automated equipment. Further, electronics repairs require a broad range of diagnostic capability to isolate the faulty components. Finally repairs must pass functional tests to determine that the repairs are successful and the circuit can be returned to service. To prevent equipment demands from exceeding spacecraft volume

  13. Design of Center-TRACON Automation System

    NASA Technical Reports Server (NTRS)

    Erzberger, Heinz; Davis, Thomas J.; Green, Steven

    1993-01-01

    A system for the automated management and control of terminal area traffic, referred to as the Center-TRACON Automation System (CTAS), is being developed at NASA Ames Research Center. In a cooperative program, NASA and FAA have efforts underway to install and evaluate the system at the Denver area and Dallas/Ft. Worth area air traffic control facilities. This paper will review CTAS architecture, and automation functions as well as the integration of CTAS into the existing operational system. CTAS consists of three types of integrated tools that provide computer-generated advisories for both en-route and terminal area controllers to guide them in managing and controlling arrival traffic efficiently. One tool, the Traffic Management Advisor (TMA), generates runway assignments, landing sequences and landing times for all arriving aircraft, including those originating from nearby feeder airports. TMA also assists in runway configuration control and flow management. Another tool, the Descent Advisor (DA), generates clearances for the en-route controllers handling arrival flows to metering gates. The DA's clearances ensure fuel-efficient and conflict free descents to the metering gates at specified crossing times. In the terminal area, the Final Approach Spacing Tool (FAST) provides heading and speed advisories that help controllers produce an accurately spaced flow of aircraft on the final approach course. Data bases consisting of several hundred aircraft performance models, airline preferred operational procedures, and a three dimensional wind model support the operation of CTAS. The first component of CTAS, the Traffic Management Advisor, is being evaluated at the Denver TRACON and the Denver Air Route Traffic Control Center. The second component, the Final Approach Spacing Tool, will be evaluated in several stages at the Dallas/Fort Worth Airport beginning in October 1993. An initial stage of the Descent Advisor tool is being prepared for testing at the Denver Center

  14. Automating the design process - Progress, problems, prospects, potential.

    NASA Technical Reports Server (NTRS)

    Heldenfels, R. R.

    1973-01-01

    The design process for large aerospace vehicles is discussed, with particular emphasis on structural design. Problems with current procedures are identified. Then, the contributions possible from automating the design process (defined as the best combination of men and computers) are considered. Progress toward automated design in the aerospace and other communities is reviewed, including NASA studies of the potential development of Integrated Programs for Aerospace-Vehicle Design (IPAD). The need for and suggested directions of future research on the design process, both technical and social, are discussed. Although much progress has been made to exploit the computer in design, it is concluded that technology is available to begin using the computer to speed communications and management as well as calculations in the design process and thus build man-computer teams that can design better, faster and cheaper.

  15. Design of the Automated Rendezvous and Capture Docking System

    NASA Technical Reports Server (NTRS)

    Cruzen, Craig A.; Lomas, James J.

    1999-01-01

    This paper describes the Automated Rendezvous and Capture (AR&C) system that was designed and is being tested at NASA's Marshall Space Flight Center (MSFC). The AR&C system incorporates some of the latest innovations in Global Positioning System (GPS), laser sensor technologies and automated mission sequencing algorithms as well as the capability for ground and crew monitoring and commanding. This paper summarizes the variety of mission scenarios supported by the AR&C system. It also describes the major components of the AR&C system including the Guidance, Navigation and Control system, GPS receivers, relative navigation filter and the Video Guidance Sensor. A discussion of the safety and reliability issues confronted during the design follows. By designing a safe and robust automated system, space mission operations cost can be reduced by decreasing the number of ground personnel required for the extensive mission design, preflight planning and training typically required for rendezvous and docking missions.

  16. Design automation for complex CMOS/SOS LSI hybrid substrates

    NASA Technical Reports Server (NTRS)

    Ramondetta, P. W.; Smiley, J. W.

    1976-01-01

    A design automated approach used to develop thick-film hybrid packages is described. The hybrid packages produced combine thick-film and silicon on sapphire (SOS) laser surface interaction technologies to bring the on-chip performance level of SOS to the subsystem level. Packing densities are improved by a factor of eight over ceramic dual in-line packing; interchip wiring capacitance is low. Due to significant time savings, the design automated approach presented can be expected to yield a 3:1 reduction in cost over the use of manual methods for the initial design of a hybrid.

  17. Automated AI-based designer of electrical distribution systems

    NASA Astrophysics Data System (ADS)

    Sumic, Zarko

    1992-03-01

    Designing the electrical supply system for new residential developments (plat design) is an everyday task for electric utility engineers. Presently this task is carried out manually resulting in an overdesigned, costly, and nonstandardized solution. As an ill-structured and open-ended problem, plat design is difficult to automate with conventional approaches such as operational research or CAD. Additional complexity in automating plat design is imposed by the need to process spatial data such as circuits' maps, records, and construction plans. The intelligent decision support system for automated electrical plate design (IDSS for AEPD) is an engineering tool aimed at automating plate design. IDSS for AEPD combines the functionality of geographic information systems (GIS) a geographically referenced database, with the sophistication of artificial intelligence (AI) to deal with the complexity inherent in design problems. Blackboard problem solving architecture, concentrated around INGRES relational database and NEXPERT object expert system shell have been chosen to accommodate the diverse knowledge sources and data models. The GIS's principal task it to create, structure, and formalize the real world representation required by the rule based reasoning portion of the AEPD. IDSS's capability to support and enhance the engineer's design, rather than only automate the design process through a prescribed computation, makes it a preferred choice among the possible techniques for AEPD. This paper presents the results of knowledge acquisition and the knowledge engineering process with AEPD tool conceptual design issues. To verify the proposed concept, the comparison of results obtained by the AEPD tool with the design obtained by an experienced human designer is given.

  18. Advanced characterization of twins using automated electron backscatter diffraction

    SciTech Connect

    Wright, S. I.; Bingert, J. F.; Mason, T. A.; Larson, R. J.

    2002-01-01

    This paper describes results obtained using an automated, crystallographically-based technique for twin identification. The technique is based on the automated collection of spatially specific orientation measurements by electron backscatter diffraction (EBSD) in the scanning electron microscope (SEM). The key features of the analysis are identification of potential twin boundaries by their misorientation character, identification of the distinct boundary planes among the symmetrically equivalent candidates, and validation of these boundaries through comparison with the boundary and twin plane traces in the sample cross section. Results on the application of this technique to deformation twins in zirconium are analyzed for the effect of twin type and amount and sense of uniaxial deformation. The accumulation of strain tends to increase the misorientation deviation at least to the degree of the trace deviation compared with recrystallization twins in nickel. In addition to the results on characterizing the twin character, results on extending the twin analysis to automated identification of parent and daughter material for structures exhibiting twin deformation are reported as well.

  19. Automated determination of electron density from electric field measurements on the Van Allen Probes spacecraft

    NASA Astrophysics Data System (ADS)

    Zhelavskaya, Irina; Kurth, William; Spasojevic, Maria; Shprits, Yuri

    2016-07-01

    We present the Neural-network-based Upper-hybrid Resonance Determination (NURD) algorithm for automatic inference of the electron number density from plasma wave measurements made onboard NASA's Van Allen Probes mission. A feedforward neural network is developed to determine the upper hybrid resonance frequency, f_{uhr}, from electric field measurements, which is then used to calculate the electron number density. In previous missions, the plasma resonance bands were manually identified, and there have been few attempts to do robust, routine automated detections. We describe the design and implementation of the algorithm and perform an initial analysis of the resulting electron number density distribution obtained by applying NURD to 2.5 years of data collected with the EMFISIS instrumentation suite of the Van Allen Probes mission. Densities obtained by NURD are compared to those obtained by another recently developed automated technique and also to an existing empirical plasmasphere and trough density model.

  20. Automation of the electron-beam welding process

    NASA Astrophysics Data System (ADS)

    Koleva, E.; Dzharov, V.; Kardjiev, M.; Mladenov, G.

    2016-03-01

    In this work, the automatic control is considered of the vacuum and cooling systems of the located in the IE-BAS equipment for electron-beam welding, evaporation and surface modification. A project was elaborated for the control and management based on the development of an engineering support system using existing and additional technical means of automation. Optimization of the indicators, which are critical for the duration of reaching the working regime and stopping the operation of the installation, can be made using experimentally obtained transient characteristics. The automation of the available equipment aimed at improving its efficiency and the repeatability of the obtained results, as well as at stabilizing the process parameters, should be integrated in an Engineering Support System which, besides the operator supervision, consists of several subsystems for equipment control, data acquisition, information analysis, system management and decision-making support.

  1. Automated Theorem Proving in High-Quality Software Design

    NASA Technical Reports Server (NTRS)

    Schumann, Johann; Swanson, Keith (Technical Monitor)

    2001-01-01

    The amount and complexity of software developed during the last few years has increased tremendously. In particular, programs are being used more and more in embedded systems (from car-brakes to plant-control). Many of these applications are safety-relevant, i.e. a malfunction of hardware or software can cause severe damage or loss. Tremendous risks are typically present in the area of aviation, (nuclear) power plants or (chemical) plant control. Here, even small problems can lead to thousands of casualties and huge financial losses. Large financial risks also exist when computer systems are used in the area of telecommunication (telephone, electronic commerce) or space exploration. Computer applications in this area are not only subject to safety considerations, but also security issues are important. All these systems must be designed and developed to guarantee high quality with respect to safety and security. Even in an industrial setting which is (or at least should be) aware of the high requirements in Software Engineering, many incidents occur. For example, the Warshaw Airbus crash, was caused by an incomplete requirements specification. Uncontrolled reuse of an Ariane 4 software module was the reason for the Ariane 5 disaster. Some recent incidents in the telecommunication area, like illegal "cloning" of smart-cards of D2GSM handies, or the extraction of (secret) passwords from German T-online users show that also in this area serious flaws can happen. Due to the inherent complexity of computer systems, most authors claim that only a rigorous application of formal methods in all stages of the software life cycle can ensure high quality of the software and lead to real safe and secure systems. In this paper, we will have a look, in how far automated theorem proving can contribute to a more widespread application of formal methods and their tools, and what automated theorem provers (ATPs) must provide in order to be useful.

  2. Automated IDEF3 and IDEF4 systems design specification document

    NASA Technical Reports Server (NTRS)

    Friel, Patricia Griffith; Blinn, Thomas M.

    1989-01-01

    The current design is presented for the automated IDEF3 and IDEF4 tools. The philosophy is described behind the tool designs as well as the conceptual view of the interacting components of the two tools. Finally, a detailed description is presented of the existing designs for the tools using IDEF3 process descriptions and IDEF4 diagrams. In the preparation of these designs, the IDEF3 and IDEF4 methodologies were very effective in defining the structure and operation of the tools. The experience in designing systems in this fashion was very valuable and resulted in future systems being designed in this way. However, the number of IDEF3 and IDEF4 diagrams that were produced using a Macintosh for this document attest to the need for an automated tool to simplify this design process.

  3. Automating the design of scientific computing software

    NASA Technical Reports Server (NTRS)

    Kant, Elaine

    1992-01-01

    SINAPSE is a domain-specific software design system that generates code from specifications of equations and algorithm methods. This paper describes the system's design techniques (planning in a space of knowledge-based refinement and optimization rules), user interaction style (user has option to control decision making), and representation of knowledge (rules and objects). It also summarizes how the system knowledge has evolved over time and suggests some issues in building software design systems to facilitate reuse.

  4. Conceptual design of an aircraft automated coating removal system

    SciTech Connect

    Baker, J.E.; Draper, J.V.; Pin, F.G.; Primm, A.H.; Shekhar, S.

    1996-05-01

    Paint stripping of the U.S. Air Force`s large transport aircrafts is currently a labor-intensive, manual process. Significant reductions in costs, personnel and turnaround time can be accomplished by the judicious use of automation in some process tasks. This paper presents the conceptual design of a coating removal systems for the tail surfaces of the C-5 plane. Emphasis is placed on the technology selection to optimize human-automation synergy with respect to overall costs, throughput, quality, safety, and reliability. Trade- offs between field-proven vs. research-requiring technologies, and between expected gain vs. cost and complexity, have led to a conceptual design which is semi-autonomous (relying on the human for task specification and disturbance handling) yet incorporates sensor- based automation (for sweep path generation and tracking, surface following, stripping quality control and tape/breach handling).

  5. Automating expert role to determine design concept in Kansei Engineering

    NASA Astrophysics Data System (ADS)

    Lokman, Anitawati Mohd; Haron, Mohammad Bakri Che; Abidin, Siti Zaleha Zainal; Khalid, Noor Elaiza Abd

    2016-02-01

    Affect has become imperative in product quality. In affective design field, Kansei Engineering (KE) has been recognized as a technology that enables discovery of consumer's emotion and formulation of guide to design products that win consumers in the competitive market. Albeit powerful technology, there is no rule of thumb in its analysis and interpretation process. KE expertise is required to determine sets of related Kansei and the significant concept of emotion. Many research endeavors become handicapped with the limited number of available and accessible KE experts. This work is performed to simulate the role of experts with the use of Natphoric algorithm thus providing sound solution to the complexity and flexibility in KE. The algorithm is designed to learn the process by implementing training datasets taken from previous KE research works. A framework for automated KE is then designed to realize the development of automated KE system. A comparative analysis is performed to determine feasibility of the developed prototype to automate the process. The result shows that the significant Kansei is determined by manual KE implementation and the automated process is highly similar. KE research advocates will benefit this system to automatically determine significant design concepts.

  6. A Case Study in CAD Design Automation

    ERIC Educational Resources Information Center

    Lowe, Andrew G.; Hartman, Nathan W.

    2011-01-01

    Computer-aided design (CAD) software and other product life-cycle management (PLM) tools have become ubiquitous in industry during the past 20 years. Over this time they have continuously evolved, becoming programs with enormous capabilities, but the companies that use them have not evolved their design practices at the same rate. Due to the…

  7. Generative Representations for Computer-Automated Design Systems

    NASA Technical Reports Server (NTRS)

    Hornby, Gregory S.

    2004-01-01

    With the increasing computational power of Computers, software design systems are progressing from being tools for architects and designers to express their ideas to tools capable of creating designs under human guidance. One of the main limitations for these computer-automated design programs is the representation with which they encode designs. If the representation cannot encode a certain design, then the design program cannot produce it. Similarly, a poor representation makes some types of designs extremely unlikely to be created. Here we define generative representations as those representations which can create and reuse organizational units within a design and argue that reuse is necessary for design systems to scale to more complex and interesting designs. To support our argument we describe GENRE, an evolutionary design program that uses both a generative and a non-generative representation, and compare the results of evolving designs with both types of representations.

  8. Automated Design of the Europa Orbiter Tour

    NASA Technical Reports Server (NTRS)

    Heaton, Andrew F.; Strange, Nathan J.; Longuski, James M.; Bonfiglio, Eugene P.; Taylor, Irene (Technical Monitor)

    2000-01-01

    In this paper we investigate tours of the Jovian satellites Europa Ganymede, and Callisto for the Europa Orbiter Mission. The principal goal of the tour design is to lower arrival V_ for the final Europa encounter while meeting all of the design constraints. Key constraints arise from considering the total time of the tour and the radiation dosage of a tour. These tours may employ 14 or more encounters with the Jovian satellites. hence there is an enormous number of possible sequences of these satellites to investigate. We develop a graphical method that greatly aids the design process.

  9. Automated Design of the Europa Orbiter Tour

    NASA Technical Reports Server (NTRS)

    Heaton, Andrew F.; Strange, Nathan J.; Longusaki, James M.; Bonfiglio, Eugene P.

    2000-01-01

    In this paper we investigate tours of the Jovian satellites Europa, Ganymede, and Callisto for the Europa Orbiter Mission. The principal goal of the tour design is to lower arrival V(sub infinity) for the final Europa encounter while meeting all of the design constraints. Key constraints arise from considering the total time of the tour and the radiation dosage of a tour. These tours may employ 14 or more encounters with the Jovian satellites, hence there is an enormous number of possible sequences of these satellites to investigate. We develop a graphical method that greatly aids the design process.

  10. Stressing Design in Electronics Teaching

    ERIC Educational Resources Information Center

    Cuthbert, L. G.

    1976-01-01

    Advocates a strong emphasis on the teaching of the design of electronic circuits in undergraduate courses. An instructional paradigm involving the design and construction of a single-transistor amplifier is provided. (CP)

  11. Automated radiation hard ASIC design tool

    NASA Technical Reports Server (NTRS)

    White, Mike; Bartholet, Bill; Baze, Mark

    1993-01-01

    A commercial based, foundry independent, compiler design tool (ChipCrafter) with custom radiation hardened library cells is described. A unique analysis approach allows low hardness risk for Application Specific IC's (ASIC's). Accomplishments, radiation test results, and applications are described.

  12. CMOS-array design-automation techniques

    NASA Technical Reports Server (NTRS)

    Feller, A.; Lombardt, T.

    1979-01-01

    Thirty four page report discusses design of 4,096-bit complementary metal oxide semiconductor (CMOS) read-only memory (ROM). CMOSROM is either mask or laser programable. Report is divided into six sections; section one describes background of ROM chips; section two presents design goals for chip; section three discusses chip implementation and chip statistics; conclusions and recommendations are given in sections four thru six.

  13. Generative Representations for Computer-Automated Evolutionary Design

    NASA Technical Reports Server (NTRS)

    Hornby, Gregory S.

    2006-01-01

    With the increasing computational power of computers, software design systems are progressing from being tools for architects and designers to express their ideas to tools capable of creating designs under human guidance. One of the main limitations for these computer-automated design systems is the representation with which they encode designs. If the representation cannot encode a certain design, then the design system cannot produce it. To be able to produce new types of designs, and not just optimize pre-defined parameterizations, evolutionary design systems must use generative representations. Generative representations are assembly procedures, or algorithms, for constructing a design thereby allowing for truly novel design solutions to be encoded. In addition, by enabling modularity, regularity and hierarchy, the level of sophistication that can be evolved is increased. We demonstrate the advantages of generative representations on two different design domains: the evolution of spacecraft antennas and the evolution of 3D objects.

  14. DESIGN OF SMALL AUTOMATION WORK CELL SYSTEM DEMONSTRATIONS

    SciTech Connect

    C. TURNER; J. PEHL; ET AL

    2000-12-01

    The introduction of automation systems into many of the facilities dealing with the production, use and disposition of nuclear materials has been an ongoing objective. Many previous attempts have been made, using a variety of monolithic and, in some cases, modular technologies. Many of these attempts were less than successful, owing to the difficulty of the problem, the lack of maturity of the technology, and over optimism about the capabilities of a particular system. Consequently, it is not surprising that suggestions that automation can reduce worker Occupational Radiation Exposure (ORE) levels are often met with skepticism and caution. The development of effective demonstrations of these technologies is of vital importance if automation is to become an acceptable option for nuclear material processing environments. The University of Texas Robotics Research Group (UTRRG) has been pursuing the development of technologies to support modular small automation systems (each of less than 5 degrees-of-freedom) and the design of those systems for more than two decades. Properly designed and implemented, these technologies have a potential to reduce the worker ORE associated with work in nuclear materials processing facilities. Successful development of systems for these applications requires the development of technologies that meet the requirements of the applications. These application requirements form a general set of rules that applicable technologies and approaches need to adhere to, but in and of themselves are generally insufficient for the design of a specific automation system. For the design of an appropriate system, the associated task specifications and relationships need to be defined. These task specifications also provide a means by which appropriate technology demonstrations can be defined. Based on the requirements and specifications of the operations of the Advanced Recovery and Integrated Extraction System (ARIES) pilot line at Los Alamos National

  15. AI and workflow automation: The prototype electronic purchase request system

    NASA Technical Reports Server (NTRS)

    Compton, Michael M.; Wolfe, Shawn R.

    1994-01-01

    Automating 'paper' workflow processes with electronic forms and email can dramatically improve the efficiency of those processes. However, applications that involve complex forms that are used for a variety of purposes or that require numerous and varied approvals often require additional software tools to ensure that (1) the electronic form is correctly and completely filled out, and (2) the form is routed to the proper individuals and organizations for approval. The prototype electronic purchase request (PEPR) system, which has been in pilot use at NASA Ames Research Center since December 1993, seamlessly links a commercial electronics forms package and a CLIPS-based knowledge system that first ensures that electronic forms are correct and complete, and then generates an 'electronic routing slip' that is used to route the form to the people who must sign it. The PEPR validation module is context-sensitive, and can apply different validation rules at each step in the approval process. The PEPR system is form-independent, and has been applied to several different types of forms. The system employs a version of CLIPS that has been extended to support AppleScript, a recently-released scripting language for the Macintosh. This 'scriptability' provides both a transparent, flexible interface between the two programs and a means by which a single copy of the knowledge base can be utilized by numerous remote users.

  16. Automated database design from natural language input

    NASA Technical Reports Server (NTRS)

    Gomez, Fernando; Segami, Carlos; Delaune, Carl

    1995-01-01

    Users and programmers of small systems typically do not have the skills needed to design a database schema from an English description of a problem. This paper describes a system that automatically designs databases for such small applications from English descriptions provided by end-users. Although the system has been motivated by the space applications at Kennedy Space Center, and portions of it have been designed with that idea in mind, it can be applied to different situations. The system consists of two major components: a natural language understander and a problem-solver. The paper describes briefly the knowledge representation structures constructed by the natural language understander, and, then, explains the problem-solver in detail.

  17. Theoretical considerations in designing operator interfaces for automated systems

    NASA Technical Reports Server (NTRS)

    Norman, Susan D.

    1987-01-01

    The domains most amenable to techniques based on artificial intelligence (AI) are those that are systematic or for which a systematic domain can be generated. In aerospace systems, many operational tasks are systematic owing to the highly procedural nature of the applications. However, aerospace applications can also be nonprocedural, particularly in the event of a failure or an unexpected event. Several techniques are discussed for designing automated systems for real-time, dynamic environments, particularly when a 'breakdown' occurs. A breakdown is defined as operation of an automated system outside its predetermined, conceptual domain.

  18. Towards automation of user interface design

    NASA Technical Reports Server (NTRS)

    Gastner, Rainer; Kraetzschmar, Gerhard K.; Lutz, Ernst

    1992-01-01

    This paper suggests an approach to automatic software design in the domain of graphical user interfaces. There are still some drawbacks in existing user interface management systems (UIMS's) which basically offer only quantitative layout specifications via direct manipulation. Our approach suggests a convenient way to get a default graphical user interface which may be customized and redesigned easily in further prototyping cycles.

  19. Library Automation Design for Visually Impaired People

    ERIC Educational Resources Information Center

    Yurtay, Nilufer; Bicil, Yucel; Celebi, Sait; Cit, Guluzar; Dural, Deniz

    2011-01-01

    Speech synthesis is a technology used in many different areas in computer science. This technology can bring a solution to reading activity of visually impaired people due to its text to speech conversion. Based on this problem, in this study, a system is designed needed for a visually impaired person to make use of all the library facilities in…

  20. Automated Design of Complex Dynamic Systems

    PubMed Central

    Hermans, Michiel; Schrauwen, Benjamin; Bienstman, Peter; Dambre, Joni

    2014-01-01

    Several fields of study are concerned with uniting the concept of computation with that of the design of physical systems. For example, a recent trend in robotics is to design robots in such a way that they require a minimal control effort. Another example is found in the domain of photonics, where recent efforts try to benefit directly from the complex nonlinear dynamics to achieve more efficient signal processing. The underlying goal of these and similar research efforts is to internalize a large part of the necessary computations within the physical system itself by exploiting its inherent non-linear dynamics. This, however, often requires the optimization of large numbers of system parameters, related to both the system's structure as well as its material properties. In addition, many of these parameters are subject to fabrication variability or to variations through time. In this paper we apply a machine learning algorithm to optimize physical dynamic systems. We show that such algorithms, which are normally applied on abstract computational entities, can be extended to the field of differential equations and used to optimize an associated set of parameters which determine their behavior. We show that machine learning training methodologies are highly useful in designing robust systems, and we provide a set of both simple and complex examples using models of physical dynamical systems. Interestingly, the derived optimization method is intimately related to direct collocation a method known in the field of optimal control. Our work suggests that the application domains of both machine learning and optimal control have a largely unexplored overlapping area which envelopes a novel design methodology of smart and highly complex physical systems. PMID:24497969

  1. An automated approach to the design of decision tree classifiers

    NASA Technical Reports Server (NTRS)

    Argentiero, P.; Chin, R.; Beaudet, P.

    1982-01-01

    An automated technique is presented for designing effective decision tree classifiers predicated only on a priori class statistics. The procedure relies on linear feature extractions and Bayes table look-up decision rules. Associated error matrices are computed and utilized to provide an optimal design of the decision tree at each so-called 'node'. A by-product of this procedure is a simple algorithm for computing the global probability of correct classification assuming the statistical independence of the decision rules. Attention is given to a more precise definition of decision tree classification, the mathematical details on the technique for automated decision tree design, and an example of a simple application of the procedure using class statistics acquired from an actual Landsat scene.

  2. Design of automation tools for management of descent traffic

    NASA Technical Reports Server (NTRS)

    Erzberger, Heinz; Nedell, William

    1988-01-01

    The design of an automated air traffic control system based on a hierarchy of advisory tools for controllers is described. Compatibility of the tools with the human controller, a key objective of the design, is achieved by a judicious selection of tasks to be automated and careful attention to the design of the controller system interface. The design comprises three interconnected subsystems referred to as the Traffic Management Advisor, the Descent Advisor, and the Final Approach Spacing Tool. Each of these subsystems provides a collection of tools for specific controller positions and tasks. This paper focuses primarily on the Descent Advisor which provides automation tools for managing descent traffic. The algorithms, automation modes, and graphical interfaces incorporated in the design are described. Information generated by the Descent Advisor tools is integrated into a plan view traffic display consisting of a high-resolution color monitor. Estimated arrival times of aircraft are presented graphically on a time line, which is also used interactively in combination with a mouse input device to select and schedule arrival times. Other graphical markers indicate the location of the fuel-optimum top-of-descent point and the predicted separation distances of aircraft at a designated time-control point. Computer generated advisories provide speed and descent clearances which the controller can issue to aircraft to help them arrive at the feeder gate at the scheduled times or with specified separation distances. Two types of horizontal guidance modes, selectable by the controller, provide markers for managing the horizontal flightpaths of aircraft under various conditions. The entire system consisting of descent advisor algorithm, a library of aircraft performance models, national airspace system data bases, and interactive display software has been implemented on a workstation made by Sun Microsystems, Inc. It is planned to use this configuration in operational

  3. An automated methodology development. [software design for combat simulation

    NASA Technical Reports Server (NTRS)

    Hawley, L. R.

    1985-01-01

    The design methodology employed in testing the applicability of Ada in large-scale combat simulations is described. Ada was considered as a substitute for FORTRAN to lower life cycle costs and ease the program development efforts. An object-oriented approach was taken, which featured definitions of military targets, the capability of manipulating their condition in real-time, and one-to-one correlation between the object states and real world states. The simulation design process was automated by the problem statement language (PSL)/problem statement analyzer (PSA). The PSL/PSA system accessed the problem data base directly to enhance the code efficiency by, e.g., eliminating non-used subroutines, and provided for automated report generation, besides allowing for functional and interface descriptions. The ways in which the methodology satisfied the responsiveness, reliability, transportability, modifiability, timeliness and efficiency goals are discussed.

  4. Automated design of multiple encounter gravity-assist trajectories

    NASA Technical Reports Server (NTRS)

    Longuski, James M.; Williams, Steve N.

    1990-01-01

    Given a range of initial launch dates and a set of target planets, a new approach to planetary mission design is developed, using an automated method for finding all conic solutions. Each point on the diagrams reproduced represents a single ballistic trajectory and is computed by modeling the trajectory as a conic section and solving the corresponding Lambert problem for each set of launch and arrival dates. An example which prescribes a launch period of 1975-2050 and target planets Uranus, Saturn, Jupiter, Neptune and Pluto is described whereby, all possible grand tour missions of this class are found, including the Voyager II trajectory. It is determined that this automated design tool may be applied to a variety of multiple encounter gravity-assist trajectories that are being considered for future missions.

  5. Implementing and Improving Automated Electronic Tumor Molecular Profiling.

    PubMed

    Rioth, Matthew J; Staggs, David B; Hackett, Lauren; Haberman, Erich; Tod, Mike; Levy, Mia; Warner, Jeremy

    2016-03-01

    Oncology practice increasingly requires the use of molecular profiling of tumors to inform the use of targeted therapeutics. However, many oncologists use third-party laboratories to perform tumor genomic testing, and these laboratories may not have electronic interfaces with the provider's electronic medical record (EMR) system. The resultant reporting mechanisms, such as plain-paper faxing, can reduce report fidelity, slow down reporting procedures for a physician's practice, and make reports less accessible. Vanderbilt University Medical Center and its genomic laboratory testing partner have collaborated to create an automated electronic reporting system that incorporates genetic testing results directly into the clinical EMR. This system was iteratively tested, and causes of failure were discovered and addressed. Most errors were attributable to data entry or typographical errors that made reports unable to be linked to the correct patient in the EMR. By providing direct feedback to providers, we were able to significantly decrease the rate of transmission errors (from 6.29% to 3.84%; P < .001). The results and lessons of 1 year of using the system and transmitting 832 tumor genomic testing reports are reported. PMID:26813927

  6. Universal electronics for miniature and automated chemical assays.

    PubMed

    Urban, Pawel L

    2015-02-21

    This minireview discusses universal electronic modules (generic programmable units) and their use by analytical chemists to construct inexpensive, miniature or automated devices. Recently, open-source platforms have gained considerable popularity among tech-savvy chemists because their implementation often does not require expert knowledge and investment of funds. Thus, chemistry students and researchers can easily start implementing them after a few hours of reading tutorials and trial-and-error. Single-board microcontrollers and micro-computers such as Arduino, Teensy, Raspberry Pi or BeagleBone enable collecting experimental data with high precision as well as efficient control of electric potentials and actuation of mechanical systems. They are readily programmed using high-level languages, such as C, C++, JavaScript or Python. They can also be coupled with mobile consumer electronics, including smartphones as well as teleinformatic networks. More demanding analytical tasks require fast signal processing. Field-programmable gate arrays enable efficient and inexpensive prototyping of high-performance analytical platforms, thus becoming increasingly popular among analytical chemists. This minireview discusses the advantages and drawbacks of universal electronic modules, considering their application in prototyping and manufacture of intelligent analytical instrumentation. PMID:25535820

  7. Automating Vector Autoregression on Electronic Patient Diary Data.

    PubMed

    Emerencia, Ando Celino; van der Krieke, Lian; Bos, Elisabeth H; de Jonge, Peter; Petkov, Nicolai; Aiello, Marco

    2016-03-01

    Finding the best vector autoregression model for any dataset, medical or otherwise, is a process that, to this day, is frequently performed manually in an iterative manner requiring a statistical expertize and time. Very few software solutions for automating this process exist, and they still require statistical expertize to operate. We propose a new application called Autovar, for the automation of finding vector autoregression models for time series data. The approach closely resembles the way in which experts work manually. Our proposal offers improvements over the manual approach by leveraging computing power, e.g., by considering multiple alternatives instead of choosing just one. In this paper, we describe the design and implementation of Autovar, we compare its performance against experts working manually, and we compare its features to those of the most used commercial solution available today. The main contribution of Autovar is to show that vector autoregression on a large scale is feasible. We show that an exhaustive approach for model selection can be relatively safe to use. This study forms an important step toward making adaptive, personalized treatment available and affordable for all branches of healthcare. PMID:25680221

  8. Flight control system design factors for applying automated testing techniques

    NASA Technical Reports Server (NTRS)

    Sitz, Joel R.; Vernon, Todd H.

    1990-01-01

    The principal design features and operational experiences of the X-29 forward-swept-wing aircraft and F-18 high alpha research vehicle (HARV) automated test systems are discussed. It is noted that operational experiences in developing and using these automated testing techniques have highlighted the need for incorporating target system features to improve testability. Improved target system testability can be accomplished with the addition of nonreal-time and real-time features. Online access to target system implementation details, unobtrusive real-time access to internal user-selectable variables, and proper software instrumentation are all desirable features of the target system. Also, test system and target system design issues must be addressed during the early stages of the target system development. Processing speeds of up to 20 million instructions/s and the development of high-bandwidth reflective memory systems have improved the ability to integrate the target system and test system for the application of automated testing techniques. It is concluded that new methods of designing testability into the target systems are required.

  9. Automated Electron Microscopy for Evaluating Two-dimensional Crystallization of Membrane Proteins

    PubMed Central

    Hu, Minghui; Vink, Martin; Kim, Changki; Derr, KD; Koss, John; D'Amico, Kevin; Cheng, Anchi; Pulokas, James; Ubarretxena-Belandia, Iban; Stokes, David

    2010-01-01

    Membrane proteins fulfill many important roles in the cell and represent the target for a large number of therapeutic drugs. Although structure determination of membrane proteins has become a major priority, it has proven to be technically challenging. Electron microscopy of two-dimensional (2D) crystals has the advantage of visualizing membrane proteins in their natural lipidic environment, but has been underutilized in recent structural genomics efforts. To improve the general applicability of electron crystallography, high-throughput methods are needed for screening large numbers of conditions for 2D crystallization, thereby increasing the chances of obtaining well ordered crystals and thus achieving atomic resolution. Previous reports describe devices for growing 2D crystals on a 96-well format. The current report describes a system for automated imaging of these screens with an electron microscope. Samples are inserted with a two-part robot: a SCARA robot for loading samples into the microscope holder, and a Cartesian robot for placing the holder into the electron microscope. A standard JEOL 1230 electron microscope was used, though a new tip was designed for the holder and a toggle switch controlling the airlock was rewired to allow robot control. A computer program for controlling the robots was integrated with the Leginon program, which provides a module for automated imaging of individual samples. The resulting images are uploaded into the Sesame laboratory information management system database where they are associated with other data relevant to the crystallization screen. PMID:20197095

  10. Crew aiding and automation: A system concept for terminal area operations, and guidelines for automation design

    NASA Technical Reports Server (NTRS)

    Dwyer, John P.

    1994-01-01

    This research and development program comprised two efforts: the development of guidelines for the design of automated systems, with particular emphasis on automation design that takes advantage of contextual information, and the concept-level design of a crew aiding system, the Terminal Area Navigation Decision Aiding Mediator (TANDAM). This concept outlines a system capable of organizing navigation and communication information and assisting the crew in executing the operations required in descent and approach. In service of this endeavor, problem definition activities were conducted that identified terminal area navigation and operational familiarization exercises addressing the terminal area navigation problem. Both airborne and ground-based (ATC) elements of aircraft control were extensively researched. The TANDAM system concept was then specified, and the crew interface and associated systems described. Additionally, three descent and approach scenarios were devised in order to illustrate the principal functions of the TANDAM system concept in relation to the crew, the aircraft, and ATC. A plan for the evaluation of the TANDAM system was established. The guidelines were developed based on reviews of relevant literature, and on experience gained in the design effort.

  11. Designing a Hypertext Electronic Encyclopedia.

    ERIC Educational Resources Information Center

    Glushko, Robert J.

    1990-01-01

    Identifies five design requirements for electronic encyclopedias, based on the content and structure of printed encyclopedias and the uses that people make of them. Three existing hypertext programs (Owl International's Guide, HyperCard, and HyperTIES) are compared in terms of their ability to satisfy these design requirements. (CLB)

  12. Designing Electronic Collaborative Learning Environments

    ERIC Educational Resources Information Center

    Kirschner, Paul; Strijbos, Jan-Willem; Kreijns, Karel; Beers, Pieter Jelle

    2004-01-01

    Electronic collaborative learning environments for learning and working are in vogue. Designers design them according to their own constructivist interpretations of what collaborative learning is and what it should achieve. Educators employ them with different educational approaches and in diverse situations to achieve different ends. Students use…

  13. A new design concept for an automated peanut processing facility

    SciTech Connect

    Ertas, A.; Tanju, B.T.; Fair, W.T.; Butts, C.

    1996-12-31

    Peanut quality is a major concern in all phases of the peanut industry from production to manufacturing. Postharvest processing of peanuts can have profound effects on the quality and safety of peanut food products. Curing is a key step in postharvest processing. Curing peanuts improperly can significantly reduce quality, and result in significant losses to both farmers and processors. The conventional drying system designed in the 1960`s is still being used in the processing of the peanuts today. The objectives of this paper is to design and develop a new automated peanut drying system for dry climates capable of handling approximately 20 million lbm of peanuts per harvest season.

  14. Automating Assessment of Lifestyle Counseling in Electronic Health Records

    PubMed Central

    Hazlehurst, Brian L.; Lawrence, Jean M.; Donahoo, William T.; Sherwood, Nancy E; Kurtz, Stephen E; Xu, Stan; Steiner, John F

    2015-01-01

    Background Numerous population-based surveys indicate that overweight and obese patients can benefit from lifestyle counseling during routine clinical care. Purpose To determine if natural language processing (NLP) could be applied to information in the electronic health record (EHR) to automatically assess delivery of counseling related to weight management in clinical health care encounters. Methods The MediClass system with NLP capabilities was used to identify weight management counseling in EHR encounter records. Knowledge for the NLP application was derived from the 5As framework for behavior counseling: Ask (evaluate weight and related disease), Advise at-risk patients to lose weight, Assess patients’ readiness to change behavior, Assist through discussion of weight loss methods and programs and Arrange follow-up efforts including referral. Using samples of EHR data in 1/1/2007-3/31/2011 period from two health systems, the accuracy of the MediClass processor for identifying these counseling elements was evaluated in post-partum visits of 600 women with gestational diabetes mellitus (GDM) compared to manual chart review as gold standard. Data were analyzed in 2013. Results Mean sensitivity and specificity for each of the 5As compared to the gold standard was at or above 85%, with the exception of sensitivity for Assist which was measured at 40% and 60% respectively for each of the two health systems. The automated method identified many valid cases of Assist not identified in the gold standard. Conclusions The MediClass processor has performance capability sufficiently similar to human abstractors to permit automated assessment of counseling for weight loss in post-partum encounter records. PMID:24745635

  15. An automated quality assessor for Ada object-oriented designs

    NASA Technical Reports Server (NTRS)

    Bailin, Sidney C.

    1988-01-01

    A tool for evaluating object-oriented designs (OODs) for Ada software is described. The tool assumes a design expressed as a hierarchy of object diagrams. A design of this type identifies the objects of a system, an interface to each object, and the usage relationships between objects. When such a design is implemented in Ada, objects become packages, interfaces become package specifications, and usage relationships become Ada `with' clauses and package references. An automated quality assessor has been developed that is based on flagging undesirable design constructs. For convenience, distinctions are made among three levels of severity: questionable, undesirable, and hazardous. A questionable construct is one that may well be appropriate. An undesirable construct is one that should be changed because it is potentially harmful to the reliability, maintainability, or reusability of the software. A hazardous construct is one that is undesirable and that introduces a high level of risk.

  16. Interchange of electronic design through VHDL and EIS

    NASA Technical Reports Server (NTRS)

    Wallace, Richard M.

    1987-01-01

    The need for both robust and unambiguous electronic designs is a direct requirement of the astonishing growth in design and manufacturing capability during recent years. In order to manage the plethora of designs, and have the design data both interchangeable and interoperable, the Very High Speed Integrated Circuits (VHSIC) program is developing two major standards for the electronic design community. The VHSIC Hardware Description Language (VHDL) is designed to be the lingua franca for transmission of design data between designers and their environments. The Engineering Information System (EIS) is designed to ease the integration of data betweeen diverse design automation systems. This paper describes the rationale for the necessity for these two standards and how they provide a synergistic expressive capability across the macrocosm of design environments.

  17. The CADSS design automation system. [computerized design language for small digital systems

    NASA Technical Reports Server (NTRS)

    Franke, E. A.

    1973-01-01

    This research was designed to implement and extend a previously defined design automation system for the design of small digital structures. A description is included of the higher level language developed to describe systems as a sequence of register transfer operations. The system simulator which is used to determine if the original description is correct is also discussed. The design automation system produces tables describing the state transistions of the system and the operation of all registers. In addition all Boolean equations specifying system operation are minimized and converted to NAND gate structures. Suggestions for further extensions to the system are also given.

  18. Automated Processing of ISIS Topside Ionograms into Electron Density Profiles

    NASA Technical Reports Server (NTRS)

    Reinisch, bodo W.; Huang, Xueqin; Bilitza, Dieter; Hills, H. Kent

    2004-01-01

    Modeling of the topside ionosphere has for the most part relied on just a few years of data from topside sounder satellites. The widely used Bent et al. (1972) model, for example, is based on only 50,000 Alouette 1 profiles. The International Reference Ionosphere (IRI) (Bilitza, 1990, 2001) uses an analytical description of the graphs and tables provided by Bent et al. (1972). The Alouette 1, 2 and ISIS 1, 2 topside sounder satellites of the sixties and seventies were ahead of their times in terms of the sheer volume of data obtained and in terms of the computer and software requirements for data analysis. As a result, only a small percentage of the collected topside ionograms was converted into electron density profiles. Recently, a NASA-funded data restoration project has undertaken and is continuing the process of digitizing the Alouette/ISIS ionograms from the analog 7-track tapes. Our project involves the automated processing of these digital ionograms into electron density profiles. The project accomplished a set of important goals that will have a major impact on understanding and modeling of the topside ionosphere: (1) The TOPside Ionogram Scaling and True height inversion (TOPIST) software was developed for the automated scaling and inversion of topside ionograms. (2) The TOPIST software was applied to the over 300,000 ISIS-2 topside ionograms that had been digitized in the fkamework of a separate AISRP project (PI: R.F. Benson). (3) The new TOPIST-produced database of global electron density profiles for the topside ionosphere were made publicly available through NASA s National Space Science Data Center (NSSDC) ftp archive at . (4) Earlier Alouette 1,2 and ISIS 1, 2 data sets of electron density profiles from manual scaling of selected sets of ionograms were converted fiom a highly-compressed binary format into a user-friendly ASCII format and made publicly available through nssdcftp.gsfc.nasa.gov. The new database for the topside

  19. An hierarchical system architecture for automated design, fabrication, and repair

    NASA Technical Reports Server (NTRS)

    Cliff, R. A.

    1981-01-01

    The architecture of an automated system which has the following properties is described: (1) if it is presented with a final product specification (within its capabilities) it will do the detailed design (all the way down to the raw materials if need be) and then produce that product; (2) if a faulty final product is presented to the system, it will repair it. Interesting extensions of this architecture would be the ability to add fabricator nodes when required and the ability to add entire ranks when required. This sort of system would be a useful component of a self-replicating system (used in space exploration).

  20. An automated approach to magnetic divertor configuration design

    NASA Astrophysics Data System (ADS)

    Blommaert, M.; Dekeyser, W.; Baelmans, M.; Gauger, N. R.; Reiter, D.

    2015-01-01

    Automated methods based on optimization can greatly assist computational engineering design in many areas. In this paper an optimization approach to the magnetic design of a nuclear fusion reactor divertor is proposed and applied to a tokamak edge magnetic configuration in a first feasibility study. The approach is based on reduced models for magnetic field and plasma edge, which are integrated with a grid generator into one sensitivity code. The design objective chosen here for demonstrative purposes is to spread the divertor target heat load as much as possible over the entire target area. Constraints on the separatrix position are introduced to eliminate physically irrelevant magnetic field configurations during the optimization cycle. A gradient projection method is used to ensure stable cost function evaluations during optimization. The concept is applied to a configuration with typical Joint European Torus (JET) parameters and it automatically provides plausible configurations with reduced heat load.

  1. Semi-Automated Neuron Boundary Detection and Nonbranching Process Segmentation in Electron Microscopy Images

    SciTech Connect

    Jurrus, Elizabeth R.; Watanabe, Shigeki; Giuly, Richard J.; Paiva, Antonio R.; Ellisman, Mark H.; Jorgensen, Erik M.; Tasdizen, Tolga

    2013-01-01

    Neuroscientists are developing new imaging techniques and generating large volumes of data in an effort to understand the complex structure of the nervous system. The complexity and size of this data makes human interpretation a labor-intensive task. To aid in the analysis, new segmentation techniques for identifying neurons in these feature rich datasets are required. This paper presents a method for neuron boundary detection and nonbranching process segmentation in electron microscopy images and visualizing them in three dimensions. It combines both automated segmentation techniques with a graphical user interface for correction of mistakes in the automated process. The automated process first uses machine learning and image processing techniques to identify neuron membranes that deliniate the cells in each two-dimensional section. To segment nonbranching processes, the cell regions in each two-dimensional section are connected in 3D using correlation of regions between sections. The combination of this method with a graphical user interface specially designed for this purpose, enables users to quickly segment cellular processes in large volumes.

  2. Semi-automated Neuron Boundary Detection and Nonbranching Process Segmentation in Electron Microscopy Images

    PubMed Central

    Jurrus, Elizabeth; Watanabe, Shigeki; Giuly, Richard J.; Paiva, Antonio R. C.; Ellisman, Mark H.; Jorgensen, Erik M.; Tasdizen, Tolga

    2013-01-01

    Neuroscientists are developing new imaging techniques and generating large volumes of data in an effort to understand the complex structure of the nervous system. The complexity and size of this data makes human interpretation a labor-intensive task. To aid in the analysis, new segmentation techniques for identifying neurons in these feature rich datasets are required. This paper presents a method for neuron boundary detection and nonbranching process segmentation in electron microscopy images and visualizing them in three dimensions. It combines both automated segmentation techniques with a graphical user interface for correction of mistakes in the automated process. The automated process first uses machine learning and image processing techniques to identify neuron membranes that deliniate the cells in each two-dimensional section. To segment nonbranching processes, the cell regions in each two-dimensional section are connected in 3D using correlation of regions between sections. The combination of this method with a graphical user interface specially designed for this purpose, enables users to quickly segment cellular processes in large volumes. PMID:22644867

  3. 21 CFR 111.30 - What requirements apply to automated, mechanical, or electronic equipment?

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 21 Food and Drugs 2 2011-04-01 2011-04-01 false What requirements apply to automated, mechanical..., DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) FOOD FOR HUMAN CONSUMPTION CURRENT GOOD MANUFACTURING... Utensils § 111.30 What requirements apply to automated, mechanical, or electronic equipment? For...

  4. 21 CFR 111.30 - What requirements apply to automated, mechanical, or electronic equipment?

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 21 Food and Drugs 2 2012-04-01 2012-04-01 false What requirements apply to automated, mechanical..., DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) FOOD FOR HUMAN CONSUMPTION CURRENT GOOD MANUFACTURING... Utensils § 111.30 What requirements apply to automated, mechanical, or electronic equipment? For...

  5. 21 CFR 111.30 - What requirements apply to automated, mechanical, or electronic equipment?

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 21 Food and Drugs 2 2014-04-01 2014-04-01 false What requirements apply to automated, mechanical..., DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) FOOD FOR HUMAN CONSUMPTION CURRENT GOOD MANUFACTURING... Utensils § 111.30 What requirements apply to automated, mechanical, or electronic equipment? For...

  6. 21 CFR 111.30 - What requirements apply to automated, mechanical, or electronic equipment?

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 21 Food and Drugs 2 2013-04-01 2013-04-01 false What requirements apply to automated, mechanical..., DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) FOOD FOR HUMAN CONSUMPTION CURRENT GOOD MANUFACTURING... Utensils § 111.30 What requirements apply to automated, mechanical, or electronic equipment? For...

  7. 21 CFR 111.30 - What requirements apply to automated, mechanical, or electronic equipment?

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 2 2010-04-01 2010-04-01 false What requirements apply to automated, mechanical..., DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) FOOD FOR HUMAN CONSUMPTION CURRENT GOOD MANUFACTURING... Utensils § 111.30 What requirements apply to automated, mechanical, or electronic equipment? For...

  8. Design, Development, and Commissioning of a Substation Automation Laboratory to Enhance Learning

    ERIC Educational Resources Information Center

    Thomas, M. S.; Kothari, D. P.; Prakash, A.

    2011-01-01

    Automation of power systems is gaining momentum across the world, and there is a need to expose graduate and undergraduate students to the latest developments in hardware, software, and related protocols for power automation. This paper presents the design, development, and commissioning of an automation lab to facilitate the understanding of…

  9. FRACSAT: Automated design synthesis for future space architectures

    NASA Astrophysics Data System (ADS)

    Mackey, R.; Uckun, S.; Do, Minh; Shah, J.

    This paper describes the algorithmic basis and development of FRACSAT (FRACtionated Spacecraft Architecture Toolkit), a new approach to conceptual design, cost-benefit analysis, and detailed trade studies for space systems. It provides an automated capability for exploration of candidate spacecraft architectures, leading users to near-optimal solutions with respect to user-defined requirements, risks, and program uncertainties. FRACSAT utilizes a sophisticated planning algorithm (PlanVisioner) to perform a quasi-exhaustive search for candidate architectures, constructing candidates from an extensible model-based representation of space system components and functions. These candidates are then evaluated with emphasis on the business case, computing the expected design utility and system costs as well as risk, presenting the user with a greatly reduced selection of candidates. The user may further refine the search according to cost or benefit uncertainty, adaptability, or other performance metrics as needed.

  10. Modular implementation of a digital hardware design automation system

    NASA Astrophysics Data System (ADS)

    Masud, M.

    An automation system based on AHPL (A Hardware Programming Language) was developed. The project may be divided into three distinct phases: (1) Upgrading of AHPL to make it more universally applicable; (2) Implementation of a compiler for the language; and (3) illustration of how the compiler may be used to support several phases of design activities. Several new features were added to AHPL. These include: application-dependent parameters, mutliple clocks, asynchronous results, functional registers and primitive functions. The new language, called Universal AHPL, has been defined rigorously. The compiler design is modular. The parsing is done by an automatic parser generated from the SLR(1)BNF grammar of the language. The compiler produces two data bases from the AHPL description of a circuit. The first one is a tabular representation of the circuit, and the second one is a detailed interconnection linked list. The two data bases provide a means to interface the compiler to application-dependent CAD systems.

  11. Automating the packing heuristic design process with genetic programming.

    PubMed

    Burke, Edmund K; Hyde, Matthew R; Kendall, Graham; Woodward, John

    2012-01-01

    The literature shows that one-, two-, and three-dimensional bin packing and knapsack packing are difficult problems in operational research. Many techniques, including exact, heuristic, and metaheuristic approaches, have been investigated to solve these problems and it is often not clear which method to use when presented with a new instance. This paper presents an approach which is motivated by the goal of building computer systems which can design heuristic methods. The overall aim is to explore the possibilities for automating the heuristic design process. We present a genetic programming system to automatically generate a good quality heuristic for each instance. It is not necessary to change the methodology depending on the problem type (one-, two-, or three-dimensional knapsack and bin packing problems), and it therefore has a level of generality unmatched by other systems in the literature. We carry out an extensive suite of experiments and compare with the best human designed heuristics in the literature. Note that our heuristic design methodology uses the same parameters for all the experiments. The contribution of this paper is to present a more general packing methodology than those currently available, and to show that, by using this methodology, it is possible for a computer system to design heuristics which are competitive with the human designed heuristics from the literature. This represents the first packing algorithm in the literature able to claim human competitive results in such a wide variety of packing domains. PMID:21609273

  12. The design of an automated electrolytic enrichment apparatus for tritium

    SciTech Connect

    Myers, J.L.

    1994-12-01

    The Radiation Analytical Sciences Section at Laboratory at Lawrence Livermore National Laboratory performs analysis of low-level tritium concentrations in various natural water samples from the Tri-Valley Area, DOE Nevada Test Site, Site 300 in Tracy, CA, and other various places around the world. Low levels of tritium, a radioactive isotope of hydrogen, which is pre-concentrated in the RAS laboratory using an electrolytic enrichment apparatus. Later these enriched waters are analyzed by liquid scintillation counting to determine the activity of tritium. The enrichment procedure and the subsequent purification process by vacuum distillation are currently undertaken manually, hence being highly labor-intensive. The whole process typically takes about 2 to 3 weeks to complete a batch of 30 samples, with a dedicated personnel operating the process. The goal is to automate the entire process, specifically having the operation PC-LabVIEW{trademark} controlled with real-time monitoring capability. My involvement was in the design and fabrication of a prototypical automated electrolytic enrichment cell. Work will be done on optimizing the electrolytic process by assessing the different parameters of the enrichment procedure. Hardware and software development have also been an integral component of this project.

  13. Automated Design of Restraint Layer of an Inflatable Vessel

    NASA Technical Reports Server (NTRS)

    Spexarth, Gary

    2007-01-01

    A Mathcad computer program largely automates the design and analysis of the restraint layer (the primary load-bearing layer) of an inflatable vessel that consists of one or more sections having cylindrical, toroidal, and/or spherical shape(s). A restraint layer typically comprises webbing in the form of multiple straps. The design task includes choosing indexing locations along the straps, computing the load at every location in each strap, computing the resulting stretch at each location, and computing the amount of undersizing required of each strap so that, once the vessel is inflated and the straps thus stretched, the vessel can be expected to assume the desired shape. Prior to the development of this program, the design task was performed by use of a difficult-to-use spreadsheet program that required manual addition of rows and columns depending on the numbers of strap rows and columns of a given design. In contrast, this program is completely parametric and includes logic that automatically adds or deletes rows and columns as needed. With minimal input from the user, this program automatically computes indexing locations, strap lengths, undersizing requirements, and all design data required to produce detailed drawings and assembly procedures. It also generates textual comments that help the user understand the calculations.

  14. Automated optimum design of wing structures. Deterministic and probabilistic approaches

    NASA Technical Reports Server (NTRS)

    Rao, S. S.

    1982-01-01

    The automated optimum design of airplane wing structures subjected to multiple behavior constraints is described. The structural mass of the wing is considered the objective function. The maximum stress, wing tip deflection, root angle of attack, and flutter velocity during the pull up maneuver (static load), the natural frequencies of the wing structure, and the stresses induced in the wing structure due to landing and gust loads are suitably constrained. Both deterministic and probabilistic approaches are used for finding the stresses induced in the airplane wing structure due to landing and gust loads. A wing design is represented by a uniform beam with a cross section in the form of a hollow symmetric double wedge. The airfoil thickness and chord length are the design variables, and a graphical procedure is used to find the optimum solutions. A supersonic wing design is represented by finite elements. The thicknesses of the skin and the web and the cross sectional areas of the flanges are the design variables, and nonlinear programming techniques are used to find the optimum solution.

  15. Relocatable, Automated Cost-Benefit Analysis for Marine Sensor Network Design

    PubMed Central

    D’Este, Claire; de Souza, Paulo; Sharman, Chris; Allen, Simon

    2012-01-01

    When designing sensor networks, we need to ensure they produce representative and relevant data, but this must be offset by the financial cost of placing sensors. We describe a novel automated method for generating and combining cost and benefit values to decide on the best sensor locations using information about the specific constraints available in most coastal locations. Costs in maintenance, negotiation, equipment, exposure and communication are estimated using hydrodynamic models and Electronic Navigation Charts. Benefits in maximum coverage and reducing overall error are also determined using model output. This method demonstrates equivalent accuracy at predicting the whole system to expert-chosen locations, whilst significantly reducing the estimated costs. PMID:22736982

  16. [Design of an Incremental and Open Laboratory Automation System].

    PubMed

    Xie, Chuanfen; Chen, Yueping; Wang, Zhihong

    2015-07-01

    Recent years have witnessed great development of TLA (Total Laboratory Automation) technology, however, its application hit the bottleneck of high cost and openess to other parties' instruments. Specifically speaking, the initial purchase of the medical devices requires large sum of money and the new system can hardly be compatible with existing equipment. This thesis proposes a new thought for system implementation that through incremental upgrade, the initial capital investment can be reduced and through open architecture and interfaces, the seamless connection of different devices can be achieved. This thesis elaborates on the standards that open architecture design should follow in aspect of mechanics, electro-communication and information interaction and the key technology points in system implementation. PMID:26665947

  17. Embedded design based virtual instrument program for positron beam automation

    NASA Astrophysics Data System (ADS)

    Jayapandian, J.; Gururaj, K.; Abhaya, S.; Parimala, J.; Amarendra, G.

    2008-10-01

    Automation of positron beam experiment with a single chip embedded design using a programmable system on chip (PSoC) which provides easy interfacing of the high-voltage DC power supply is reported. Virtual Instrument (VI) control program written in Visual Basic 6.0 ensures the following functions (i) adjusting of sample high voltage by interacting with the programmed PSoC hardware, (ii) control of personal computer (PC) based multi channel analyzer (MCA) card for energy spectroscopy, (iii) analysis of the obtained spectrum to extract the relevant line shape parameters, (iv) plotting of relevant parameters and (v) saving the file in the appropriate format. The present study highlights the hardware features of the PSoC hardware module as well as the control of MCA and other units through programming in Visual Basic.

  18. An Innovative Method of Teaching Electronic System Design with PSoC

    ERIC Educational Resources Information Center

    Ye, Zhaohui; Hua, Chengying

    2012-01-01

    Programmable system-on-chip (PSoC), which provides a microprocessor and programmable analog and digital peripheral functions in a single chip, is very convenient for mixed-signal electronic system design. This paper presents the experience of teaching contemporary mixed-signal electronic system design with PSoC in the Department of Automation,…

  19. Automated design of multiphase space missions using hybrid optimal control

    NASA Astrophysics Data System (ADS)

    Chilan, Christian Miguel

    A modern space mission is assembled from multiple phases or events such as impulsive maneuvers, coast arcs, thrust arcs and planetary flybys. Traditionally, a mission planner would resort to intuition and experience to develop a sequence of events for the multiphase mission and to find the space trajectory that minimizes propellant use by solving the associated continuous optimal control problem. This strategy, however, will most likely yield a sub-optimal solution, as the problem is sophisticated for several reasons. For example, the number of events in the optimal mission structure is not known a priori and the system equations of motion change depending on what event is current. In this work a framework for the automated design of multiphase space missions is presented using hybrid optimal control (HOC). The method developed uses two nested loops: an outer-loop that handles the discrete dynamics and finds the optimal mission structure in terms of the categorical variables, and an inner-loop that performs the optimization of the corresponding continuous-time dynamical system and obtains the required control history. Genetic algorithms (GA) and direct transcription with nonlinear programming (NLP) are introduced as methods of solution for the outer-loop and inner-loop problems, respectively. Automation of the inner-loop, continuous optimal control problem solver, required two new technologies. The first is a method for the automated construction of the NLP problems resulting from the use of a direct solver for systems with different structures, including different numbers of categorical events. The method assembles modules, consisting of parameters and constraints appropriate to each event, sequentially according to the given mission structure. The other new technology is for a robust initial guess generator required by the inner-loop NLP problem solver. Two new methods were developed for cases including low-thrust trajectories. The first method, based on GA

  20. Electronic Alerts with Automated Consultations Promote Appropriate Antimicrobial Prescriptions

    PubMed Central

    Kim, Moonsuk; Kim, Chung-Jong; Song, Minkyo; Choe, Pyoeng Gyun; Park, Wan Beom; Bang, Ji Hwan; Hwang, Hee; Kim, Eu Suk; Park, Sang-Won; Kim, Nam Joong; Oh, Myoung-don; Kim, Hong Bin

    2016-01-01

    Background To promote appropriate antimicrobial use in bloodstream infections (BSIs), we initiated an intervention program consisting of electronic alerts and automated infectious diseases consultations in which the identification and antimicrobial susceptibility test (ID/AST) results were reported. Methods We compared the appropriateness of antimicrobial prescriptions and clinical outcomes in BSIs before and after initiation of the program. Appropriateness was assessed in terms of effective therapy, optimal therapy, de-escalation therapy, and intravenous to oral switch therapy. Results There were 648 BSI episodes in the pre-program period and 678 in the program period. The proportion of effective, optimal, and de-escalation therapies assessed 24 hours after the reporting of the ID/AST results increased from 87.8% (95% confidence interval [CI] 85.5–90.5), 64.4% (95% CI 60.8–68.1), and 10.0% (95% CI 7.5–12.6) in the pre-program period, respectively, to 94.4% (95% CI 92.7–96.1), 81.4% (95% CI 78.4–84.3), and 18.6% (95% CI 15.3–21.9) in the program period, respectively. Kaplan-Meier analyses and log-rank tests revealed that the time to effective (p<0.001), optimal (p<0.001), and de-escalation (p = 0.017) therapies were significantly different in the two periods. Segmented linear regression analysis showed the increase in the proportion of effective (p = 0.015), optimal (p<0.001), and de-escalation (p = 0.010) therapies at 24 hours after reporting, immediately after program initiation. No significant baseline trends or changes in trends were identified. There were no significant differences in time to intravenous to oral switch therapy, length of stay, and 30-day mortality rate. Conclusion This novel form of stewardship program based on intervention by infectious disease specialists and information technology improved antimicrobial prescriptions in BSIs. PMID:27532125

  1. Software Would Largely Automate Design of Kalman Filter

    NASA Technical Reports Server (NTRS)

    Chuang, Jason C. H.; Negast, William J.

    2005-01-01

    Embedded Navigation Filter Automatic Designer (ENFAD) is a computer program being developed to automate the most difficult tasks in designing embedded software to implement a Kalman filter in a navigation system. The most difficult tasks are selection of error states of the filter and tuning of filter parameters, which are timeconsuming trial-and-error tasks that require expertise and rarely yield optimum results. An optimum selection of error states and filter parameters depends on navigation-sensor and vehicle characteristics, and on filter processing time. ENFAD would include a simulation module that would incorporate all possible error states with respect to a given set of vehicle and sensor characteristics. The first of two iterative optimization loops would vary the selection of error states until the best filter performance was achieved in Monte Carlo simulations. For a fixed selection of error states, the second loop would vary the filter parameter values until an optimal performance value was obtained. Design constraints would be satisfied in the optimization loops. Users would supply vehicle and sensor test data that would be used to refine digital models in ENFAD. Filter processing time and filter accuracy would be computed by ENFAD.

  2. An automated approach to the design of decision tree classifiers

    NASA Technical Reports Server (NTRS)

    Argentiero, P.; Chin, P.; Beaudet, P.

    1980-01-01

    The classification of large dimensional data sets arising from the merging of remote sensing data with more traditional forms of ancillary data is considered. Decision tree classification, a popular approach to the problem, is characterized by the property that samples are subjected to a sequence of decision rules before they are assigned to a unique class. An automated technique for effective decision tree design which relies only on apriori statistics is presented. This procedure utilizes a set of two dimensional canonical transforms and Bayes table look-up decision rules. An optimal design at each node is derived based on the associated decision table. A procedure for computing the global probability of correct classfication is also provided. An example is given in which class statistics obtained from an actual LANDSAT scene are used as input to the program. The resulting decision tree design has an associated probability of correct classification of .76 compared to the theoretically optimum .79 probability of correct classification associated with a full dimensional Bayes classifier. Recommendations for future research are included.

  3. The Automated Bicron Tester: Automated electronic instrument diagnostic, testing, and alignment system with records generation

    SciTech Connect

    Rao, G.S.; Maddox, S.R.; Turner, G.W.; Vandermolen, R.I.

    1995-11-01

    The Bicron Surveyor MX is a portable radiation monitoring instrument used by the Office of Radiation Protection at Oak Ridge National Laboratory. This instrument must be calibrated in order to assure reliable operation. A manual calibration procedure was developed, but it was time consuming and repetitive. Therefore, an automated tester station that would allow the technicians to calibrate the instruments faster and more reliably was developed. With the automated tester station, calibration records and accountability could be generated and maintained automatically. This allows the technicians to concentrate on repairing defective units. The Automated Bicron Tester consists of an operator interface, an analog board, and a digital controller board. The panel is the user interface that allows the technician to communicate with the tester. The analog board has an analog-to-digital converter (ADC) that converts the signals from the instrument into digital data that the tester can manipulate. The digital controller board contains the circuitry to perform the test and to communicate the results to the host personal computer (PC). The tester station is connected to the unit under test through a special test harness that attaches to a header on the Bicron. The tester sends pulse trains to the Bicron and measures the resulting meter output. This is done to determine if the unit is functioning properly. The testers are connected to the host PC through an RS-485 serial line. The host PC polls all the tester stations that are connected to it and collects data from those that have completed a calibration. It logs these data and stores the record in a format ready for export to the Maintenance, Accountability, Jobs, and Inventory Control (MAJIC) database. It also prints a report. The programs for the Automated Bicron Tester and the host are written in the C language.

  4. Toward new design-rule-check of silicon photonics for automated layout physical verifications

    NASA Astrophysics Data System (ADS)

    Ismail, Mohamed; El Shamy, Raghi S.; Madkour, Kareem; Hammouda, Sherif; Swillam, Mohamed A.

    2015-02-01

    A simple analytical model is developed to estimate the power loss and time delay in photonic integrated circuits fabricated using SOI standard wafers. This model is simple and can be utilized in physical verification of the circuit layout to verify its feasibility for fabrication using certain foundry specifications. This model allows for providing new design rules for the layout physical verification process in any electronic design automation (EDA) tool. The model is accurate and compared with finite element based full wave electromagnetic EM solver. The model is closed form and circumvents the need to utilize any EM solver for verification process. As such it dramatically reduces the time of verification process and allows fast design rule check.

  5. Reliability-Based Electronics Shielding Design Tools

    NASA Technical Reports Server (NTRS)

    Wilson, J. W.; O'Neill, P. J.; Zang, T. A.; Pandolf, J. E.; Tripathi, R. K.; Koontz, Steven L.; Boeder, P.; Reddell, B.; Pankop, C.

    2007-01-01

    Shielding design on large human-rated systems allows minimization of radiation impact on electronic systems. Shielding design tools require adequate methods for evaluation of design layouts, guiding qualification testing, and adequate follow-up on final design evaluation.

  6. The automated design of materials far from equilibrium

    NASA Astrophysics Data System (ADS)

    Miskin, Marc Z.

    Automated design is emerging as a powerful concept in materials science. By combining computer algorithms, simulations, and experimental data, new techniques are being developed that start with high level functional requirements and identify the ideal materials that achieve them. This represents a radically different picture of how materials become functional in which technological demand drives material discovery, rather than the other way around. At the frontiers of this field, materials systems previously considered too complicated can start to be controlled and understood. Particularly promising are materials far from equilibrium. Material robustness, high strength, self-healing and memory are properties displayed by several materials systems that are intrinsically out of equilibrium. These and other properties could be revolutionary, provided they can first be controlled. This thesis conceptualizes and implements a framework for designing materials that are far from equilibrium. We show how, even in the absence of a complete physical theory, design from the top down is possible and lends itself to producing physical insight. As a prototype system, we work with granular materials: collections of athermal, macroscopic identical objects, since these materials function both as an essential component of industrial processes as well as a model system for many non-equilibrium states of matter. We show that by placing granular materials in the context of design, benefits emerge simultaneously for fundamental and applied interests. As first steps, we use our framework to design granular aggregates with extreme properties like high stiffness, and softness. We demonstrate control over nonlinear effects by producing exotic aggregates that stiffen under compression. Expanding on our framework, we conceptualize new ways of thinking about material design when automatic discovery is possible. We show how to build rules that link particle shapes to arbitrary granular packing

  7. Automated database design for large-scale scientific applications

    NASA Astrophysics Data System (ADS)

    Papadomanolakis, Stratos

    The need for large-scale scientific data management is today more pressing than ever, as modern sciences need to store and process terabyte-scale data volumes. Traditional systems, relying on filesystems and custom data access and processing code do not scale for multi-terabyte datasets. Therefore, supporting today's data-driven sciences requires the development of new data management capabilities. This Ph.D dissertation develops techniques that allow modern Database Management Systems (DBMS) to efficiently handle large scientific datasets. Several recent successful DBMS deployments target applications like astronomy, that manage collections of objects or observations (e.g. galaxies, spectra) and can easily store their data in a commercial relational DBMS. Query performance for such systems critically depends on the database physical design, the organization of database structures such as indexes and tables. This dissertation develops algorithms and tools for automating the physical design process. Our tools allow databases to tune themselves, providing efficient query execution in the presence of large data volumes and complex query workloads. For more complex applications dealing with multidimensional and time-varying data, standard relational DBMS are inadequate. Efficiently supporting such applications requires the development of novel indexing and query processing techniques. This dissertation develops an indexing technique for unstructured tetrahedral meshes, a multidimensional data organization used in finite element analysis applications. Our technique outperforms existing multidimensional indexing techniques and has the advantage that can easily be integrated with standard DBMS, providing existing systems with the ability to handle spatial data with minor modifications.

  8. Electronic design with integrated circuits

    NASA Astrophysics Data System (ADS)

    Comer, D. J.

    The book is concerned with the application of integrated circuits and presents the material actually needed by the system designer to do an effective job. The operational amplifier (op amp) is discussed, taking into account the electronic amplifier, the basic op amp, the practical op amp, analog applications, and digital applications. Digital components are considered along with combinational logic, digital subsystems, the microprocessor, special circuits, communications, and integrated circuit building blocks. Attention is given to logic gates, logic families, multivibrators, the digital computer, digital methods, communicating with a computer, computer organization, register and timing circuits for data transfer, arithmetic circuits, memories, the microprocessor chip, the control unit, communicating with the microprocessor, examples of microprocessor architecture, programming a microprocessor, the voltage-controlled oscillator, the phase-locked loop, analog-to-digital conversion, amplitude modulation, frequency modulation, pulse and digital transmission, the semiconductor diode, the bipolar transistor, and the field-effect transistor.

  9. Automated Design of Noise-Minimal, Safe Rotorcraft Trajectories

    NASA Technical Reports Server (NTRS)

    Morris, Robert A.; Venable, K. Brent; Lindsay, James

    2012-01-01

    NASA and the international community are investing in the development of a commercial transportation infrastructure that includes the increased use of rotorcraft, specifically helicopters and aircraft such as a 40-passenger civil tilt rotors. Rotorcraft have a number of advantages over fixed wing aircraft, primarily in not requiring direct access to the primary fixed wing runways. As such they can operate at an airport without directly interfering with major air carrier and commuter aircraft operations. However, there is significant concern over the impact of noise on the communities surrounding the transportation facilities. In this paper we propose to address the rotorcraft noise problem by exploiting powerful search techniques coming from artificial intelligence, coupled with simulation and field tests, to design trajectories that are expected to improve on the amount of ground noise generated. This paper investigates the use of simulation based on predictive physical models to facilitate the search for low-noise trajectories using a class of automated search algorithms called local search. A novel feature of this approach is the ability to incorporate constraints into the problem formulation that addresses passenger safety and comfort.

  10. DESIGN HANDBOOK FOR AUTOMATION OF ACTIVATED SLUDGE WASTEWATER TREATMENT PLANTS

    EPA Science Inventory

    This report is a systems engineering handbook for the automation of activated sludge wastewater treatment processes. Process control theory and application are discussed to acquaint the reader with terminology and fundamentals. Successful unit process control strategies currently...

  11. Electronic aids to conceptual design

    NASA Technical Reports Server (NTRS)

    Bouchard, Eugene E.

    1990-01-01

    Presented in viewgraph form are techniques to improve the conceptual design of complex systems. The paper discusses theory of design, flexible software tools for computer aided design, and methods for enhancing communication among design teams.

  12. Xyce parallel electronic simulator design.

    SciTech Connect

    Thornquist, Heidi K.; Rankin, Eric Lamont; Mei, Ting; Schiek, Richard Louis; Keiter, Eric Richard; Russo, Thomas V.

    2010-09-01

    This document is the Xyce Circuit Simulator developer guide. Xyce has been designed from the 'ground up' to be a SPICE-compatible, distributed memory parallel circuit simulator. While it is in many respects a research code, Xyce is intended to be a production simulator. As such, having software quality engineering (SQE) procedures in place to insure a high level of code quality and robustness are essential. Version control, issue tracking customer support, C++ style guildlines and the Xyce release process are all described. The Xyce Parallel Electronic Simulator has been under development at Sandia since 1999. Historically, Xyce has mostly been funded by ASC, the original focus of Xyce development has primarily been related to circuits for nuclear weapons. However, this has not been the only focus and it is expected that the project will diversify. Like many ASC projects, Xyce is a group development effort, which involves a number of researchers, engineers, scientists, mathmaticians and computer scientists. In addition to diversity of background, it is to be expected on long term projects for there to be a certain amount of staff turnover, as people move on to different projects. As a result, it is very important that the project maintain high software quality standards. The point of this document is to formally document a number of the software quality practices followed by the Xyce team in one place. Also, it is hoped that this document will be a good source of information for new developers.

  13. Automated design synthesis of robotic/human workcells for improved manufacturing system design in hazardous environments

    SciTech Connect

    Williams, Joshua M.

    2012-06-12

    Manufacturing tasks that are deemed too hazardous for workers require the use of automation, robotics, and/or other remote handling tools. The associated hazards may be radiological or nonradiological, and based on the characteristics of the environment and processing, a design may necessitate robotic labor, human labor, or both. There are also other factors such as cost, ergonomics, maintenance, and efficiency that also effect task allocation and other design choices. Handling the tradeoffs of these factors can be complex, and lack of experience can be an issue when trying to determine if and what feasible automation/robotics options exist. To address this problem, we utilize common engineering design approaches adapted more for manufacturing system design in hazardous environments. We limit our scope to the conceptual and embodiment design stages, specifically a computational algorithm for concept generation and early design evaluation. In regard to concept generation, we first develop the functional model or function structure for the process, using the common 'verb-noun' format for describing function. A common language or functional basis for manufacturing was developed and utilized to formalize function descriptions and guide rules for function decomposition. Potential components for embodiment are also grouped in terms of this functional language and are stored in a database. The properties of each component are given as quantitative and qualitative criteria. Operators are also rated for task-relevant criteria which are used to address task compatibility. Through the gathering of process requirements/constraints, construction of the component database, and development of the manufacturing basis and rule set, design knowledge is stored and available for computer use. Thus, once the higher level process functions are defined, the computer can automate the synthesis of new design concepts through alternating steps of embodiment and function structure updates

  14. INTEGRATING THE STORED GRAIN ADVISOR PRO EXPERT SYSTEM WITH AN AUTOMATED ELECTRONIC ELECTRONIC GRAIN PROBE TRAPPING SYSTEM

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Automation of grain sampling should help to increase the adoption of stored-grain integrated pest management. A new commercial electronic grain probe trap (OPI Insector™) has recently been marketed. To make accurate insect management decisions, managers need to know both the insect species and numbe...

  15. Managing Selection for Electronic Resources: Kent State University Develops a New System to Automate Selection

    ERIC Educational Resources Information Center

    Downey, Kay

    2012-01-01

    Kent State University has developed a centralized system that manages the communication and work related to the review and selection of commercially available electronic resources. It is an automated system that tracks the review process, provides selectors with price and trial information, and compiles reviewers' feedback about the resource. It…

  16. INTEGRATING THE STORED GRAIN ADVISOR PRO EXPERT SYSTEM WITH AN AUTOMATED ELECTRONIC GRAIN PROBE TRAPPING SYSTEM

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Automation of grain sampling should help to increase the adoption of stored-grain integrated pest management. A new commercial electronic grain probe trap (OPI Insector) has recently been marketed. To make accurate insect management decisions, managers need to know both the insect species and number...

  17. Electronic Resource Management and Design

    ERIC Educational Resources Information Center

    Abrams, Kimberly R.

    2015-01-01

    We have now reached a tipping point at which electronic resources comprise more than half of academic library budgets. Because of the increasing work associated with the ever-increasing number of e-resources, there is a trend to distribute work throughout the library even in the presence of an electronic resources department. In 2013, the author…

  18. Design and Implementation of an Open, Interoperable AutomatedDemand Response Infrastructure

    SciTech Connect

    Piette, Mary Ann; Kiliccote, Sila; Ghatikar, Girish

    2007-10-01

    This paper describes the concept for and lessons from the development and field-testing of an open, interoperable communications infrastructure to support automating demand response (DR). Automating DR allows greater levels of participation and improved reliability and repeatability of the demand response and customer facilities. Automated DR systems have been deployed for critical peak pricing and demand bidding and are being designed for real time pricing. The system is designed to generate, manage, and track DR signals between utilities and Independent System Operators (ISOs) to aggregators and end-use customers and their control systems.

  19. Model-Based Design of Air Traffic Controller-Automation Interaction

    NASA Technical Reports Server (NTRS)

    Romahn, Stephan; Callantine, Todd J.; Palmer, Everett A.; Null, Cynthia H. (Technical Monitor)

    1998-01-01

    A model of controller and automation activities was used to design the controller-automation interactions necessary to implement a new terminal area air traffic management concept. The model was then used to design a controller interface that provides the requisite information and functionality. Using data from a preliminary study, the Crew Activity Tracking System (CATS) was used to help validate the model as a computational tool for describing controller performance.

  20. Investigation of expert system design approaches for electronic design environments

    NASA Astrophysics Data System (ADS)

    Poppens, Susan A.

    1987-12-01

    Various schemes were investigated that are available for the design effort of electronic systems. The information is to be incorporated into a knowledge base to determine approaches for a particular design. Various design methodologies are to be investigated for their appropriateness and application in the aforesaid design environment. The second phase is to focus on the knowledge base gathered in the design effort for electronic design. This knowledge base is to be incorporated into a rule based expert system which can be utilized by the design engineer in the design/development of functional specifications.

  1. Spacecraft electronics design for radiation tolerance

    SciTech Connect

    Rasmussen, R.D.

    1988-11-01

    Spacecraft electronics design for radiation tolerance is a complex subject, involving a detailed understanding of environment, component hardening, and design susceptibility. This paper describes current design practices and discusses future trends in spacecraft electronics which are likely to alter traditional approaches. A summary of radiation effects and radiation tolerance requirements typically levied on spacecraft designs is provided. Methods of dealing with radiation are then described, followed with testability issues.

  2. Preparing Electronic Clinical Data for Quality Improvement and Comparative Effectiveness Research: The SCOAP CERTAIN Automation and Validation Project

    PubMed Central

    Devine, Emily Beth; Capurro, Daniel; van Eaton, Erik; Alfonso-Cristancho, Rafael; Devlin, Allison; Yanez, N. David; Yetisgen-Yildiz, Meliha; Flum, David R.; Tarczy-Hornoch, Peter

    2013-01-01

    Background: The field of clinical research informatics includes creation of clinical data repositories (CDRs) used to conduct quality improvement (QI) activities and comparative effectiveness research (CER). Ideally, CDR data are accurately and directly abstracted from disparate electronic health records (EHRs), across diverse health-systems. Objective: Investigators from Washington State’s Surgical Care Outcomes and Assessment Program (SCOAP) Comparative Effectiveness Research Translation Network (CERTAIN) are creating such a CDR. This manuscript describes the automation and validation methods used to create this digital infrastructure. Methods: SCOAP is a QI benchmarking initiative. Data are manually abstracted from EHRs and entered into a data management system. CERTAIN investigators are now deploying Caradigm’s Amalga™ tool to facilitate automated abstraction of data from multiple, disparate EHRs. Concordance is calculated to compare data automatically to manually abstracted. Performance measures are calculated between Amalga and each parent EHR. Validation takes place in repeated loops, with improvements made over time. When automated abstraction reaches the current benchmark for abstraction accuracy - 95% - itwill ‘go-live’ at each site. Progress to Date: A technical analysis was completed at 14 sites. Five sites are contributing; the remaining sites prioritized meeting Meaningful Use criteria. Participating sites are contributing 15–18 unique data feeds, totaling 13 surgical registry use cases. Common feeds are registration, laboratory, transcription/dictation, radiology, and medications. Approximately 50% of 1,320 designated data elements are being automatically abstracted—25% from structured data; 25% from text mining. Conclusion: In semi-automating data abstraction and conducting a rigorous validation, CERTAIN investigators will semi-automate data collection to conduct QI and CER, while advancing the Learning Healthcare System. PMID:25848565

  3. Collaborative Electronic Notebooks as Electronic Records: Design Issues for the Secure Electronic Laboratory Notebook (ELN)

    SciTech Connect

    Myers, James D.

    2003-01-24

    Current electronic notebooks (EN) can be grouped roughly into two general classes - personal/group productivity tools and enterprise records/knowledge management systems. Personal/group productivity-oriented ENs extend the notebook metaphor in terms of supporting multimedia annotations, automating workflow and data processing, supporting simultaneous use by distributed researchers, providing displays on personal digital assistants

  4. Robotic design for an automated uranium solution enrichment system

    SciTech Connect

    Horley, E.C.; Beugelsdijk, T.; Biddle, R.S.; Bronisz, L.E.; Hansen, W.J.; Li, T.K.; Sampson, T.E.; Walton, G.

    1990-01-01

    A method to automate solution enrichment analysis by gamma-ray spectroscopy is being developed at Los Alamos National Laboratory. Both passive and x-ray fluorescence (XRF) analyses will be remotely performed to determine the amounts of {sup 235}U and total uranium in sample containers. A commercial laboratory robot will be used to process up to 40 batch and 8 priority samples in an unattended mode. Samples will be read by a bar-code reader to determine measurement requirements, then assayed by either or both of the gamma-ray and XRF instruments. The robot will be responsible for moving the sample containers and operating all shield doors and shutters. In addition to reducing hardware complexity, this feature will also allow manual operation of the instruments if the robot fails. This automated system will reduce personnel radiation exposure and increase the reliability and repeatability of the measurements.

  5. Development of automated power system management techniques. [spacecraft design

    NASA Technical Reports Server (NTRS)

    Imamura, M. S.; Moser, R. L.; Skelly, L. A.; Weiner, H.

    1978-01-01

    The basic approach in the automated power system management (APSM) implementation is to use one central microprocessor for the overall power system supervision and several local microprocessors dedicated to one or more major subassemblies to perform simple monitoring and control functions. Communication between the central and each local processor is through a dedicated two-wire network employing serial data transfer. The block diagrams of the processors, the data bus characteristics, and the software functions and organization are presented.

  6. Designing of smart home automation system based on Raspberry Pi

    NASA Astrophysics Data System (ADS)

    Saini, Ravi Prakash; Singh, Bhanu Pratap; Sharma, Mahesh Kumar; Wattanawisuth, Nattapol; Leeprechanon, Nopbhorn

    2016-03-01

    Locally networked or remotely controlled home automation system becomes a popular paradigm because of the numerous advantages and is suitable for academic research. This paper proposes a method for an implementation of Raspberry Pi based home automation system presented with an android phone access interface. The power consumption profile across the connected load is measured accurately through programming. Users can access the graph of total power consumption with respect to time worldwide using their Dropbox account. An android application has been developed to channelize the monitoring and controlling operation of home appliances remotely. This application facilitates controlling of operating pins of Raspberry Pi by pressing the corresponding key for turning "on" and "off" of any desired appliance. Systems can range from the simple room lighting control to smart microcontroller based hybrid systems incorporating several other additional features. Smart home automation systems are being adopted to achieve flexibility, scalability, security in the sense of data protection through the cloud-based data storage protocol, reliability, energy efficiency, etc.

  7. Automated detection of a prostate Ni-Ti stent in electronic portal images

    SciTech Connect

    Carl, Jesper; Nielsen, Henning; Nielsen, Jane; Lund, Bente; Larsen, Erik Hoejkjaer

    2006-12-15

    Planning target volumes (PTV) in fractionated radiotherapy still have to be outlined with wide margins to the clinical target volume due to uncertainties arising from daily shift of the prostate position. A recently proposed new method of visualization of the prostate is based on insertion of a thermo-expandable Ni-Ti stent. The current study proposes a new detection algorithm for automated detection of the Ni-Ti stent in electronic portal images. The algorithm is based on the Ni-Ti stent having a cylindrical shape with a fixed diameter, which was used as the basis for an automated detection algorithm. The automated method uses enhancement of lines combined with a grayscale morphology operation that looks for enhanced pixels separated with a distance similar to the diameter of the stent. The images in this study are all from prostate cancer patients treated with radiotherapy in a previous study. Images of a stent inserted in a humanoid phantom demonstrated a localization accuracy of 0.4-0.7 mm which equals the pixel size in the image. The automated detection of the stent was compared to manual detection in 71 pairs of orthogonal images taken in nine patients. The algorithm was successful in 67 of 71 pairs of images. The method is fast, has a high success rate, good accuracy, and has a potential for unsupervised localization of the prostate before radiotherapy, which would enable automated repositioning before treatment and allow for the use of very tight PTV margins.

  8. Electronics Shielding and Reliability Design Tools

    NASA Technical Reports Server (NTRS)

    Wilson, John W.; ONeill, P. M.; Zang, Thomas A., Jr.; Pandolf, John E.; Koontz, Steven L.; Boeder, P.; Reddell, B.; Pankop, C.

    2006-01-01

    It is well known that electronics placement in large-scale human-rated systems provides opportunity to optimize electronics shielding through materials choice and geometric arrangement. For example, several hundred single event upsets (SEUs) occur within the Shuttle avionic computers during a typical mission. An order of magnitude larger SEU rate would occur without careful placement in the Shuttle design. These results used basic physics models (linear energy transfer (LET), track structure, Auger recombination) combined with limited SEU cross section measurements allowing accurate evaluation of target fragment contributions to Shuttle avionics memory upsets. Electronics shielding design on human-rated systems provides opportunity to minimize radiation impact on critical and non-critical electronic systems. Implementation of shielding design tools requires adequate methods for evaluation of design layouts, guiding qualification testing, and an adequate follow-up on final design evaluation including results from a systems/device testing program tailored to meet design requirements.

  9. Some Challenges in the Design of Human-Automation Interaction for Safety-Critical Systems

    NASA Technical Reports Server (NTRS)

    Feary, Michael S.; Roth, Emilie

    2014-01-01

    Increasing amounts of automation are being introduced to safety-critical domains. While the introduction of automation has led to an overall increase in reliability and improved safety, it has also introduced a class of failure modes, and new challenges in risk assessment for the new systems, particularly in the assessment of rare events resulting from complex inter-related factors. Designing successful human-automation systems is challenging, and the challenges go beyond good interface development (e.g., Roth, Malin, & Schreckenghost 1997; Christoffersen & Woods, 2002). Human-automation design is particularly challenging when the underlying automation technology generates behavior that is difficult for the user to anticipate or understand. These challenges have been recognized in several safety-critical domains, and have resulted in increased efforts to develop training, procedures, regulations and guidance material (CAST, 2008, IAEA, 2001, FAA, 2013, ICAO, 2012). This paper points to the continuing need for new methods to describe and characterize the operational environment within which new automation concepts are being presented. We will describe challenges to the successful development and evaluation of human-automation systems in safety-critical domains, and describe some approaches that could be used to address these challenges. We will draw from experience with the aviation, spaceflight and nuclear power domains.

  10. Laser Opto-Electronic Correlator for Robotic Vision Automated Pattern Recognition

    NASA Technical Reports Server (NTRS)

    Marzwell, Neville

    1995-01-01

    A compact laser opto-electronic correlator for pattern recognition has been designed, fabricated, and tested. Specifically it is a translation sensitivity adjustable compact optical correlator (TSACOC) utilizing convergent laser beams for the holographic filter. Its properties and performance, including the location of the correlation peak and the effects of lateral and longitudinal displacements for both filters and input images, are systematically analyzed based on the nonparaxial approximation for the reference beam. The theoretical analyses have been verified in experiments. In applying the TSACOC to important practical problems including fingerprint identification, we have found that the tolerance of the system to the input lateral displacement can be conveniently increased by changing a geometric factor of the system. The system can be compactly packaged using the miniature laser diode sources and can be used in space by the National Aeronautics and Space Administration (NASA) and ground commercial applications which include robotic vision, and industrial inspection of automated quality control operations. The personnel of Standard International will work closely with the Jet Propulsion Laboratory (JPL) to transfer the technology to the commercial market. Prototype systems will be fabricated to test the market and perfect the product. Large production will follow after successful results are achieved.

  11. Automated Stitching of Microtubule Centerlines across Serial Electron Tomograms

    PubMed Central

    Weber, Britta; Tranfield, Erin M.; Höög, Johanna L.; Baum, Daniel; Antony, Claude; Hyman, Tony; Verbavatz, Jean-Marc; Prohaska, Steffen

    2014-01-01

    Tracing microtubule centerlines in serial section electron tomography requires microtubules to be stitched across sections, that is lines from different sections need to be aligned, endpoints need to be matched at section boundaries to establish a correspondence between neighboring sections, and corresponding lines need to be connected across multiple sections. We present computational methods for these tasks: 1) An initial alignment is computed using a distance compatibility graph. 2) A fine alignment is then computed with a probabilistic variant of the iterative closest points algorithm, which we extended to handle the orientation of lines by introducing a periodic random variable to the probabilistic formulation. 3) Endpoint correspondence is established by formulating a matching problem in terms of a Markov random field and computing the best matching with belief propagation. Belief propagation is not generally guaranteed to converge to a minimum. We show how convergence can be achieved, nonetheless, with minimal manual input. In addition to stitching microtubule centerlines, the correspondence is also applied to transform and merge the electron tomograms. We applied the proposed methods to samples from the mitotic spindle in C. elegans, the meiotic spindle in X. laevis, and sub-pellicular microtubule arrays in T. brucei. The methods were able to stitch microtubules across section boundaries in good agreement with experts' opinions for the spindle samples. Results, however, were not satisfactory for the microtubule arrays. For certain experiments, such as an analysis of the spindle, the proposed methods can replace manual expert tracing and thus enable the analysis of microtubules over long distances with reasonable manual effort. PMID:25438148

  12. The design of an automated verification of redundant systems

    NASA Technical Reports Server (NTRS)

    Ford, F. A.; Hasslinger, T. W.; Moreno, F. J.

    1972-01-01

    Handbook describes design processes, presents design considerations and techniques, gives tutorial material on implementation and methodology, shows design aids, illustrates use of design aids and application samples, and identifies general practices to be adhered to or avoided.

  13. An automated tool for the design and assessment of space systems

    NASA Technical Reports Server (NTRS)

    Dalcambre, Lois M. L.; Landry, Steve P.

    1990-01-01

    Space systems can be characterized as both large and complex but they often rely on reusable subcomponents. One problem in the design of such systems is the representation and validation of the system, particularly at the higher levels of management. An automated tool is described for the representation, refinement, and validation of such complex systems based on a formal design theory, the Theory of Plausible Design. In particular, the steps necessary to automate the tool and make it a competent, usable assistant, are described.

  14. Automated Monte Carlo biasing for photon-generated electrons near surfaces.

    SciTech Connect

    Franke, Brian Claude; Crawford, Martin James; Kensek, Ronald Patrick

    2009-09-01

    This report describes efforts to automate the biasing of coupled electron-photon Monte Carlo particle transport calculations. The approach was based on weight-windows biasing. Weight-window settings were determined using adjoint-flux Monte Carlo calculations. A variety of algorithms were investigated for adaptivity of the Monte Carlo tallies. Tree data structures were used to investigate spatial partitioning. Functional-expansion tallies were used to investigate higher-order spatial representations.

  15. Designing Microcomputer Networks (and) LANS: A New Technology to Improve Library Automation.

    ERIC Educational Resources Information Center

    Ivie, Evan L.; Farr, Rick C.

    1984-01-01

    Two articles address the design of microcomputer networks and the use of local area computer networks (LAN) to improve library automation. Topics discussed include network design criteria, media for local networks, transmission mode, typical communication protocols, user interface, basic local network architectures, and examples of microcomputer…

  16. Reliability-Based Design of a Safety-Critical Automation System: A Case Study

    NASA Technical Reports Server (NTRS)

    Carroll, Carol W.; Dunn, W.; Doty, L.; Frank, M. V.; Hulet, M.; Alvarez, Teresa (Technical Monitor)

    1994-01-01

    In 1986, NASA funded a project to modernize the NASA Ames Research Center Unitary Plan Wind Tunnels, including the replacement of obsolescent controls with a modern, automated distributed control system (DCS). The project effort on this system included an independent safety analysis (ISA) of the automation system. The purpose of the ISA was to evaluate the completeness of the hazard analyses which had already been performed on the Modernization Project. The ISA approach followed a tailoring of the risk assessment approach widely used on existing nuclear power plants. The tailoring of the nuclear industry oriented risk assessment approach to the automation system and its role in reliability-based design of the automation system is the subject of this paper.

  17. Radiation design criteria handbook. [design criteria for electronic parts applications

    NASA Technical Reports Server (NTRS)

    Stanley, A. G.; Martin, K. E.; Douglas, S.

    1976-01-01

    Radiation design criteria for electronic parts applications in space environments are provided. The data were compiled from the Mariner/Jupiter Saturn 1977 electronic parts radiation test program. Radiation sensitive device types were exposed to radiation environments compatible with the MJS'77 requirements under suitable bias conditions. A total of 189 integrated circuits, transistors, and other semiconductor device types were tested.

  18. Automation and Accountability in Decision Support System Interface Design

    ERIC Educational Resources Information Center

    Cummings, Mary L.

    2006-01-01

    When the human element is introduced into decision support system design, entirely new layers of social and ethical issues emerge but are not always recognized as such. This paper discusses those ethical and social impact issues specific to decision support systems and highlights areas that interface designers should consider during design with an…

  19. Tevatron Electron Lenses: Design and Operation

    SciTech Connect

    Shiltsev, Vladimir; Bishofberger, Kip; Kamerdzhiev, Vsevolod; Kozub, Sergei; Kufer, Matthew; Kuznetsov, Gennady; Martinez, Alexander; Olson, Marvin; Pfeffer, Howard; Saewert, Greg; Scarpine, Vic; /Fermilab /SLAC /Fermilab /Serpukhov, IHEP /Novosibirsk, IYF /Serpukhov, IHEP /Fermilab

    2008-08-01

    The beam-beam effects have been the dominating sources of beam loss and lifetime limitations in the Tevatron proton-antiproton collider [1]. Electron lenses were originally proposed for compensation of electromagnetic long-range and head-on beam-beam interactions of proton and antiproton beams [2]. Results of successful employment of two electron lenses built and installed in the Tevatron are reported in [3,4,5]. In this paper we present design features of the Tevatron electron lenses (TELs), discuss the generation of electron beams, describe different modes of operation and outline the technical parameters of various subsystems.

  20. D-Zero muon readout electronics design

    SciTech Connect

    Baldin, B.; Hansen, S.; Los, S.; Matveev, M.; Vaniev, V.

    1996-11-01

    The readout electronics designed for the D{null} Muon Upgrade are described. These electronics serve three detector subsystems and one trigger system. The front-ends and readout hardware are synchronized by means of timing signals broadcast from the D{null} Trigger Framework. The front-end electronics have continuously running digitizers and two levels of buffering resulting in nearly deadtimeless operation. The raw data is corrected and formatted by 16- bit fixed point DSP processors. These processors also perform control of the data buffering. The data transfer from the front-end electronics located on the detector platform is performed by serial links running at 160 Mbit/s. The design and test results of the subsystem readout electronics and system interface are discussed.

  1. Data base systems in electronic design engineering

    NASA Technical Reports Server (NTRS)

    Williams, D.

    1980-01-01

    The concepts of an integrated design data base system (DBMS) as it might apply to an electronic design company are discussed. Data elements of documentation, project specifications, project tracking, firmware, software, electronic and mechanical design can be integrated and managed through a single DBMS. Combining the attributes of a DBMS data handler with specialized systems and functional data can provide users with maximum flexibility, reduced redundancy, and increased overall systems performance. Although some system overhead is lost due to redundancy in transitory data, it is believed the combination of the two data types is advisable rather than trying to do all data handling through a single DBMS.

  2. Model-based automated segmentation of kinetochore microtubule from electron tomography.

    PubMed

    Jiang, Ming; Ji, Qiang; McEwen, Bruce

    2004-01-01

    The segmentation of kinetochore microtubules from electron tomography is challenging due to the poor quality of the acquired data and the cluttered cellular surroundings. We propose to automate the microtubule segmentation by extending the active shape model (ASM) in two aspects. First, we develop a higher order boundary model obtained by 3-D local surface estimation that characterizes the microtubule boundary better than the gray level appearance model in the 2-D microtubule cross section. We then incorporate this model into the weight matrix of the fitting error measurement to increase the influence of salient features. Second, we integrate the ASM with Kalman filtering to utilize the shape information along the longitudinal direction of the microtubules. The ASM modified in this way is robust against missing data and outliers frequently present in the kinetochore tomography volume. Experimental results demonstrate that our automated method outperforms manual process but using only a fraction of the time of the latter. PMID:17272020

  3. Automated integration of continuous glucose monitor data in the electronic health record using consumer technology.

    PubMed

    Kumar, Rajiv B; Goren, Nira D; Stark, David E; Wall, Dennis P; Longhurst, Christopher A

    2016-05-01

    The diabetes healthcare provider plays a key role in interpreting blood glucose trends, but few institutions have successfully integrated patient home glucose data in the electronic health record (EHR). Published implementations to date have required custom interfaces, which limit wide-scale replication. We piloted automated integration of continuous glucose monitor data in the EHR using widely available consumer technology for 10 pediatric patients with insulin-dependent diabetes. Establishment of a passive data communication bridge via a patient's/parent's smartphone enabled automated integration and analytics of patient device data within the EHR between scheduled clinic visits. It is feasible to utilize available consumer technology to assess and triage home diabetes device data within the EHR, and to engage patients/parents and improve healthcare provider workflow. PMID:27018263

  4. Automating software design and configuration for a small spacecraft

    NASA Astrophysics Data System (ADS)

    Straub, Jeremy

    2014-06-01

    The Open Prototype for Educational NanoSats (OPEN) is a framework for the development of low-cost spacecraft. It will allow users to build a 1-U (10 cm x 10 cm x 11 cm, 1.33 kg) CubeSat-class spacecraft with a parts budget of approximately $5,000. Work is underway to develop software to assist users in configuring the spacecraft and validating its compliance with integration and launch standards. Each prospective configuration requires a unique software configuration, combining pre-built modules for controlling base components, custom control software for custom developed and payload components and overall mission management and control software (which, itself will be a combination of standard components and mission specific control logic). This paper presents a system for automating standard component configuration and creating templates to facilitate the creation and integration of components that must be (or which the developer desires to be) custom-developed for the particular mission or spacecraft.

  5. Design of microcontroller based system for automation of streak camera

    SciTech Connect

    Joshi, M. J.; Upadhyay, J.; Deshpande, P. P.; Sharma, M. L.; Navathe, C. P.

    2010-08-15

    A microcontroller based system has been developed for automation of the S-20 optical streak camera, which is used as a diagnostic tool to measure ultrafast light phenomenon. An 8 bit MCS family microcontroller is employed to generate all control signals for the streak camera. All biasing voltages required for various electrodes of the tubes are generated using dc-to-dc converters. A high voltage ramp signal is generated through a step generator unit followed by an integrator circuit and is applied to the camera's deflecting plates. The slope of the ramp can be changed by varying values of the capacitor and inductor. A programmable digital delay generator has been developed for synchronization of ramp signal with the optical signal. An independent hardwired interlock circuit has been developed for machine safety. A LABVIEW based graphical user interface has been developed which enables the user to program the settings of the camera and capture the image. The image is displayed with intensity profiles along horizontal and vertical axes. The streak camera was calibrated using nanosecond and femtosecond lasers.

  6. Design of microcontroller based system for automation of streak camera

    NASA Astrophysics Data System (ADS)

    Joshi, M. J.; Upadhyay, J.; Deshpande, P. P.; Sharma, M. L.; Navathe, C. P.

    2010-08-01

    A microcontroller based system has been developed for automation of the S-20 optical streak camera, which is used as a diagnostic tool to measure ultrafast light phenomenon. An 8 bit MCS family microcontroller is employed to generate all control signals for the streak camera. All biasing voltages required for various electrodes of the tubes are generated using dc-to-dc converters. A high voltage ramp signal is generated through a step generator unit followed by an integrator circuit and is applied to the camera's deflecting plates. The slope of the ramp can be changed by varying values of the capacitor and inductor. A programmable digital delay generator has been developed for synchronization of ramp signal with the optical signal. An independent hardwired interlock circuit has been developed for machine safety. A LABVIEW based graphical user interface has been developed which enables the user to program the settings of the camera and capture the image. The image is displayed with intensity profiles along horizontal and vertical axes. The streak camera was calibrated using nanosecond and femtosecond lasers.

  7. Using IDE in Instructional Design: Encouraging Reflective Instruction Design Through Automated Design Tools.

    ERIC Educational Resources Information Center

    Russell, Daniel M.; Kelley, Loretta

    The Instructional Design Environment (IDE), a computer assisted instruction tool for instructional design, has been incorporated into the curriculum and instructional development in mathematics instruction in the Stanford Teacher Education Program (STEP). (STEP is a 12-month program leading to an M.A. in education which emphasizes content focus…

  8. Automated design of a uniform distribution using faceted reflectors

    NASA Astrophysics Data System (ADS)

    Cassarly, William J.; David, Stuart R.; Jenkins, David G.; Riser, Andrew; Davenport, Thomas L.

    2000-07-01

    Faceted reflectors are a ubiquitous means for providing uniform illumination in many commercial lighting products, examples being newer flashlights, department-store display lighting, and the faceted reflectors found in overhead projectors. However, the design of faceted reflectors using software has often been more limited by the tools available to design them than by the imagination of the designers. One of the keys to enabling a broader range of design options has been to allow more complex surfaces using constructive solid geometry (CSG). CSG uses Boolean operations on basic geometric primitives to define shapes to create individual facets. In this paper, we describe an improved faceted reflector design algorithm and use it to create a wide range of CSG-based reflectors. The performance of various reflectors is compared using a Monte Carlo ray-trace method.

  9. One window on the world of design automation

    SciTech Connect

    Stamm, D.A.

    1983-01-01

    A discussion of these factors is used as an introduction to a description of the daisy logician workstation. Rapidly increasing VLSI design complexity is most often cited as the primary reason for the emergence of the engineering workstation. Many other factors play an even more crucial role, however, in the development of this new technology. These include the advent of custom devices, the standardisation of logic design methodology and advances in memory density, display technology and disc technology.

  10. Automated design of the surface positions of protein helices.

    PubMed Central

    Dahiyat, B. I.; Gordon, D. B.; Mayo, S. L.

    1997-01-01

    Using a protein design algorithm that quantitatively considers side-chain interactions, the design of surface residues of alpha helices was examined. Three scoring functions were tested: a hydrogen-bond potential, a hydrogen-bond potential in conjunction with a penalty for uncompensated burial of polar hydrogens, and a hydrogen-bond potential in combination with helix propensity. The solvent exposed residues of a homodimeric coiled coil based on GCN4-p1 were designed by using the Dead-End Elimination Theorem to find the optimal amino acid sequence for each scoring function. The corresponding peptides were synthesized and characterized by circular dichroism spectroscopy and size exclusion chromatography. The designed peptides were dimeric and nearly 100% helical at 1 degree C, with melting temperatures from 69-72 degrees C, over 12 degrees C higher than GCN4-p1, whereas a random hydrophilic sequence at the surface positions produced a peptide that melted at 15 degrees C. Analysis of the designed sequences suggests that helix propensity is the key factor in sequence design for surface helical positions. PMID:9194194

  11. Automation and Schema Acquisition in Learning Elementary Computer Programming: Implications for the Design of Practice.

    ERIC Educational Resources Information Center

    Van Merrienboer, Jeroen J. G.; Paas, Fred G. W. C.

    1990-01-01

    Discussion of computer programing at the secondary level focuses on automation and schema acquisition as two processes important in learning cognitive skills such as programing. Their effects on learning outcomes and transfer of training are examined, the importance of worked examples is highlighted, and instructional design principles are…

  12. Design and development of a semi-automated module for the preparation of metallic PET radionuclides

    NASA Astrophysics Data System (ADS)

    Trejo-Ballado, F.; Lopez-Rodriguez, V.; Gaspar-Carcamo, R. E.; Hurtado-Chong, G.; Avila-Rodriguez, Miguel A.

    2012-12-01

    The method for the production of metallic radionuclides has been widely reported, and most of them share a common ion chromatography purification technique. The aim of this work is to design and develop a semi-automated remotely controlled module for the purification of metallic PET radionuclides via cation exchange chromatography.

  13. Automated Tetrahedral Mesh Generation for CFD Analysis of Aircraft in Conceptual Design

    NASA Technical Reports Server (NTRS)

    Ordaz, Irian; Li, Wu; Campbell, Richard L.

    2014-01-01

    The paper introduces an automation process of generating a tetrahedral mesh for computational fluid dynamics (CFD) analysis of aircraft configurations in early conceptual design. The method was developed for CFD-based sonic boom analysis of supersonic configurations, but can be applied to aerodynamic analysis of aircraft configurations in any flight regime.

  14. Design, development, test, and evaluation of an automated analytical electrophoresis apparatus

    NASA Technical Reports Server (NTRS)

    Bartels, P. A.; Bier, M.

    1977-01-01

    An Automated Analytical Electrophoresis Apparatus (AAEA) was designed, developed, assembled, and preliminarily tested. The AAEA was demonstrated to be a feasible apparatus for automatically acquiring, displaying, and storing (and eventually analyzing) electrophoresis mobility data from living blood cells. The apparatus and the operation of its major assemblies are described in detail.

  15. Graphic design principles for automated document segmentation and understanding

    NASA Astrophysics Data System (ADS)

    Vega-Riveros, J. Fernando; Santos Villalobos, Hector J.

    2006-01-01

    When designers develop a document layout their objective is to convey a specific message and provoke a specific response from the audience. Design principles provide the foundation for identifying document components and relations among them to extract implicit knowledge from the layout. Variable Data Printing enables the production of personalized printing jobs for which traditional proofing of all the job instances could result unfeasible. This paper explains a rule-based system that uses design principles to segment and understand document context. The system uses the design principles of repetition, proximity, alignment, similarity, and contrast as the foundation for the strategy in document segmentation and understanding which holds a strong relation with the recognition of artifacts produced by the infringement of the constraints articulated in the document layout. There are two main modules in the tool: the geometric analysis module; and the design rule engine. The geometric analysis module extracts explicit knowledge from the data provided in the document. The design rule module uses the information provided by the geometric analysis to establish logical units inside the document. We used a subset of XSL-FO, sufficient for designing documents with an adequate amount complexity. The system identifies components such as headers, paragraphs, lists, images and determines the relations between them, such as header-paragraph, header-list, etc. The system provides accurate information about the geometric properties of the components, detects the elements of the documents and identifies corresponding components between a proofed instance and the rest of the instances in a Variable Data Printing Job.

  16. Designs for a quantum electron microscope.

    PubMed

    Kruit, P; Hobbs, R G; Kim, C-S; Yang, Y; Manfrinato, V R; Hammer, J; Thomas, S; Weber, P; Klopfer, B; Kohstall, C; Juffmann, T; Kasevich, M A; Hommelhoff, P; Berggren, K K

    2016-05-01

    One of the astounding consequences of quantum mechanics is that it allows the detection of a target using an incident probe, with only a low probability of interaction of the probe and the target. This 'quantum weirdness' could be applied in the field of electron microscopy to generate images of beam-sensitive specimens with substantially reduced damage to the specimen. A reduction of beam-induced damage to specimens is especially of great importance if it can enable imaging of biological specimens with atomic resolution. Following a recent suggestion that interaction-free measurements are possible with electrons, we now analyze the difficulties of actually building an atomic resolution interaction-free electron microscope, or "quantum electron microscope". A quantum electron microscope would require a number of unique components not found in conventional transmission electron microscopes. These components include a coherent electron beam-splitter or two-state-coupler, and a resonator structure to allow each electron to interrogate the specimen multiple times, thus supporting high success probabilities for interaction-free detection of the specimen. Different system designs are presented here, which are based on four different choices of two-state-couplers: a thin crystal, a grating mirror, a standing light wave and an electro-dynamical pseudopotential. Challenges for the detailed electron optical design are identified as future directions for development. While it is concluded that it should be possible to build an atomic resolution quantum electron microscope, we have also identified a number of hurdles to the development of such a microscope and further theoretical investigations that will be required to enable a complete interpretation of the images produced by such a microscope. PMID:26998703

  17. Automation for pattern library creation and in-design optimization

    NASA Astrophysics Data System (ADS)

    Deng, Rock; Zou, Elain; Hong, Sid; Wang, Jinyan; Zhang, Yifan; Sweis, Jason; Lai, Ya-Chieh; Ding, Hua; Huang, Jason

    2015-03-01

    Semiconductor manufacturing technologies are becoming increasingly complex with every passing node. Newer technology nodes are pushing the limits of optical lithography and requiring multiple exposures with exotic material stacks for each critical layer. All of this added complexity usually amounts to further restrictions in what can be designed. Furthermore, the designs must be checked against all these restrictions in verification and sign-off stages. Design rules are intended to capture all the manufacturing limitations such that yield can be maximized for any given design adhering to all the rules. Most manufacturing steps employ some sort of model based simulation which characterizes the behavior of each step. The lithography models play a very big part of the overall yield and design restrictions in patterning. However, lithography models are not practical to run during design creation due to their slow and prohibitive run times. Furthermore, the models are not usually given to foundry customers because of the confidential and sensitive nature of every foundry's processes. The design layout locations where a model flags unacceptable simulated results can be used to define pattern rules which can be shared with customers. With advanced technology nodes we see a large growth of pattern based rules. This is due to the fact that pattern matching is very fast and the rules themselves can be very complex to describe in a standard DRC language. Therefore, the patterns are left as either pattern layout clips or abstracted into pattern-like syntax which a pattern matcher can use directly. The patterns themselves can be multi-layered with "fuzzy" designations such that groups of similar patterns can be found using one description. The pattern matcher is often integrated with a DRC tool such that verification and signoff can be done in one step. The patterns can be layout constructs that are "forbidden", "waived", or simply low-yielding in nature. The patterns can also

  18. Automated simulation as part of a design workstation

    NASA Technical Reports Server (NTRS)

    Cantwell, Elizabeth; Shenk, T.; Robinson, P.; Upadhye, R.

    1990-01-01

    A development project for a design workstation for advanced life-support systems (called the DAWN Project, for Design Assistant Workstation), incorporating qualitative simulation, required the implementation of a useful qualitative simulation capability and the integration of qualitative and quantitative simulation such that simulation capabilities are maximized without duplication. The reason is that to produce design solutions to a system goal, the behavior of the system in both a steady and perturbed state must be represented. The Qualitative Simulation Tool (QST), on an expert-system-like model building and simulation interface toll called ScratchPad (SP), and on the integration of QST and SP with more conventional, commercially available simulation packages now being applied in the evaluation of life-support system processes and components are discussed.

  19. Scar-less multi-part DNA assembly design automation

    DOEpatents

    Hillson, Nathan J.

    2016-06-07

    The present invention provides a method of a method of designing an implementation of a DNA assembly. In an exemplary embodiment, the method includes (1) receiving a list of DNA sequence fragments to be assembled together and an order in which to assemble the DNA sequence fragments, (2) designing DNA oligonucleotides (oligos) for each of the DNA sequence fragments, and (3) creating a plan for adding flanking homology sequences to each of the DNA oligos. In an exemplary embodiment, the method includes (1) receiving a list of DNA sequence fragments to be assembled together and an order in which to assemble the DNA sequence fragments, (2) designing DNA oligonucleotides (oligos) for each of the DNA sequence fragments, and (3) creating a plan for adding optimized overhang sequences to each of the DNA oligos.

  20. Automated Design and Optimization of Pebble-bed Reactor Cores

    SciTech Connect

    Hans D. Gougar; Abderrafi M. Ougouag; William K. Terry

    2010-07-01

    We present a conceptual design approach for high-temperature gas-cooled reactors using recirculating pebble-bed cores. The design approach employs PEBBED, a reactor physics code specifically designed to solve for and analyze the asymptotic burnup state of pebble-bed reactors, in conjunction with a genetic algorithm to obtain a core that maximizes a fitness value that is a function of user-specified parameters. The uniqueness of the asymptotic core state and the small number of independent parameters that define it suggest that core geometry and fuel cycle can be efficiently optimized toward a specified objective. PEBBED exploits a novel representation of the distribution of pebbles that enables efficient coupling of the burnup and neutron diffusion solvers. With this method, even complex pebble recirculation schemes can be expressed in terms of a few parameters that are amenable to modern optimization techniques. With PEBBED, the user chooses the type and range of core physics parameters that represent the design space. A set of traits, each with acceptable and preferred values expressed by a simple fitness function, is used to evaluate the candidate reactor cores. The stochastic search algorithm automatically drives the generation of core parameters toward the optimal core as defined by the user. The optimized design can then be modeled and analyzed in greater detail using higher resolution and more computationally demanding tools to confirm the desired characteristics. For this study, the design of pebble-bed high temperature reactor concepts subjected to demanding physical constraints demonstrated the efficacy of the PEBBED algorithm.

  1. Design preferences and cognitive styles: experimentation by automated website synthesis

    PubMed Central

    2012-01-01

    Background This article aims to demonstrate computational synthesis of Web-based experiments in undertaking experimentation on relationships among the participants' design preference, rationale, and cognitive test performance. The exemplified experiments were computationally synthesised, including the websites as materials, experiment protocols as methods, and cognitive tests as protocol modules. This work also exemplifies the use of a website synthesiser as an essential instrument enabling the participants to explore different possible designs, which were generated on the fly, before selection of preferred designs. Methods The participants were given interactive tree and table generators so that they could explore some different ways of presenting causality information in tables and trees as the visualisation formats. The participants gave their preference ratings for the available designs, as well as their rationale (criteria) for their design decisions. The participants were also asked to take four cognitive tests, which focus on the aspects of visualisation and analogy-making. The relationships among preference ratings, rationale, and the results of cognitive tests were analysed by conservative non-parametric statistics including Wilcoxon test, Krustal-Wallis test, and Kendall correlation. Results In the test, 41 of the total 64 participants preferred graphical (tree-form) to tabular presentation. Despite the popular preference for graphical presentation, the given tabular presentation was generally rated to be easier than graphical presentation to interpret, especially by those who were scored lower in the visualization and analogy-making tests. Conclusions This piece of evidence helps generate a hypothesis that design preferences are related to specific cognitive abilities. Without the use of computational synthesis, the experiment setup and scientific results would be impractical to obtain. PMID:22748000

  2. Automated Verification of Design Patterns with LePUS3

    NASA Technical Reports Server (NTRS)

    Nicholson, Jonathan; Gasparis, Epameinondas; Eden, Ammon H.; Kazman, Rick

    2009-01-01

    Specification and [visual] modelling languages are expected to combine strong abstraction mechanisms with rigour, scalability, and parsimony. LePUS3 is a visual, object-oriented design description language axiomatized in a decidable subset of the first-order predicate logic. We demonstrate how LePUS3 is used to formally specify a structural design pattern and prove ( verify ) whether any JavaTM 1.4 program satisfies that specification. We also show how LePUS3 specifications (charts) are composed and how they are verified fully automatically in the Two-Tier Programming Toolkit.

  3. Parameter Studies, time-dependent simulations and design with automated Cartesian methods

    NASA Technical Reports Server (NTRS)

    Aftosmis, Michael

    2005-01-01

    Over the past decade, NASA has made a substantial investment in developing adaptive Cartesian grid methods for aerodynamic simulation. Cartesian-based methods played a key role in both the Space Shuttle Accident Investigation and in NASA's return to flight activities. The talk will provide an overview of recent technological developments focusing on the generation of large-scale aerodynamic databases, automated CAD-based design, and time-dependent simulations with of bodies in relative motion. Automation, scalability and robustness underly all of these applications and research in each of these topics will be presented.

  4. CMOS array design automation techniques. [metal oxide semiconductors

    NASA Technical Reports Server (NTRS)

    Ramondetta, P.; Feller, A.; Noto, R.; Lombardi, T.

    1975-01-01

    A low cost, quick turnaround technique for generating custom metal oxide semiconductor arrays using the standard cell approach was developed, implemented, tested and validated. Basic cell design topology and guidelines are defined based on an extensive analysis that includes circuit, layout, process, array topology and required performance considerations particularly high circuit speed.

  5. Automated design of image operators that detect interest points.

    PubMed

    Trujillo, Leonardo; Olague, Gustavo

    2008-01-01

    This work describes how evolutionary computation can be used to synthesize low-level image operators that detect interesting points on digital images. Interest point detection is an essential part of many modern computer vision systems that solve tasks such as object recognition, stereo correspondence, and image indexing, to name but a few. The design of the specialized operators is posed as an optimization/search problem that is solved with genetic programming (GP), a strategy still mostly unexplored by the computer vision community. The proposed approach automatically synthesizes operators that are competitive with state-of-the-art designs, taking into account an operator's geometric stability and the global separability of detected points during fitness evaluation. The GP search space is defined using simple primitive operations that are commonly found in point detectors proposed by the vision community. The experiments described in this paper extend previous results (Trujillo and Olague, 2006a,b) by presenting 15 new operators that were synthesized through the GP-based search. Some of the synthesized operators can be regarded as improved manmade designs because they employ well-known image processing techniques and achieve highly competitive performance. On the other hand, since the GP search also generates what can be considered as unconventional operators for point detection, these results provide a new perspective to feature extraction research. PMID:19053496

  6. A Multi-Agent Design for Power Distribution Systems Automation

    NASA Astrophysics Data System (ADS)

    Ghorbani, M. Jawad

    A new Multi Agent System (MAS) design for fault location, isolation and restoration in power distribution systems is presented. In proposed approach, when there is a fault in the Power Distribution System (PDS), MAS quickly isolates the fault and restores the service to fault-free zones. Hierarchical coordination strategy is introduced to manage the agents which integrate the advantages of both centralized and decentralized coordination strategies. In this framework, Zone Agent (ZA) locate and isolate the fault based on the locally available information and assist the Feeder Agent (FA) for reconfiguration and restoration. FA can solve the restoration problem using the existing algorithms for the 0-1 Knapsack problem. A novel Q-learning mechanism is also introduced to support the FAs in decision making for restoration. Also a distributed MAS-Based Load Shedding (LS) technique has been used to supply as many of higher priority customers as possible, in case there is more demand than generation. The design is illustrated by the use of simulation case studies for fault location, isolation and restoration on West Virginia Super Circuit (WVSC) and hardware implementation for fault location and isolation in a laboratory platform. The results from the case studies indicate the performance of proposed MAS designs.

  7. Enhancing Creative Thinking through Designing Electronic Slides

    ERIC Educational Resources Information Center

    Mokaram, Al-Ali Khaled; Al-Shabatat, Ahmad Mohammad; Fong, Fook Soon; Abdallah, Andaleeb Ahmad

    2011-01-01

    During the shifting of teaching and learning methods using computer technologies, much emphasis was paid on the knowledge content more than the thinking skills. Thus, this study investigated the effects of a computer application, namely, designing electronic slides on the development of creative thinking skills of a sample of undergraduate…

  8. An Electronics Course Emphasizing Circuit Design

    ERIC Educational Resources Information Center

    Bergeson, Haven E.

    1975-01-01

    Describes a one-quarter introductory electronics course in which the students use a variety of inexpensive integrated circuits to design and construct a large number of useful circuits. Presents the subject matter of the course in three parts: linear circuits, digital circuits, and more complex circuits. (GS)

  9. Design principles and algorithms for automated air traffic management

    NASA Technical Reports Server (NTRS)

    Erzberger, Heinz

    1995-01-01

    This paper presents design principles and algorithm for building a real time scheduler. The primary objective of the scheduler is to assign arrival aircraft to a favorable landing runway and schedule them to land at times that minimize delays. A further objective of the scheduler is to allocate delays between high altitude airspace far from the airport and low altitude airspace near the airport. A method of delay allocation is described that minimizes the average operating cost in the presence of errors in controlling aircraft to a specified landing time.

  10. Automated extraction of fine features of kinetochore microtubules and plus-ends from electron tomography volume.

    PubMed

    Jiang, Ming; Ji, Qiang; McEwen, Bruce F

    2006-07-01

    Kinetochore microtubules (KMTs) and the associated plus-ends have been areas of intense investigation in both cell biology and molecular medicine. Though electron tomography opens up new possibilities in understanding their function by imaging their high-resolution structures, the interpretation of the acquired data remains an obstacle because of the complex and cluttered cellular environment. As a result, practical segmentation of the electron tomography data has been dominated by manual operation, which is time consuming and subjective. In this paper, we propose a model-based automated approach to extracting KMTs and the associated plus-ends with a coarse-to-fine scale scheme consisting of volume preprocessing, microtubule segmentation and plus-end tracing. In volume preprocessing, we first apply an anisotropic invariant wavelet transform and a tube-enhancing filter to enhance the microtubules at coarse level for localization. This is followed with a surface-enhancing filter to accentuate the fine microtubule boundary features. The microtubule body is then segmented using a modified active shape model method. Starting from the segmented microtubule body, the plus-ends are extracted with a probabilistic tracing method improved with rectangular window based feature detection and the integration of multiple cues. Experimental results demonstrate that our automated method produces results comparable to manual segmentation but using only a fraction of the manual segmentation time. PMID:16830922

  11. A diagnostic electronic reporting framework proposal using preassigned automated coded phrases.

    PubMed

    Karpouzou, Lamprini; Mylonakis, John; Evripiotis, Michalis; Mainta, Evgenia; Vasileiou, Panayiotis

    2013-03-01

    Radiologists daily diagnose a large number of Chest X-rays and it is crucial that these reports are appropriately recorded, meaningfully indexed, carefully stored, easily retrieved, shared and printed. The absence of organized reports' storage does not permit their direct and easy retrieval, while after almost a year the report is perished and not even readable (handwritten or typed). The scope of this paper is to evaluate and propose the use of preassigned automated-coded phrases for the chest X-ray electronic reporting in a Radiology Department. The research included 9,252 typed reports, using the proposed method and 949 hand written reports (later typed or not), which were used to compare between the time being spent in reporting with either method. The results proved that even if the method could not be applied fully, there was a 90% reduction of the time being spent by the radiologists and secretarial staff in a Radiology Department, thereby facilitating the typing and management of the electronic archives. In addition, it was found that the reprinting due to addendums/discrepancies, when the proposed method was used, was reduced fourfold, when compared to the previously used methods. In conclusion, the consistent application of preassigned automated-coded reporting can be time saving, cost effective and environmentally friendly saving paper and ink. PMID:23445706

  12. An automated method of quantifying ferrite microstructures using electron backscatter diffraction (EBSD) data.

    PubMed

    Shrestha, Sachin L; Breen, Andrew J; Trimby, Patrick; Proust, Gwénaëlle; Ringer, Simon P; Cairney, Julie M

    2014-02-01

    The identification and quantification of the different ferrite microconstituents in steels has long been a major challenge for metallurgists. Manual point counting from images obtained by optical and scanning electron microscopy (SEM) is commonly used for this purpose. While classification systems exist, the complexity of steel microstructures means that identifying and quantifying these phases is still a great challenge. Moreover, point counting is extremely tedious, time consuming, and subject to operator bias. This paper presents a new automated identification and quantification technique for the characterisation of complex ferrite microstructures by electron backscatter diffraction (EBSD). This technique takes advantage of the fact that different classes of ferrite exhibit preferential grain boundary misorientations, aspect ratios and mean misorientation, all of which can be detected using current EBSD software. These characteristics are set as criteria for identification and linked to grain size to determine the area fractions. The results of this method were evaluated by comparing the new automated technique with point counting results. The technique could easily be applied to a range of other steel microstructures. PMID:24291695

  13. DESIGN AND PRELIMINARY VALIDATION OF A RAPID AUTOMATED BIODOSIMETRY TOOL FOR HIGH THROUGPUT RADIOLOGICAL TRIAGE.

    PubMed

    Chen, Youhua; Zhang, Jian; Wang, Hongliang; Garty, Guy; Xu, Yanping; Lyulko, Oleksandra V; Turner, Helen C; Randers-Pehrson, Gerhard; Simaan, Nabil; Yao, Y Lawrence; Brenner, D J

    2009-01-01

    This paper presents design, hardware, software, and parameter optimization for a novel robotic automation system. RABiT is a Rapid Automated Biodosimetry Tool for high throughput radiological triage. The design considerations guiding the hardware and software architecture are presented with focus on methods of communication, ease of implementation, and need for real-time control versus soft time control cycles. The design and parameter determination for a non-contact PVC capillary laser cutting system is presented. A novel approach for lymphocyte concentration estimation based on computer vision is reported. Experimental evaluations of the system components validate the success of our prototype system in achieving a throughput of 6,000 samples in a period of 18 hours. PMID:21258614

  14. The Chandra automated processing system: challenges, design enhancements, and lessons learned

    NASA Astrophysics Data System (ADS)

    Plummer, David; Grier, John; Masters, Sreelatha

    2006-06-01

    Chandra standard data processing involves hundreds of different types of data products and pipelines. Pipelines are initiated by different types of events or notifications and may depend upon many other pipelines for input data. The Chandra automated processing system (AP) was designed to handle the various notifications and orchestrate the pipeline processing. Certain data sets may require "special" handling that deviates slightly from the standard processing thread. Also, bulk reprocessing of data often involves new processing requirements. Most recently, a new type of processing to produce source catalogs has introduced requirements not anticipated by the original AP design. Managing these complex dependencies and evolving processing requirements in an efficient, flexible, and automated fashion presents many challenges. This paper describes the most significant of these challenges, the AP design changes required to address these issues and the lessons learned along the way.

  15. DESIGN AND PRELIMINARY VALIDATION OF A RAPID AUTOMATED BIODOSIMETRY TOOL FOR HIGH THROUGPUT RADIOLOGICAL TRIAGE

    PubMed Central

    Chen, Youhua; Zhang, Jian; Wang, Hongliang; Garty, Guy; Xu, Yanping; Lyulko, Oleksandra V.; Turner, Helen C.; Randers-Pehrson, Gerhard; Simaan, Nabil; Yao, Y. Lawrence; Brenner, D. J.

    2010-01-01

    This paper presents design, hardware, software, and parameter optimization for a novel robotic automation system. RABiT is a Rapid Automated Biodosimetry Tool for high throughput radiological triage. The design considerations guiding the hardware and software architecture are presented with focus on methods of communication, ease of implementation, and need for real-time control versus soft time control cycles. The design and parameter determination for a non-contact PVC capillary laser cutting system is presented. A novel approach for lymphocyte concentration estimation based on computer vision is reported. Experimental evaluations of the system components validate the success of our prototype system in achieving a throughput of 6,000 samples in a period of 18 hours. PMID:21258614

  16. SLC polarized beam source electron optics design

    SciTech Connect

    Eppley, K.R.; Lavine, T.L.; Early, R.A.; Herrmannsfeldt, W.B.; Miller, R.H.; Schultz, D.C.; Spencer, C.M.; Yeremian, A.D.

    1991-05-01

    This paper describes the design of the beam-line from the polarized electron gun to the linac injector in the Stanford Linear Collider (SLC). The polarized electron source is a GaAs photocathode, requiring 10{sup {minus}11}-Torr-range pressure for adequate quantum efficiency and longevity. The photocathode is illuminated by 3-nsec-long laser pulses. The quality of the optics for the 160-kV beam is crucial since electron-stimulated gas desorption from beam loss in excess of 0.1% of the 20-nC pulses may poison the photocathode. Our design for the transport line consists of a differential pumping region isolated by a pair of valves. Focusing is provided by a pair of Helmholtz coils and by several iron-encased solenoidal lenses. Our optics design is based on beam transport simulations using 2{1/2}-D particle-in-cell codes to model the gun and to solve the fully-relativistic time-dependent equations of motion in three dimensions for electrons in the presence of azimuthally symmetric electromagnetic fields. 6 refs., 6 figs.

  17. Engineering Design and Automation in the Applied Engineering Technologies (AET) Group at Los Alamos National Laboratory.

    SciTech Connect

    Wantuck, P. J.; Hollen, R. M.

    2002-01-01

    This paper provides an overview of some design and automation-related projects ongoing within the Applied Engineering Technologies (AET) Group at Los Alamos National Laboratory. AET uses a diverse set of technical capabilities to develop and apply processes and technologies to applications for a variety of customers both internal and external to the Laboratory. The Advanced Recovery and Integrated Extraction System (ARIES) represents a new paradigm for the processing of nuclear material from retired weapon systems in an environment that seeks to minimize the radiation dose to workers. To achieve this goal, ARIES relies upon automation-based features to handle and process the nuclear material. Our Chemical Process Development Team specializes in fuzzy logic and intelligent control systems. Neural network technology has been utilized in some advanced control systems developed by team members. Genetic algorithms and neural networks have often been applied for data analysis. Enterprise modeling, or discrete event simulation, as well as chemical process simulation has been employed for chemical process plant design. Fuel cell research and development has historically been an active effort within the AET organization. Under the principal sponsorship of the Department of Energy, the Fuel Cell Team is now focusing on technologies required to produce fuel cell compatible feed gas from reformation of a variety of conventional fuels (e.g., gasoline, natural gas), principally for automotive applications. This effort involves chemical reactor design and analysis, process modeling, catalyst analysis, as well as full scale system characterization and testing. The group's Automation and Robotics team has at its foundation many years of experience delivering automated and robotic systems for nuclear, analytical chemistry, and bioengineering applications. As an integrator of commercial systems and a developer of unique custom-made systems, the team currently supports the automation

  18. Design of a small automated telescope for Indian universities

    NASA Astrophysics Data System (ADS)

    Anandaram, Mandayam N.; Kagali, B. A.

    We have constructed a computer controlled telescope using a 0.36-m f/11 Celestron optical tube assembly for teaching and research applications. We have constructed a heavy duty fork-type equatorial mount fitted with precision machined 24 inch drive disks for both axes. These are friction driven by stepper motors through one inch rollers. We have used an open loop control system triggerable by an ST-4 CCD camera to acquire and track any target object. Our telescope can home in on any target within a range of two arc-minutes. We have employed a commercial stepper motor controller card for which we have written a user friendly pc based telescope control software in C. Photometry using a solid state photometer, and imaging by an ST-6 CCD camera are possible. We consider that this project is suitable for those wishing to construct some parts of a telescope and understand the principles of operation. A simpler model of this telescope could use DC motors instead of stepper motors. We shall be happy to send our design diagrams and details to those interested. This project was funded by the DST, and was assisted by IUCAA, Pune.

  19. Imaging Electron Spectrometer (IES) Electron Preprocessor (EPP) Design

    NASA Technical Reports Server (NTRS)

    Fennell, J. F.; Osborn, J. V.; Christensen, John L. (Technical Monitor)

    2001-01-01

    The Aerospace Corporation developed the Electron PreProcessor (EPP) to support the Imaging Electron Spectrometer (IES) that is part of the RAPID experiment on the ESA/NASA CLUSTER mission. The purpose of the EPP is to collect raw data from the IES and perform processing and data compression on it before transferring it to the RAPID microprocessor system for formatting and transmission to the CLUSTER satellite data system. The report provides a short history of the RAPID and CLUSTER programs and describes the EPP design. Four EPP units were fabricated, tested, and delivered for the original CLUSTER program. These were destroyed during a launch failure. Four more EPP units were delivered for the CLUSTER II program. These were successfully launched and are operating nominally on orbit.

  20. Technologies and Designs for Electronic Nanocomputers

    NASA Technical Reports Server (NTRS)

    Montemerlo, Michael S.; Love, J. Christopher; Opiteck, Gregory J.; Goldhaber, David J.; Ellenbogen, James C.

    1995-01-01

    Diverse space-related applications have been proposed for microscopic and sub-microscopic structures, mechanisms, and 'organisms'. To govern their functions, many of these tiny systems will require even smaller, nanometer-scale programmable computers, i.e. 'nanocomputers' on-board. This paper provides an overview of the results of a nearly two-year study of the technologies and designs that presently are in development for electronic nanocomputers. Strengths and weaknesses of the various technologies and designs are discussed, as well as promising directions for remedying some of the present research issues in this area. The presentation is a synopsis of a longer MITRE review article on the same subject.

  1. Program Calculates Power Demands Of Electronic Designs

    NASA Technical Reports Server (NTRS)

    Cox, Brian

    1995-01-01

    CURRENT computer program calculates power requirements of electronic designs. For given design, CURRENT reads in applicable parts-list file and file containing current required for each part. Program also calculates power required for circuit at supply potentials of 5.5, 5.0, and 4.5 volts. Written by use of AWK utility for Sun4-series computers running SunOS 4.x and IBM PC-series and compatible computers running MS-DOS. Sun version of program (NPO-19590). PC version of program (NPO-19111).

  2. On Abstractions and Simplifications in the Design of Human-Automation Interfaces

    NASA Technical Reports Server (NTRS)

    Heymann, Michael; Degani, Asaf; Clancy, Daniel (Technical Monitor)

    2002-01-01

    This report addresses the design of human-automation interaction from a formal perspective that focuses on the information content of the interface, rather than the design of the graphical user interface. It also addresses the issue of the information provided to the user (e.g., user-manuals, training material, and all other resources). In this report, we propose a formal procedure for generating interfaces and user-manuals. The procedure is guided by two criteria: First, the interface must be correct, that is, with the given interface the user will be able to perform the specified tasks correctly. Second, the interface should be succinct. The report discusses the underlying concepts and the formal methods for this approach. Two examples are used to illustrate the procedure. The algorithm for constructing interfaces can be automated, and a preliminary software system for its implementation has been developed.

  3. On Abstractions and Simplifications in the Design of Human-Automation Interfaces

    NASA Technical Reports Server (NTRS)

    Heymann, Michael; Degani, Asaf; Shafto, Michael; Meyer, George; Clancy, Daniel (Technical Monitor)

    2001-01-01

    This report addresses the design of human-automation interaction from a formal perspective that focuses on the information content of the interface, rather than the design of the graphical user interface. It also addresses the, issue of the information provided to the user (e.g., user-manuals, training material, and all other resources). In this report, we propose a formal procedure for generating interfaces and user-manuals. The procedure is guided by two criteria: First, the interface must be correct, i.e., that with the given interface the user will be able to perform the specified tasks correctly. Second, the interface should be as succinct as possible. The report discusses the underlying concepts and the formal methods for this approach. Several examples are used to illustrate the procedure. The algorithm for constructing interfaces can be automated, and a preliminary software system for its implementation has been developed.

  4. End-to-end automated microfluidic platform for synthetic biology: from design to functional analysis

    DOE PAGESBeta

    Linshiz, Gregory; Jensen, Erik; Stawski, Nina; Bi, Changhao; Elsbree, Nick; Jiao, Hong; Kim, Jungkyu; Mathies, Richard; Keasling, Jay D.; Hillson, Nathan J.

    2016-02-02

    Synthetic biology aims to engineer biological systems for desired behaviors. The construction of these systems can be complex, often requiring genetic reprogramming, extensive de novo DNA synthesis, and functional screening. Here, we present a programmable, multipurpose microfluidic platform and associated software and apply the platform to major steps of the synthetic biology research cycle: design, construction, testing, and analysis. We show the platform’s capabilities for multiple automated DNA assembly methods, including a new method for Isothermal Hierarchical DNA Construction, and for Escherichia coli and Saccharomyces cerevisiae transformation. The platform enables the automated control of cellular growth, gene expression induction, andmore » proteogenic and metabolic output analysis. Finally, taken together, we demonstrate the microfluidic platform’s potential to provide end-to-end solutions for synthetic biology research, from design to functional analysis.« less

  5. Automated Design of Propellant-Optimal, End-to-End, Low-Thrust Trajectories for Trojan Asteroid Tours

    NASA Technical Reports Server (NTRS)

    Stuart, Jeffrey; Howell, Kathleen; Wilson, Roby

    2013-01-01

    The Sun-Jupiter Trojan asteroids are celestial bodies of great scientific interest as well as potential resources offering water and other mineral resources for longterm human exploration of the solar system. Previous investigations under this project have addressed the automated design of tours within the asteroid swarm. This investigation expands the current automation scheme by incorporating options for a complete trajectory design approach to the Trojan asteroids. Computational aspects of the design procedure are automated such that end-to-end trajectories are generated with a minimum of human interaction after key elements and constraints associated with a proposed mission concept are specified.

  6. Model-based automated extraction of microtubules from electron tomography volume.

    PubMed

    Jiang, Ming; Ji, Qiang; McEwen, Bruce F

    2006-07-01

    We propose a model-based automated approach to extracting microtubules from noisy electron tomography volume. Our approach consists of volume enhancement, microtubule localization, and boundary segmentation to exploit the unique geometric and photometric properties of microtubules. The enhancement starts with an anisotropic invariant wavelet transform to enhance the microtubules globally, followed by a three-dimensional (3-D) tube-enhancing filter based on Weingarten matrix to further accentuate the tubular structures locally. The enhancement ends with a modified coherence-enhancing diffusion to complete the interruptions along the microtubules. The microtubules are then localized with a centerline extraction algorithm adapted for tubular objects. To perform segmentation, we novelly modify and extend active shape model method. We first use 3-D local surface enhancement to characterize the microtubule boundary and improve shape searching by relating the boundary strength with the weight matrix of the searching error. We then integrate the active shape model with Kalman filtering to utilize the longitudinal smoothness along the microtubules. The segmentation improved in this way is robust against missing boundaries and outliers that are often present in the tomography volume. Experimental results demonstrate that our automated method produces results close to those by manual process and uses only a fraction of the time of the latter. PMID:16871731

  7. Revisiting the Fully Automated Double-ring Infiltrometer using Open-source Electronics

    NASA Astrophysics Data System (ADS)

    Ong, J.; Werkema, D., Jr.; Lane, J. W.

    2012-12-01

    The double-ring infiltrometer (DRI) is commonly used for measuring soil hydraulic conductivity. However, constant-head DRI tests typically involve the use of Mariotte tubes, which can be problematic to set-up, and time-consuming to maintain and monitor during infiltration tests. Maheshwari (1996, Australian Journal of Soil Research, v. 34, p. 709-714) developed a method for eliminating Mariotte tubes for constant-head tests using a computer-controlled combination of water-level indicators and solenoids to maintain a near-constant head in the DRI. A pressure transducer mounted on a depth-to-volume calibrated tank measures the water delivery rates during the test and data are saved on a hard drive or floppy disk. Here we use an inexpensive combination of pressure transducers, microcontroller, and open-source electronics that eliminate the need for Mariotte tubes. The system automates DRI water delivery and data recording for both constant- and falling-head infiltration tests. The user has the option of choosing water supplied to the DRI through a pressurized water system, pump, or gravity fed. An LCD screen enables user interface and observation of data for quality analysis in the field. The digital data are stored on a micro-SD card in standard column format for future retrieval and easy importing into conventional processing and plotting software. We show the results of infiltrometer tests using the automated system and a conventional Mariotte tube system conducted over test beds of uniform soils.

  8. An automation of design and modelling tasks in NX Siemens environment with original software - generator module

    NASA Astrophysics Data System (ADS)

    Zbiciak, M.; Grabowik, C.; Janik, W.

    2015-11-01

    Nowadays the design constructional process is almost exclusively aided with CAD/CAE/CAM systems. It is evaluated that nearly 80% of design activities have a routine nature. These design routine tasks are highly susceptible to automation. Design automation is usually made with API tools which allow building original software responsible for adding different engineering activities. In this paper the original software worked out in order to automate engineering tasks at the stage of a product geometrical shape design is presented. The elaborated software works exclusively in NX Siemens CAD/CAM/CAE environment and was prepared in Microsoft Visual Studio with application of the .NET technology and NX SNAP library. The software functionality allows designing and modelling of spur and helicoidal involute gears. Moreover, it is possible to estimate relative manufacturing costs. With the Generator module it is possible to design and model both standard and non-standard gear wheels. The main advantage of the model generated in such a way is its better representation of an involute curve in comparison to those which are drawn in specialized standard CAD systems tools. It comes from fact that usually in CAD systems an involute curve is drawn by 3 points that respond to points located on the addendum circle, the reference diameter of a gear and the base circle respectively. In the Generator module the involute curve is drawn by 11 involute points which are located on and upper the base and the addendum circles therefore 3D gear wheels models are highly accurate. Application of the Generator module makes the modelling process very rapid so that the gear wheel modelling time is reduced to several seconds. During the conducted research the analysis of differences between standard 3 points and 11 points involutes was made. The results and conclusions drawn upon analysis are shown in details.

  9. An automated approach to calculating the daily dose of tacrolimus in electronic health records.

    PubMed

    Xu, Hua; Doan, Son; Birdwell, Kelly A; Cowan, James D; Vincz, Andrew J; Haas, David W; Basford, Melissa A; Denny, Joshua C

    2010-01-01

    Clinical research often requires extracting detailed drug information, such as medication names and dosages, from Electronic Health Records (EHR). Since medication information is often recorded as both structured and unstructured formats in the EHR, extracting all the relevant drug mentions and determining the daily dose of a medication for a selected patient at a given date can be a challenging and time-consuming task. In this paper, we present an automated approach using natural language processing to calculate daily doses of medications mentioned in clinical text, using tacrolimus as a test case. We evaluated this method using data sets from four different types of unstructured clinical data. Our results showed that the system achieved precisions of 0.90-1.00 and recalls of 0.81-1.00. PMID:21347153

  10. Tevatron electron lenses: Design and operation

    NASA Astrophysics Data System (ADS)

    Shiltsev, Vladimir; Bishofberger, Kip; Kamerdzhiev, Vsevolod; Kozub, Sergei; Kufer, Matthew; Kuznetsov, Gennady; Martinez, Alexander; Olson, Marvin; Pfeffer, Howard; Saewert, Greg; Scarpine, Vic; Seryi, Andrey; Solyak, Nikolai; Sytnik, Veniamin; Tiunov, Mikhail; Tkachenko, Leonid; Wildman, David; Wolff, Daniel; Zhang, Xiao-Long

    2008-10-01

    The beam-beam effects have been the dominating sources of beam loss and lifetime limitations in the Tevatron proton-antiproton collider [V. Shiltsev , Phys. Rev. ST Accel. Beams 8, 101001 (2005)PRABFM1098-440210.1103/PhysRevSTAB.8.101001]. Electron lenses were originally proposed for compensation of electromagnetic long-range and head-on beam-beam interactions of proton and antiproton beams [V. Shiltsev , Phys. Rev. ST Accel. Beams 2, 071001 (1999).PRABFM1098-440210.1103/PhysRevSTAB.2.071001]. Results of successful employment of two electron lenses built and installed in the Tevatron are reported by Shiltsev et al. [Phys. Rev. Lett. 99, 244801 (2007)PRLTAO0031-900710.1103/PhysRevLett.99.244801; New J. Phys. 10, 043042 (2008)NJOPFM1367-263010.1088/1367-2630/10/4/043042] and by Zhang et al. [X.-L. Zhang , Phys. Rev. ST Accel. Beams 11, 051002 (2008)PRABFM1098-440210.1103/PhysRevSTAB.11.051002]. In this paper we present design features of the Tevatron electron lenses (TELs), discuss the generation of electron beams, describe different modes of operation, and outline the technical parameters of various subsystems.

  11. Integrating automated structured analysis and design with Ada programming support environments

    NASA Technical Reports Server (NTRS)

    Hecht, Alan; Simmons, Andy

    1986-01-01

    Ada Programming Support Environments (APSE) include many powerful tools that address the implementation of Ada code. These tools do not address the entire software development process. Structured analysis is a methodology that addresses the creation of complete and accurate system specifications. Structured design takes a specification and derives a plan to decompose the system subcomponents, and provides heuristics to optimize the software design to minimize errors and maintenance. It can also produce the creation of useable modules. Studies have shown that most software errors result from poor system specifications, and that these errors also become more expensive to fix as the development process continues. Structured analysis and design help to uncover error in the early stages of development. The APSE tools help to insure that the code produced is correct, and aid in finding obscure coding errors. However, they do not have the capability to detect errors in specifications or to detect poor designs. An automated system for structured analysis and design TEAMWORK, which can be integrated with an APSE to support software systems development from specification through implementation is described. These tools completement each other to help developers improve quality and productivity, as well as to reduce development and maintenance costs. Complete system documentation and reusable code also resultss from the use of these tools. Integrating an APSE with automated tools for structured analysis and design provide capabilities and advantages beyond those realized with any of these systems used by themselves.

  12. A structural study of cyanotrichite from Dachang by conventional and automated electron diffraction

    NASA Astrophysics Data System (ADS)

    Ventruti, Gennaro; Mugnaioli, Enrico; Capitani, Giancarlo; Scordari, Fernando; Pinto, Daniela; Lausi, Andrea

    2015-09-01

    The crystal structure of cyanotrichite, having general formula Cu4Al2(SO4)(OH)12·2H2O, from the Dachang deposit (China) was studied by means of conventional transmission electron microscopy, automated electron diffraction tomography (ADT) and synchrotron X-ray powder diffraction (XRPD). ADT revealed the presence of two different cyanotrichite-like phases. The same phases were also recognized in the XRPD pattern, allowing the perfect indexing of all peaks leading, after refinement to the following cell parameters: (1) a = 12.417(2) Å, b = 2.907(1) Å, c = 10.157(1) Å and β = 98.12(1); (2) a = 12.660(2) Å, b = 2.897(1) Å, c = 10.162(1) Å and β = 92.42(1)°. Only for the former phase, labeled cyanotrichite-98, a partial structure, corresponding to the [Cu4Al2(OH){12/2+}] cluster, was obtained ab initio by direct methods in space group C2/ m on the basis of electron diffraction data. Geometric and charge-balance considerations allowed to reach the whole structure model for the cyanotrichite-98 phase. The sulfate group and water molecule result to be statistically disordered over two possible positions, but keeping the average structure consistent with the C-centering symmetry, in agreement with ADT results.

  13. Three-dimensional rotation electron diffraction: software RED for automated data collection and data processing.

    PubMed

    Wan, Wei; Sun, Junliang; Su, Jie; Hovmöller, Sven; Zou, Xiaodong

    2013-12-01

    Implementation of a computer program package for automated collection and processing of rotation electron diffraction (RED) data is described. The software package contains two computer programs: RED data collection and RED data processing. The RED data collection program controls the transmission electron microscope and the camera. Electron beam tilts at a fine step (0.05-0.20°) are combined with goniometer tilts at a coarse step (2.0-3.0°) around a common tilt axis, which allows a fine relative tilt to be achieved between the electron beam and the crystal in a large tilt range. An electron diffraction (ED) frame is collected at each combination of beam tilt and goniometer tilt. The RED data processing program processes three-dimensional ED data generated by the RED data collection program or by other approaches. It includes shift correction of the ED frames, peak hunting for diffraction spots in individual ED frames and identification of these diffraction spots as reflections in three dimensions. Unit-cell parameters are determined from the positions of reflections in three-dimensional reciprocal space. All reflections are indexed, and finally a list with hkl indices and intensities is output. The data processing program also includes a visualizer to view and analyse three-dimensional reciprocal lattices reconstructed from the ED frames. Details of the implementation are described. Data collection and data processing with the software RED are demonstrated using a calcined zeolite sample, silicalite-1. The structure of the calcined silicalite-1, with 72 unique atoms, could be solved from the RED data by routine direct methods. PMID:24282334

  14. Adaptive and Adaptable Automation Design: A Critical Review of the Literature and Recommendations for Future Research

    NASA Technical Reports Server (NTRS)

    Prinzel, Lawrence J., III; Kaber, David B.

    2006-01-01

    This report presents a review of literature on approaches to adaptive and adaptable task/function allocation and adaptive interface technologies for effective human management of complex systems that are likely to be issues for the Next Generation Air Transportation System, and a focus of research under the Aviation Safety Program, Integrated Intelligent Flight Deck Project. Contemporary literature retrieved from an online database search is summarized and integrated. The major topics include the effects of delegation-type, adaptable automation on human performance, workload and situation awareness, the effectiveness of various automation invocation philosophies and strategies to function allocation in adaptive systems, and the role of user modeling in adaptive interface design and the performance implications of adaptive interface technology.

  15. Electronic engine 'mockup' shortens design time

    NASA Astrophysics Data System (ADS)

    Blaschke, J. C.; Jatzek, H. A., Jr.

    1985-01-01

    CAD systems approaches being used by one engine manufacturing company to display the engine and all accessories are described. The electronic mock-ups of engines under development permits examining the entire engine and its support equipment without resorting to a manufactured model. The Geomod program module allows definitions of sections of tubing in appropriate places and subsequent generations of a solid model, or cabling sections in a wire frame representation. Interferences are automatically identified, even when the system is accessed by several designers simultaneously. Storage of the data provides for representational printouts and plots for subcontractors.

  16. Provider acceptance of an automated electronic alert for acute kidney injury

    PubMed Central

    Oh, Janice; Bia, Joshua R.; Ubaid-Ullah, Muhamad; Testani, Jeffrey M.; Wilson, Francis Perry

    2016-01-01

    Background Clinical decision support systems, including electronic alerts, ideally provide immediate and relevant patient-specific information to improve clinical decision-making. Despite the growing capabilities of such alerts in conjunction with an expanding electronic medical record, there is a paucity of information regarding their perceived usefulness. We surveyed healthcare providers' opinions concerning the practicality and efficacy of a specific text-based automated electronic alert for acute kidney injury (AKI) in a single hospital during a randomized trial of AKI alerts. Methods Providers who had received at least one electronic AKI alert in the previous 6 months, as part of a separate randomized controlled trial (clinicaltrials.gov #01862419), were asked to complete a survey concerning their opinions about this specific AKI alert system. Individual approval of the alert system was defined by a provider's desire to continue receiving the alert after termination of the trial. Results A total of 98 individuals completed the survey, including 62 physicians, 27 pharmacists and 7 non-physician providers. Sixty-nine percent of responders approved the alert, with no significant difference among the various professions (P = 0.28). Alert approval was strongly correlated with the belief that the alerts improved patient care (P < 0.0001), and negatively correlated with the belief that alerts did not provide novel information (P = 0.0001). With each additional 30 days of trial duration, odds of approval decreased by 20% (3–35%) (P = 0.02). Conclusions The alert system was generally well received, although approval waned with time. Approval was correlated with the belief that this type of alert improved patient care. These findings suggest that perceived efficacy is critical to the success of future alert trials. PMID:27478598

  17. Using Tomoauto: A Protocol for High-throughput Automated Cryo-electron Tomography.

    PubMed

    Morado, Dustin R; Hu, Bo; Liu, Jun

    2016-01-01

    Cryo-electron tomography (Cryo-ET) is a powerful three-dimensional (3-D) imaging technique for visualizing macromolecular complexes in their native context at a molecular level. The technique involves initially preserving the sample in its native state by rapidly freezing the specimen in vitreous ice, then collecting a series of micrographs from different angles at high magnification, and finally computationally reconstructing a 3-D density map. The frozen-hydrated specimen is extremely sensitive to the electron beam and so micrographs are collected at very low electron doses to limit the radiation damage. As a result, the raw cryo-tomogram has a very low signal to noise ratio characterized by an intrinsically noisy image. To better visualize subjects of interest, conventional imaging analysis and sub-tomogram averaging in which sub-tomograms of the subject are extracted from the initial tomogram and aligned and averaged are utilized to improve both contrast and resolution. Large datasets of tilt-series are essential to understanding and resolving the complexes at different states, conditions, or mutations as well as obtaining a large enough collection of sub-tomograms for averaging and classification. Collecting and processing this data can be a major obstacle preventing further analysis. Here we describe a high-throughput cryo-ET protocol based on a computer-controlled 300kV cryo-electron microscope, a direct detection device (DDD) camera and a highly effective, semi-automated image-processing pipeline software wrapper library tomoauto developed in-house. This protocol has been effectively utilized to visualize the intact type III secretion system (T3SS) in Shigella flexneri minicells. It can be applicable to any project suitable for cryo-ET. PMID:26863591

  18. Evidence Report, Risk of Inadequate Design of Human and Automation/Robotic Integration

    NASA Technical Reports Server (NTRS)

    Zumbado, Jennifer Rochlis; Billman, Dorrit; Feary, Mike; Green, Collin

    2011-01-01

    The success of future exploration missions depends, even more than today, on effective integration of humans and technology (automation and robotics). This will not emerge by chance, but by design. Both crew and ground personnel will need to do more demanding tasks in more difficult conditions, amplifying the costs of poor design and the benefits of good design. This report has looked at the importance of good design and the risks from poor design from several perspectives: 1) If the relevant functions needed for a mission are not identified, then designs of technology and its use by humans are unlikely to be effective: critical functions will be missing and irrelevant functions will mislead or drain attention. 2) If functions are not distributed effectively among the (multiple) participating humans and automation/robotic systems, later design choices can do little to repair this: additional unnecessary coordination work may be introduced, workload may be redistributed to create problems, limited human attentional resources may be wasted, and the capabilities of both humans and technology underused. 3) If the design does not promote accurate understanding of the capabilities of the technology, the operators will not use the technology effectively: the system may be switched off in conditions where it would be effective, or used for tasks or in contexts where its effectiveness may be very limited. 4) If an ineffective interaction design is implemented and put into use, a wide range of problems can ensue. Many involve lack of transparency into the system: operators may be unable or find it very difficult to determine a) the current state and changes of state of the automation or robot, b) the current state and changes in state of the system being controlled or acted on, and c) what actions by human or by system had what effects. 5) If the human interfaces for operation and control of robotic agents are not designed to accommodate the unique points of view and

  19. Automated Detection and Segmentation of Synaptic Contacts in Nearly Isotropic Serial Electron Microscopy Images

    PubMed Central

    Kreshuk, Anna; Straehle, Christoph N.; Sommer, Christoph; Koethe, Ullrich; Cantoni, Marco; Knott, Graham; Hamprecht, Fred A.

    2011-01-01

    We describe a protocol for fully automated detection and segmentation of asymmetric, presumed excitatory, synapses in serial electron microscopy images of the adult mammalian cerebral cortex, taken with the focused ion beam, scanning electron microscope (FIB/SEM). The procedure is based on interactive machine learning and only requires a few labeled synapses for training. The statistical learning is performed on geometrical features of 3D neighborhoods of each voxel and can fully exploit the high z-resolution of the data. On a quantitative validation dataset of 111 synapses in 409 images of 1948×1342 pixels with manual annotations by three independent experts the error rate of the algorithm was found to be comparable to that of the experts (0.92 recall at 0.89 precision). Our software offers a convenient interface for labeling the training data and the possibility to visualize and proofread the results in 3D. The source code, the test dataset and the ground truth annotation are freely available on the website http://www.ilastik.org/synapse-detection. PMID:22031814

  20. Automated Detection of Synapses in Serial Section Transmission Electron Microscopy Image Stacks

    PubMed Central

    Kreshuk, Anna; Koethe, Ullrich; Pax, Elizabeth; Bock, Davi D.; Hamprecht, Fred A.

    2014-01-01

    We describe a method for fully automated detection of chemical synapses in serial electron microscopy images with highly anisotropic axial and lateral resolution, such as images taken on transmission electron microscopes. Our pipeline starts from classification of the pixels based on 3D pixel features, which is followed by segmentation with an Ising model MRF and another classification step, based on object-level features. Classifiers are learned on sparse user labels; a fully annotated data subvolume is not required for training. The algorithm was validated on a set of 238 synapses in 20 serial 7197×7351 pixel images (4.5×4.5×45 nm resolution) of mouse visual cortex, manually labeled by three independent human annotators and additionally re-verified by an expert neuroscientist. The error rate of the algorithm (12% false negative, 7% false positive detections) is better than state-of-the-art, even though, unlike the state-of-the-art method, our algorithm does not require a prior segmentation of the image volume into cells. The software is based on the ilastik learning and segmentation toolkit and the vigra image processing library and is freely available on our website, along with the test data and gold standard annotations (http://www.ilastik.org/synapse-detection/sstem). PMID:24516550

  1. Fast Model Adaptation for Automated Section Classification in Electronic Medical Records.

    PubMed

    Ni, Jian; Delaney, Brian; Florian, Radu

    2015-01-01

    Medical information extraction is the automatic extraction of structured information from electronic medical records, where such information can be used for improving healthcare processes and medical decision making. In this paper, we study one important medical information extraction task called section classification. The objective of section classification is to automatically identify sections in a medical document and classify them into one of the pre-defined section types. Training section classification models typically requires large amounts of human labeled training data to achieve high accuracy. Annotating institution-specific data, however, can be both expensive and time-consuming; which poses a big hurdle for adapting a section classification model to new medical institutions. In this paper, we apply two advanced machine learning techniques, active learning and distant supervision, to reduce annotation cost and achieve fast model adaptation for automated section classification in electronic medical records. Our experiment results show that active learning reduces the annotation cost and time by more than 50%, and distant supervision can achieve good model accuracy using weakly labeled training data only. PMID:26262005

  2. Crystallographic Tool Box (CrysTBox): automated tools for transmission electron microscopists and crystallographers

    PubMed Central

    Klinger, Miloslav; Jäger, Aleš

    2015-01-01

    Three tools for an automated analysis of electron diffraction pattern and crystallographic visualization are presented. Firstly, diffractGUI determines the zone axis from selected area diffraction, convergent beam diffraction or nanodiffraction patterns and allows for indexing of individual reflections. Secondly, ringGUI identifies crystallographic planes corresponding to the depicted rings in the ring diffraction pattern and can select the sample material from a list of candidates. Both diffractGUI and ringGUI employ methods of computer vision for a fast, robust and accurate analysis. Thirdly, cellViewer is an intuitive visualization tool which is also helpful for crystallographic calculations or educational purposes. diffractGUI and cellViewer can be used together during a transmission electron microscopy session to determine the sample holder tilts required to reach a desired zone axis. All the tools offer a graphical user interface. The toolbox is distributed as a standalone application, so it can be installed on the microscope computer and launched directly from DigitalMicrograph (Gatan Inc.). PMID:26664349

  3. Electronic prototyping

    NASA Technical Reports Server (NTRS)

    Hopcroft, J.

    1987-01-01

    The potential benefits of automation in space are significant. The science base needed to support this automation not only will help control costs and reduce lead-time in the earth-based design and construction of space stations, but also will advance the nation's capability for computer design, simulation, testing, and debugging of sophisticated objects electronically. Progress in automation will require the ability to electronically represent, reason about, and manipulate objects. Discussed here is the development of representations, languages, editors, and model-driven simulation systems to support electronic prototyping. In particular, it identifies areas where basic research is needed before further progress can be made.

  4. Effects of an Advanced Reactor’s Design, Use of Automation, and Mission on Human Operators

    SciTech Connect

    Jeffrey C. Joe; Johanna H. Oxstrand

    2014-06-01

    The roles, functions, and tasks of the human operator in existing light water nuclear power plants (NPPs) are based on sound nuclear and human factors engineering (HFE) principles, are well defined by the plant’s conduct of operations, and have been validated by years of operating experience. However, advanced NPPs whose engineering designs differ from existing light-water reactors (LWRs) will impose changes on the roles, functions, and tasks of the human operators. The plans to increase the use of automation, reduce staffing levels, and add to the mission of these advanced NPPs will also affect the operator’s roles, functions, and tasks. We assert that these factors, which do not appear to have received a lot of attention by the design engineers of advanced NPPs relative to the attention given to conceptual design of these reactors, can have significant risk implications for the operators and overall plant safety if not mitigated appropriately. This paper presents a high-level analysis of a specific advanced NPP and how its engineered design, its plan to use greater levels of automation, and its expanded mission have risk significant implications on operator performance and overall plant safety.

  5. Tevatron Electron Lenses: Design and Operation

    SciTech Connect

    Shiltsev, Vladimir; Bishofberger, Kip; Kamerdzhiev, Vsevolod; Kozub, Sergei; Kufer, Matthew; Kuznetsov, Gennady; Martinez, Alexander; Olson, Marvin; Pfeffer, Howard; Saewert, Greg; Scarpine, Vic; Seryi, Andrei; Solyak, Nikolai; Sytnik, Veniamin; Tiunov, Mikhail; Tkachenko, Leonid; Wildman, David; Wolff, Daniel; Zhang, Xiao-Long; /Fermilab

    2011-09-12

    Fermilab's Tevatron is currently the world's highest energy accelerator in which tightly focused beams of 980 GeV protons and antiprotons collide at two dedicated interaction points (IPs). Both beams share the same beam pipe and magnet aperture and, in order to avoid multiple detrimental head-on collisions, the beams are placed on separated orbits everywhere except the main IPs by using high-voltage (HV) electrostatic separators. The electromagnetic beam-beam interaction at the main IPs together with the long-range interactions between separated beams adversely affect the collider performance, reducing the luminosity integral per store (period of continuous collisions) by 10-30%. Tuning the collider operation for optimal performance becomes more and more cumbersome as the beam intensities and luminosity increase. The long-range effects which (besides being nonlinear) vary from bunch to bunch are particularly hard to mitigate. A comprehensive review of the beam-beam effects in the Tevatron Collider Run II can be found in Ref. [1]. The beam-beam effects have been the dominating sources of beam loss and lifetime limitations in the Tevatron proton-antiproton collider [1]. Electron lenses were originally proposed for compensation of electromagnetic long-range and head-on beam-beam interactions of proton and antiproton beams [2]. Results of successful employment of two electron lenses built and installed in the Tevatron are reported in [3,4,5]. In this paper we present design features of the Tevatron electron lenses (TELs), discuss the generation of electron beams, describe different modes of operation and outline the technical parameters of various subsystems.

  6. Automated Generation of Finite-Element Meshes for Aircraft Conceptual Design

    NASA Technical Reports Server (NTRS)

    Li, Wu; Robinson, Jay

    2016-01-01

    This paper presents a novel approach for automated generation of fully connected finite-element meshes for all internal structural components and skins of a given wing-body geometry model, controlled by a few conceptual-level structural layout parameters. Internal structural components include spars, ribs, frames, and bulkheads. Structural layout parameters include spar/rib locations in wing chordwise/spanwise direction and frame/bulkhead locations in longitudinal direction. A simple shell thickness optimization problem with two load conditions is used to verify versatility and robustness of the automated meshing process. The automation process is implemented in ModelCenter starting from an OpenVSP geometry and ending with a NASTRAN 200 solution. One subsonic configuration and one supersonic configuration are used for numerical verification. Two different structural layouts are constructed for each configuration and five finite-element meshes of different sizes are generated for each layout. The paper includes various comparisons of solutions of 20 thickness optimization problems, as well as discussions on how the optimal solutions are affected by the stress constraint bound and the initial guess of design variables.

  7. ECO fill: automated fill modification to support late-stage design changes

    NASA Astrophysics Data System (ADS)

    Davis, Greg; Wilson, Jeff; Yu, J. J.; Chiu, Anderson; Chuang, Yao-Jen; Yang, Ricky

    2014-03-01

    One of the most critical factors in achieving a positive return for a design is ensuring the design not only meets performance specifications, but also produces sufficient yield to meet the market demand. The goal of design for manufacturability (DFM) technology is to enable designers to address manufacturing requirements during the design process. While new cell-based, DP-aware, and net-aware fill technologies have emerged to provide the designer with automated fill engines that support these new fill requirements, design changes that arrive late in the tapeout process (as engineering change orders, or ECOs) can have a disproportionate effect on tapeout schedules, due to the complexity of replacing fill. If not handled effectively, the impacts on file size, run time, and timing closure can significantly extend the tapeout process. In this paper, the authors examine changes to design flow methodology, supported by new fill technology, that enable efficient, fast, and accurate adjustments to metal fill late in the design process. We present an ECO fill methodology coupled with the support of advanced fill tools that can quickly locate the portion of the design affected by the change, remove and replace only the fill in that area, while maintaining the fill hierarchy. This new fill approach effectively reduces run time, contains fill file size, minimizes timing impact, and minimizes mask costs due to ECO-driven fill changes, all of which are critical factors to ensuring time-to-market schedules are maintained.

  8. Electronic bolus design impacts on administration.

    PubMed

    Hentz, F; Umstätter, C; Gilaverte, S; Prado, O R; Silva, C J A; Monteiro, A L G

    2014-06-01

    Electronic identification of animals has become increasingly important worldwide to improve and ensure traceability. In warm and hot climates, such as Brazil, boluses can have advantages over ear tags as the internal devices reduce the risks of ear tag losses, tissue damage, and lesions on the ear. Electronic boluses, however, are often perceived as having negative characteristics, including reported difficulties of administration in small ruminants. This paper describes the factors associated with bolus design that affect the swallowing of a bolus in sheep. Other factors that might influence bolus swallowing time have also been considered. In addition, the effect of bolus design on its performance was evaluated. A total of 56 Suffolk ewes were used to assess the ease of administration and retention of 3 types of electronic ruminal boluses (mini, 11.5 × 58.0 mm and 21.7 g; small, 14.8 × 48.5 mm and 29.5 g; standard, 19.3 × 69.8 mm and 74.4 g) during a whole productive year, including pregnancy and lamb suckling. Ewe age (5.6 ± 2.3 yr) and weight (85.07 ± 8.2 kg BW) were recorded, as well as time for bolus swallowing. The deglutition of the bolus and any resulting blockages in the esophagus were monitored by visual observations. Retention and readability of the boluses were regularly monitored for d 1, wk 1, mo 1, and every mo until 1 yr. Time for bolus swallowing differed substantially with bolus type and was greater (P < 0.05) for the standard bolus (32.8 ± 6.9 s) when compared to small and mini boluses, which did not differ (8.5 ± 2.0 vs. 9.2 ± 2.7 s; P > 0.05). The bolus o.d. and length were positively correlated with swallowing time (P < 0.01). The ewe weight was negatively correlated with swallowing time (P < 0.05). At 6 mo all electronic boluses showed 100% retention rate, and at 12 mo, bolus retention was 100%, 94.5%, and 100% for mini, small, and standard boluses, respectively (P > 0.05). At 12 mo, all boluses showed 100% readability, except for

  9. A clinical trial comparing physician prompting with an unprompted automated electronic checklist to reduce empirical antibiotic utilization

    PubMed Central

    Weiss, Curtis H.; DiBardino, David; Rho, Jason; Sung, Nina; Collander, Brett; Wunderink, Richard G.

    2013-01-01

    Objective To determine whether face-to-face prompting of critical care physicians reduces empirical antibiotic utilization compared to an unprompted electronic checklist embedded within the electronic health record (EHR). Design Random allocation design. Setting Medical intensive care unit (MICU) with high-intensity intensivist coverage at a tertiary care urban medical center. Patients Two hundred ninety-six critically ill patients treated with at least one day of empirical antibiotics. Interventions For one MICU team, face-to-face prompting of critical care physicians if they did not address empirical antibiotic utilization during a patient’s daily rounds. On a separate MICU team, attendings and fellows were trained once to complete an EHR-embedded checklist daily for each patient, including a question asking whether listed empirical antibiotics could be discontinued. Measurements and main results Prompting led to a more than 4-fold increase in discontinuing or narrowing of empirical antibiotics compared to use of the electronic checklist. Prompted group patients had a lower proportion of patient-days on which empirical antibiotics were administered compared to electronic checklist group patients (63.1% vs. 70.0%, P=0.002). Mean proportion of antibiotic-days on which empirical antibiotics were used was also lower in the prompted group, although not statistically significant (0.78 [0.27] vs. 0.83 [0.27], P=0.093). Each additional day of empirical antibiotics predicted higher risk-adjusted mortality (odds ratio 1.14, 95% CI 1.05–1.23). Risk-adjusted ICU length of stay and hospital mortality were not significantly different between the two groups. Conclusions Face-to-face prompting was superior to an unprompted EHR-based checklist at reducing empirical antibiotic utilization. Sustained culture change may have contributed to the electronic checklist having similar empirical antibiotic utilization to a prompted group in the same MICU two years prior. Future studies

  10. Electronic cigarettes: product characterisation and design considerations

    PubMed Central

    Brown, Christopher J; Cheng, James M

    2014-01-01

    Objective To review the available evidence regarding electronic cigarette (e-cigarette) product characterisation and design features in order to understand their potential impact on individual users and on public health. Methods Systematic literature searches in 10 reference databases were conducted through October 2013. A total of 14 articles and documents and 16 patents were included in this analysis. Results Numerous disposable and reusable e-cigarette product options exist, representing wide variation in product configuration and component functionality. Common e-cigarette components include an aerosol generator, a flow sensor, a battery and a nicotine-containing solution storage area. e-cigarettes currently include many interchangeable parts, enabling users to modify the character of the delivered aerosol and, therefore, the product's ‘effectiveness’ as a nicotine delivery product. Materials in e-cigarettes may include metals, rubber and ceramics. Some materials may be aerosolised and have adverse health effects. Several studies have described significant performance variability across and within e-cigarette brands. Patent applications include novel product features designed to influence aerosol properties and e-cigarette efficiency at delivering nicotine. Conclusions Although e-cigarettes share a basic design, engineering variations and user modifications result in differences in nicotine delivery and potential product risks. e-cigarette aerosols may include harmful and potentially harmful constituents. Battery explosions and the risks of exposure to the e-liquid (especially for children) are also concerns. Additional research will enhance the current understanding of basic e-cigarette design and operation, aerosol production and processing, and functionality. A standardised e-cigarette testing regime should be developed to allow product comparisons. PMID:24732162

  11. Working Notes from the 1992 AAAI Workshop on Automating Software Design. Theme: Domain Specific Software Design

    NASA Technical Reports Server (NTRS)

    Keller, Richard M. (Editor); Barstow, David; Lowry, Michael R.; Tong, Christopher H.

    1992-01-01

    The goal of this workshop is to identify different architectural approaches to building domain-specific software design systems and to explore issues unique to domain-specific (vs. general-purpose) software design. Some general issues that cut across the particular software design domain include: (1) knowledge representation, acquisition, and maintenance; (2) specialized software design techniques; and (3) user interaction and user interface.

  12. Using tomoauto – a protocol for high-throughput automated cryo-electron tomography

    PubMed Central

    Morado, Dustin R.; Hu, Bo; Liu, Jun

    2016-01-01

    We present a protocol on how to utilize high-throughput cryo-electron tomography to determine high resolution in situ structures of molecular machines. The protocol permits large amounts of data to be processed, avoids common bottlenecks and reduces resource downtime, allowing the user to focus on important biological questions. Cryo-electron tomography (Cryo-ET) is a powerful three-dimensional (3-D) imaging technique for visualizing macromolecular complexes in their native context at a molecular level. The technique involves initially preserving the sample in its native state by rapidly freezing the specimen in vitreous ice, then collecting a series of micrographs from different angles at high magnification, and finally computationally reconstructing a 3-D density map. The frozen-hydrated specimen is extremely sensitive to the electron beam and so micrographs are collected at very low electron doses to limit the radiation damage. As a result, the raw cryo-tomogram has a very low signal to noise ratio characterized by an intrinsically noisy image. To better visualize subjects of interest, conventional imaging analysis and sub-tomogram averaging in which sub-tomograms of the subject are extracted from the initial tomogram and aligned and averaged are utilized to improve both contrast and resolution. Large datasets of tilt-series are essential to understanding and resolving the complexes at different states, conditions, or mutations as well as obtaining a large enough collection of sub-tomograms for averaging and classification. Collecting and processing this data can be a major obstacle preventing further analysis. Here we describe a high-throughput cryo-ET protocol based on a computer-controlled 300kV cryo-electron microscope, a direct detection device (DDD) camera and a highly effective, semi-automated image-processing pipeline software wrapper library tomoauto developed in-house. This protocol has been effectively utilized to visualize the intact type III

  13. Optimizing RF gun cavity geometry within an automated injector design system

    SciTech Connect

    Alicia Hofler ,Pavel Evtushenko

    2011-03-28

    RF guns play an integral role in the success of several light sources around the world, and properly designed and optimized cw superconducting RF (SRF) guns can provide a path to higher average brightness. As the need for these guns grows, it is important to have automated optimization software tools that vary the geometry of the gun cavity as part of the injector design process. This will allow designers to improve existing designs for present installations, extend the utility of these guns to other applications, and develop new designs. An evolutionary algorithm (EA) based system can provide this capability because EAs can search in parallel a large parameter space (often non-linear) and in a relatively short time identify promising regions of the space for more careful consideration. The injector designer can then evaluate more cavity design parameters during the injector optimization process against the beam performance requirements of the injector. This paper will describe an extension to the APISA software that allows the cavity geometry to be modified as part of the injector optimization and provide examples of its application to existing RF and SRF gun designs.

  14. SP-100 shield design automation process using expert system and heuristic search techniques

    NASA Astrophysics Data System (ADS)

    Marcille, Thomas F.; Protsik, Robert; Deane, Nelson A.; Hoover, Darryl G.

    1993-01-01

    The SP-100 shield subsystem design process has been modified to utilize the GE Corporate Reserch and Development program, ENGINEOUS (Tong 1990). ENGINEOUS is a software system that automates the use of Computer Aided Engineering (CAE) analysis programs in the engineering design process. The shield subsystem design process incorporates a nuclear subsystems design and performance code, a two-dimensional neutral particle transport code, several input processors and two general purpose neutronic output processors. Coupling these programs within ENGINEOUS provides automatic transition paths between applications, with no source code modifications. ENGINEOUS captures human design knowledge, as well as information about the specific CAE applications and stores this information in knowledge base files. The knowledge base information is used by the ENGINEOUS expert system to drive knowledge directed and knowledge supplemented search modules to find an optimum shield design for a given reactor definition, ensuring that specified constraints are satisfied. Alternate designs, not accommodated in the optimization design rules, can readily be explored through the use of a parametric study capability.

  15. The accuracy of a designed software for automated localization of craniofacial landmarks on CBCT images

    PubMed Central

    2014-01-01

    Background Two-dimensional projection radiographs have been traditionally considered the modality of choice for cephalometric analysis. To overcome the shortcomings of two-dimensional images, three-dimensional computed tomography (CT) has been used to evaluate craniofacial structures. However, manual landmark detection depends on medical expertise, and the process is time-consuming. The present study was designed to produce software capable of automated localization of craniofacial landmarks on cone beam (CB) CT images based on image registration and to evaluate its accuracy. Methods The software was designed using MATLAB programming language. The technique was a combination of feature-based (principal axes registration) and voxel similarity-based methods for image registration. A total of 8 CBCT images were selected as our reference images for creating a head atlas. Then, 20 CBCT images were randomly selected as the test images for evaluating the method. Three experts twice located 14 landmarks in all 28 CBCT images during two examinations set 6 weeks apart. The differences in the distances of coordinates of each landmark on each image between manual and automated detection methods were calculated and reported as mean errors. Results The combined intraclass correlation coefficient for intraobserver reliability was 0.89 and for interobserver reliability 0.87 (95% confidence interval, 0.82 to 0.93). The mean errors of all 14 landmarks were <4 mm. Additionally, 63.57% of landmarks had a mean error of <3 mm compared with manual detection (gold standard method). Conclusion The accuracy of our approach for automated localization of craniofacial landmarks, which was based on combining feature-based and voxel similarity-based methods for image registration, was acceptable. Nevertheless we recommend repetition of this study using other techniques, such as intensity-based methods. PMID:25223399

  16. New Insights into the Composition and Texture of Lunar Regolith Using Ultrafast Automated Electron-Beam Analysis

    NASA Technical Reports Server (NTRS)

    Rickman, Doug; Wentworth, Susan J.; Schrader, Christian M.; Stoeser, Doug; Botha, Pieter WSK; Butcher, Alan R.; Horsch, Hanna E.; Benedictus, Aukje; Gottlieb, Paul; McKay, David

    2008-01-01

    Sieved grain mounts of Apollo 16 drive tube samples have been examined using QEMSCAN - an innovative electron beam technology. By combining multiple energy-dispersive X-ray detectors, fully automated control, and off-line image processing, to produce digital mineral maps of particles exposed on polished surfaces, the result is an unprecedented quantity of mineralogical and petrographic data, on a particle-by-particle basis. Experimental analysis of four size fractions (500-250 microns, 150-90 microns, 75-45 microns and < 20 microns), prepared from two samples (64002,374 and 64002,262), has produced a robust and uniform dataset which allows for the quantification of mineralogy; texture; particle shape, size and density; and the digital classification of distinct particle types in each measured sample. These preliminary data show that there is a decrease in plagioclase modal content and an opposing increase in glass modal content, with decreasing particle size. These findings, together with data on trace phases (metals, sulphides, phosphates, and oxides), provide not only new insights into the make-up of lunar regolith at the Apollo 16 landing site, but also key physical parameters which can be used to design lunar simulants, and compute Figures of Merit for each material produced.

  17. Detection of Pharmacovigilance-Related adverse Events Using Electronic Health Records and automated Methods

    PubMed Central

    Haerian, K; Varn, D; Vaidya, S; Ena, L; Chase, HS; Friedman, C

    2013-01-01

    Electronic health records (EHRs) are an important source of data for detection of adverse drug reactions (ADRs). However, adverse events are frequently due not to medications but to the patients’ underlying conditions. Mining to detect ADRs from EHR data must account for confounders. We developed an automated method using natural-language processing (NLP) and a knowledge source to differentiate cases in which the patient’s disease is responsible for the event rather than a drug. Our method was applied to 199,920 hospitalization records, concentrating on two serious ADRs: rhabdomyolysis (n = 687) and agranulocytosis (n = 772). Our method automatically identified 75% of the cases, those with disease etiology. The sensitivity and specificity were 93.8% (confidence interval: 88.9-96.7%) and 91.8% (confidence interval: 84.0-96.2%), respectively. The method resulted in considerable saving of time: for every 1 h spent in development, there was a saving of at least 20 h in manual review. The review of the remaining 25% of the cases therefore became more feasible, allowing us to identify the medications that had caused the ADRs. PMID:22713699

  18. Automated tracing of filaments in 3D electron tomography reconstructions using Sculptor and Situs.

    PubMed

    Rusu, Mirabela; Starosolski, Zbigniew; Wahle, Manuel; Rigort, Alexander; Wriggers, Willy

    2012-05-01

    The molecular graphics program Sculptor and the command-line suite Situs are software packages for the integration of biophysical data across spatial resolution scales. Herein, we provide an overview of recently developed tools relevant to cryo-electron tomography (cryo-ET), with an emphasis on functionality supported by Situs 2.7.1 and Sculptor 2.1.1. We describe a work flow for automatically segmenting filaments in cryo-ET maps including denoising, local normalization, feature detection, and tracing. Tomograms of cellular actin networks exhibit both cross-linked and bundled filament densities. Such filamentous regions in cryo-ET data sets can then be segmented using a stochastic template-based search, VolTrac. The approach combines a genetic algorithm and a bidirectional expansion with a tabu search strategy to localize and characterize filamentous regions. The automated filament segmentation by VolTrac compares well to a manual one performed by expert users, and it allows an efficient and reproducible analysis of large data sets. The software is free, open source, and can be used on Linux, Macintosh or Windows computers. PMID:22433493

  19. Human-Automation Interaction Design for Adaptive Cruise Control Systems of Ground Vehicles.

    PubMed

    Eom, Hwisoo; Lee, Sang Hun

    2015-01-01

    A majority of recently developed advanced vehicles have been equipped with various automated driver assistance systems, such as adaptive cruise control (ACC) and lane keeping assistance systems. ACC systems have several operational modes, and drivers can be unaware of the mode in which they are operating. Because mode confusion is a significant human error factor that contributes to traffic accidents, it is necessary to develop user interfaces for ACC systems that can reduce mode confusion. To meet this requirement, this paper presents a new human-automation interaction design methodology in which the compatibility of the machine and interface models is determined using the proposed criteria, and if the models are incompatible, one or both of the models is/are modified to make them compatible. To investigate the effectiveness of our methodology, we designed two new interfaces by separately modifying the machine model and the interface model and then performed driver-in-the-loop experiments. The results showed that modifying the machine model provides a more compact, acceptable, effective, and safe interface than modifying the interface model. PMID:26076406

  20. Human-Automation Interaction Design for Adaptive Cruise Control Systems of Ground Vehicles

    PubMed Central

    Eom, Hwisoo; Lee, Sang Hun

    2015-01-01

    A majority of recently developed advanced vehicles have been equipped with various automated driver assistance systems, such as adaptive cruise control (ACC) and lane keeping assistance systems. ACC systems have several operational modes, and drivers can be unaware of the mode in which they are operating. Because mode confusion is a significant human error factor that contributes to traffic accidents, it is necessary to develop user interfaces for ACC systems that can reduce mode confusion. To meet this requirement, this paper presents a new human-automation interaction design methodology in which the compatibility of the machine and interface models is determined using the proposed criteria, and if the models are incompatible, one or both of the models is/are modified to make them compatible. To investigate the effectiveness of our methodology, we designed two new interfaces by separately modifying the machine model and the interface model and then performed driver-in-the-loop experiments. The results showed that modifying the machine model provides a more compact, acceptable, effective, and safe interface than modifying the interface model. PMID:26076406

  1. Design and performance of an automated video-based laser beam alignment system

    SciTech Connect

    Rundle, W.J. ); Kartz, M.W. ); Bliss, E.S.; English, R.E. Jr.; Peterson, R.L.; Thompson, G.R.; Uhlich, D.M. )

    1992-07-14

    This paper describes the design and performance of an automated, closed-loop, laser beam alignment system. Its function is to sense a beam alignment error in a laser beam transport system and automatically steer mirrors preceding the sensor location as required to maintain beam alignment. The laser beam is sampled by an optomechanical package which uses video cameras to sense pointing and centering errors. The camera outputs are fed to an image processing module, which includes video digitizers and uses image storage and software to sense the centroid of the image. Signals are sent through a VMEbus to an optical device controller'' (ODC), which drives stepper-motor actuators on mirror mounts preceding the beam-sampling location to return the beam alignment to the prescribed condition. Photodiodes are also used to extend the control bandwidth beyond that which is achievable with video cameras. This system has been operated at LLNL in the Atomic Vapor Laser Isotope Separation (AVLIS) program to maintain the alignment of copper and dye laser beams, the latter to within [plus minus]2 [mu]r in pointing and less than 1 mm in centering. The optomechanical design of the instrumented package, which includes lens, mirror, and video mounts in a rigid housing, the automated control system architecture, and the performance of this equipment is described.

  2. Design and performance of an automated video-based laser beam alignment system

    SciTech Connect

    Rundle, W.J.; Kartz, M.W.; Bliss, E.S.; English, R.E. Jr.; Peterson, R.L.; Thompson, G.R.; Uhlich, D.M.

    1992-07-14

    This paper describes the design and performance of an automated, closed-loop, laser beam alignment system. Its function is to sense a beam alignment error in a laser beam transport system and automatically steer mirrors preceding the sensor location as required to maintain beam alignment. The laser beam is sampled by an optomechanical package which uses video cameras to sense pointing and centering errors. The camera outputs are fed to an image processing module, which includes video digitizers and uses image storage and software to sense the centroid of the image. Signals are sent through a VMEbus to an ``optical device controller`` (ODC), which drives stepper-motor actuators on mirror mounts preceding the beam-sampling location to return the beam alignment to the prescribed condition. Photodiodes are also used to extend the control bandwidth beyond that which is achievable with video cameras. This system has been operated at LLNL in the Atomic Vapor Laser Isotope Separation (AVLIS) program to maintain the alignment of copper and dye laser beams, the latter to within {plus_minus}2 {mu}r in pointing and less than 1 mm in centering. The optomechanical design of the instrumented package, which includes lens, mirror, and video mounts in a rigid housing, the automated control system architecture, and the performance of this equipment is described.

  3. Electronic Course Surveys: Does Automating Feedback and Reporting Give Better Results?

    ERIC Educational Resources Information Center

    Watt, Stuart; Simpson, Claire; McKillop, Chris; Nunn, Viv

    2002-01-01

    Describes an automated course evaluation process for a modular program in management at Britain's Open University. The system employs Web-based surveying but emphasizes the reporting process, allowing easy generation of word-processed reports. Discusses both expected and unanticipated implications arising from the automated evaluation system. (EV)

  4. An automated instrument for human STR identification: design, characterization, and experimental validation.

    PubMed

    Hurth, Cedric; Smith, Stanley D; Nordquist, Alan R; Lenigk, Ralf; Duane, Brett; Nguyen, David; Surve, Amol; Hopwood, Andrew J; Estes, Matthew D; Yang, Jianing; Cai, Zhi; Chen, Xiaojia; Lee-Edghill, John G; Moran, Nina; Elliott, Keith; Tully, Gillian; Zenhausern, Frederic

    2010-10-01

    The microfluidic integration of an entire DNA analysis workflow on a fully integrated miniaturized instrument is reported using lab-on-a-chip automation to perform DNA fingerprinting compatible with CODIS standard relevant to the forensic community. The instrument aims to improve the cost, duration, and ease of use to perform a "sample-to-profile" analysis with no need for human intervention. The present publication describes the operation of the three major components of the system: the electronic control components, the microfluidic cartridge and CE microchip, and the optical excitation/detection module. Experimental details are given to characterize the level of performance, stability, reliability, accuracy, and sensitivity of the prototype system. A typical temperature profile from a PCR amplification process and an electropherogram of a commercial size standard (GeneScan 500™, Applied Biosystems) separation are shown to assess the relevance of the instrument to forensic applications. Finally, we present a profile from an automated integrated run where lysed cells from a buccal swab were introduced in the system and no further human intervention was required to complete the analysis. PMID:20931618

  5. Automated solar collector installation design including ability to define heterogeneous design preferences

    DOEpatents

    Wayne, Gary; Frumkin, Alexander; Zaydman, Michael; Lehman, Scott; Brenner, Jules

    2013-01-08

    Embodiments may include systems and methods to create and edit a representation of a worksite, to create various data objects, to classify such objects as various types of pre-defined "features" with attendant properties and layout constraints. As part of or in addition to classification, an embodiment may include systems and methods to create, associate, and edit intrinsic and extrinsic properties to these objects. A design engine may apply of design rules to the features described above to generate one or more solar collectors installation design alternatives, including generation of on-screen and/or paper representations of the physical layout or arrangement of the one or more design alternatives. Embodiments may also include definition of one or more design apertures, each of which may correspond to boundaries in which solar collector layouts should comply with distinct sets of user-defined design preferences. Distinct apertures may provide heterogeneous regions of collector layout according to the user-defined design preferences.

  6. Automated solar collector installation design including ability to define heterogeneous design preferences

    DOEpatents

    Wayne, Gary; Frumkin, Alexander; Zaydman, Michael; Lehman, Scott; Brenner, Jules

    2014-04-29

    Embodiments may include systems and methods to create and edit a representation of a worksite, to create various data objects, to classify such objects as various types of pre -defined "features" with attendant properties and layout constraints. As part of or in addition to classification, an embodiment may include systems and methods to create, associate, and edit intrinsic and extrinsic properties to these objects. A design engine may apply of design rules to the features described above to generate one or more solar collectors installation design alternatives, including generation of on-screen and/or paper representations of the physical layout or arrangement of the one or more design alternatives. Embodiments may also include definition of one or more design apertures, each of which may correspond to boundaries in which solar collector layouts should comply with distinct sets of user-defined design preferences. Distinct apertures may provide heterogeneous regions of collector layout according to the user-defined design preferences.

  7. Issues in the design of an executive controller shell for Space Station automation

    NASA Technical Reports Server (NTRS)

    Erickson, William K.; Cheeseman, Peter C.

    1986-01-01

    A major goal of NASA's Systems Autonomy Demonstration Project is to focus research in artificial intelligence, human factors, and dynamic control systems in support of Space Station automation. Another goal is to demonstrate the use of these technologies in real space systems, for both round-based mission support and on-board operations. The design, construction, and evaluation of an intelligent autonomous system shell is recognized as an important part of the Systems Autonomy research program. His paper describes autonomous systems and executive controllers, outlines how these intelligent systems can be utilized within the Space Station, and discusses a number of key design issues that have been raised during some preliminary work to develop an autonomous executive controller shell at NASA Ames Research Center.

  8. ADS: A FORTRAN program for automated design synthesis: Version 1.10

    NASA Technical Reports Server (NTRS)

    Vanderplaats, G. N.

    1985-01-01

    A new general-purpose optimization program for engineering design is described. ADS (Automated Design Synthesis - Version 1.10) is a FORTRAN program for solution of nonlinear constrained optimization problems. The program is segmented into three levels: strategy, optimizer, and one-dimensional search. At each level, several options are available so that a total of over 100 possible combinations can be created. Examples of available strategies are sequential unconstrained minimization, the Augmented Lagrange Multiplier method, and Sequential Linear Programming. Available optimizers include variable metric methods and the Method of Feasible Directions as examples, and one-dimensional search options include polynomial interpolation and the Golden Section method as examples. Emphasis is placed on ease of use of the program. All information is transferred via a single parameter list. Default values are provided for all internal program parameters such as convergence criteria, and the user is given a simple means to over-ride these, if desired.

  9. Automated design of gravity-assist trajectories to Mars and the outer planets

    NASA Technical Reports Server (NTRS)

    Longuski, James M.; Williams, Steve N.

    1991-01-01

    In this paper, a new approach to planetary mission design is described which automates the search for gravity-assist trajectories. This method finds all conic solutions given a range of launch dates, a range of launch energies and a set of target planets. The new design tool is applied to the problems of finding multiple encounter trajectories to the outer planets and Venus gravity-assist trajectories to Mars. The last four-planet grand tour opportunity (until the year 2153) is identified. It requires an earth launch in 1996 and encounters Jupiter, Uranus, Neptune, and Pluto. Venus gravity-assist trajectories to Mars for the 30 year period 1995-2024 are examined. It is shown that in many cases these trajectories require less launch energy to reach Mars than direct ballistic trajectories.

  10. SIMPLE DESIGN FOR AUTOMATION OF TUNGSTEN(VI) OXIDE TECHNIQUE FOR MEASUREMENT OF NH3, AND HNO3

    EPA Science Inventory

    The tungstic acid technique for collection and analysis of NH3 and HNO3 concentrations in the ambient air has been automated in a simple and cost-effective design. The design allows complete separation of HNO3 and NH3 during detection. Unattended operation in field trials has bee...

  11. Space Electronics: A Challenging World for Designers

    NASA Technical Reports Server (NTRS)

    Poivey, Christian; LaBel, Kenneth A.

    2004-01-01

    This viewgraph presentation provides an overview of: 1) The Space Radiation Environment; 2) The Effects on Electronics; 3) The Environment in Action; 4) Hardening Approaches to Commercial CMOS Electronics (including device vulnerabilities).

  12. Rapid and Semi-Automated Extraction of Neuronal Cell Bodies and Nuclei from Electron Microscopy Image Stacks

    PubMed Central

    Holcomb, Paul S.; Morehead, Michael; Doretto, Gianfranco; Chen, Peter; Berg, Stuart; Plaza, Stephen; Spirou, George

    2016-01-01

    Connectomics—the study of how neurons wire together in the brain—is at the forefront of modern neuroscience research. However, many connectomics studies are limited by the time and precision needed to correctly segment large volumes of electron microscopy (EM) image data. We present here a semi-automated segmentation pipeline using freely available software that can significantly decrease segmentation time for extracting both nuclei and cell bodies from EM image volumes. PMID:27259933

  13. EMC in power electronics and PCB design

    NASA Astrophysics Data System (ADS)

    Zhu, Chentian

    This dissertation consists of two parts. Part I is about Electromagnetic Compatibility (EMC) in power electronics and part II is about the Maximum Radiated Electromagnetic Emissions Calculator (MREMC), which is a software tool for EMC in printed circuit board (PCB) design. Switched-mode power converters can be significant sources of electromagnetic fields that interfere with the proper operation of nearby circuits or distant radio receivers. Part I of this dissertation provides comprehensive and organized information on the latest EMC developments in power converters. It describes and evaluates different technologies to ensure that power converters meet electromagnetic compatibility requirements. Chapters 2 and 3 describe EMC noise sources and coupling mechanisms in power converters. Chapter 4 reviews the measurements used to characterize and troubleshoot EMC problems. Chapters 5 -- 8 cover passive filter solutions, active filter solutions, noise cancellation methods and reduced-noise driving schemes. Part II describes the methods used, calculations made, and implementation details of the MREMC, which is a software tool that allows the user to calculate the maximum possible radiated emissions that could occur due to specific source geometries on a PCB. Chapters 9 -- 13 covers the I/O coupling EMI algorithm, Common-mode EMI algorithm, Power Bus EMI algorithm and Differential-Mode EMI algorithm used in the MREMC.

  14. The Design of the Mercury Electronic Library.

    ERIC Educational Resources Information Center

    Arms, William Y; And Others

    1992-01-01

    Describes the Mercury Electronic Library, a project at Carnegie Mellon University that involved development of software for an electronic library, implementation of the software by the university libraries, and stimulation of the market for electronic publishing. The library information system interface and databases and the computing system…

  15. Application of the H-Mode, a Design and Interaction Concept for Highly Automated Vehicles, to Aircraft

    NASA Technical Reports Server (NTRS)

    Goodrich, Kenneth H.; Flemisch, Frank O.; Schutte, Paul C.; Williams, Ralph A.

    2006-01-01

    Driven by increased safety, efficiency, and airspace capacity, automation is playing an increasing role in aircraft operations. As aircraft become increasingly able to autonomously respond to a range of situations with performance surpassing human operators, we are compelled to look for new methods that help us understand their use and guide their design using new forms of automation and interaction. We propose a novel design metaphor to aid the conceptualization, design, and operation of highly-automated aircraft. Design metaphors transfer meaning from common experiences to less familiar applications or functions. A notable example is the "Desktop metaphor" for manipulating files on a computer. This paper describes a metaphor for highly automated vehicles known as the H-metaphor and a specific embodiment of the metaphor known as the H-mode as applied to aircraft. The fundamentals of the H-metaphor are reviewed followed by an overview of an exploratory usability study investigating human-automation interaction issues for a simple H-mode implementation. The envisioned application of the H-mode concept to aircraft is then described as are two planned evaluations.

  16. Espina: A Tool for the Automated Segmentation and Counting of Synapses in Large Stacks of Electron Microscopy Images

    PubMed Central

    Morales, Juan; Alonso-Nanclares, Lidia; Rodríguez, José-Rodrigo; DeFelipe, Javier; Rodríguez, Ángel; Merchán-Pérez, Ángel

    2011-01-01

    The synapses in the cerebral cortex can be classified into two main types, Gray's type I and type II, which correspond to asymmetric (mostly glutamatergic excitatory) and symmetric (inhibitory GABAergic) synapses, respectively. Hence, the quantification and identification of their different types and the proportions in which they are found, is extraordinarily important in terms of brain function. The ideal approach to calculate the number of synapses per unit volume is to analyze 3D samples reconstructed from serial sections. However, obtaining serial sections by transmission electron microscopy is an extremely time consuming and technically demanding task. Using focused ion beam/scanning electron microscope microscopy, we recently showed that virtually all synapses can be accurately identified as asymmetric or symmetric synapses when they are visualized, reconstructed, and quantified from large 3D tissue samples obtained in an automated manner. Nevertheless, the analysis, segmentation, and quantification of synapses is still a labor intensive procedure. Thus, novel solutions are currently necessary to deal with the large volume of data that is being generated by automated 3D electron microscopy. Accordingly, we have developed ESPINA, a software tool that performs the automated segmentation and counting of synapses in a reconstructed 3D volume of the cerebral cortex, and that greatly facilitates and accelerates these processes. PMID:21633491

  17. Design and implementation of an automated compound management system in support of lead optimization.

    PubMed

    Quintero, Catherine; Kariv, Ilona

    2009-06-01

    To meet the needs of the increasingly rapid and parallelized lead optimization process, a fully integrated local compound storage and liquid handling system was designed and implemented to automate the generation of assay-ready plates directly from newly submitted and cherry-picked compounds. A key feature of the system is the ability to create project- or assay-specific compound-handling methods, which provide flexibility for any combination of plate types, layouts, and plate bar-codes. Project-specific workflows can be created by linking methods for processing new and cherry-picked compounds and control additions to produce a complete compound set for both biological testing and local storage in one uninterrupted workflow. A flexible cherry-pick approach allows for multiple, user-defined strategies to select the most appropriate replicate of a compound for retesting. Examples of custom selection parameters include available volume, compound batch, and number of freeze/thaw cycles. This adaptable and integrated combination of software and hardware provides a basis for reducing cycle time, fully automating compound processing, and ultimately increasing the rate at which accurate, biologically relevant results can be produced for compounds of interest in the lead optimization process. PMID:19487770

  18. Designs and concept reliance of a fully automated high-content screening platform.

    PubMed

    Radu, Constantin; Adrar, Hosna Sana; Alamir, Ab; Hatherley, Ian; Trinh, Trung; Djaballah, Hakim

    2012-10-01

    High-content screening (HCS) is becoming an accepted platform in academic and industry screening labs and does require slightly different logistics for execution. To automate our stand-alone HCS microscopes, namely, an alpha IN Cell Analyzer 3000 (INCA3000), originally a Praelux unit hooked to a Hudson Plate Crane with a maximum capacity of 50 plates per run, and the IN Cell Analyzer 2000 (INCA2000), in which up to 320 plates could be fed per run using the Thermo Fisher Scientific Orbitor, we opted for a 4 m linear track system harboring both microscopes, plate washer, bulk dispensers, and a high-capacity incubator allowing us to perform both live and fixed cell-based assays while accessing both microscopes on deck. Considerations in design were given to the integration of the alpha INCA3000, a new gripper concept to access the onboard nest, and peripheral locations on deck to ensure a self-reliant system capable of achieving higher throughput. The resulting system, referred to as Hestia, has been fully operational since the new year, has an onboard capacity of 504 plates, and harbors the only fully automated alpha INCA3000 unit in the world. PMID:22797489

  19. Algorithm design for automated transportation photo enforcement camera image and video quality diagnostic check modules

    NASA Astrophysics Data System (ADS)

    Raghavan, Ajay; Saha, Bhaskar

    2013-03-01

    Photo enforcement devices for traffic rules such as red lights, toll, stops, and speed limits are increasingly being deployed in cities and counties around the world to ensure smooth traffic flow and public safety. These are typically unattended fielded systems, and so it is important to periodically check them for potential image/video quality problems that might interfere with their intended functionality. There is interest in automating such checks to reduce the operational overhead and human error involved in manually checking large camera device fleets. Examples of problems affecting such camera devices include exposure issues, focus drifts, obstructions, misalignment, download errors, and motion blur. Furthermore, in some cases, in addition to the sub-algorithms for individual problems, one also has to carefully design the overall algorithm and logic to check for and accurately classifying these individual problems. Some of these issues can occur in tandem or have the potential to be confused for each other by automated algorithms. Examples include camera misalignment that can cause some scene elements to go out of focus for wide-area scenes or download errors that can be misinterpreted as an obstruction. Therefore, the sequence in which the sub-algorithms are utilized is also important. This paper presents an overview of these problems along with no-reference and reduced reference image and video quality solutions to detect and classify such faults.

  20. Designs and Concept-Reliance of a Fully Automated High Content Screening Platform

    PubMed Central

    Radu, Constantin; Adrar, Hosna Sana; Alamir, Ab; Hatherley, Ian; Trinh, Trung; Djaballah, Hakim

    2013-01-01

    High content screening (HCS) is becoming an accepted platform in academic and industry screening labs and does require slightly different logistics for execution. To automate our stand alone HCS microscopes, namely an alpha IN Cell Analyzer 3000 (INCA3000) originally a Praelux unit hooked to a Hudson Plate Crane with a maximum capacity of 50 plates per run; and the IN Cell Analyzer 2000 (INCA2000) where up to 320 plates could be fed per run using the Thermo Fisher Scientific Orbitor, we opted for a 4 meter linear track system harboring both microscopes, plate washer, bulk dispensers, and a high capacity incubator allowing us to perform both live and fixed cell based assays while accessing both microscopes on deck. Considerations in design were given to the integration of the alpha INCA3000, a new gripper concept to access the onboard nest, and peripheral locations on deck to ensure a self reliant system capable of achieving higher throughput. The resulting system, referred to as Hestia, has been fully operational since the New Year, has an onboard capacity of 504 plates, and harbors the only fully automated alpha INCA3000 unit in the World. PMID:22797489

  1. Electronic-Power-Transformer Design Guide

    NASA Technical Reports Server (NTRS)

    Schwarze, G. E.; Lagadinos, J. C.; Ahearn, J. F.

    1983-01-01

    Compilation of information on design procedures, electrical properties, and fabrication. Guide provides information on design procedures; magnetic and insulating material electrical properties; impregnating, encapsulating and processing techniques.

  2. Design of an automated algorithm for labeling cardiac blood pool in gated SPECT images of radiolabeled red blood cells

    SciTech Connect

    Hebert, T.J. |; Moore, W.H.; Dhekne, R.D.; Ford, P.V.; Wendt, J.A.; Murphy, P.H.; Ting, Y.

    1996-08-01

    The design of an automated computer algorithm for labeling the cardiac blood pool within gated 3-D reconstructions of the radiolabeled red blood cells is investigated. Due to patient functional abnormalities, limited resolution, and noise, certain spatial and temporal features of the cardiac blood pool that one would anticipate finding in every study are not present in certain frames or with certain patients. The labeling of the cardiac blood pool requires an algorithm that only relies upon features present in all patients. The authors investigate the design of a fully-automated region growing algorithm for this purpose.

  3. Design and development of a microarray processing station (MPS) for automated miniaturized immunoassays.

    PubMed

    Pla-Roca, Mateu; Altay, Gizem; Giralt, Xavier; Casals, Alícia; Samitier, Josep

    2016-08-01

    Here we describe the design and evaluation of a fluidic device for the automatic processing of microarrays, called microarray processing station or MPS. The microarray processing station once installed on a commercial microarrayer allows automating the washing, and drying steps, which are often performed manually. The substrate where the assay occurs remains on place during the microarray printing, incubation and processing steps, therefore the addressing of nL volumes of the distinct immunoassay reagents such as capture and detection antibodies and samples can be performed on the same coordinate of the substrate with a perfect alignment without requiring any additional mechanical or optical re-alignment methods. This allows the performance of independent immunoassays in a single microarray spot. PMID:27405464

  4. Using Dynamic Simulations and Automated Decision Tools to Design Lunar Habitats

    NASA Technical Reports Server (NTRS)

    Bell, Scott; Rodriguez, Luis; Kortenkamp, David

    2005-01-01

    This paper describes the role of transient simulations, heuristic techniques, and closed loop integrated control in designing and sizing habitat life support systems. The integration of these three elements allows for more accurate requirements to be derived in advance of hardware choices. As a test case, we used a typical lunar surface habitat. Large numbers of habitat configurations were rapidly tested and evaluated using automated decision support tools. Through this process, preliminary sizing for habitat life support systems were derived. Our preliminary results show that by using transient simulations and closed loop control , we substantially reduced the system mass required to meet mission goals. This has greater implications for general systems analyses and for life support systems. It is likely that transient models, realtime integrated control, and other analyses capable of capturing the uncertainties of systems can be useful for systems analyses much earlier in the system development life cycle than has previously been considered.

  5. An automated calibration laboratory for flight research instrumentation: Requirements and a proposed design approach

    NASA Technical Reports Server (NTRS)

    Oneill-Rood, Nora; Glover, Richard D.

    1990-01-01

    NASA's Dryden Flight Research Facility (Ames-Dryden), operates a diverse fleet of research aircraft which are heavily instrumented to provide both real time data for in-flight monitoring and recorded data for postflight analysis. Ames-Dryden's existing automated calibration (AUTOCAL) laboratory is a computerized facility which tests aircraft sensors to certify accuracy for anticipated harsh flight environments. Recently, a major AUTOCAL lab upgrade was initiated; the goal of this modernization is to enhance productivity and improve configuration management for both software and test data. The new system will have multiple testing stations employing distributed processing linked by a local area network to a centralized database. The baseline requirements for the new AUTOCAL lab and the design approach being taken for its mechanization are described.

  6. Research Initiatives and Preliminary Results In Automation Design In Airspace Management in Free Flight

    NASA Technical Reports Server (NTRS)

    Corker, Kevin; Lebacqz, J. Victor (Technical Monitor)

    1997-01-01

    The NASA and the FAA have entered into a joint venture to explore, define, design and implement a new airspace management operating concept. The fundamental premise of that concept is that technologies and procedures need to be developed for flight deck and ground operations to improve the efficiency, the predictability, the flexibility and the safety of airspace management and operations. To that end NASA Ames has undertaken an initial development and exploration of "key concepts" in the free flight airspace management technology development. Human Factors issues in automation aiding design, coupled aiding systems between air and ground, communication protocols in distributed decision making, and analytic techniques for definition of concepts of airspace density and operator cognitive load have been undertaken. This paper reports the progress of these efforts, which are not intended to definitively solve the many evolving issues of design for future ATM systems, but to provide preliminary results to chart the parameters of performance and the topology of the analytic effort required. The preliminary research in provision of cockpit display of traffic information, dynamic density definition, distributed decision making, situation awareness models and human performance models is discussed as they focus on the theme of "design requirements".

  7. Pilot opinions on high level flight deck automation issues: Toward the development of a design philosophy

    NASA Technical Reports Server (NTRS)

    Tenney, Yvette J.; Rogers, William H.; Pew, Richard W.

    1995-01-01

    There has been much concern in recent years about the rapid increase in automation on commercial flight decks. The survey was composed of three major sections. The first section asked pilots to rate different automation components that exist on the latest commercial aircraft regarding their obtrusiveness and the attention and effort required in using them. The second section addressed general 'automation philosophy' issues. The third section focused on issues related to levels and amount of automation. The results indicate that pilots of advanced aircraft like their automation, use it, and would welcome more automation. However, they also believe that automation has many disadvantages, especially fully autonomous automation. They want their automation to be simple and reliable and to produce predictable results. The biggest needs for higher levels of automation were in pre-flight, communication, systems management, and task management functions, planning as well as response tasks, and high workload situations. There is an irony and a challenge in the implications of these findings. On the one hand pilots would like new automation to be simple and reliable, but they need it to support the most complex part of the job--managing and planning tasks in high workload situations.

  8. An introduction to the BANNING design automation system for shuttle microelectronic hardware development

    NASA Technical Reports Server (NTRS)

    Mcgrady, W. J.

    1979-01-01

    The BANNING MOS design system is presented. It complements rather than supplant the normal design activities associated with the design and fabrication of low-power digital electronic equipment. BANNING is user-oriented and requires no programming experience to use effectively. It provides the user a simulation capability to aid in his circuit design and it eliminates most of the manual operations involved in the layout and artwork generation of integrated circuits. An example of its operation is given and some additional background reading is provided.

  9. Numerical and experimental analysis of a ducted propeller designed by a fully automated optimization process under open water condition

    NASA Astrophysics Data System (ADS)

    Yu, Long; Druckenbrod, Markus; Greve, Martin; Wang, Ke-qi; Abdel-Maksoud, Moustafa

    2015-10-01

    A fully automated optimization process is provided for the design of ducted propellers under open water conditions, including 3D geometry modeling, meshing, optimization algorithm and CFD analysis techniques. The developed process allows the direct integration of a RANSE solver in the design stage. A practical ducted propeller design case study is carried out for validation. Numerical simulations and open water tests are fulfilled and proved that the optimum ducted propeller improves hydrodynamic performance as predicted.

  10. Structure and design of the electron lens for RHIC

    SciTech Connect

    Pikin, A.; Fischer, W.; Alessi, J.; Anerella, M.; Beebe, E. Gassner, D.; Gu, X.; Gupta, R.; Hock, J.; Jain, A.; Lambiase, R.; Luo, Y.; Montag, C.; Okamura, M.; Tan, Y.; Tuozzolo, J.; Thieberger, P.; Zhang, W.

    2011-03-28

    Two electron lenses for a head-on beam-beam compensation are being planned for RHIC; one for each circulating proton beam. The transverse profile of the electron beam will be Gaussian up to a maximum radius of r{sub e} = 3{sigma}. Simulations and design of the electron gun with Gaussian radial emission current density profile and of the electron collector are presented. Ions of the residual gas generated in the interaction region by electron and proton beams will be removed by an axial gradient of the electric field towards the electron collector. A method for the optical observation of the transverse profile of the electron beam is described.

  11. Designing a beam transport system for RHIC's electron lens

    SciTech Connect

    Gu, X.; Pikin, A.; Okamura, M.; Fischer, W.; Luo, Y.; Gupta, R.; Hock, J.; Raparia, D.

    2011-03-28

    We designed two electron lenses to apply head-on beam-beam compensation for RHIC; they will be installed near IP10. The electron-beam transport system is an important subsystem of the entire electron-lens system. Electrons are transported from the electron gun to the main solenoid and further to the collector. The system must allow for changes of the electron beam size inside the superconducting magnet, and for changes of the electron position by 5 mm in the horizontal- and vertical-planes.

  12. Aircraft wing structural design optimization based on automated finite element modelling and ground structure approach

    NASA Astrophysics Data System (ADS)

    Yang, Weizhu; Yue, Zhufeng; Li, Lei; Wang, Peiyan

    2016-01-01

    An optimization procedure combining an automated finite element modelling (AFEM) technique with a ground structure approach (GSA) is proposed for structural layout and sizing design of aircraft wings. The AFEM technique, based on CATIA VBA scripting and PCL programming, is used to generate models automatically considering the arrangement of inner systems. GSA is used for local structural topology optimization. The design procedure is applied to a high-aspect-ratio wing. The arrangement of the integral fuel tank, landing gear and control surfaces is considered. For the landing gear region, a non-conventional initial structural layout is adopted. The positions of components, the number of ribs and local topology in the wing box and landing gear region are optimized to obtain a minimum structural weight. Constraints include tank volume, strength, buckling and aeroelastic parameters. The results show that the combined approach leads to a greater weight saving, i.e. 26.5%, compared with three additional optimizations based on individual design approaches.

  13. ADS: A FORTRAN program for automated design synthesis, version 1.00

    NASA Technical Reports Server (NTRS)

    Vanderplaats, G. N.

    1984-01-01

    A new general-purpose optimization program for engineering design is described. ADS-1 (Automated Design Synthesis - Version 1) is a FORTRAN program for solution of nonlinear constrained optimization problems. The program is segmented into three levels, being strategy, optimizer, and one-dimensional search. At each level, several options are available so that a total of over 100 possible combinations can be created. Examples of available strategies are sequential unconstrained minimization, the Augmented Lagrange Multiplier method, and Sequential Linear Programming. Available optimizers include variable metric methods and the Method of Feasible Directions as examples and one-dimensional search options include polynomial interpolation and the Golden Section method as examples. Emphasis is placed on ease of use of the program. All information is transferred via a single parameter list. Default values are provided for all internal program parameters such as convergence criteria, and the user is given a simple means to over-ride these, if desired. The program is demonstrated with a simple structural design example.

  14. Data fusion-based design for automated fingerprint identification systems (AFIS)

    NASA Astrophysics Data System (ADS)

    Reisman, James G.; Thomopoulos, Stelios C.

    1998-07-01

    This paper presents a data fusion-based approach to designing an Automated Fingerprint Identification System (AFIS). Fingerprint matching methods vary from pattern matching, using ridge structure, orientation, or even the entire fingerprint itself, to point critical matching, using localized features such as ridge discontinuities, e.g. minutiae, or porous structures. Localized matching methods, such as minutiae, tend to yield more compact templates, in general, than pattern based methods. However, the reliability of localized features may be an issue, since they are affected adversely by the quality of the captured fingerprint, i.e. the degree of noise. Minutiae-based matching methods tend to be slower, albeit more accurate, than pattern-based methods. The trade-off in designing a cost-effective AFIS in terms of processing power (CPU) used, matching speed, and accuracy, lies in the choice of the proper matching methods that are selected to optimize performance by maximizing the matching accuracy while minimizing the search time. In this paper we present a systematic design and study of a fusion-based AFIS using a multiplicity of matching methods to optimize system performance and minimize required CPU cost.

  15. Checking design conformance and optimizing manufacturability using automated double patterning decomposition

    NASA Astrophysics Data System (ADS)

    Cork, Chris; Ward, Brian; Barnes, Levi; Painter, Ben; Lucas, Kevin; Luk-Pat, Gerry; Wiaux, Vincent; Verhaegen, Staf; Maenhoudt, Mireille

    2008-03-01

    internally developed automated double pattern decomposition tool to investigate design compliance and describes a number of classes of non-conforming layout. Tool results then provide help to the designer to achieve robust design compliant layout.

  16. Designing Electronic Performance Support Systems: Models and Instructional Strategies Employed

    ERIC Educational Resources Information Center

    Nekvinda, Christopher D.

    2011-01-01

    The purpose of this qualitative study was to determine whether instructional designers and performance technologists utilize instructional design models when designing and developing electronic performance support systems (EPSS). The study also explored if these same designers were utilizing instructional strategies within their EPSS to support…

  17. Logic Design Pathology and Space Flight Electronics

    NASA Technical Reports Server (NTRS)

    Katz, Richard; Barto, Rod L.; Erickson, K.

    1997-01-01

    Logic design errors have been observed in space flight missions and the final stages of ground test. The technologies used by designers and their design/analysis methodologies will be analyzed. This will give insight to the root causes of the failures. These technologies include discrete integrated circuit based systems, systems based on field and mask programmable logic, and the use computer aided engineering (CAE) systems. State-of-the-art (SOTA) design tools and methodologies will be analyzed with respect to high-reliability spacecraft design and potential pitfalls are discussed. Case studies of faults from large expensive programs to "smaller, faster, cheaper" missions will be used to explore the fundamental reasons for logic design problems.

  18. D{O} upgrade muon electronics design

    SciTech Connect

    Baldin, B.; Green, D.; Haggerty, H.; Hansen, S.

    1994-11-01

    The planned luminosity for the upgrade is ten times higher than at present (L {approximately} 10{sup 32}cm{sup {minus}2}s{sup {minus}1}) and involves a time between collisions as small as 132 ns. To operate in this environment, completely new electronics is required for the 17,500 proportional drift tubes of the system. These electronics include a deadtimeless readout, a digital TDC with about 1 ns binning for the wire signals, fast charge integrators and pipelined ADCs for digitizing the pad electrode signals, a new wire signal triggering scheme and its associated trigger logic, and high level DSP processing. Some test results of measurements performed on prototype channels and a comparison with the existing electronics are presented.

  19. A conceptual design for an electron beam

    SciTech Connect

    Garcia, M

    1999-02-15

    This report is a brief description of a model electron beam, which is meant to serve as a pulsed heat source that vaporizes a metal fleck into an ''under-dense'' cloud. See Reference 1. The envelope of the electron beam is calculated from the paraxial ray equation, as stated in Reference 2. The examples shown here are for 5 A, 200 keV beams that focus to waists of under 0.4 mm diameter, within a cylindrical volume of 10 cm radius and length. The magnetic fields assumed in the examples are moderate, 0.11 T and 0.35 T, and can probably be created by permanent magnets.

  20. Logic Design Pathology and Space Flight Electronics

    NASA Technical Reports Server (NTRS)

    Katz, Richard B.; Barto, Rod L.; Erickson, Ken

    1999-01-01

    This paper presents a look at logic design from early in the US Space Program and examines faults in recent logic designs. Most examples are based on flight hardware failures and analysis of new tools and techniques. The paper is presented in viewgraph form.

  1. An Automated Electronic Tongue for In-Situ Quick Monitoring of Trace Heavy Metals in Water Environment

    NASA Astrophysics Data System (ADS)

    Cai, Wei; Li, Yi; Gao, Xiaoming; Guo, Hongsun; Zhao, Huixin; Wang, Ping

    2009-05-01

    An automated electronic tongue instrumentation has been developed for in-situ concentration determination of trace heavy metals in water environment. The electronic tongue contains two main parts. The sensor part consists of a silicon-based Hg-coated Au microelectrodes array (MEA) for the detection of Zn(II), Cd(II), Pb(II) and Cu(II) and a multiple light-addressable potentiometric sensor (MLAPS) for the detection of Fe(III) and Cr(VI). The control part employs pumps, valves and tubes to enable the pick-up and pretreatment of aqueous sample. The electronic tongue realized detection of the six metals mentioned above at part-per-billion (ppb) level without manual operation. This instrumentation will have wide application in quick monitoring and prediction the heavy metal pollution in lakes and oceans.

  2. Design of a Tool Integrating Force Sensing With Automated Insertion in Cochlear Implantation.

    PubMed

    Schurzig, Daniel; Labadie, Robert F; Hussong, Andreas; Rau, Thomas S; Webster, Robert J

    2012-04-01

    The quality of hearing restored to a deaf patient by a cochlear implant in hearing preservation cochlear implant surgery (and possibly also in routine cochlear implant surgery) is believed to depend on preserving delicate cochlear membranes while accurately inserting an electrode array deep into the spiral cochlea. Membrane rupture forces, and possibly, other indicators of suboptimal placement, are below the threshold detectable by human hands, motivating a force sensing insertion tool. Furthermore, recent studies have shown significant variability in manual insertion forces and velocities that may explain some instances of imperfect placement. Toward addressing this, an automated insertion tool was recently developed by Hussong et al. By following the same insertion tool concept, in this paper, we present mechanical enhancements that improve the surgeon's interface with the device and make it smaller and lighter. We also present electomechanical design of new components enabling integrated force sensing. The tool is designed to be sufficiently compact and light that it can be mounted to a microstereotactic frame for accurate image-guided preinsertion positioning. The new integrated force sensing system is capable of resolving forces as small as 0.005 N, and we provide experimental illustration of using forces to detect errors in electrode insertion. PMID:23482414

  3. A Practical Approach for Integrating Automatically Designed Fixtures with Automated Assembly Planning

    SciTech Connect

    Calton, Terri L.; Peters, Ralph R.

    1999-07-20

    This paper presents a practical approach for integrating automatically designed fixtures with automated assembly planning. Product assembly problems vary widely; here the focus is on assemblies that are characterized by a single base part to which a number of smaller parts and subassemblies are attached. This method starts with three-dimension at CAD descriptions of an assembly whose assembly tasks require a fixture to hold the base part. It then combines algorithms that automatically design assembly pallets to hold the base part with algorithms that automatically generate assembly sequences. The designed fixtures rigidly constrain and locate the part, obey task constraints, are robust to part shape variations, are easy to load, and are economical to produce. The algorithm is guaranteed to find the global optimum solution that satisfies these and other pragmatic conditions. The assembly planner consists of four main elements: a user interface, a constraint system, a search engine, and an animation module. The planner expresses all constraints at a sequencing level, specifying orders and conditions on part mating operations in a number of ways. Fast replanning enables an interactive plan-view-constrain-replan cycle that aids in constrain discovery and documentation. The combined algorithms guarantee that the fixture will hold the base part without interfering with any of the assembly operations. This paper presents an overview of the planners, the integration approach, and the results of the integrated algorithms applied to several practical manufacturing problems. For these problems initial high-quality fixture designs and assembly sequences are generated in a matter of minutes with global optimum solutions identified in just over an hour.

  4. Design and Development of a Robot-Based Automation System for Cryogenic Crystal Sample Mounting at the Advanced Photon Source

    SciTech Connect

    Shu, D.; Preissner, C.; Nocher, D.; Han, Y.; Barraza, J.; Lee, P.; Lee, W.-K.; Cai, Z.; Ginell, S.; Alkire, R.; Lazarski, K.; Schuessler, R.; Joachimiak, A.

    2004-05-12

    X-ray crystallography is the primary method to determine the 3D structures of complex macromolecules at high resolution. In the years to come, the Advanced Photon Source (APS) and similar 3rd-generation synchrotron sources elsewhere will become the most powerful tools for studying atomic structures of biological molecules. One of the major bottlenecks in the x-ray data collection process is the constant need to change and realign the crystal sample. This is a very time- and manpower-consuming task. An automated sample mounting system will help to solve this bottleneck problem. We have developed a novel robot-based automation system for cryogenic crystal sample mounting at the APS. Design of the robot-based automation system, as well as its on-line test results at the Argonne Structural Biology Center (SBC) 19-BM experimental station, are presented in this paper.

  5. Crystallographic analysis of the solid-state dewetting of polycrystalline gold film using automated indexing in a transmission electron microscope

    SciTech Connect

    Jang, S. A.; Lee, H. J.; Oh, Y. J.; Thompson, C. V.; Ross, C. A.

    2015-12-01

    We analyzed the effect of crystallographic anisotropy on the morphological evolution of a 12-nm-thick gold film during solid-state dewetting at high temperatures using automated indexing tool in a transmission electron microscopy. Dewetting initiated at grain-boundary triple junctions adjacent to large grains resulting from abnormal grain growth driven by (111) texture development. Voids at the junctions developed shapes with faceted edges bounded by low-index crystal planes. The kinetic mobility of the edges varied with the crystal orientation normal to the edges, with a predominance of specific edges with the slowest retraction rates as the annealing time was increased.

  6. Designing an electronic medication reconciliation system.

    PubMed

    Hamann, Claus; Poon, Eric; Smith, Sandra; Coley, Christopher; Graydon-Baker, Erin; Gandhi, Tejal; Chueh, Henry C; Poikonen, John; Hallisey, Robert K; Van Putten, Cheryl; Broverman, Carol; Blumenfeld, Barry; Middleton, Blackford

    2005-01-01

    Unintended medication discrepancies at hospital admission and discharge potentially harm patients. Explicit medication reconciliation (MR) can prevent unintended discrepancies among care settings and is mandated by JCAHO for 2005. Enterprise-wide, we are linking pre-admission and discharge medication lists in our outpatient electronic health records (EHR) with our inpatient order entry applications (OE) - currently not interoperable - to support MR and inform the development of comprehensive MR among hospitalized patients. PMID:16779263

  7. Designing an Electronic Medication Reconciliation System

    PubMed Central

    Hamann, Claus; Poon, Eric; Smith, Sandra; Coley, Christopher; Graydon-Baker, Erin; Gandhi, Tejal; Chueh, Henry C.; Poikonen, John; Hallisey, Robert K.; Van Putten, Cheryl; Broverman, Carol; Blumenfeld, Barry; Middleton, Blackford

    2005-01-01

    Unintended medication discrepancies at hospital admission and discharge potentially harm patients. Explicit medication reconciliation (MR) can prevent unintended discrepancies among care settings and is mandated by JCAHO for 2005. Enterprise-wide, we are linking pre-admission and discharge medication lists in our outpatient electronic health records (EHR) with our inpatient order entry applications (OE) - currently not interoperable - to support MR and inform the development of comprehensive MR among hospitalized patients. PMID:16779263

  8. Update on the MEIC electron collider ring design

    SciTech Connect

    Lin, Fangei; Derbenev, Yaroslav S.; Harwood, Leigh; Hutton, Andrew; Morozov, Vasiliy; Pilat, Fulvia; Zhang, Yuhong; Cai, Y.; Nosochkov, Y. M.; Sullivan, Michael; Wang, M.-H; Wienands, Uli

    2015-09-01

    The electron collider ring of the Medium-energy Electron-Ion Collider (MEIC) at Jefferson Lab is designed to accumulate and store a high-current polarized electron beam for collisions with an ion beam. We consider a design of the electron collider ring based on reusing PEP-II components, such as magnets, power supplies, vacuum system, etc. This has the potential to significantly reduce the cost and engineering effort needed to bring the project to fruition. This paper reports on an electron ring optics design considering the balance of PEP-II hardware parameters (such as dipole sagitta, magnet field strengths and acceptable synchrotron radiation power) and electron beam quality in terms of equilibrium emittances.

  9. Update on the MEIC electron collider ring design

    SciTech Connect

    Lin, F.; Derbenev, Ya. S.; Harwood, L.; Hutton, A.; Morozov, V. S.; Pilat, F.; Zhang, Y.; Cai, Y.; Nosochkov, Y. M.; Sullivan, M.; Wang, M-H; Wienands, U.

    2015-07-14

    The electron collider ring of the Medium-energy Electron-Ion Collider (MEIC) at Jefferson Lab is designed to accumulate and store a high-current polarized electron beam for collisions with an ion beam. We consider a design of the electron collider ring based on reusing PEPII components, such as magnets, power supplies, vacuum system, etc. This has the potential to significantly reduce the cost and engineering effort needed to bring the project to fruition. This paper reports on an electron ring optics design considering the balance of PEP-II hardware parameters (such as dipole sagitta, magnet field strengths and acceptable synchrotron radiation power) and electron beam quality in terms of equilibrium emittances.

  10. Program user's manual for optimizing the design of a liquid or gaseous propellant rocket engine with the automated combustor design code AUTOCOM

    NASA Technical Reports Server (NTRS)

    Reichel, R. H.; Hague, D. S.; Jones, R. T.; Glatt, C. R.

    1973-01-01

    This computer program manual describes in two parts the automated combustor design optimization code AUTOCOM. The program code is written in the FORTRAN 4 language. The input data setup and the program outputs are described, and a sample engine case is discussed. The program structure and programming techniques are also described, along with AUTOCOM program analysis.

  11. Binding the Electronic Book: Design Features for Bibliophiles

    ERIC Educational Resources Information Center

    Ruecker, Stan; Uszkalo, Kirsten C.

    2007-01-01

    This paper proposes a design for the electronic book based on discussions with frequent book readers. We adopted a conceptual framework for this project consisting of a spectrum of possible designs, with the conventional bound book at one difference pole, and the laptop computer at the other; the design activity then consisted of appropriately…

  12. Free electron laser designs for laser amplification

    DOEpatents

    Prosnitz, Donald; Szoke, Abraham

    1985-01-01

    Method for laser beam amplification by means of free electron laser techniques. With wiggler magnetic field strength B.sub.w and wavelength .lambda..sub.w =2.pi./k.sub.w regarded as variable parameters, the method(s) impose conditions such as substantial constancy of B.sub.w /k.sub.w or k.sub.w or B.sub.w and k.sub.w (alternating), coupled with a choice of either constant resonant phase angle or programmed phase space "bucket" area.

  13. Human factors design of automated highway systems: First generation scenarios. Final report, 1 November 1992-1 May 1993

    SciTech Connect

    Tsao, H.S.J.; Hall, R.W.; Shladover, S.E.; Plocher, T.A.; Levitan, L.J.

    1994-12-01

    Attention to driver acceptance and performance issues during system design will be key to the success of the Automated Highway System (AHS). A first step in the process defining driver roles and driver-system interface requirements for AHS is the definition of system visions and operational scenarios. These scenarios then become the basis for first identifying driver functions and information requirements, and, later, designing the driver`s interface to the AHS. In addition, the scenarios provide a framework within which variables that potentially impact the driver can be explored systematically. Seven AHS operational scenarios, each describing a different AHS vision, were defined by varying three system dimensions with special significance for the driver. These three dimensions are: (1) the degree to which automated and manual traffic is separated, (2) the rules for vehicle following and spacing, and (3) the level of automation in traffic flow control. The seven scenarios vary in the complexity of the automated and manual driving maneuvers required, the physical space allowed for maneuvers, and the nature of the resulting demands placed on the driver. Each scenario describes the physical configuration of the system, operational events from entry to exist, and high-level driver functions.

  14. Small Volume Flow Probe for Automated Direct-Injection NMR Analysis: Design and Performance

    NASA Astrophysics Data System (ADS)

    Haner, Ronald L.; Llanos, William; Mueller, Luciano

    2000-03-01

    A detailed characterization of an NMR flow probe for use in direct-injection sample analysis is presented. A 600-MHz, indirect detection NMR flow probe with a 120-μl active volume is evaluated in two configurations: first as a stand-alone small volume probe for the analysis of static, nonflowing solutions, and second as a component in an integrated liquids-handling system used for high-throughput NMR analysis. In the stand-alone mode, 1H lineshape, sensitivity, radiofrequency (RF) homogeneity, and heat transfer characteristics are measured and compared to conventional-format NMR probes of related design. Commonly used descriptive terminology for the hardware, sample regions, and RF coils are reviewed or defined, and test procedures developed for flow probes are described. The flow probe displayed general performance that is competitive with standard probes. Key advantages of the flow probe include high molar sensitivity, ease of use in an automation setup, and superior reproducibility of magnetic field homogeneity which enables the practical implementation of 1D T2-edited analysis of protein-ligand interactions.

  15. Automated physics-based design of synthetic riboswitches from diverse RNA aptamers

    PubMed Central

    Espah Borujeni, Amin; Mishler, Dennis M.; Wang, Jingzhi; Huso, Walker; Salis, Howard M.

    2016-01-01

    Riboswitches are shape-changing regulatory RNAs that bind chemicals and regulate gene expression, directly coupling sensing to cellular actuation. However, it remains unclear how their sequence controls the physics of riboswitch switching and activation, particularly when changing the ligand-binding aptamer domain. We report the development of a statistical thermodynamic model that predicts the sequence-structure-function relationship for translation-regulating riboswitches that activate gene expression, characterized inside cells and within cell-free transcription–translation assays. Using the model, we carried out automated computational design of 62 synthetic riboswitches that used six different RNA aptamers to sense diverse chemicals (theophylline, tetramethylrosamine, fluoride, dopamine, thyroxine, 2,4-dinitrotoluene) and activated gene expression by up to 383-fold. The model explains how aptamer structure, ligand affinity, switching free energy and macromolecular crowding collectively control riboswitch activation. Our model-based approach for engineering riboswitches quantitatively confirms several physical mechanisms governing ligand-induced RNA shape-change and enables the development of cell-free and bacterial sensors for diverse applications. PMID:26621913

  16. Cloud-Based Automated Design and Additive Manufacturing: A Usage Data-Enabled Paradigm Shift

    PubMed Central

    Lehmhus, Dirk; Wuest, Thorsten; Wellsandt, Stefan; Bosse, Stefan; Kaihara, Toshiya; Thoben, Klaus-Dieter; Busse, Matthias

    2015-01-01

    Integration of sensors into various kinds of products and machines provides access to in-depth usage information as basis for product optimization. Presently, this large potential for more user-friendly and efficient products is not being realized because (a) sensor integration and thus usage information is not available on a large scale and (b) product optimization requires considerable efforts in terms of manpower and adaptation of production equipment. However, with the advent of cloud-based services and highly flexible additive manufacturing techniques, these obstacles are currently crumbling away at rapid pace. The present study explores the state of the art in gathering and evaluating product usage and life cycle data, additive manufacturing and sensor integration, automated design and cloud-based services in manufacturing. By joining and extrapolating development trends in these areas, it delimits the foundations of a manufacturing concept that will allow continuous and economically viable product optimization on a general, user group or individual user level. This projection is checked against three different application scenarios, each of which stresses different aspects of the underlying holistic concept. The following discussion identifies critical issues and research needs by adopting the relevant stakeholder perspectives. PMID:26703606

  17. A novel automated instrument designed to determine photosensitivity thresholds (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Aguilar, Mariela C.; Gonzalez, Alex; Rowaan, Cornelis; De Freitas, Carolina; Rosa, Potyra R.; Alawa, Karam; Lam, Byron L.; Parel, Jean-Marie A.

    2016-03-01

    As there is no clinically available instrument to systematically and reliably determine the photosensitivity thresholds of patients with dry eyes, blepharospasms, migraines, traumatic brain injuries, and genetic disorders such as Achromatopsia, retinitis pigmentosa and other retinal dysfunctions, a computer-controlled optoelectronics system was designed. The BPEI Photosensitivity System provides a light stimuli emitted from a bi-cupola concave, 210 white LED array with varying intensity ranging from 1 to 32,000 lux. The system can either utilize a normal or an enhanced testing mode for subjects with low light tolerance. The automated instrument adjusts the intensity of each light stimulus. The subject is instructed to indicate discomfort by pressing a hand-held button. Reliability of the responses is tracked during the test. The photosensitivity threshold is then calculated after 10 response reversals. In a preliminary study, we demonstrated that subjects suffering from Achromatopsia experienced lower photosensitivity thresholds than normal subjects. Hence, the system can safely and reliably determine the photosensitivity thresholds of healthy and light sensitive subjects by detecting and quantifying the individual differences. Future studies will be performed with this system to determine the photosensitivity threshold differences between normal subjects and subjects suffering from other conditions that affect light sensitivity.

  18. Design and utilization of the drug-excipient chemical compatibility automated system.

    PubMed

    Thomas, V Hayden; Naath, Maryanne

    2008-07-01

    To accelerate clinical formulation development, an excipient compatibility screen should be conducted as early as possible and it must be rapid, robust and resource sparing. This however, does not describe the traditional excipient compatibility testing approach, requiring many tedious and labor intensive manual operations. This study focused on transforming traditional practices into a completely automated screening process to increase sample throughput and realign resources to more urgent areas, while maintaining quality. Using the developed system, a complete on-line performance study was conducted whereby drug-excipient mixtures were weighed, blended and subjected to accelerated stress stability for up to 1 month, followed by sample extraction and HPLC analysis. Compared to off-line traditional study protocols, the system provided similar relative rank order results with equivalent precision and accuracy, while increasing sample throughput. The designed system offers a resource sparing primary screen for drug-excipient chemical compatibility for solid dosage form development. This approach allows risk assessment analysis, based upon formulation complexity, to be conducted prior to the commitment of resources and candidate selection for clinical development. PMID:18486368

  19. Cloud-Based Automated Design and Additive Manufacturing: A Usage Data-Enabled Paradigm Shift.

    PubMed

    Lehmhus, Dirk; Wuest, Thorsten; Wellsandt, Stefan; Bosse, Stefan; Kaihara, Toshiya; Thoben, Klaus-Dieter; Busse, Matthias

    2015-01-01

    Integration of sensors into various kinds of products and machines provides access to in-depth usage information as basis for product optimization. Presently, this large potential for more user-friendly and efficient products is not being realized because (a) sensor integration and thus usage information is not available on a large scale and (b) product optimization requires considerable efforts in terms of manpower and adaptation of production equipment. However, with the advent of cloud-based services and highly flexible additive manufacturing techniques, these obstacles are currently crumbling away at rapid pace. The present study explores the state of the art in gathering and evaluating product usage and life cycle data, additive manufacturing and sensor integration, automated design and cloud-based services in manufacturing. By joining and extrapolating development trends in these areas, it delimits the foundations of a manufacturing concept that will allow continuous and economically viable product optimization on a general, user group or individual user level. This projection is checked against three different application scenarios, each of which stresses different aspects of the underlying holistic concept. The following discussion identifies critical issues and research needs by adopting the relevant stakeholder perspectives. PMID:26703606

  20. Automated physics-based design of synthetic riboswitches from diverse RNA aptamers.

    PubMed

    Espah Borujeni, Amin; Mishler, Dennis M; Wang, Jingzhi; Huso, Walker; Salis, Howard M

    2016-01-01

    Riboswitches are shape-changing regulatory RNAs that bind chemicals and regulate gene expression, directly coupling sensing to cellular actuation. However, it remains unclear how their sequence controls the physics of riboswitch switching and activation, particularly when changing the ligand-binding aptamer domain. We report the development of a statistical thermodynamic model that predicts the sequence-structure-function relationship for translation-regulating riboswitches that activate gene expression, characterized inside cells and within cell-free transcription-translation assays. Using the model, we carried out automated computational design of 62 synthetic riboswitches that used six different RNA aptamers to sense diverse chemicals (theophylline, tetramethylrosamine, fluoride, dopamine, thyroxine, 2,4-dinitrotoluene) and activated gene expression by up to 383-fold. The model explains how aptamer structure, ligand affinity, switching free energy and macromolecular crowding collectively control riboswitch activation. Our model-based approach for engineering riboswitches quantitatively confirms several physical mechanisms governing ligand-induced RNA shape-change and enables the development of cell-free and bacterial sensors for diverse applications. PMID:26621913

  1. Automated digital microfluidic platform for magnetic-particle-based immunoassays with optimization by design of experiments.

    PubMed

    Choi, Kihwan; Ng, Alphonsus H C; Fobel, Ryan; Chang-Yen, David A; Yarnell, Lyle E; Pearson, Elroy L; Oleksak, Carl M; Fischer, Andrew T; Luoma, Robert P; Robinson, John M; Audet, Julie; Wheeler, Aaron R

    2013-10-15

    We introduce an automated digital microfluidic (DMF) platform capable of performing immunoassays from sample to analysis with minimal manual intervention. This platform features (a) a 90 Pogo pin interface for digital microfluidic control, (b) an integrated (and motorized) photomultiplier tube for chemiluminescent detection, and (c) a magnetic lens assembly which focuses magnetic fields into a narrow region on the surface of the DMF device, facilitating up to eight simultaneous digital microfluidic magnetic separations. The new platform was used to implement a three-level full factorial design of experiments (DOE) optimization for thyroid-stimulating hormone immunoassays, varying (1) the analyte concentration, (2) the sample incubation time, and (3) the sample volume, resulting in an optimized protocol that reduced the detection limit and sample incubation time by up to 5-fold and 2-fold, respectively, relative to those from previous work. To our knowledge, this is the first report of a DOE optimization for immunoassays in a microfluidic system of any format. We propose that this new platform paves the way for a benchtop tool that is useful for implementing immunoassays in near-patient settings, including community hospitals, physicians' offices, and small clinical laboratories. PMID:23978190

  2. Process automation

    SciTech Connect

    Moser, D.R.

    1986-01-01

    Process automation technology has been pursued in the chemical processing industries and to a very limited extent in nuclear fuel reprocessing. Its effective use has been restricted in the past by the lack of diverse and reliable process instrumentation and the unavailability of sophisticated software designed for process control. The Integrated Equipment Test (IET) facility was developed by the Consolidated Fuel Reprocessing Program (CFRP) in part to demonstrate new concepts for control of advanced nuclear fuel reprocessing plants. A demonstration of fuel reprocessing equipment automation using advanced instrumentation and a modern, microprocessor-based control system is nearing completion in the facility. This facility provides for the synergistic testing of all chemical process features of a prototypical fuel reprocessing plant that can be attained with unirradiated uranium-bearing feed materials. The unique equipment and mission of the IET facility make it an ideal test bed for automation studies. This effort will provide for the demonstration of the plant automation concept and for the development of techniques for similar applications in a full-scale plant. A set of preliminary recommendations for implementing process automation has been compiled. Some of these concepts are not generally recognized or accepted. The automation work now under way in the IET facility should be useful to others in helping avoid costly mistakes because of the underutilization or misapplication of process automation. 6 figs.

  3. Design of an ultra-portable field transfer radiometer supporting automated vicarious calibration

    NASA Astrophysics Data System (ADS)

    Anderson, Nikolaus; Thome, Kurtis; Czapla-Myers, Jeffrey; Biggar, Stuart

    2015-09-01

    The University of Arizona Remote Sensing Group (RSG) began outfitting the radiometric calibration test site (RadCaTS) at Railroad Valley Nevada in 2004 for automated vicarious calibration of Earth-observing sensors. RadCaTS was upgraded to use RSG custom 8-band ground viewing radiometers (GVRs) beginning in 2011 and currently four GVRs are deployed providing an average reflectance for the test site. This measurement of ground reflectance is the most critical component of vicarious calibration using the reflectance-based method. In order to ensure the quality of these measurements, RSG has been exploring more efficient and accurate methods of on-site calibration evaluation. This work describes the design of, and initial results from, a small portable transfer radiometer for the purpose of GVR calibration validation on site. Prior to deployment, RSG uses high accuracy laboratory calibration methods in order to provide radiance calibrations with low uncertainties for each GVR. After deployment, a solar radiation based calibration has typically been used. The method is highly dependent on a clear, stable atmosphere, requires at least two people to perform, is time consuming in post processing, and is dependent on several large pieces of equipment. In order to provide more regular and more accurate calibration monitoring, the small portable transfer radiometer is designed for quick, one-person operation and on-site field calibration comparison results. The radiometer is also suited for laboratory calibration use and thus could be used as a transfer radiometer calibration standard for ground viewing radiometers of a RadCalNet site.

  4. The Semantic Automated Discovery and Integration (SADI) Web service Design-Pattern, API and Reference Implementation

    PubMed Central

    2011-01-01

    Background The complexity and inter-related nature of biological data poses a difficult challenge for data and tool integration. There has been a proliferation of interoperability standards and projects over the past decade, none of which has been widely adopted by the bioinformatics community. Recent attempts have focused on the use of semantics to assist integration, and Semantic Web technologies are being welcomed by this community. Description SADI - Semantic Automated Discovery and Integration - is a lightweight set of fully standards-compliant Semantic Web service design patterns that simplify the publication of services of the type commonly found in bioinformatics and other scientific domains. Using Semantic Web technologies at every level of the Web services "stack", SADI services consume and produce instances of OWL Classes following a small number of very straightforward best-practices. In addition, we provide codebases that support these best-practices, and plug-in tools to popular developer and client software that dramatically simplify deployment of services by providers, and the discovery and utilization of those services by their consumers. Conclusions SADI Services are fully compliant with, and utilize only foundational Web standards; are simple to create and maintain for service providers; and can be discovered and utilized in a very intuitive way by biologist end-users. In addition, the SADI design patterns significantly improve the ability of software to automatically discover appropriate services based on user-needs, and automatically chain these into complex analytical workflows. We show that, when resources are exposed through SADI, data compliant with a given ontological model can be automatically gathered, or generated, from these distributed, non-coordinating resources - a behaviour we have not observed in any other Semantic system. Finally, we show that, using SADI, data dynamically generated from Web services can be explored in a manner

  5. Computer programs: Electronic circuit design criteria: A compilation

    NASA Technical Reports Server (NTRS)

    1973-01-01

    A Technology Utilization Program for the dissemination of information on technological developments which have potential utility outside the aerospace community is presented. The 21 items reported herein describe programs that are applicable to electronic circuit design procedures.

  6. Design Alternatives for a Free Electron Laser Facility

    SciTech Connect

    Jacobs, K; Bosch, R A; Eisert, D; Fisher, M V; Green, M A; Keil, R G; Kleman, K J; Kulpin, J G; Rogers, G C; Wehlitz, R; Chiang, T; Miller, T J; Lawler, J E; Yavuz, D; Legg, R A; York, R C

    2012-07-01

    The University of Wisconsin-Madison is continuing design efforts for a vacuum ultraviolet/X-ray Free Electron Laser facility. The design incorporates seeding the FEL to provide fully coherent photon output at energies up to {approx}1 keV. The focus of the present work is to minimize the cost of the facility while preserving its performance. To achieve this we are exploring variations in the electron beam driver for the FEL, in undulator design, and in the seeding mechanism. Design optimizations and trade-offs between the various technologies and how they affect the FEL scientific program will be presented.

  7. Development of Automated Image Analysis Tools for Verification of Radiotherapy Field Accuracy with AN Electronic Portal Imaging Device.

    NASA Astrophysics Data System (ADS)

    Dong, Lei

    1995-01-01

    The successful management of cancer with radiation relies on the accurate deposition of a prescribed dose to a prescribed anatomical volume within the patient. Treatment set-up errors are inevitable because the alignment of field shaping devices with the patient must be repeated daily up to eighty times during the course of a fractionated radiotherapy treatment. With the invention of electronic portal imaging devices (EPIDs), patient's portal images can be visualized daily in real-time after only a small fraction of the radiation dose has been delivered to each treatment field. However, the accuracy of human visual evaluation of low-contrast portal images has been found to be inadequate. The goal of this research is to develop automated image analysis tools to detect both treatment field shape errors and patient anatomy placement errors with an EPID. A moments method has been developed to align treatment field images to compensate for lack of repositioning precision of the image detector. A figure of merit has also been established to verify the shape and rotation of the treatment fields. Following proper alignment of treatment field boundaries, a cross-correlation method has been developed to detect shifts of the patient's anatomy relative to the treatment field boundary. Phantom studies showed that the moments method aligned the radiation fields to within 0.5mm of translation and 0.5^ circ of rotation and that the cross-correlation method aligned anatomical structures inside the radiation field to within 1 mm of translation and 1^ circ of rotation. A new procedure of generating and using digitally reconstructed radiographs (DRRs) at megavoltage energies as reference images was also investigated. The procedure allowed a direct comparison between a designed treatment portal and the actual patient setup positions detected by an EPID. Phantom studies confirmed the feasibility of the methodology. Both the moments method and the cross -correlation technique were

  8. Design and Operation of an Open, Interoperable Automated Demand Response Infrastructure for Commercial Buildings

    SciTech Connect

    Piette, Mary Ann; Ghatikar, Girish; Kiliccote, Sila; Watson, David; Koch, Ed; Hennage, Dan

    2009-05-01

    This paper describes the concept for and lessons from the development and field-testing of an open, interoperable communications infrastructure to support automated demand response (auto-DR). Automating DR allows greater levels of participation, improved reliability, and repeatability of the DR in participating facilities. This paper also presents the technical and architectural issues associated with auto-DR and description of the demand response automation server (DRAS), the client/server architecture-based middle-ware used to automate the interactions between the utilities or any DR serving entity and their customers for DR programs. Use case diagrams are presented to show the role of the DRAS between utility/ISO and the clients at the facilities.

  9. Computer-automated multi-disciplinary analysis and design optimization of internally cooled turbine blades

    NASA Astrophysics Data System (ADS)

    Martin, Thomas Joseph

    This dissertation presents the theoretical methodology, organizational strategy, conceptual demonstration and validation of a fully automated computer program for the multi-disciplinary analysis, inverse design and optimization of convectively cooled axial gas turbine blades and vanes. Parametric computer models of the three-dimensional cooled turbine blades and vanes were developed, including the automatic generation of discretized computational grids. Several new analysis programs were written and incorporated with existing computational tools to provide computer models of the engine cycle, aero-thermodynamics, heat conduction and thermofluid physics of the internally cooled turbine blades and vanes. A generalized information transfer protocol was developed to provide the automatic mapping of geometric and boundary condition data between the parametric design tool and the numerical analysis programs. A constrained hybrid optimization algorithm controlled the overall operation of the system and guided the multi-disciplinary internal turbine cooling design process towards the objectives and constraints of engine cycle performance, aerodynamic efficiency, cooling effectiveness and turbine blade and vane durability. Several boundary element computer programs were written to solve the steady-state non-linear heat conduction equation inside the internally cooled and thermal barrier-coated turbine blades and vanes. The boundary element method (BEM) did not require grid generation inside the internally cooled turbine blades and vanes, so the parametric model was very robust. Implicit differentiations of the BEM thermal and thereto-elastic analyses were done to compute design sensitivity derivatives faster and more accurately than via explicit finite differencing. A factor of three savings of computer processing time was realized for two-dimensional thermal optimization problems, and a factor of twenty was obtained for three-dimensional thermal optimization problems

  10. Design study for electronic system for Jupiter Orbit Probe (JOP)

    NASA Technical Reports Server (NTRS)

    Elero, B. P., Jr.; Carignan, G. R.

    1978-01-01

    The conceptual design of the Jupiter probe spectrometer is presented. Block and circuit diagrams are presented along with tabulated parts lists. Problem areas are considered to be (1) the schedule, (2) weight limitations for the electronic systems, and (3) radiation hardness of the electronic devices.

  11. Two-dimensional optimization of free-electron-laser designs

    DOEpatents

    Prosnitz, D.; Haas, R.A.

    1982-05-04

    Off-axis, two-dimensional designs for free electron lasers are described that maintain correspondence of a light beam with a synchronous electron at an optimal transverse radius r > 0 to achieve increased beam trapping efficiency and enhanced laser beam wavefront control so as to decrease optical beam diffraction and other deleterious effects.

  12. Two-dimensional optimization of free electron laser designs

    DOEpatents

    Prosnitz, Donald; Haas, Roger A.

    1985-01-01

    Off-axis, two-dimensional designs for free electron lasers that maintain correspondence of a light beam with a "synchronous electron" at an optimal transverse radius r>0 to achieve increased beam trapping efficiency and enhanced laser beam wavefront control so as to decrease optical beam diffraction and other deleterious effects.

  13. Design study report. Volume 2: Electronic unit

    NASA Technical Reports Server (NTRS)

    1973-01-01

    The recording system discussed is required to record and reproduce wideband data from either of the two primary Earth Resources Technology Satellite sensors: Return Beam Vidicon (RBV) camera or Multi-Spectral Scanner (MSS). The camera input is an analog signal with a bandwidth from dc to 3.5 MHz; this signal is accommodated through FM recording techniques which provide a recorder signal-to-noise ratio in excess of 39 db, black-to-white signal/rms noise, over the specified bandwidth. The MSS provides, as initial output, 26 narrowband channels. These channels are multiplexed prior to transmission, or recording, into a single 15 Megabit/second digital data stream. Within the recorder, the 15 Megabit/second NRZL signal is processed through the same FM electronics as the RBV signal, but the basic FM standards are modified to provide an internal, 10.5 MHz baseland response with signal-to-noise ratio of about 25 db. Following FM demodulation, however, the MSS signal is digitally re-shaped and re-clocked so that good bit stability and signal-to-noise exist at the recorder output.

  14. Habitat automation

    NASA Technical Reports Server (NTRS)

    Swab, Rodney E.

    1992-01-01

    A habitat, on either the surface of the Moon or Mars, will be designed and built with the proven technologies of that day. These technologies will be mature and readily available to the habitat designer. We believe an acceleration of the normal pace of automation would allow a habitat to be safer and more easily maintained than would be the case otherwise. This document examines the operation of a habitat and describes elements of that operation which may benefit from an increased use of automation. Research topics within the automation realm are then defined and discussed with respect to the role they can have in the design of the habitat. Problems associated with the integration of advanced technologies into real-world projects at NASA are also addressed.

  15. The design of electronic map displays

    NASA Technical Reports Server (NTRS)

    Aretz, Anthony J.

    1991-01-01

    This paper presents a cognitive analysis of a pilot's navigation task and describes an experiment comparing a new map display that employs the principle of visual momentum with the two traditional approaches, track-up and north-up. The data show that the advantage of a track-up alignment is its congruence with the egocentered forward view; however, the inconsistency of the rotating display hinders development of a cognitive map. The stability of a north-up alignment aids the acquisition of a cognitive map, but there is a cost associated with the mental rotation of the display to a track-up alignment for tasks involving the ego-centered forward view. The data also show that the visual momentum design captures the benefits and reduces the costs associated with the two traditional approaches.

  16. GEM: ANL 4-GeV CW electron microtron design

    SciTech Connect

    Kustom, R.L.

    1983-01-01

    A six-sided hexagonal microtron has been chosen as the accelerator to generate the beams required to pursue a national research program at a CW 4 GeV electron laboratory. This option has the advantage of superior beam quality, low capital and operating cost, and promise of furnishing beams of several electron energies simultaneously. Only moderate rf power is required because of the basic feature of all microtron designs, recirculation of the electron beam through the same rf accelerating section many times. The hexatron design has the additional feature of compatibility with an existing accelerator complex at Argonne which is currently unoccupied and available.

  17. The Automated Instrumentation and Monitoring System (AIMS): Design and Architecture. 3.2

    NASA Technical Reports Server (NTRS)

    Yan, Jerry C.; Schmidt, Melisa; Schulbach, Cathy; Bailey, David (Technical Monitor)

    1997-01-01

    Whether a researcher is designing the 'next parallel programming paradigm', another 'scalable multiprocessor' or investigating resource allocation algorithms for multiprocessors, a facility that enables parallel program execution to be captured and displayed is invaluable. Careful analysis of such information can help computer and software architects to capture, and therefore, exploit behavioral variations among/within various parallel programs to take advantage of specific hardware characteristics. A software tool-set that facilitates performance evaluation of parallel applications on multiprocessors has been put together at NASA Ames Research Center under the sponsorship of NASA's High Performance Computing and Communications Program over the past five years. The Automated Instrumentation and Monitoring Systematic has three major software components: a source code instrumentor which automatically inserts active event recorders into program source code before compilation; a run-time performance monitoring library which collects performance data; and a visualization tool-set which reconstructs program execution based on the data collected. Besides being used as a prototype for developing new techniques for instrumenting, monitoring and presenting parallel program execution, AIMS is also being incorporated into the run-time environments of various hardware testbeds to evaluate their impact on user productivity. Currently, the execution of FORTRAN and C programs on the Intel Paragon and PALM workstations can be automatically instrumented and monitored. Performance data thus collected can be displayed graphically on various workstations. The process of performance tuning with AIMS will be illustrated using various NAB Parallel Benchmarks. This report includes a description of the internal architecture of AIMS and a listing of the source code.

  18. Feasibility of fully automated detection of fiducial markers implanted into the prostate using electronic portal imaging: A comparison of methods

    SciTech Connect

    Harris, Emma J. . E-mail: eharris@icr.ac.uk; McNair, Helen A.; Evans, Phillip M.

    2006-11-15

    Purpose: To investigate the feasibility of fully automated detection of fiducial markers implanted into the prostate using portal images acquired with an electronic portal imaging device. Methods and Materials: We have made a direct comparison of 4 different methods (2 template matching-based methods, a method incorporating attenuation and constellation analyses and a cross correlation method) that have been published in the literature for the automatic detection of fiducial markers. The cross-correlation technique requires a-priory information from the portal images, therefore the technique is not fully automated for the first treatment fraction. Images of 7 patients implanted with gold fiducial markers (8 mm in length and 1 mm in diameter) were acquired before treatment (set-up images) and during treatment (movie images) using 1MU and 15MU per image respectively. Images included: 75 anterior (AP) and 69 lateral (LAT) set-up images and 51 AP and 83 LAT movie images. Using the different methods described in the literature, marker positions were automatically identified. Results: The method based upon cross correlation techniques gave the highest percentage detection success rate of 99% (AP) and 83% (LAT) set-up (1MU) images. The methods gave detection success rates of less than 91% (AP) and 42% (LAT) set-up images. The amount of a-priory information used and how it affects the way the techniques are implemented, is discussed. Conclusions: Fully automated marker detection in set-up images for the first treatment fraction is unachievable using these methods and that using cross-correlation is the best technique for automatic detection on subsequent radiotherapy treatment fractions.

  19. Using Process Visualizations to Validate Electronic Form Design

    PubMed Central

    Marquard, Jenna L.; Mei, Yi You

    2010-01-01

    Electronic reporting systems have the potential to support health care quality improvement initiatives across varied health care settings, specifically in low-technology settings such as long-term residential care facilities (LTRCFs). Yet, these organizations face financial barriers to implementing such systems and the LTRCF workforce is generally not as technology-ready as larger organizations’ workforces. Electronic reporting systems implemented in these settings must therefore be inexpensive and easy-to-use. This paper outlines a novel technique – process visualization – for systematically assessing the order in which users complete electronic forms, an inexpensively-developed patient falls reporting form in this case. These visualizations can help designers uncover usage patterns not evident via other usability methods. Based on this knowledge, designers can validate the design of the electronic forms, informing their subsequent redesign. PMID:21347028

  20. Laboratory design for high-performance electron microscopy

    SciTech Connect

    O'Keefe, Michael A.; Turner, John H.; Hetherington, Crispin J.D.; Cullis, A.G.; Carragher, Bridget; Jenkins, Ron; Milgrim, Julie; Milligan,Ronald A.; Potter, Clinton S.; Allard, Lawrence F.; Blom, Douglas A.; Degenhardt, Lynn; Sides, William H.

    2004-04-23

    Proliferation of electron microscopes with field emission guns, imaging filters and hardware spherical aberration correctors (giving higher spatial and energy resolution) has resulted in the need to construct special laboratories. As resolutions improve, transmission electron microscopes (TEMs) and scanning transmission electron microscopes (STEMs) become more sensitive to ambient conditions. State-of-the-art electron microscopes require state-of-the-art environments, and this means careful design and implementation of microscope sites, from the microscope room to the building that surrounds it. Laboratories have been constructed to house high-sensitive instruments with resolutions ranging down to sub-Angstrom levels; we present the various design philosophies used for some of these laboratories and our experiences with them. Four facilities are described: the National Center for Electron Microscopy OAM Laboratory at LBNL; the FEGTEM Facility at the University of Sheffield; the Center for Integrative Molecular Biosciences at TSRI; and the Advanced Microscopy Laboratory at ORNL.

  1. Three Experiments Examining the Use of Electroencephalogram,Event-Related Potentials, and Heart-Rate Variability for Real-Time Human-Centered Adaptive Automation Design

    NASA Technical Reports Server (NTRS)

    Prinzel, Lawrence J., III; Parasuraman, Raja; Freeman, Frederick G.; Scerbo, Mark W.; Mikulka, Peter J.; Pope, Alan T.

    2003-01-01

    Adaptive automation represents an advanced form of human-centered automation design. The approach to automation provides for real-time and model-based assessments of human-automation interaction, determines whether the human has entered into a hazardous state of awareness and then modulates the task environment to keep the operator in-the-loop , while maintaining an optimal state of task engagement and mental alertness. Because adaptive automation has not matured, numerous challenges remain, including what the criteria are, for determining when adaptive aiding and adaptive function allocation should take place. Human factors experts in the area have suggested a number of measures including the use of psychophysiology. This NASA Technical Paper reports on three experiments that examined the psychophysiological measures of event-related potentials, electroencephalogram, and heart-rate variability for real-time adaptive automation. The results of the experiments confirm the efficacy of these measures for use in both a developmental and operational role for adaptive automation design. The implications of these results and future directions for psychophysiology and human-centered automation design are discussed.

  2. RHIC electron lens beam transport system design considerations

    SciTech Connect

    Gu, X.; Pikin, A.; Okamura, M.; Fischer, W.; Luo, Y.; Gupta, R.; Hock, J.; Jain, A.; Raparia, D.

    2010-10-01

    To apply head-on beam-beam compensation for RHIC, two electron lenses are designed and will be installed at IP10. Electron beam transport system is one of important subsystem, which is used to transport electron beam from electron gun side to collector side. This system should be able to change beam size inside superconducting magnet and control beam position with 5 mm in horizontal and vertical plane. Some other design considerations for this beam transport system are also reported in this paper. The head-on beam-beam effect is one of important nonlinear source in storage ring and linear colliders, which have limited the luminosity improvement of many colliders, such as SppS, Tevatron and RHIC. In order to enhance the performance of colliders, beam-beam effects can be compensated with direct space charge compensation, indirect space charge compensation or betatron phase cancellation scheme. Like other colliders, indirect space charge compensation scheme (Electron Lens) was also proposed for Relativistic Heavy Ion Collider (RHIC) beam-beam compensation at Brookhaven National Laboratory. The two similar electron lenses are located in IR10 between the DX magnets. One RHIC electron lens consists of one DC electron gun, one superconducting magnet, one electron collector and beam transport system.

  3. Advanced Air Traffic Management Research (Human Factors and Automation): NASA Research Initiatives in Human-Centered Automation Design in Airspace Management

    NASA Technical Reports Server (NTRS)

    Corker, Kevin M.; Condon, Gregory W. (Technical Monitor)

    1996-01-01

    NASA has initiated a significant thrust of research and development focused on providing the flight crew and air traffic managers automation aids to increase capacity in en route and terminal area operations through the use of flexible, more fuel-efficient routing, while improving the level of safety in commercial carrier operations. In that system development, definition of cognitive requirements for integrated multi-operator dynamic aiding systems is fundamental. The core processes of control and the distribution of decision making in that control are undergoing extensive analysis. From our perspective, the human operators and the procedures by which they interact are the fundamental determinants of the safe, efficient, and flexible operation of the system. In that perspective, we have begun to explore what our experience has taught will be the most challenging aspects of designing and integrating human-centered automation in the advanced system. We have performed a full mission simulation looking at the role shift to self-separation on board the aircraft with the rules of the air guiding behavior and the provision of a cockpit display of traffic information and an on-board traffic alert system that seamlessly integrates into the TCAS operations. We have performed and initial investigation of the operational impact of "Dynamic Density" metrics on controller relinquishing and reestablishing full separation authority. (We follow the assumption that responsibility at all times resides with the controller.) This presentation will describe those efforts as well as describe the process by which we will guide the development of error tolerant systems that are sensitive to shifts in operator work load levels and dynamic shifts in the operating point of air traffic management.

  4. Big innovations in a small instrument: technical challenges in a new CCD system design for the Automated Patrol Telescope

    NASA Astrophysics Data System (ADS)

    Miziarski, Stan; Ashley, Michael C. B.; Smith, Greg; Barden, Sam; Dawson, John; Horton, Anthony; Saunders, Will; Brzeski, Jurek; Churilov, Vladimir; Klauser, Urs; Waller, Lew; Mayfield, Don; Correll, David; Phillips, Andre; Whittard, Denis

    2008-07-01

    We describe the design of a new CCD system delivered to the Automated Patrol Telescope at Siding Springs NSW Australia operated by UNSW. A very fast beam (f/1) with a mosaic of two MITLL CCID-34 detectors placed only 1 mm behind the field flattener which also serves as the dewar window, have called for innovative engineering solutions. This paper describes the design and procedure of the field-flattener mounting, differential screw adjustable detector mount and dewar suspension on the external ring providing tip/tilt and focus adjustment.

  5. Design and Prototype of an Automated Column-Switching HPLC System for Radiometabolite Analysis.

    PubMed

    Vasdev, Neil; Collier, Thomas Lee

    2016-01-01

    Column-switching high performance liquid chromatography (HPLC) is extensively used for the critical analysis of radiolabeled ligands and their metabolites in plasma. However, the lack of streamlined apparatus and consequently varying protocols remain as a challenge among positron emission tomography laboratories. We report here the prototype apparatus and implementation of a fully automated and simplified column-switching procedure to allow for the easy and automated determination of radioligands and their metabolites in up to 5 mL of plasma. The system has been used with conventional UV and coincidence radiation detectors, as well as with a single quadrupole mass spectrometer. PMID:27548189

  6. Wearable design issues for electronic vision enhancement systems

    NASA Astrophysics Data System (ADS)

    Dvorak, Joe

    2006-09-01

    As the baby boomer generation ages, visual impairment will overtake a significant portion of the US population. At the same time, more and more of our world is becoming digital. These two trends, coupled with the continuing advances in digital electronics, argue for a rethinking in the design of aids for the visually impaired. This paper discusses design issues for electronic vision enhancement systems (EVES) [R.C. Peterson, J.S. Wolffsohn, M. Rubinstein, et al., Am. J. Ophthalmol. 136 1129 (2003)] that will facilitate their wearability and continuous use. We briefly discuss the factors affecting a person's acceptance of wearable devices. We define the concept of operational inertia which plays an important role in our design of wearable devices and systems. We then discuss how design principles based upon operational inertia can be applied to the design of EVES.

  7. Stochastic Micro-Pattern for Automated Correlative Fluorescence - Scanning Electron Microscopy

    PubMed Central

    Begemann, Isabell; Viplav, Abhiyan; Rasch, Christiane; Galic, Milos

    2015-01-01

    Studies of cellular surface features gain from correlative approaches, where live cell information acquired by fluorescence light microscopy is complemented by ultrastructural information from scanning electron micrographs. Current approaches to spatially align fluorescence images with scanning electron micrographs are technically challenging and often cost or time-intensive. Relying exclusively on open-source software and equipment available in a standard lab, we have developed a method for rapid, software-assisted alignment of fluorescence images with the corresponding scanning electron micrographs via a stochastic gold micro-pattern. Here, we provide detailed instructions for micro-pattern production and image processing, troubleshooting for critical intermediate steps, and examples of membrane ultra-structures aligned with the fluorescence signal of proteins enriched at such sites. Together, the presented method for correlative fluorescence – scanning electron microscopy is versatile, robust and easily integrated into existing workflows, permitting image alignment with accuracy comparable to existing approaches with negligible investment of time or capital. PMID:26647824

  8. Stochastic Micro-Pattern for Automated Correlative Fluorescence - Scanning Electron Microscopy.

    PubMed

    Begemann, Isabell; Viplav, Abhiyan; Rasch, Christiane; Galic, Milos

    2015-01-01

    Studies of cellular surface features gain from correlative approaches, where live cell information acquired by fluorescence light microscopy is complemented by ultrastructural information from scanning electron micrographs. Current approaches to spatially align fluorescence images with scanning electron micrographs are technically challenging and often cost or time-intensive. Relying exclusively on open-source software and equipment available in a standard lab, we have developed a method for rapid, software-assisted alignment of fluorescence images with the corresponding scanning electron micrographs via a stochastic gold micro-pattern. Here, we provide detailed instructions for micro-pattern production and image processing, troubleshooting for critical intermediate steps, and examples of membrane ultra-structures aligned with the fluorescence signal of proteins enriched at such sites. Together, the presented method for correlative fluorescence - scanning electron microscopy is versatile, robust and easily integrated into existing workflows, permitting image alignment with accuracy comparable to existing approaches with negligible investment of time or capital. PMID:26647824

  9. Design of an Automated Essay Grading (AEG) System in Indian Context

    ERIC Educational Resources Information Center

    Ghosh, Siddhartha; Fatima, Sameen S.

    2007-01-01

    Automated essay grading or scoring systems are no more a myth, but they are a reality. As of today, the human written (not hand written) essays are corrected not only by examiners/teachers but also by machines. The TOEFL exam is one of the best examples of this application. The students' essays are evaluated both by human and web based automated…

  10. Design and operation of the electron beam ion trap

    SciTech Connect

    Vogel, D.

    1990-05-30

    This report describes the basic features and operating principles of the Electron Beam Ion Trap. The differences between EBIT and other sources of highly charged ions are outlined. Its features and operating parameters are discussed. The report also explains why certain design choices were necessary and the constraints involved in building an electron beam ion trap. EBIT's evaporation cooling system is described in detail. 13 refs., 8 figs.

  11. Designs for surge immunity in critical electronic facilities

    NASA Technical Reports Server (NTRS)

    Roberts, Edward F., Jr.

    1991-01-01

    In recent years, Federal Aviation Administration (FAA) embarked on a program replacing older tube type electronic equipment with newer solid state equipment. This replacement program dramatically increased the susceptibility of the FAA's facilities to lightning related damages. The proposal is presented of techniques which may be employed to lessen the susceptibility of new FAA electronic facility designs to failures resulting from lightning related surges and transients as well as direct strikes. The general concept espoused is one of a consistent system approach employing both perimeter and internal protection. It compares the technique presently employed to reduce electronic noise with other techniques which reduce noise while lowering susceptibility to lightning related damage. It is anticipated that these techniques will be employed in the design of an Air Traffic Control Tower in a high isokeraunic area. This facility would be subjected to rigorous monitoring over a multi-year period to provide quantitative data hopefully supporting the advantage of this design.

  12. Automated discovery of drug treatment patterns for endocrine therapy of breast cancer within an electronic medical record

    PubMed Central

    Olson, Janet E; Murphy, Sean P; Cafourek, Victoria L; Couch, Fergus J; Goetz, Matthew P; Ingle, James N; Suman, Vera J; Chute, Christopher G; Weinshilboum, Richard M

    2011-01-01

    Objective To develop an algorithm for the discovery of drug treatment patterns for endocrine breast cancer therapy within an electronic medical record and to test the hypothesis that information extracted using it is comparable to the information found by traditional methods. Materials The electronic medical charts of 1507 patients diagnosed with histologically confirmed primary invasive breast cancer. Methods The automatic drug treatment classification tool consisted of components for: (1) extraction of drug treatment-relevant information from clinical narratives using natural language processing (clinical Text Analysis and Knowledge Extraction System); (2) extraction of drug treatment data from an electronic prescribing system; (3) merging information to create a patient treatment timeline; and (4) final classification logic. Results Agreement between results from the algorithm and from a nurse abstractor is measured for categories: (0) no tamoxifen or aromatase inhibitor (AI) treatment; (1) tamoxifen only; (2) AI only; (3) tamoxifen before AI; (4) AI before tamoxifen; (5) multiple AIs and tamoxifen cycles in no specific order; and (6) no specific treatment dates. Specificity (all categories): 96.14%–100%; sensitivity (categories (0)–(4)): 90.27%–99.83%; sensitivity (categories (5)–(6)): 0–23.53%; positive predictive values: 80%–97.38%; negative predictive values: 96.91%–99.93%. Discussion Our approach illustrates a secondary use of the electronic medical record. The main challenge is event temporality. Conclusion We present an algorithm for automated treatment classification within an electronic medical record to combine information extracted through natural language processing with that extracted from structured databases. The algorithm has high specificity for all categories, high sensitivity for five categories, and low sensitivity for two categories. PMID:22140207

  13. Design studies for the next generation electron ion colliders

    SciTech Connect

    Sayed, Hisham Kamal; Bogacz, Slawomir A.; Krafft, Geoffrey A.

    2014-04-01

    The next generation Electron Ion Collider (EIC) at Thomas Jefferson National Accelerator Facility (JLAB) utilizes a figure-8 shaped ion and electron rings. EIC has the ability to preserve the ion polarization during acceleration, where the electron ring matches in footprint with a figure-8 ion ring. The electron ring is designed to deliver a highly polarized high luminous electron beam at interaction point (IP). The main challenges of the electron ring design are the chromaticity compensation and maintaining high beam polarization of 70% at all energies 3–11 GeV without introducing transverse orbital coupling before the IP. The very demanding detector design limits the minimum distance between the final focus quadrupole and the interaction point to 3.5 m which results in a large β function inside the final focus quadrupoles leading to increased beam chromaticity. In this paper, we present a novel chromaticity compensation scheme that mitigates IP chromaticity by a compact chromaticity compensation section with multipole magnet components. In addition, a set of spin rotators are utilized to manipulate the polarization vector of the electron beam in order to preserve the beam polarization. The spin rotator solenoids introduce undesired coupling between the horizontal and vertical betatron motion of the beam. We introduce a compact and modular orbit decoupling insert that can fit in the limited space of the straight section in the figure-8 ring. We show a numerical study of the figure-8 ring design with the compact straight section, which includes the interaction region, chromaticity compensation section, and the spin rotators, the figure-8 design performance is evaluated with particle tracking.

  14. Design studies for the next generation electron ion colliders

    NASA Astrophysics Data System (ADS)

    Sayed, Hisham Kamal; Bogacz, S. A.; Krafft, G.

    2014-04-01

    The next generation Electron Ion Collider (EIC) at Thomas Jefferson National Accelerator Facility (JLAB) utilizes a figure-8 shaped ion and electron rings. EIC has the ability to preserve the ion polarization during acceleration, where the electron ring matches in footprint with a figure-8 ion ring. The electron ring is designed to deliver a highly polarized high luminous electron beam at interaction point (IP). The main challenges of the electron ring design are the chromaticity compensation and maintaining high beam polarization of 70% at all energies 3-11 GeV without introducing transverse orbital coupling before the IP. The very demanding detector design limits the minimum distance between the final focus quadrupole and the interaction point to 3.5 m which results in a large β function inside the final focus quadrupoles leading to increased beam chromaticity. In this paper, we present a novel chromaticity compensation scheme that mitigates IP chromaticity by a compact chromaticity compensation section with multipole magnet components. In addition, a set of spin rotators are utilized to manipulate the polarization vector of the electron beam in order to preserve the beam polarization. The spin rotator solenoids introduce undesired coupling between the horizontal and vertical betatron motion of the beam. We introduce a compact and modular orbit decoupling insert that can fit in the limited space of the straight section in the figure-8 ring. We show a numerical study of the figure-8 ring design with the compact straight section, which includes the interaction region, chromaticity compensation section, and the spin rotators, the figure-8 design performance is evaluated with particle tracking.

  15. Automated Analysis of the Digitized Second Palomar Sky Survey: System Design, Implementation, and Initial Results

    NASA Astrophysics Data System (ADS)

    Weir, Nicholas

    1995-01-01

    We describe the design, implementation, and initial scientific results of a system for analyzing the Digitized Second Palomar Observatory Sky Survey (DPOSS). The system (SKICAT) facilitates and largely automates the pipeline processing of DPOSS from raw pixel data into calibrated, classified object catalog form. A fundamental constraint limiting the scientific usefulness of optical imaging surveys is the level at which objects may be reliably distinguished as stars, galaxies, or artifacts. The classifier implemented within SKICAT was created using a new machine learning technology, whereby an algorithm determines a near-optimal set of classification rules based upon training examples. Using this approach, we were able to construct a classifier which distinguishes objects to the same level of accuracy as in previous surveys using comparable plate material, but nearly one magnitude fainter (or an equivalent BJ ~ 21.0). Our first analysis of DPOSS using SKICAT is of an overlapping set of four survey fields near the North Galactic Pole, in both the J and F passbands. Through detailed simulations of a subset of these data, we were able to analyze systematic aspects of our detection and measurement procedures, as well as optimize them. We discuss how we calibrate the plate magnitudes to the Gunn-Thuan g and r photometric system using CCD sequences obtained in a program devoted expressly to calibrating DPOSS. Our technique results in an estimated plate-to-plate zero point standard error of under 0.10m in g and below 0.05^{m } in r, for J and F plates, respectively. Using the catalogs derived from these fields, we compare our differential galaxy counts in g and r with those from recent Schmidt plate surveys as well as predictions from evolutionary and non-evolutionary (NE) galaxy models. We find generally good agreement between our counts and recent NE and mild evolutionary models calibrated to consistently fit bright and faint galaxy counts, colors, and redshift

  16. ILC Polarized Electron Source Design and R&D Program

    SciTech Connect

    Brachmann, A.; Sheppard, J.; Zhou, F.; Poelker, M.; /SLAC

    2012-04-06

    The R and D program for the ILC electron focuses on three areas. These are the source drive laser system, the electron gun and photo cathodes necessary to produce a highly polarized electron beam. Currently, the laser system and photo cathode development take place at SLAC's 'ILC Injector Test facility', which is an integrated lab (laser and gun) that allows the production of the electron beam and is equipped with a set of diagnostics, necessary to characterize the source performance. Development of the ILC electron gun takes place at Jefferson Lab, where advanced concepts and technologies for HV DC electron guns for polarized beams are being developed. The goal is to combine both efforts at one facility to demonstrate an electron beam with ILC specifications, which are electron beam charge and polarization as well as the cathode's lifetime. The source parameters are summarized in Table 1. The current schematic design of the ILC central complex is depicted in Figure 1. The electron and positron sources are located and laid out approximately symmetric on either side of the damping rings.

  17. Maintainability design criteria for packaging of spacecraft replaceable electronic equipment.

    NASA Technical Reports Server (NTRS)

    Kappler, J. R.; Folsom, A. B.

    1972-01-01

    Maintainability must be designed into long-duration spacecraft and equipment to provide the required high probability of mission success with the least cost and weight. The ability to perform repairs quickly and easily in a space environment can be achieved by imposing specific maintainability design criteria on spacecraft equipment design and installation. A study was funded to investigate and define design criteria for electronic equipment that would permit rapid removal and replacement in a space environment. The results of the study are discussed together with subsequent simulated zero-g demonstration tests of a mockup with new concepts for packaging.

  18. Lattice design for the ERL electron ion collider in RHIC

    SciTech Connect

    Trbojevic, D.; Beebe-Wang, J.; Tsoupas, N.; Chang, X.; Kayran, D.; Ptitsyn, V.; Litvinenko, V.; Hao, Y.; Parker, B.; Pozdeyev, E.

    2010-05-23

    We present electron ion collider lattice design for the Relativistic Heavy Ion Collider (eRHIC) where the electrons have multi-passes through recirculating linacs (ERL) and arcs placed in the existing RHIC tunnel. The present RHIC interaction regions (IR's), where the electron ion collisions will occur, are modified to allow for the large luminosity. Staging of eRHIC will bring the electron energy from 4 up to 20 (30) GeV as the superconducting cavities are built and installed sequentially. The synchrotron radiation from electrons at the IR is reduced as they arrive straight to the collision while ions and protons come with 10 mrad crossing angle using the crab cavities.

  19. STATISTICAL CONSIDERATIONS IN THE EMPLOYMENT OF SAX (SCANNING ELECTRON MICROSOPY WITH AUTOMATED IMAGE ANALYSIS AND X-RAY ENERGY SPECTROSCOPY) RESULTS FOR RECEPTOR MODELS

    EPA Science Inventory

    Hundreds of thousands of individual particle measurements may be accumulated in a receptor model study employing Scanning electron microscopy with Automated image analysis and X-ray energy spectroscopy (SAX). At present, the summaries of these data are utilized in apportionment c...

  20. Automated Detection of Postoperative Surgical Site Infections Using Supervised Methods with Electronic Health Record Data.

    PubMed

    Hu, Zhen; Simon, Gyorgy J; Arsoniadis, Elliot G; Wang, Yan; Kwaan, Mary R; Melton, Genevieve B

    2015-01-01

    The National Surgical Quality Improvement Project (NSQIP) is widely recognized as "the best in the nation" surgical quality improvement resource in the United States. In particular, it rigorously defines postoperative morbidity outcomes, including surgical adverse events occurring within 30 days of surgery. Due to its manual yet expensive construction process, the NSQIP registry is of exceptionally high quality, but its high cost remains a significant bottleneck to NSQIP's wider dissemination. In this work, we propose an automated surgical adverse events detection tool, aimed at accelerating the process of extracting postoperative outcomes from medical charts. As a prototype system, we combined local EHR data with the NSQIP gold standard outcomes and developed machine learned models to retrospectively detect Surgical Site Infections (SSI), a particular family of adverse events that NSQIP extracts. The built models have high specificity (from 0.788 to 0.988) as well as very high negative predictive values (>0.98), reliably eliminating the vast majority of patients without SSI, thereby significantly reducing the NSQIP extractors' burden. PMID:26262143

  1. An automated system for studying the power distribution of electron beams

    SciTech Connect

    Filarowski, C.A.

    1994-12-01

    Precise welds with an electron beam welder are difficult to reproduce because the factors effecting the electron beam current density distribution are not easily controlled. One method for measuring the power density distribution in EB welds uses computer tomography to reconstruct an image of the current density distribution. This technique uses many separate pieces of hardware and software packages to obtain the data and then reconstruct it consequently, transferring this technology between different machines and operators is difficult. Consolidating all of the hardware and software into one machine to execute the same tasks will allow for real-time measurement of the EB power density distribution and will provide a facilitated means for transferring various welding procedure between different machines and operators, thereby enhancing reproducibility of electron beam welds.

  2. Note: Design of transverse electron gun for electron beam based reactive evaporation system.

    PubMed

    Maiti, Namita; Barve, U D; Bhatia, M S; Das, A K

    2011-05-01

    In this paper design of a 10 kV, 10 kW transverse electron gun, suitable for reactive evaporation, supported by simulation and modeling, is presented. Simulation of the electron beam trajectory helps in locating the emergence aperture after 90° bend and also in designing the crucible on which the beam is finally incident after 270° bend. The dimension of emergence aperture plays a vital role in designing the differential pumping system between the gun chamber and the substrate chamber. Experimental validation is done for beam trajectory by piercing a stainless steel plate at 90° position which is kept above the crucible. PMID:21639554

  3. 76 FR 27606 - Technical Corrections To Remove Obsolete References to Non-Automated Carriers From Electronic...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-05-12

    ...) in the Federal Register (67 FR 66318) amending 19 CFR 4.7 pertaining to vessel manifests to require... 5, 2003, CBP published a final rule (2003 final rule) in the Federal Register (68 FR 68140) further... FR 68145). In order to conform the regulation to the statute's mandatory electronic...

  4. Automated Methods to Extract Patient New Information from Clinical Notes in Electronic Health Record Systems

    ERIC Educational Resources Information Center

    Zhang, Rui

    2013-01-01

    The widespread adoption of Electronic Health Record (EHR) has resulted in rapid text proliferation within clinical care. Clinicians' use of copying and pasting functions in EHR systems further compounds this by creating a large amount of redundant clinical information in clinical documents. A mixture of redundant information (especially outdated…

  5. Automation of industrial bioprocesses.

    PubMed

    Beyeler, W; DaPra, E; Schneider, K

    2000-01-01

    The dramatic development of new electronic devices within the last 25 years has had a substantial influence on the control and automation of industrial bioprocesses. Within this short period of time the method of controlling industrial bioprocesses has changed completely. In this paper, the authors will use a practical approach focusing on the industrial applications of automation systems. From the early attempts to use computers for the automation of biotechnological processes up to the modern process automation systems some milestones are highlighted. Special attention is given to the influence of Standards and Guidelines on the development of automation systems. PMID:11092132

  6. Beam by design: Laser manipulation of electrons in modern accelerators

    NASA Astrophysics Data System (ADS)

    Hemsing, Erik; Stupakov, Gennady; Xiang, Dao; Zholents, Alexander

    2014-07-01

    Accelerator-based light sources such as storage rings and free-electron lasers use relativistic electron beams to produce intense radiation over a wide spectral range for fundamental research in physics, chemistry, materials science, biology, and medicine. More than a dozen such sources operate worldwide, and new sources are being built to deliver radiation that meets with the ever-increasing sophistication and depth of new research. Even so, conventional accelerator techniques often cannot keep pace with new demands and, thus, new approaches continue to emerge. In this article, a variety of recently developed and promising techniques that rely on lasers to manipulate and rearrange the electron distribution in order to tailor the properties of the radiation are reviewed. Basic theories of electron-laser interactions, techniques to create microstructures and nanostructures in electron beams, and techniques to produce radiation with customizable waveforms are reviewed. An overview of laser-based techniques for the generation of fully coherent x rays, mode-locked x-ray pulse trains, light with orbital angular momentum, and attosecond or even zeptosecond long coherent pulses in free-electron lasers is presented. Several methods to generate femtosecond pulses in storage rings are also discussed. Additionally, various schemes designed to enhance the performance of light sources through precision beam preparation including beam conditioning, laser heating, emittance exchange, and various laser-based diagnostics are described. Together these techniques represent a new emerging concept of "beam by design" in modern accelerators, which is the primary focus of this article.

  7. Design of 300A constant current electronic load

    NASA Astrophysics Data System (ADS)

    Cai, Ying

    2016-01-01

    Energy efficient and stable power supply is the core of most electronic products. DC electronic load is essential equipment to calibrate the DC regulated power supply. with the development of power industry towards to diversification and complication, the electronic load equipment for testing power supply is put forward higher requirements. Quality of electronic load equipment is mainly reflected in three aspects, measurement accuracy, completeness of measuring project and richness of load characteristic. In the paper, the high power and constant current DC electronic load is designed. Two pieces of D/A converter are used to constitute the 20 D/A conversion unit, to realize the minimum resolution of 0.045 mV. Four magnetic rings of high permeability and magnetic properties consistency, and the corresponding processing unit circuit compose the current sampling unit, which solve a key problem and difficulty of high precision and large current test. The three groups of 600 W power modules in parallel to realize the function of 1800 W power constant current. The electronic load has the 0 ~ 300A constant current characteristic, uncertainty of measurement is 1×10-4, and the maximum load voltage is 5V. After testing, every specifications have reached the design requirements. The load is mainly used for the metrology of DC regulated power supply.

  8. Designing the Electronic Classroom: Applying Learning Theory and Ergonomic Design Principles.

    ERIC Educational Resources Information Center

    Emmons, Mark; Wilkinson, Frances C.

    2001-01-01

    Applies learning theory and ergonomic principles to the design of effective learning environments for library instruction. Discusses features of electronic classroom ergonomics, including the ergonomics of physical space, environmental factors, and workstations; and includes classroom layouts. (Author/LRW)

  9. Teachers' Grammar on the Electronic Highway: Design Criteria for "Telegram."

    ERIC Educational Resources Information Center

    Wu, Kamyin; Tsui, Amy B. M.

    1997-01-01

    Discusses the rationale and criteria for developing "Telegram," an electronic grammar database for English-as-a- Second-Language teachers in Hong Kong. Describes the importance of explicit grammatical knowledge in effective language teaching, and describes the design criteria for "Telegram," which aims to provide a body of content knowledge and…

  10. Designing an Electronic Classroom for Large College Courses.

    ERIC Educational Resources Information Center

    Aiken, Milam W.; Hawley, Delvin D.

    1995-01-01

    Describes a state-of-the-art electronic classroom at the University of Mississippi School of Business designed for large numbers of students and regularly scheduled classes. Highlights include: architecture of the room, hardware components, software utilized in the room, and group decision support system software and its uses. (JKP)

  11. Designing an Electronic Educational Game to Facilitate Immersion and Flow

    ERIC Educational Resources Information Center

    Ma, Yuxin; Williams, Doug; Prejean, Louise

    2014-01-01

    Advocates of electronic educational games often cite the work on motivation to support the use of games in education. However, motivation alone is inadequate to facilitate learning. Many of the educational games that focused their game design solely on the motivational effect failed to be either educational or entertaining. Theory and research is…

  12. An expert system for choosing the best combination of options in a general-purpose program for automated design synthesis

    NASA Technical Reports Server (NTRS)

    Rogers, J. L., Jr.; Barthelemy, J. F. M.

    1985-01-01

    An expert system was developed to aid a user of the Automated Design Synthesis (ADS) general-purpose optimization computer program in selecting the best combination of strategy, optimizer, and one-dimensional search options for solving a problem. There are approximately 100 such combinations available in ADS. The knowledge base contains over 200 rules, and is divided into three categories: constrained problems, unconstrained problems, and constrained problems treated as unconstrained problems. The inference engine is written in LISP and is available on DEC-VAX and IBM PC/XT computers.

  13. Design and test of a semi-automated system for metrological verification of non-contact clinic thermometers

    NASA Astrophysics Data System (ADS)

    Giannetti, R., Dr; Sáenz-Nuño, M. A., Dr; Valderrama, J. M.; Fernandez, A.

    2013-09-01

    Clinic thermometers are, probably, the most used measurement instrument in the medical facilities (hospitals, clinics, etc.) all around the world. A good part of the assessment the physician does on the patient's health status will depend on the result of such a measurement. In this work, a system to assess the quality of non-contact clinic thermometers is developed and presented; the accuracy of the system is designed to be a useful tool in the phase of the instrument verification and as a base for a future automated calibration facilities.

  14. Electronics design of the airborne stabilized platform attitude acquisition module

    NASA Astrophysics Data System (ADS)

    Xu, Jiang; Wei, Guiling; Cheng, Yong; Li, Baolin; Bu, Hongyi; Wang, Hao; Zhang, Zhanwei; Li, Xingni

    2014-02-01

    We present an attitude acquisition module electronics design for the airborne stabilized platform. The design scheme, which is based on Integrated MEMS sensor ADIS16405, develops the attitude information processing algorithms and the hardware circuit. The hardware circuits with a small volume of only 44.9 x 43.6 x 24.6 mm3, has the characteristics of lightweight, modularization and digitalization. The interface design of the PC software uses the combination plane chart with track line to receive the attitude information and display. Attitude calculation uses the Kalman filtering algorithm to improve the measurement accuracy of the module in the dynamic environment.

  15. Self-shielded electron linear accelerators designed for radiation technologies

    NASA Astrophysics Data System (ADS)

    Belugin, V. M.; Rozanov, N. E.; Pirozhenko, V. M.

    2009-09-01

    This paper describes self-shielded high-intensity electron linear accelerators designed for radiation technologies. The specific property of the accelerators is that they do not apply an external magnetic field; acceleration and focusing of electron beams are performed by radio-frequency fields in the accelerating structures. The main characteristics of the accelerators are high current and beam power, but also reliable operation and a long service life. To obtain these characteristics, a number of problems have been solved, including a particular optimization of the accelerator components and the application of a variety of specific means. The paper describes features of the electron beam dynamics, accelerating structure, and radio-frequency power supply. Several compact self-shielded accelerators for radiation sterilization and x-ray cargo inspection have been created. The introduced methods made it possible to obtain a high intensity of the electron beam and good performance of the accelerators.

  16. Electron identification and implications in SSC detector design

    SciTech Connect

    Bensinger, J. Superconducting Super Collider Lab., Dallas, TX ); Wang, E.M. ); Yamamoto, H. )

    1990-05-01

    In the context of Heavy Higgs searches in the decay mode H {yields} ZZ {yields} 4e, electron identification issues and their implications on detector design are discussed (though many of the issues are valid for muon modes as well). The backgrounds considered seem manageable (a net rejection of 100 for combined electron ID and isolation cut is needed and seems fairly straightforward). A detector must have wide electron rapidity coverage {eta} < 2.5 to 3 and the ability to identify and measure an electron with P{sub T} > GeV; be hermetic (in the sense of minimizing regions where electrons can disappear through cracks, dead spaces, or poorly placed walls); and have high efficiency electron ID ({approximately} 0.90) since we are trying to be sensitive to a feeble signal and we need 4 electrons. The product of a number of fairly high acceptances based on optimistic estimates still yields in the end a net Higgs acceptance about 0.15 to 0.25 depending on how hermetic a detector is assumed. For M{sub Higgs} < 500 GeV, this may be tolerable; whereas, for higher Higgs masses, the situation is much less clear.

  17. Mission Design Evaluation Using Automated Planning for High Resolution Imaging of Dynamic Surface Processes from the ISS

    NASA Technical Reports Server (NTRS)

    Knight, Russell; Donnellan, Andrea; Green, Joseph J.

    2013-01-01

    A challenge for any proposed mission is to demonstrate convincingly that the proposed systems will in fact deliver the science promised. Funding agencies and mission design personnel are becoming ever more skeptical of the abstractions that form the basis of the current state of the practice with respect to approximating science return. To address this, we have been using automated planning and scheduling technology to provide actual coverage campaigns that provide better predictive performance with respect to science return for a given mission design and set of mission objectives given implementation uncertainties. Specifically, we have applied an adaptation of ASPEN and SPICE to the Eagle-Eye domain that demonstrates the performance of the mission design with respect to coverage of science imaging targets that address climate change and disaster response. Eagle-Eye is an Earth-imaging telescope that has been proposed to fly aboard the International Space Station (ISS).

  18. Optical and electronic design of a calibrated multichannel electronic interferometer for quantitative flow visualization

    NASA Astrophysics Data System (ADS)

    Upton, T. D.; Watt, D. W.

    1995-09-01

    Calibrated multichannel electronic interferometry is an electro-optic technique for performing phase shifting of transient phenomena. The design of an improved system for calibrated multichannel electronic interferometry is discussed. This includes a computational method for alignment of three phase-shifted interferograms and determination of the pixel correspondence. During calibration the phase, modulation, and bias of the optical system are determined. These data are stored electronically and used to compensate for errors associated with the path differences in the interferometer, the separation of the phase-shifted interferograms, and the measurement of the phase shift.

  19. Coherent electron cooling proof of principle instrumentation design

    SciTech Connect

    Gassner D. M.; Litvinenko, V.; Michnoff, R.; Miller, T.; Minty, M.; Pinayev, I.

    2012-04-15

    The goal of the Coherent Electron Cooling Proof-of-Principle (CeC PoP) experiment being designed at RHIC is to demonstrate longitudinal (energy spread) cooling before the expected CD-2 for eRHIC. The scope of the experiment is to longitudinally cool a single bunch of 40 GeV/u gold ions in RHIC. This paper will describe the instrumentation systems proposed to meet the diagnostics challenges. These include measurements of beam intensity, emittance, energy spread, bunch length, position, orbit stability, and transverse and temporal alignment of electron and ion beams.

  20. Automated identification of patients with a diagnosis of binge eating disorder from narrative electronic health records

    PubMed Central

    Bellows, Brandon K; LaFleur, Joanne; Kamauu, Aaron W C; Ginter, Thomas; Forbush, Tyler B; Agbor, Stephen; Supina, Dylan; Hodgkins, Paul; DuVall, Scott L

    2014-01-01

    Binge eating disorder (BED) does not have an International Classification of Diseases, 9th or 10th edition code, but is included under ‘eating disorder not otherwise specified’ (EDNOS). This historical cohort study identified patients with clinician-diagnosed BED from electronic health records (EHR) in the Department of Veterans Affairs between 2000 and 2011 using natural language processing (NLP) and compared their characteristics to patients identified by EDNOS diagnosis codes. NLP identified 1487 BED patients with classification accuracy of 91.8% and sensitivity of 96.2% compared to human review. After applying study inclusion criteria, 525 patients had NLP-identified BED only, 1354 had EDNOS only, and 68 had both BED and EDNOS. Patient characteristics were similar between the groups. This is the first study to use NLP as a method to identify BED patients from EHR data and will allow further epidemiological study of patients with BED in systems with adequate clinical notes. PMID:24201026

  1. Automated evaluation of electronic discharge notes to assess quality of care for cardiovascular diseases using Medical Language Extraction and Encoding System (MedLEE)

    PubMed Central

    Lin, Jou-Wei; Yang, Chen-Wei

    2010-01-01

    The objective of this study was to develop and validate an automated acquisition system to assess quality of care (QC) measures for cardiovascular diseases. This system combining searching and retrieval algorithms was designed to extract QC measures from electronic discharge notes and to estimate the attainment rates to the current standards of care. It was developed on the patients with ST-segment elevation myocardial infarction and tested on the patients with unstable angina/non-ST-segment elevation myocardial infarction, both diseases sharing almost the same QC measures. The system was able to reach a reasonable agreement (κ value) with medical experts from 0.65 (early reperfusion rate) to 0.97 (β-blockers and lipid-lowering agents before discharge) for different QC measures in the test set, and then applied to evaluate QC in the patients who underwent coronary artery bypass grafting surgery. The result has validated a new tool to reliably extract QC measures for cardiovascular diseases. PMID:20442141

  2. Automated Work Packages Prototype: Initial Design, Development, and Evaluation. Light Water Reactor Sustainability Program

    SciTech Connect

    Oxstrand, Johanna Helene; Ahmad Al Rashdan; Le Blanc, Katya Lee; Bly, Aaron Douglas; Agarwal, Vivek

    2015-07-01

    The goal of the Automated Work Packages (AWP) project is to demonstrate how to enhance work quality, cost management, and nuclear safety through the use of advanced technology. The work described in this report is part of the digital architecture for a highly automated plant project of the technical program plan for advanced instrumentation, information, and control (II&C) systems technologies. This report addresses the DOE Milestone M2LW-15IN0603112: Describe the outcomes of field evaluations/demonstrations of the AWP prototype system and plant surveillance and communication framework requirements at host utilities. A brief background to the need for AWP research is provided, then two human factors field evaluation studies are described. These studies focus on the user experience of conducting a task (in this case a preventive maintenance and a surveillance test) while using an AWP system. The remaining part of the report describes an II&C effort to provide real time status updates to the technician by wireless transfer of equipment indications and a dynamic user interface.

  3. Automated Hydrogen/Deuterium Exchange Electron Transfer Dissociation High Resolution Mass Spectrometry Measured at Single-Amide Resolution

    NASA Astrophysics Data System (ADS)

    Landgraf, Rachelle R.; Chalmers, Michael J.; Griffin, Patrick R.

    2012-02-01

    Hydrogen deuterium exchange mass spectrometry (HDX-MS) is a well established method for the measurement of solution-phase deuterium incorporation into proteins, which can provide insight into protein conformational mobility. However, most HDX measurements are constrained to regions of the protein where pepsin proteolysis allows detection at peptide resolution. Recently, single-amide resolution deuterium incorporation has been achieved by limiting gas-phase scrambling in the mass spectrometer. This was accomplished by employing a combination of soft ionization and desolvation conditions coupled with the radical-driven fragmentation technique electron transfer dissociation (ETD). Here, a hybrid LTQ-Orbitrap XL is systematically evaluated for its utility in providing single-amide deuterium incorporation for differential HDX analysis of a nuclear receptor upon binding small molecule ligands. We are able to show that instrumental parameters can be optimized to minimize scrambling and can be incorporated into an established and fully automated HDX platform making differential single-amide HDX possible for bottom-up analysis of complex systems. We have applied this system to determine differential single amide resolution HDX data for the peroxizome proliferator activated receptor bound with two ligands of interest.

  4. Conceptual design of industrial free electron laser using superconducting accelerator

    SciTech Connect

    Saldin, E.L.; Schneidmiller, E.A.; Ulyanov, Yu.N.

    1995-12-31

    Paper presents conceptual design of free electron laser (FEL) complex for industrial applications. The FEL complex consists of three. FEL oscillators with the optical output spanning the infrared (IR) and ultraviolet (UV) wave-lengths ({lambda} = 0.3...20 {mu}m) and with the average output power 10 - 20 kW. The driving beam for the FELs is produced by a superconducting accelerator. The electron beam is transported to the FELs via three beam lines (125 MeV and 2 x 250 MeV). Peculiar feature of the proposed complex is a high efficiency of the. FEL oscillators, up to 20 %. This becomes possible due to the use of quasi-continuous electron beam and the use of the time-dependent undulator tapering.

  5. INTERACTION REGION DESIGN FOR THE ELECTRON-ION COLLIDER ERHIC.

    SciTech Connect

    MONTAG, C.; PARKER, B.; TEPIKIAN, S.; ET AL.

    2005-05-16

    To facilitate the study of collisions between 10 GeV polarized electrons and 100 GeV/u heavy ions or 250 GeV polarized protons at luminosities in the 10{sup 33} cm{sup -2} sec{sup -1} range (e-p case), adding a 10 GeV electron storage ring to the existing RHIC complex has been proposed. The interaction region of this electron-ion collider eRHIC has to provide the required low-beta focusing, while simultaneously accommodating the synchrotron radiation fan generated by beam separation close to the interaction point, which is particularly challenging. The latest design status of the eRHIC interaction region will be presented.

  6. High-temperature electronic components and circuit designs

    NASA Astrophysics Data System (ADS)

    Chang, H. T.

    Downhole logging instruments for geothermal application must have electronic circuits capable of operating from room temperature 250 C. A nondestructive evaluation instrument for geothermal wells requires a circuit that can be operated at high voltage and high current in order to provide high power output. In designing such a circuit, a high power, high speed, cold cathode switching tube was developed to be used as a substitute for SCRs or thyratrons. The possibility of using low leakage JFETs beyond their rated temperature in a circuit design is discussed. Commercial high temperature components are reviewed.

  7. Molecular scene analysis: application of a topological approach to the automated interpretation of protein electron-density maps.

    PubMed

    Leherte, L; Fortier, S; Glasgow, J; Allen, F H

    1994-03-01

    Methods to assist in the spatial and visual analysis of electron-density maps have been investigated as part of a project in molecular scene analysis [Fortier, Castleden, Glasgow, Conklin, Walmsley, Leherte & Allen (1993). Acta Cryst. D49, 168-178]. In particular, the usefulness of the topological approach for the segmentation of medium-resolution (3 A) maps of proteins and their interpretation in terms of structural motifs has been assessed. The approach followed is that proposed by Johnson [Johnson (1977). ORCRIT. The Oak Ridge Critical Point Network Program. Chemistry Division, Oak Ridge National Laboratory, USA] which provides a global representation of the electron-density distribution through the location, identification and linkage of its critical points. In the first part of the study, the topological approach was applied to calculated maps of three proteins of small to medium size so as to develop a methodology that could then be used for analyzing maps of medium resolution. The methodology was then applied to both calculated and experimental maps of penicillopepsin at 3 A resolution. The study shows that the networks of critical points can provide a useful segmentation of the maps, tracing the protein main chains and capturing their conformation. In addition, these networks can be parsed in terms of secondary-structure motifs, through a geometrical analysis of the critical points. The procedure adopted for secondary-structure recognition, which was phrased in terms of geometry-based rules, provides a basis for a further automated implementation of a more complete set of recognition operations through the use of artificial-intelligence techniques. PMID:15299453

  8. Automated transmission-mode scanning electron microscopy (tSEM) for large volume analysis at nanoscale resolution.

    PubMed

    Kuwajima, Masaaki; Mendenhall, John M; Lindsey, Laurence F; Harris, Kristen M

    2013-01-01

    Transmission-mode scanning electron microscopy (tSEM) on a field emission SEM platform was developed for efficient and cost-effective imaging of circuit-scale volumes from brain at nanoscale resolution. Image area was maximized while optimizing the resolution and dynamic range necessary for discriminating key subcellular structures, such as small axonal, dendritic and glial processes, synapses, smooth endoplasmic reticulum, vesicles, microtubules, polyribosomes, and endosomes which are critical for neuronal function. Individual image fields from the tSEM system were up to 4,295 µm(2) (65.54 µm per side) at 2 nm pixel size, contrasting with image fields from a modern transmission electron microscope (TEM) system, which were only 66.59 µm(2) (8.160 µm per side) at the same pixel size. The tSEM produced outstanding images and had reduced distortion and drift relative to TEM. Automated stage and scan control in tSEM easily provided unattended serial section imaging and montaging. Lens and scan properties on both TEM and SEM platforms revealed no significant nonlinear distortions within a central field of ∼100 µm(2) and produced near-perfect image registration across serial sections using the computational elastic alignment tool in Fiji/TrakEM2 software, and reliable geometric measurements from RECONSTRUCT™ or Fiji/TrakEM2 software. Axial resolution limits the analysis of small structures contained within a section (∼45 nm). Since this new tSEM is non-destructive, objects within a section can be explored at finer axial resolution in TEM tomography with current methods. Future development of tSEM tomography promises thinner axial resolution producing nearly isotropic voxels and should provide within-section analyses of structures without changing platforms. Brain was the test system given our interest in synaptic connectivity and plasticity; however, the new tSEM system is readily applicable to other biological systems. PMID:23555711

  9. GeV C. W. electron microtron design report

    SciTech Connect

    Not Available

    1982-05-01

    Rising interest in the nuclear physics community in a GeV C.W. electron accelerator reflects the growing importance of high-resolution short-range nuclear physics to future advances in the field. In this report major current problems are reviewed and the details of prospective measurements which could be made with a GeV C.W. electron facility are discussed, together with their impact on an understanding of nuclear forces and the structure of nuclear matter. The microtron accelerator has been chosen as the technology to generate the electron beams required for the research discussed because of the advantages of superior beam quality, low capital and operating cost and capability of furnishing beams of several energies and intensities simultaneously. A complete technical description of the conceptual design for a 2 GeV double-sided C.W. electron microtron is presented. The accelerator can furnish three beams with independently controlled energy and intensity. The maximum current per beam is 100 ..mu..amps. Although the precise objective for maximum beam energy is still a subject of debate, the design developed in this study provides the base technology for microtron accelerators at higher energies (2 to 6 GeV) using multi-sided geometries.

  10. Modulated Electron Radiation Therapy: An Investigation On Fast Beam Models and Radiation-Tolerant Solutions For Automated Motion Control Of A Few Leaf Electron Collimator

    NASA Astrophysics Data System (ADS)

    Papaconstadopoulos, Paul

    The purpose of this study was to address two specific issues related with the clinical application of Modulated Electron Radiation Therapy (MERT). The first was to investigate radiation-tolerant solutions for automated motion control of a Few Leaf Electron Collimator. Secondly, we implemented a fast, Monte Carlo-based, parameterized beam model for characterization of the electron beam in modulated deliveries. Two approaches were investigated for the implementation of a radiation-tolerant position feedback system: (i) the use of CMOS-based optical encoders protected by a prototype shield and (ii) the use of an analog device, such as a potentiometer, whose radiation tolerance is significantly higher. The two approaches were implemented and their performance tested. Results indicated that the optical encoders could not be safely used under radiation even with the presence of a shield. The analog position feedback system showed to be a viable solution. Future work will be focused towards the direction of implementing an analog position feedback system suitable for clinical use. The MC-based, parameterized beam model is based on the idea of deriving the scattered electron beam characteristics directly on the exit plane of the linear accelerator by the use of source scatter fluence kernels. Primary beam characteristics are derived by fast Monte Carlo simulations. The novelty of the method is that arbitrary rectangular fields can be recreated fast by superposition of the appropriate source kernels directly on the output plane. Depth, profile dose distributions and dose output, were derived for three field sizes (8 x 8, 2 x 2 and 2 x 8 cm 2) and energies of 6 MeV and 20 MeV electron beams by the beam model and compared with full Monte Carlo simulations. The primary beam showed excellent agreement in all cases. Scattered particles agreed well for the larger field sizes of 8 x 8 and 2 x 8 cm2, while discrepancies were encountered for scattered particles for the smaller field

  11. Automated background subtraction technique for electron energy-loss spectroscopy and application to semiconductor heterostructures.

    PubMed

    Angadi, Veerendra C; Abhayaratne, Charith; Walther, Thomas

    2016-05-01

    Electron energy-loss spectroscopy (EELS) has become a standard tool for identification and sometimes also quantification of elements in materials science. This is important for understanding the chemical and/or structural composition of processed materials. In EELS, the background is often modelled using an inverse power-law function. Core-loss ionization edges are superimposed on top of the dominating background, making it difficult to quantify their intensities. The inverse power-law has to be modelled for each pre-edge region of the ionization edges in the spectrum individually rather than for the entire spectrum. To achieve this, the prerequisite is that one knows all core losses possibly present. The aim of this study is to automatically detect core-loss edges, model the background and extract quantitative elemental maps and profiles of EELS, based on several EELS spectrum images (EELS SI) without any prior knowledge of the material. The algorithm provides elemental maps and concentration profiles by making smart decisions in selecting pre-edge regions and integration ranges. The results of the quantification for a semiconductor thin film heterostructure show high chemical sensitivity, reasonable group III/V intensity ratios but also quantification issues when narrow integration windows are used without deconvolution. PMID:26998582

  12. Automated Physician Order Recommendations and Outcome Predictions by Data-Mining Electronic Medical Records

    PubMed Central

    Chen, Jonathan H.; Altman, Russ B.

    2014-01-01

    The meaningful use of electronic medical records (EMR) will come from effective clinical decision support (CDS) applied to physician orders, the concrete manifestation of clinical decision making. CDS development is currently limited by a top-down approach, requiring manual production and limited end-user awareness. A statistical data-mining alternative automatically extracts expertise as association statistics from structured EMR data (>5.4M data elements from >19K inpatient encounters). This powers an order recommendation system analogous to commercial systems (e.g., Amazon.com’s “Customers who bought this…”). Compared to a standard benchmark, the association method improves order prediction precision from 26% to 37% (p<0.01). Introducing an inverse frequency weighted recall metric demonstrates a quantifiable improvement from 3% to 17% (p<0.01) in recommending more specifically relevant orders. The system also predicts clinical outcomes, such as 30 day mortality and 1 week ICU intervention, with ROC AUC of 0.88 and 0.78 respectively, comparable to state-of-the-art prognosis scores. PMID:25717414

  13. Automated physician order recommendations and outcome predictions by data-mining electronic medical records.

    PubMed

    Chen, Jonathan H; Altman, Russ B

    2014-01-01

    The meaningful use of electronic medical records (EMR) will come from effective clinical decision support (CDS) applied to physician orders, the concrete manifestation of clinical decision making. CDS development is currently limited by a top-down approach, requiring manual production and limited end-user awareness. A statistical data-mining alternative automatically extracts expertise as association statistics from structured EMR data (>5.4M data elements from >19K inpatient encounters). This powers an order recommendation system analogous to commercial systems (e.g., Amazon.com's "Customers who bought this…"). Compared to a standard benchmark, the association method improves order prediction precision from 26% to 37% (p<0.01). Introducing an inverse frequency weighted recall metric demonstrates a quantifiable improvement from 3% to 17% (p<0.01) in recommending more specifically relevant orders. The system also predicts clinical outcomes, such as 30 day mortality and 1 week ICU intervention, with ROC AUC of 0.88 and 0.78 respectively, comparable to state-of-the-art prognosis scores. PMID:25717414

  14. Automated extraction of clinical traits of multiple sclerosis in electronic medical records

    PubMed Central

    Davis, Mary F; Sriram, Subramaniam; Bush, William S; Denny, Joshua C; Haines, Jonathan L

    2013-01-01

    Objectives The clinical course of multiple sclerosis (MS) is highly variable, and research data collection is costly and time consuming. We evaluated natural language processing techniques applied to electronic medical records (EMR) to identify MS patients and the key clinical traits of their disease course. Materials and methods We used four algorithms based on ICD-9 codes, text keywords, and medications to identify individuals with MS from a de-identified, research version of the EMR at Vanderbilt University. Using a training dataset of the records of 899 individuals, algorithms were constructed to identify and extract detailed information regarding the clinical course of MS from the text of the medical records, including clinical subtype, presence of oligoclonal bands, year of diagnosis, year and origin of first symptom, Expanded Disability Status Scale (EDSS) scores, timed 25-foot walk scores, and MS medications. Algorithms were evaluated on a test set validated by two independent reviewers. Results We identified 5789 individuals with MS. For all clinical traits extracted, precision was at least 87% and specificity was greater than 80%. Recall values for clinical subtype, EDSS scores, and timed 25-foot walk scores were greater than 80%. Discussion and conclusion This collection of clinical data represents one of the largest databases of detailed, clinical traits available for research on MS. This work demonstrates that detailed clinical information is recorded in the EMR and can be extracted for research purposes with high reliability. PMID:24148554

  15. Automated defect review of the wafer bevel with a defect review scanning electron microscope

    NASA Astrophysics Data System (ADS)

    McGarvey, Steve; Kanezawa, Masakazu

    2009-03-01

    One of the few remaining bastions of non-regulated Integrated Circuit defectivity is the wafer bevel. Recent internal Integrated Circuit Manufacturing studies have suggested that the edge bevel may be responsible for as much as a two to three percent yield loss during a defect excursion on the manufacturing line and a one to two percent yield loss during ongoing wafer manufacturing. A new generation of defect inspection equipment has been introduced to the Research and Development, Integrated Circuit, MEM's and Si wafer manufacturing markets that has imparted the ability for the end equipment user to detect defects located on the bevel of the wafer. The inherent weakness of the current batch of wafer bevel inspection equipment is the lack of automatic discrete defect classification data into multiple, significant classification bins and the lack of discrete elemental analysis data. Root cause analysis is based on minimal discrete defect analysis as a surrogate for a statistically valid sampling of defects from the bevel. This paper provides a study of the methods employed with a Hitachi RS-5500EQEQ Defect Review Scanning Electron Microscope (DRSEM) to automatically capture high resolution/high magnification images and collect elemental analysis on a statistically valid sample of the discrete defects that were located by a bevel inspection system.

  16. GASICA: generic automated stress induction and control application design of an application for controlling the stress state

    PubMed Central

    van der Vijgh, Benny; Beun, Robbert J.; van Rood, Maarten; Werkhoven, Peter

    2014-01-01

    In a multitude of research and therapy paradigms it is relevant to know, and desirably to control, the stress state of a patient or participant. Examples include research paradigms in which the stress state is the dependent or independent variable, or therapy paradigms where this state indicates the boundaries of the therapy. To our knowledge, no application currently exists that focuses specifically on the automated control of the stress state while at the same time being generic enough to be used in various therapy and research purposes. Therefore, we introduce GASICA, an application aimed at the automated control of the stress state in a multitude of therapy and research paradigms. The application consists of three components: a digital stressor game, a set of measurement devices, and a feedback model. These three components form a closed loop (called a biocybernetic loop by Pope et al. (1995) and Fairclough (2009) that continuously presents an acute psychological stressor, measures several physiological responses to this stressor, and adjusts the stressor intensity based on these measurements by means of the feedback model, hereby aiming to control the stress state. In this manner GASICA presents multidimensional and ecological valid stressors, whilst continuously in control of the form and intensity of the presented stressors, aiming at the automated control of the stress state. Furthermore, the application is designed as a modular open-source application to easily implement different therapy and research tasks using a high-level programming interface and configuration file, and allows for the addition of (existing) measurement equipment, making it usable for various paradigms. PMID:25538554

  17. IMPACT OF CANAL DESIGN LIMITATIONS ON WATER DELIVERY OPERATIONS AND AUTOMATION

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Irrigation canals are often designed for water transmission. The design engineer simply ensures that the canal will pass the maximum design discharge. However, irrigation canals frequently operated far below design capacity. Because demands and the distribution of flow at bifurcations (branch points...

  18. Preliminary Design and Evaluation of Portable Electronic Flight Progress Strips

    NASA Technical Reports Server (NTRS)

    Doble, Nathan A.; Hansman, R. John

    2002-01-01

    There has been growing interest in using electronic alternatives to the paper Flight Progress Strip (FPS) for air traffic control. However, most research has been centered on radar-based control environments, and has not considered the unique operational needs of the airport air traffic control tower. Based on an analysis of the human factors issues for control tower Decision Support Tool (DST) interfaces, a requirement has been identified for an interaction mechanism which replicates the advantages of the paper FPS (e.g., head-up operation, portability) but also enables input and output with DSTs. An approach has been developed which uses a Portable Electronic FPS that has attributes of both a paper strip and an electronic strip. The prototype flight strip system uses Personal Digital Assistants (PDAs) to replace individual paper strips in addition to a central management interface which is displayed on a desktop computer. Each PDA is connected to the management interface via a wireless local area network. The Portable Electronic FPSs replicate the core functionality of paper flight strips and have additional features which provide a heads-up interface to a DST. A departure DST is used as a motivating example. The central management interface is used for aircraft scheduling and sequencing and provides an overview of airport departure operations. This paper will present the design of the Portable Electronic FPS system as well as preliminary evaluation results.

  19. Automated Classification Of Scanning Electron Microscope Particle Images Using Morphological Analysis

    NASA Astrophysics Data System (ADS)

    Lamarche, B. L.; Lewis, R. R.; Girvin, D. C.; McKinley, J. P.

    2008-12-01

    We are developing a software tool that can automatically classify anthropogenic and natural aerosol particulates using morphological analysis. Our method was developed using SEM (background and secondary electron) images of single particles. Particle silhouettes are detected and converted into polygons using Intel's OpenCV image processing library. Our analysis then proceeds independently for the two kinds of images. Analysis of secondary images concerns itself solely with the silhouette and seeks to quantify its shape and roughness. Traversing the polygon with spline interpolation, we uniformly sample k(s), the signed curvature of the silhouette's path as a function of distance along the perimeter s. k(s) is invariant under rotation and translation. The power spectrum of k(s) qualitatively shows both shape and roughness: more power at low frequencies indicates variation in shape; more power at higher frequencies indicates a rougher silhouette. We present a series of filters (low-, band-, and high-pass) which we convolve with k(s) to yield a set of parameters that characterize the shape and roughness numerically. Analysis of backscatter images focuses on the (visual) texture, which is the result of both composition and geometry. Using the silhouette as a boundary, we compute the variogram, a statistical measure of inter-pixel covariance as a function of distance. Variograms take on characteristic curves, which we fit with a heuristic, asymptotic function that uses a small set of parameters. The combination of silhouette and variogram fit parameters forms the basis of a multidimensional classification space whose dimensionality we may reduce by principal component analysis and whose region boundaries allow us to classify new particles. This analysis is performed without a priori knowledge of other physical, chemical, or climatic properties. The method will be adapted to multi-particulate images.

  20. Collimated Photo-Electron Gun (CPEG) Development for Spaceflight Applications: Electronics Design and Preliminary Testing

    NASA Astrophysics Data System (ADS)

    Taylor, A.; Everding, D.; Krause, L. H.

    2012-12-01

    In previous decades, active space experiments have been conducted with electron beams generate artificial aurora, trace magnetic field lines, and stimulate Very Low Frequency (VLF) emissions. A new electron source called the collimated photo-electron gun (CPEG) is presently under development for spaceflight applications. High-energy Light Emitting Diodes (LEDs) are used to photo-eject electrons off a target material, and these photoelectrons are then focused into a beam using electrostatic lenses. The beam electron energy is controlled by the voltage on the lenses, and the electron flux is controlled by the brightness of the LEDs. The LEDs require a narrow range of both voltage and current setpoints, and thus must be pulse-width modulated at a high frequency to control the brightness. Because the lens and target voltages must be kept at fixed ratio to ensure a laminar beam, the target is powered by a voltage-controlled current source. An Arduino is used to provide command and data handling for the electron gun and the telemetry interface with the host spacecraft. To measure the current flowing to the target, an instrumentation amplifier boosts the voltage from a current-viewing resistor and feeds this voltage to one of the analog inputs of the Arduino. The LEDs are powered using a highly-specialized integrated circuit designed for sourcing high-power LEDs: The LM3500-21. The detailed design and preliminary results of the calibration of the electronics will be presented with this paper. The CPEG is presently under consideration for numerous flight opportunities, and a prototype is scheduled for environmental and functional testing in the fourth quarter of 2012.

  1. Maneuver Automation Software

    NASA Technical Reports Server (NTRS)

    Uffelman, Hal; Goodson, Troy; Pellegrin, Michael; Stavert, Lynn; Burk, Thomas; Beach, David; Signorelli, Joel; Jones, Jeremy; Hahn, Yungsun; Attiyah, Ahlam; Illsley, Jeannette

    2009-01-01

    The Maneuver Automation Software (MAS) automates the process of generating commands for maneuvers to keep the spacecraft of the Cassini-Huygens mission on a predetermined prime mission trajectory. Before MAS became available, a team of approximately 10 members had to work about two weeks to design, test, and implement each maneuver in a process that involved running many maneuver-related application programs and then serially handing off data products to other parts of the team. MAS enables a three-member team to design, test, and implement a maneuver in about one-half hour after Navigation has process-tracking data. MAS accepts more than 60 parameters and 22 files as input directly from users. MAS consists of Practical Extraction and Reporting Language (PERL) scripts that link, sequence, and execute the maneuver- related application programs: "Pushing a single button" on a graphical user interface causes MAS to run navigation programs that design a maneuver; programs that create sequences of commands to execute the maneuver on the spacecraft; and a program that generates predictions about maneuver performance and generates reports and other files that enable users to quickly review and verify the maneuver design. MAS can also generate presentation materials, initiate electronic command request forms, and archive all data products for future reference.

  2. Tools for Designing, Evaluating, and Certifying NextGen Technologies and Procedures: Automation Roles and Responsibilities

    NASA Technical Reports Server (NTRS)

    Kanki, Barbara G.

    2011-01-01

    Barbara Kanki from NASA Ames Research Center will discuss research that focuses on the collaborations between pilots, air traffic controllers and dispatchers that will change in NextGen systems as automation increases and roles and responsibilities change. The approach taken by this NASA Ames team is to build a collaborative systems assessment template (CSAT) based on detailed task descriptions within each system to establish a baseline of the current operations. The collaborative content and context are delineated through the review of regulatory and advisory materials, policies, procedures and documented practices as augmented by field observations and interviews. The CSAT is developed to aid the assessment of key human factors and performance tradeoffs that result from considering different collaborative arrangements under NextGen system changes. In theory, the CSAT product may be applied to any NextGen application (such as Trajectory Based Operations) with specified ground and aircraft capabilities.

  3. Design and automated production of 11C-alpha-methyl-l-tryptophan (11C-AMT).

    PubMed

    Huang, Xuan; Xiao, Xia; Gillies, Robert J; Tian, Haibin

    2016-05-01

    (11)C-alpha-methyl-l-tryptophan ([(11)C]AMT), a tryptophan metabolism PET tracer, has successfully been employed for brain serotonin pathway and indoleamine 2,3-dioxygenase (IDO) pathway related tumor imaging. We here report a reliable, automated procedure for routine synthesis of [(11)C]AMT based on an Eckert and Ziegler Modular-Lab system. The semi-preparative HPLC was incorporated into the system to improve chemical purity and specific activity. The 6-step radiosynthesis followed by HPLC-purification provided [(11)C]AMT in 5.3±1.2% (n=6, non-decay-corrected) overall radiochemical yield with radiochemical purity >99% and specific activity of 35-116GBq/μmol. Usually, 2.95±0.65GBq (n=6, EOS) patient ready dose was produced from about 55.5GBq [(11)C]CO2 in 50min. PMID:27150033

  4. Numerical simulation and design of a thermionic electron gun

    NASA Astrophysics Data System (ADS)

    Hoseinzade, M.; Nijatie, M.; Sadighzadeh, A.

    2016-05-01

    This paper reports the simulation of an electron gun. The effects on the beam quality of some parameters on the beam quality were studied and optimal choices were identified. It gives numerical beam qualities for a common electrostatic triode gun, and the dependencies on design parameters such as electrode geometries and bias voltages to these electrodes are shown. An electron beam of diameter 5 mm with energy of 5 keV was assumed for the simulation process. Some design parameters were identified as variable parameters in the presence of space charge. These parameters are the inclination angle of emission electrode, the applied voltage to the focusing electrode, the gap width between the emission electrode and the focusing electrode and the diameter of the focusing electrode. The triode extraction system is designed and optimized by using CST software (for Particle Beam Simulations). The physical design of the extraction system is given in this paper. From the simulation results, it is concluded that the inclination angle of the emission electrode is optimized at 22.5°, the applied voltage to the focusing electrode was optimized and found to be V foc = ‑600 V, the optimal separation distance (gap between emission electrode and focusing electrode) is 4 mm, and the optimal diameter of the emission electrode is 14 mm. Initial results for these efforts aimed at emittance improvement are also given.

  5. Design and demonstration of automated data analysis algorithms for ultrasonic inspection of complex composite panels with bonds

    NASA Astrophysics Data System (ADS)

    Aldrin, John C.; Forsyth, David S.; Welter, John T.

    2016-02-01

    To address the data review burden and improve the reliability of the ultrasonic inspection of large composite structures, automated data analysis (ADA) algorithms have been developed to make calls on indications that satisfy the detection criteria and minimize false calls. The original design followed standard procedures for analyzing signals for time-of-flight indications and backwall amplitude dropout. However, certain complex panels with varying shape, ply drops and the presence of bonds can complicate this interpretation process. In this paper, enhancements to the automated data analysis algorithms are introduced to address these challenges. To estimate the thickness of the part and presence of bonds without prior information, an algorithm tracks potential backwall or bond-line signals, and evaluates a combination of spatial, amplitude, and time-of-flight metrics to identify bonded sections. Once part boundaries, thickness transitions and bonded regions are identified, feature extraction algorithms are applied to multiple sets of through-thickness and backwall C-scan images, for evaluation of both first layer through thickness and layers under bonds. ADA processing results are presented for a variety of complex test specimens with inserted materials and other test discontinuities. Lastly, enhancements to the ADA software interface are presented, which improve the software usability for final data review by the inspectors and support the certification process.

  6. Sci—Thur PM: Planning and Delivery — 03: Automated delivery and quality assurance of a modulated electron radiation therapy plan

    SciTech Connect

    Connell, T; Papaconstadopoulos, P; Alexander, A; Serban, M; Devic, S; Seuntjens, J

    2014-08-15

    Modulated electron radiation therapy (MERT) offers the potential to improve healthy tissue sparing through increased dose conformity. Challenges remain, however, in accurate beamlet dose calculation, plan optimization, collimation method and delivery accuracy. In this work, we investigate the accuracy and efficiency of an end-to-end MERT plan and automated-delivery workflow for the electron boost portion of a previously treated whole breast irradiation case. Dose calculations were performed using Monte Carlo methods and beam weights were determined using a research-based treatment planning system capable of inverse optimization. The plan was delivered to radiochromic film placed in a water equivalent phantom for verification, using an automated motorized tertiary collimator. The automated delivery, which covered 4 electron energies, 196 subfields and 6183 total MU was completed in 25.8 minutes, including 6.2 minutes of beam-on time with the remainder of the delivery time spent on collimator leaf motion and the automated interfacing with the accelerator in service mode. The delivery time could be reduced by 5.3 minutes with minor electron collimator modifications and the beam-on time could be reduced by and estimated factor of 2–3 through redesign of the scattering foils. Comparison of the planned and delivered film dose gave 3%/3 mm gamma pass rates of 62.1, 99.8, 97.8, 98.3, and 98.7 percent for the 9, 12, 16, 20 MeV, and combined energy deliveries respectively. Good results were also seen in the delivery verification performed with a MapCHECK 2 device. The results showed that accurate and efficient MERT delivery is possible with current technologies.

  7. Design and function of an electron mobility spectrometer with a thick gas electron multiplier

    NASA Astrophysics Data System (ADS)

    Orchard, Gloria M.; Puddu, Silvia; Waker, Anthony J.

    2016-04-01

    The design and function of an electron mobility spectrometer (EMS) including a thick gas electron multiplier (THGEM) is presented. The THGEM was designed to easily be incorporated in an existing EMS to investigate the ability to detect tritium in air using a micropattern gas detector. The THGEM and a collection plate (anode) were installed and the appropriate circuitry was designed and connected to supply the required voltages to the THGEM-EMS. An alpha source (241Am) was used to generate electron-ion pairs within the gas-filled sensitive volume of the EMS. The electrons were used to investigate the THGEM-EMS response as a function of applied voltage to the THGEM and anode. The relative gas-gain and system resolution of the THGEM-EMS were measured at various applied voltage settings. It was observed a potential difference across the THGEM of +420 V and potential difference across the induction region of +150 V for this EMS setup resulted in the minimum voltage requirements to operate with a stable gain and system resolution. Furthermore, as expected, the gain is strongly affected not only by the potential difference across the THGEM, but also by the applied voltage to the anode and resulting potential difference between the THGEM and anode.

  8. Evaluation of green infrastructure designs using the Automated Geospatial Watershed Assessment Tool

    Technology Transfer Automated Retrieval System (TEKTRAN)

    In arid and semi-arid regions, green infrastructure (GI) designs can address several issues facing urban environments, including augmenting water supply, mitigating flooding, decreasing pollutant loads, and promoting greenness in the built environment. An optimum design captures stormwater, addressi...

  9. Design of the large hadron electron collider interaction region

    NASA Astrophysics Data System (ADS)

    Cruz-Alaniz, E.; Newton, D.; Tomás, R.; Korostelev, M.

    2015-11-01

    The large hadron electron collider (LHeC) is a proposed upgrade of the Large Hadron Collider (LHC) within the high luminosity LHC (HL-LHC) project, to provide electron-nucleon collisions and explore a new regime of energy and luminosity for deep inelastic scattering. The design of an interaction region for any collider is always a challenging task given that the beams are brought into crossing with the smallest beam sizes in a region where there are tight detector constraints. In this case integrating the LHeC into the existing HL-LHC lattice, to allow simultaneous proton-proton and electron-proton collisions, increases the difficulty of the task. A nominal design was presented in the the LHeC conceptual design report in 2012 featuring an optical configuration that focuses one of the proton beams of the LHC to β*=10 cm in the LHeC interaction point to reach the desired luminosity of L =1033 cm-2 s-1 . This value is achieved with the aid of a new inner triplet of quadrupoles at a distance L*=10 m from the interaction point. However the chromatic beta beating was found intolerable regarding machine protection issues. An advanced chromatic correction scheme was required. This paper explores the feasibility of the extension of a novel optical technique called the achromatic telescopic squeezing scheme and the flexibility of the interaction region design, in order to find the optimal solution that would produce the highest luminosity while controlling the chromaticity, minimizing the synchrotron radiation power and maintaining the dynamic aperture required for stability.

  10. Work and Programmable Automation.

    ERIC Educational Resources Information Center

    DeVore, Paul W.

    A new industrial era based on electronics and the microprocessor has arrived, an era that is being called intelligent automation. Intelligent automation, in the form of robots, replaces workers, and the new products, using microelectronic devices, require significantly less labor to produce than the goods they replace. The microprocessor thus…

  11. 3D-Printed Microfluidic Automation

    PubMed Central

    Au, Anthony K.; Bhattacharjee, Nirveek; Horowitz, Lisa F.; Chang, Tim C.; Folch, Albert

    2015-01-01

    Microfluidic automation – the automated routing, dispensing, mixing, and/or separation of fluids through microchannels – generally remains a slowly-spreading technology because device fabrication requires sophisticated facilities and the technology’s use demands expert operators. Integrating microfluidic automation in devices has involved specialized multi-layering and bonding approaches. Stereolithography is an assembly-free, 3D-printing technique that is emerging as an efficient alternative for rapid prototyping of biomedical devices. Here we describe fluidic valves and pumps that can be stereolithographically printed in optically-clear, biocompatible plastic and integrated within microfluidic devices at low cost. User-friendly fluid automation devices can be printed and used by non-engineers as replacement for costly robotic pipettors or tedious manual pipetting. Engineers can manipulate the designs as digital modules into new devices of expanded functionality. Printing these devices only requires the digital file and electronic access to a printer. PMID:25738695

  12. Design of Electron and Ion Crabbing Cavities for an Electron-Ion Collider

    SciTech Connect

    Alejandro Castilla Loeza, Geoffrey Krafft, Jean Delayen

    2012-07-01

    Beyond the 12 GeV upgrade at the Jefferson Lab a Medium Energy Electron-Ion Collider (MEIC) has been considered. In order to achieve the desired high luminosities at the Interaction Points (IP), the use of crabbing cavities is under study. In this work, we will present to-date designs of superconducting cavities, considered for crabbing both ion and electron bunches. A discussion of properties such as peak surface fields and higher-order mode separation will be presented. Keywords: super conducting, deflecting cavity, crab cavity.

  13. Design of materials configurations for enhanced phononic and electronic properties

    NASA Astrophysics Data System (ADS)

    Daraio, Chiara

    The discovery of novel nonlinear dynamic and electronic phenomena is presented for the specific cases of granular materials and carbon nanotubes. This research was conducted for designing and constructing optimized macro-, micro- and nano-scale structural configurations of materials, and for studying their phononic and electronic behavior. Variation of composite arrangements of granular elements with different elastic properties in a linear chain-of-sphere, Y-junction or 3-D configurations led to a variety of novel phononic phenomena and interesting physical properties, which can be potentially useful for security, communications, mechanical and biomedical engineering applications. Mechanical and electronic properties of carbon nanotubes with different atomic arrangements and microstructures were also investigated. Electronic properties of Y-junction configured carbon nanotubes exhibit an exciting transistor switch behavior which is not seen in linear configuration nanotubes. Strongly nonlinear materials were designed and fabricated using novel and innovative concepts. Due to their unique strongly nonlinear and anisotropic nature, novel wave phenomena have been discovered. Specifically, violations of Snell's law were detected and a new mechanism of wave interaction with interfaces between NTPCs (Nonlinear Tunable Phononic Crystals) was established. Polymer-based systems were tested for the first time, and the tunability of the solitary waves speed was demonstrated. New materials with transformed signal propagation speed in the manageable range of 10-100 m/s and signal amplitude typical for audible speech have been developed. The enhancing of the mitigation of solitary and shock waves in 1-D chains were demonstrated and a new protective medium was designed for practical applications. 1-D, 2-D and 3-D strongly nonlinear system have been investigated providing a broad impact on the whole area of strongly nonlinear wave dynamics and creating experimental basis for new

  14. Automated a complex computer aided design concept generated using macros programming

    NASA Astrophysics Data System (ADS)

    Rizal Ramly, Mohammad; Asrokin, Azharrudin; Abd Rahman, Safura; Zulkifly, Nurul Ain Md

    2013-12-01

    Changing a complex Computer Aided design profile such as car and aircraft surfaces has always been difficult and challenging. The capability of CAD software such as AutoCAD and CATIA show that a simple configuration of a CAD design can be easily modified without hassle, but it is not the case with complex design configuration. Design changes help users to test and explore various configurations of the design concept before the production of a model. The purpose of this study is to look into macros programming as parametric method of the commercial aircraft design. Macros programming is a method where the configurations of the design are done by recording a script of commands, editing the data value and adding a certain new command line to create an element of parametric design. The steps and the procedure to create a macro programming are discussed, besides looking into some difficulties during the process of creation and advantage of its usage. Generally, the advantages of macros programming as a method of parametric design are; allowing flexibility for design exploration, increasing the usability of the design solution, allowing proper contained by the model while restricting others and real time feedback changes.

  15. A Fully Automated Diabetes Prevention Program, Alive-PD: Program Design and Randomized Controlled Trial Protocol

    PubMed Central

    Azar, Kristen MJ; Block, Torin J; Romanelli, Robert J; Carpenter, Heather; Hopkins, Donald; Palaniappan, Latha; Block, Clifford H

    2015-01-01

    Background In the United States, 86 million adults have pre-diabetes. Evidence-based interventions that are both cost effective and widely scalable are needed to prevent diabetes. Objective Our goal was to develop a fully automated diabetes prevention program and determine its effectiveness in a randomized controlled trial. Methods Subjects with verified pre-diabetes were recruited to participate in a trial of the effectiveness of Alive-PD, a newly developed, 1-year, fully automated behavior change program delivered by email and Web. The program involves weekly tailored goal-setting, team-based and individual challenges, gamification, and other opportunities for interaction. An accompanying mobile phone app supports goal-setting and activity planning. For the trial, participants were randomized by computer algorithm to start the program immediately or after a 6-month delay. The primary outcome measures are change in HbA1c and fasting glucose from baseline to 6 months. The secondary outcome measures are change in HbA1c, glucose, lipids, body mass index (BMI), weight, waist circumference, and blood pressure at 3, 6, 9, and 12 months. Randomization and delivery of the intervention are independent of clinic staff, who are blinded to treatment assignment. Outcomes will be evaluated for the intention-to-treat and per-protocol populations. Results A total of 340 subjects with pre-diabetes were randomized to the intervention (n=164) or delayed-entry control group (n=176). Baseline characteristics were as follows: mean age 55 (SD 8.9); mean BMI 31.1 (SD 4.3); male 68.5%; mean fasting glucose 109.9 (SD 8.4) mg/dL; and mean HbA1c 5.6 (SD 0.3)%. Data collection and analysis are in progress. We hypothesize that participants in the intervention group will achieve statistically significant reductions in fasting glucose and HbA1c as compared to the control group at 6 months post baseline. Conclusions The randomized trial will provide rigorous evidence regarding the efficacy of

  16. Computational strategies for the automated design of RNA nanoscale structures from building blocks using NanoTiler☆

    PubMed Central

    Bindewald, Eckart; Grunewald, Calvin; Boyle, Brett; O’Connor, Mary; Shapiro, Bruce A.

    2013-01-01

    One approach to designing RNA nanoscale structures is to use known RNA structural motifs such as junctions, kissing loops or bulges and to construct a molecular model by connecting these building blocks with helical struts. We previously developed an algorithm for detecting internal loops, junctions and kissing loops in RNA structures. Here we present algorithms for automating or assisting many of the steps that are involved in creating RNA structures from building blocks: (1) assembling building blocks into nanostructures using either a combinatorial search or constraint satisfaction; (2) optimizing RNA 3D ring structures to improve ring closure; (3) sequence optimisation; (4) creating a unique non-degenerate RNA topology descriptor. This effectively creates a computational pipeline for generating molecular models of RNA nanostructures and more specifically RNA ring structures with optimized sequences from RNA building blocks. We show several examples of how the algorithms can be utilized to generate RNA tecto-shapes. PMID:18838281

  17. An expert system for choosing the best combination of options in a general purpose program for automated design synthesis

    NASA Technical Reports Server (NTRS)

    Rogers, J. L.; Barthelemy, J.-F. M.

    1986-01-01

    An expert system called EXADS has been developed to aid users of the Automated Design Synthesis (ADS) general purpose optimization program. ADS has approximately 100 combinations of strategy, optimizer, and one-dimensional search options from which to choose. It is difficult for a nonexpert to make this choice. This expert system aids the user in choosing the best combination of options based on the users knowledge of the problem and the expert knowledge stored in the knowledge base. The knowledge base is divided into three categories; constrained problems, unconstrained problems, and constrained problems being treated as unconstrained problems. The inference engine and rules are written in LISP, contains about 200 rules, and executes on DEC-VAX (with Franz-LISP) and IBM PC (with IQ-LISP) computers.

  18. Design and construction of a medium-scale automated direct measurement respirometric system to assess aerobic biodegradation of polymers

    NASA Astrophysics Data System (ADS)

    Castro Aguirre, Edgar

    A medium-scale automated direct measurement respirometric (DMR) system was designed and built to assess the aerobic biodegradation of up to 30 materials in triplicate simultaneously. Likewise, a computer application was developed for rapid analysis of the data generated. The developed DMR system was able to simulate different testing conditions by varying temperature and relative humidity, which are the major exposure conditions affecting biodegradation. Two complete tests for determining the aerobic biodegradation of polymers under composting conditions were performed to show the efficacy and efficiency of both the DMR system and the DMR data analyzer. In both cases, cellulose reached 70% mineralization at 139 and 45 days. The difference in time for cellulose to reach 70% mineralization was attributed to the composition of the compost and water availability, which highly affect the biodegradation rate. Finally, among the tested materials, at least 60% of the organic carbon content of the biodegradable polymers was converted into carbon dioxide by the end of the test.

  19. Electron-proton spectrometer: Summary for critical design review

    NASA Technical Reports Server (NTRS)

    1972-01-01

    The electron-proton spectrometer (EPS) is mounted external to the Skylab module complex on the command service module. It is designed to make a 2 pi omni-directional measurement of electrons and protons which result from solar flares or enhancement of the radiation belts. The EPS data will provide accurate radiation dose information so that uncertain Relative biological effectiveness factors are eliminated by measuring the external particle spectra. Astronaut radiation safety, therefore, can be ensured, as the EPS data can be used to correct or qualify radiation dose measurements recorded by other radiation measuring instrumentation within the Skylab module complex. The EPS has the capability of measuring and extremely wide dynamic radiation dose rate range, approaching 10 to the 7th power. Simultaneously the EPS has the capability to process data from extremely high radiation fields such as might be encountered in the wake of an intense solar flare.

  20. Automated divertor target design by adjoint shape sensitivity analysis and a one-shot method

    SciTech Connect

    Dekeyser, W.; Reiter, D.; Baelmans, M.

    2014-12-01

    As magnetic confinement fusion progresses towards the development of first reactor-scale devices, computational tokamak divertor design is a topic of high priority. Presently, edge plasma codes are used in a forward approach, where magnetic field and divertor geometry are manually adjusted to meet design requirements. Due to the complex edge plasma flows and large number of design variables, this method is computationally very demanding. On the other hand, efficient optimization-based design strategies have been developed in computational aerodynamics and fluid mechanics. Such an optimization approach to divertor target shape design is elaborated in the present paper. A general formulation of the design problems is given, and conditions characterizing the optimal designs are formulated. Using a continuous adjoint framework, design sensitivities can be computed at a cost of only two edge plasma simulations, independent of the number of design variables. Furthermore, by using a one-shot method the entire optimization problem can be solved at an equivalent cost of only a few forward simulations. The methodology is applied to target shape design for uniform power load, in simplified edge plasma geometry.

  1. Design of III-Nitride Hot Electron Transistors

    NASA Astrophysics Data System (ADS)

    Gupta, Geetak

    III-Nitride based devices have made great progress over the past few decades in electronics and photonics applications. As the technology and theoretical understanding of the III-N system matures, the limitations on further development are based on very basic electronic properties of the material, one of which is electron scattering (or ballistic electron effects). This thesis explores the design space of III-N based ballistic electron transistors using novel design, growth and process techniques. The hot electron transistor (HET) is a unipolar vertical device that operates on the principle of injecting electrons over a high-energy barrier (φBE) called the emitter into an n-doped region called base and finally collecting the high energy electrons (hot electrons) over another barrier (φBC) called the collector barrier. The injected electrons traverse the base in a quasi-ballistic manner. Electrons that get scattered in the base contribute to base current. High gain in the HET is thus achieved by enabling ballistic transport of electrons in the base. In addition, low leakage across the collector barrier (I BCleak) and low base resistance (RB) are needed to achieve high performance. Because of device attributes such as vertical structure, ballistic transport and low-resistance n-type base, the HET has the potential of operating at very high frequencies. Electrical measurements of a HET structure can be used to understand high-energy electron physics and extract information like mean free path in semiconductors. The III-Nitride material system is particularly suited for HETs as it offers a wide range of DeltaEcs and polarization charges which can be engineered to obtain barriers which can inject hot-electrons and have low leakage at room temperature. In addition, polarization charges in the III-N system can be engineered to obtain a high-density and high-mobility 2DEG in the base, which can be used to reduce base resistance and allow vertical scaling. With these

  2. Cost effective, weight sensitive design for military airborne electronic systems

    SciTech Connect

    Peck, W.M.

    1996-12-31

    Thermal management of Military airborne electronic systems is governed by many trade-offs. While the trade-offs may change depending on the customer and system requirements, minimizing weight is usually the primary engineering concern because it saves aircraft fuel. Fuel savings provides increased range and time aloft for the aircraft. The most common approach to achieving meaningful reductions in equipment weight is to reduce system volume. Reduced volume is achieved by increasing electronic packaging density which is accomplished by incorporating new materials, processes, and technologies into the system design. The following four considerations are currently under study in the development of an 8 KW high altitude Military electronic system in order to reduce system volume: (1) identifying design parameters and performing trade-off studies between the use of liquid vs. forced air for system cooling; (2) modeling the total system thermal resistance path to identify possible areas for reducing component temperature rise in order to provide enhanced system reliability; (3) substituting commercial plastic integrated circuits (ICs) for Mil ceramic components to reduce material cost while still meeting system requirements; and (4) using TC1050 material technology in: Militarizing Commercial Off The Shelf (COTS) Circuit Card Assemblies (CCAs), developing high conductivity ceramic packaging for Multi Chip Modules (MCMs), and for developing low coefficient of thermal expansion (CTE) composites for use at both the chip and chassis level. Current results from these studies have yielded a design that has a 2:1 reduction in system volume and a weight reduction of 480 lbs from a currently fielded system.

  3. Summary, Working Group 1: Electron guns and injector designs

    NASA Astrophysics Data System (ADS)

    Ben-Zvi, I.; Bazarov, I. V.

    2006-02-01

    We summarize the proceedings of Working Group 1 of the 2005 Energy Recovery Linac (ERL) Workshop. The subject of this working group, the electron gun and injector design, is arguably the most critical part of the ERL as it determines the ultimate performance of this type of accelerators. Working Group 1 dealt with a variety of subjects: The technology of DC, normal-conducting RF and superconducting RF guns; beam dynamics in the gun and injector; the cathode and laser package; modeling and computational issues; magnetized beams and polarization. A short overview of these issues covered in the Working Group is presented in this paper.

  4. Collaborating with human factors when designing an electronic textbook

    SciTech Connect

    Ratner, J.A.; Zadoks, R.I.; Attaway, S.W.

    1996-04-01

    The development of on-line engineering textbooks presents new challenges to authors to effectively integrate text and tools in an electronic environment. By incorporating human factors principles of interface design and cognitive psychology early in the design process, a team at Sandia National Laboratories was able to make the end product more usable and shorten the prototyping and editing phases. A critical issue was simultaneous development of paper and on-line versions of the textbook. In addition, interface consistency presented difficulties with distinct goals and limitations for each media. Many of these problems were resolved swiftly with human factors input using templates, style guides and iterative usability testing of both paper and on-line versions. Writing style continuity was also problematic with numerous authors contributing to the text.

  5. High-temperature electronic components and circuit designs

    SciTech Connect

    Chang, H.T.

    1982-01-01

    Downhole logging instruments for geothermal application must have electronic circuits capable of operating from room temperature to 250/sup 0/C. Previous research was centered on low voltage/low current hybrid microcircuits. However, a nondestructive evaluation (NDE) instrument for geothermal wells requires a circuit that can be operated at high voltage and high current in order to provide high-power output. In designing such a circuit, Sandia Laboratories is developing a high-power, high-speed, cold-cathode switching tube to be used as a substitute for SCRs or thyratrons. The possibility of using low-leakage JFETs beyond their rated temperature in a circuit design will be discussed. Commercial high-temperature components will be reviewed.

  6. Dual scattering foil design for poly-energetic electron beams.

    PubMed

    Kainz, K K; Antolak, J A; Almond, P R; Bloch, C D; Hogstrom, K R

    2005-03-01

    The laser wakefield acceleration (LWFA) mechanism can accelerate electrons to energies within the 6-20 MeV range desired for therapy application. However, the energy spectrum of LWFA-generated electrons is broad, on the order of tens of MeV. Using existing laser technology, the therapeutic beam might require a significant energy spread to achieve clinically acceptable dose rates. The purpose of this work was to test the assumption that a scattering foil system designed for a mono-energetic beam would be suitable for a poly-energetic beam with a significant energy spread. Dual scattering foil systems were designed for mono-energetic beams using an existing analytical formalism based on Gaussian multiple-Coulomb scattering theory. The design criterion was to create a flat beam that would be suitable for fields up to 25 x 25 cm2 at 100 cm from the primary scattering foil. Radial planar fluence profiles for poly-energetic beams with energy spreads ranging from 0.5 MeV to 6.5 MeV were calculated using two methods: (a) analytically by summing beam profiles for a range of mono-energetic beams through the scattering foil system, and (b) by Monte Carlo using the EGS/BEAM code. The analytic calculations facilitated fine adjustments to the foil design, and the Monte Carlo calculations enabled us to verify the results of the analytic calculation and to determine the phase-space characteristics of the broadened beam. Results showed that the flatness of the scattered beam is fairly insensitive to the width of the input energy spectrum. Also, results showed that dose calculated by the analytical and Monte Carlo methods agreed very well in the central portion of the beam. Outside the useable field area, the differences between the analytical and Monte Carlo results were small but significant, possibly due to the small angle approximation. However, these did not affect the conclusion that a scattering foil system designed for a mono-energetic beam will be suitable for a poly

  7. Investigating the intrinsic cleanliness of automated handling designed for EUV mask pod-in-pod systems

    NASA Astrophysics Data System (ADS)

    Brux, O.; van der Walle, P.; van der Donck, J. C. J.; Dress, P.

    2011-11-01

    Extreme Ultraviolet Lithography (EUVL) is the most promising solution for technology nodes 16nm (hp) and below. However, several unique EUV mask challenges must be resolved for a successful launch of the technology into the market. Uncontrolled introduction of particles and/or contamination into the EUV scanner significantly increases the risk for device yield loss and potentially scanner down-time. With the absence of a pellicle to protect the surface of the EUV mask, a zero particle adder regime between final clean and the point-of-exposure is critical for the active areas of the mask. A Dual Pod concept for handling EUV masks had been proposed by the industry as means to minimize the risk of mask contamination during transport and storage. SuSS-HamaTech introduces MaskTrackPro InSync as a fully automated solution for the handling of EUV masks in and out of this Dual Pod System and therefore constitutes an interface between various tools inside the Fab. The intrinsic cleanliness of each individual handling and storage step of the inner shell (EIP) of this Dual Pod and the EUV mask inside the InSync Tool has been investigated to confirm the capability for minimizing the risk of cross-contamination. An Entegris Dual Pod EUV-1000A-A110 has been used for the qualification. The particle detection for the qualification procedure was executed with the TNO's RapidNano Particle Scanner, qualified for particle sizes down to 50nm (PSL equivalent). It has been shown that the target specification of < 2 particles @ 60nm per 25 cycles has been achieved. In case where added particles were measured, the EIP has been identified as a potential root cause for Ni particle generation. Any direct Ni-Al contact has to be avoided to mitigate the risk of material abrasion.

  8. Design and Practices for Use of Automated Drilling and Sample Handling in MARTE While Minimizing Terrestrial and Cross Contamination

    NASA Astrophysics Data System (ADS)

    Miller, David P.; Bonaccorsi, Rosalba; Davis, Kiel

    2008-10-01

    Mars Astrobiology Research and Technology Experiment (MARTE) investigators used an automated drill and sample processing hardware to detect and categorize life-forms found in subsurface rock at Río Tinto, Spain. For the science to be successful, it was necessary for the biomass from other sources -- whether from previously processed samples (cross contamination) or the terrestrial environment (forward contamination) -- to be insignificant. The hardware and practices used in MARTE were designed around this problem. Here, we describe some of the design issues that were faced and classify them into problems that are unique to terrestrial tests versus problems that would also exist for a system that was flown to Mars. Assessment of the biomass at various stages in the sample handling process revealed mixed results; the instrument design seemed to minimize cross contamination, but contamination from the surrounding environment sometimes made its way onto the surface of samples. Techniques used during the MARTE Río Tinto project, such as facing the sample, appear to remove this environmental contamination without introducing significant cross contamination from previous samples.

  9. HotSpot Wizard 2.0: automated design of site-specific mutations and smart libraries in protein engineering

    PubMed Central

    Bendl, Jaroslav; Stourac, Jan; Sebestova, Eva; Vavra, Ondrej; Musil, Milos; Brezovsky, Jan; Damborsky, Jiri

    2016-01-01

    HotSpot Wizard 2.0 is a web server for automated identification of hot spots and design of smart libraries for engineering proteins’ stability, catalytic activity, substrate specificity and enantioselectivity. The server integrates sequence, structural and evolutionary information obtained from 3 databases and 20 computational tools. Users are guided through the processes of selecting hot spots using four different protein engineering strategies and optimizing the resulting library's size by narrowing down a set of substitutions at individual randomized positions. The only required input is a query protein structure. The results of the calculations are mapped onto the protein's structure and visualized with a JSmol applet. HotSpot Wizard lists annotated residues suitable for mutagenesis and can automatically design appropriate codons for each implemented strategy. Overall, HotSpot Wizard provides comprehensive annotations of protein structures and assists protein engineers with the rational design of site-specific mutations and focused libraries. It is freely available at http://loschmidt.chemi.muni.cz/hotspotwizard. PMID:27174934

  10. Design and practices for use of automated drilling and sample handling in MARTE while minimizing terrestrial and cross contamination.

    PubMed

    Miller, David P; Bonaccorsi, Rosalba; Davis, Kiel

    2008-10-01

    Mars Astrobiology Research and Technology Experiment (MARTE) investigators used an automated drill and sample processing hardware to detect and categorize life-forms found in subsurface rock at Río Tinto, Spain. For the science to be successful, it was necessary for the biomass from other sources--whether from previously processed samples (cross contamination) or the terrestrial environment (forward contamination)-to be insignificant. The hardware and practices used in MARTE were designed around this problem. Here, we describe some of the design issues that were faced and classify them into problems that are unique to terrestrial tests versus problems that would also exist for a system that was flown to Mars. Assessment of the biomass at various stages in the sample handling process revealed mixed results; the instrument design seemed to minimize cross contamination, but contamination from the surrounding environment sometimes made its way onto the surface of samples. Techniques used during the MARTE Río Tinto project, such as facing the sample, appear to remove this environmental contamination without introducing significant cross contamination from previous samples. PMID:19105753

  11. HotSpot Wizard 2.0: automated design of site-specific mutations and smart libraries in protein engineering.

    PubMed

    Bendl, Jaroslav; Stourac, Jan; Sebestova, Eva; Vavra, Ondrej; Musil, Milos; Brezovsky, Jan; Damborsky, Jiri

    2016-07-01

    HotSpot Wizard 2.0 is a web server for automated identification of hot spots and design of smart libraries for engineering proteins' stability, catalytic activity, substrate specificity and enantioselectivity. The server integrates sequence, structural and evolutionary information obtained from 3 databases and 20 computational tools. Users are guided through the processes of selecting hot spots using four different protein engineering strategies and optimizing the resulting library's size by narrowing down a set of substitutions at individual randomized positions. The only required input is a query protein structure. The results of the calculations are mapped onto the protein's structure and visualized with a JSmol applet. HotSpot Wizard lists annotated residues suitable for mutagenesis and can automatically design appropriate codons for each implemented strategy. Overall, HotSpot Wizard provides comprehensive annotations of protein structures and assists protein engineers with the rational design of site-specific mutations and focused libraries. It is freely available at http://loschmidt.chemi.muni.cz/hotspotwizard. PMID:27174934

  12. Ultraviolet Free Electron Laser Facility preliminary design report

    SciTech Connect

    Ben-Zvi, I.

    1993-02-01

    This document, the Preliminary Design Report (PDR) for the Brookhaven Ultraviolet Free Electron Laser (UV FEL) facility, describes all the elements of a facility proposed to meet the needs of a research community which requires ultraviolet sources not currently available as laboratory based lasers. Further, for these experiments, the requisite properties are not extant in either the existing second or upcoming third generation synchrotron light sources. This document is the result of our effort at BNL to identify potential users, determine the requirements of their experiments, and to design a facility which can not only satisfy the existing need, but have adequate flexibility for possible future extensions as need dictates and as evolving technology allows. The PDR is comprised of three volumes. In this, the first volume, background for the development of the proposal is given, including descriptions of the UV FEL facility, and representative examples of the science it was designed to perform. Discussion of the limitations and potential directions for growth are also included. A detailed description of the facility design is then provided, which addresses the accelerator, optical, and experimental systems. Information regarding the conventional construction for the facility is contained in an addendum to volume one (IA).

  13. Design analysis of a novel hot-electron microbolometer

    SciTech Connect

    Nahum, M.; Richards, P.L. ); Mears, C.A. )

    1992-08-01

    We propose a novel antenna coupled microbolometer which makes use of the weak coupling between electrons and phonons in a metal at low temperatures. The radiation is collected by a planar lithographed antenna and thermalized in a thin metal strip. Resulting temperature rise of the electrons is detected by a tunnel junction, where part of the metal strip forms the normal electrode. All components are deposited directly on a substrate so that arrays can be conveniently produced by conventional lithographic techniques. The active area of the bolometer is thermally decoupled by its small volume, by thermal resistance between electrons an phonons in the strip, and by reflection of quasiparticles at interface between strip and superconducting antenna. Design calculations based on a metal volume of 2 [times] 6[times]0.05 [mu]m[sup 3] at an operating temperature of T=100 mK give an NEP [approx] 3 [times] l0[sup [minus]19] WHz[sup [minus][1/2

  14. Design analysis of a novel hot-electron microbolometer

    SciTech Connect

    Nahum, M.; Richards, P.L.; Mears, C.A.

    1992-08-01

    We propose a novel antenna coupled microbolometer which makes use of the weak coupling between electrons and phonons in a metal at low temperatures. The radiation is collected by a planar lithographed antenna and thermalized in a thin metal strip. Resulting temperature rise of the electrons is detected by a tunnel junction, where part of the metal strip forms the normal electrode. All components are deposited directly on a substrate so that arrays can be conveniently produced by conventional lithographic techniques. The active area of the bolometer is thermally decoupled by its small volume, by thermal resistance between electrons an phonons in the strip, and by reflection of quasiparticles at interface between strip and superconducting antenna. Design calculations based on a metal volume of 2 {times} 6{times}0.05 {mu}m{sup 3} at an operating temperature of T=100 mK give an NEP {approx} 3 {times} l0{sup {minus}19} WHz{sup {minus}{1/2}}, time constant {approx} 10 {minus}s, and responsivity {approx} 10{sup 9} V/W. The calculated sensitivity is almost two orders of magnitude higher than that of the best available direct detectors of millimeter and submillimeter radiation at the same temperature.

  15. Demonstration of electronic design automation flow for massively parallel e-beam lithography

    NASA Astrophysics Data System (ADS)

    Brandt, Pieter; Belledent, Jérôme; Tranquillin, Céline; Figueiro, Thiago; Meunier, Stéfanie; Bayle, Sébastien; Fay, Aurélien; Milléquant, Matthieu; Icard, Beatrice; Wieland, Marco

    2014-07-01

    For proximity effect correction in 5 keV e-beam lithography, three elementary building blocks exist: dose modulation, geometry (size) modulation, and background dose addition. Combinations of these three methods are quantitatively compared in terms of throughput impact and process window (PW). In addition, overexposure in combination with negative bias results in PW enhancement at the cost of throughput. In proximity effect correction by over exposure (PEC-OE), the entire layout is set to fixed dose and geometry sizes are adjusted. In PEC-dose to size (DTS) both dose and geometry sizes are locally optimized. In PEC-background (BG), a background is added to correct the long-range part of the point spread function. In single e-beam tools (Gaussian or Shaped-beam), throughput heavily depends on the number of shots. In raster scan tools such as MAPPER Lithography's FLX 1200 (MATRIX platform) this is not the case and instead of pattern density, the maximum local dose on the wafer is limiting throughput. The smallest considered half-pitch is 28 nm, which may be considered the 14-nm node for Metal-1 and the 10-nm node for the Via-1 layer, achieved in a single exposure with e-beam lithography. For typical 28-nm-hp Metal-1 layouts, it was shown that dose latitudes (size of process window) of around 10% are realizable with available PEC methods. For 28-nm-hp Via-1 layouts this is even higher at 14% and up. When the layouts do not reach the highest densities (up to 10∶1 in this study), PEC-BG and PEC-OE provide the capability to trade throughput for dose latitude. At the highest densities, PEC-DTS is required for proximity correction, as this method adjusts both geometry edges and doses and will reduce the dose at the densest areas. For 28-nm-hp lines critical dimension (CD), hole&dot (CD) and line ends (edge placement error), the data path errors are typically 0.9, 1.0 and 0.7 nm (3σ) and below, respectively. There is not a clear data path performance difference between the investigated PEC methods. After the simulations, the methods were successfully validated in exposures on a MAPPER pre-alpha tool. A 28-nm half pitch Metal-1 and Via-1 layouts show good performance in resist that coincide with the simulation result. Exposures of soft-edge stitched layouts show that beam-to-beam position errors up to ±7 nm specified for FLX 1200 show no noticeable impact on CD. The research leading to these results has been performed in the frame of the industrial collaborative consortium IMAGINE.

  16. Green Infrastructure Design Evaluation Using the Automated Geospatial Watershed Assessment Tool

    EPA Science Inventory

    In arid and semi-arid regions, green infrastructure (GI) can address several issues facing urban environments, including augmenting water supply, mitigating flooding, decreasing pollutant loads, and promoting greenness in the built environment. An optimum design captures stormwat...

  17. Evaluation of Green Infrastructure Designs Using the Automated Geospatial Watershed Assessment Tool

    EPA Science Inventory

    In arid and semi-arid regions, green infrastructure (GI) can address several issues facing urban environments, including augmenting water supply, mitigating flooding, decreasing pollutant loads, and promoting greenness in the built environment. An optimum design captures stormwat...

  18. Automated water analyser computer supported system (AWACSS) Part I: Project objectives, basic technology, immunoassay development, software design and networking.

    PubMed

    Tschmelak, Jens; Proll, Guenther; Riedt, Johannes; Kaiser, Joachim; Kraemmer, Peter; Bárzaga, Luis; Wilkinson, James S; Hua, Ping; Hole, J Patrick; Nudd, Richard; Jackson, Michael; Abuknesha, Ram; Barceló, Damià; Rodriguez-Mozaz, Sara; de Alda, Maria J López; Sacher, Frank; Stien, Jan; Slobodník, Jaroslav; Oswald, Peter; Kozmenko, Helena; Korenková, Eva; Tóthová, Lívia; Krascsenits, Zoltan; Gauglitz, Guenter

    2005-02-15

    A novel analytical system AWACSS (automated water analyser computer-supported system) based on immunochemical technology has been developed that can measure several organic pollutants at low nanogram per litre level in a single few-minutes analysis without any prior sample pre-concentration nor pre-treatment steps. Having in mind actual needs of water-sector managers related to the implementation of the Drinking Water Directive (DWD) (98/83/EC, 1998) and Water Framework Directive WFD (2000/60/EC, 2000), drinking, ground, surface, and waste waters were major media used for the evaluation of the system performance. The instrument was equipped with remote control and surveillance facilities. The system's software allows for the internet-based networking between the measurement and control stations, global management, trend analysis, and early-warning applications. The experience of water laboratories has been utilised at the design of the instrument's hardware and software in order to make the system rugged and user-friendly. Several market surveys were conducted during the project to assess the applicability of the final system. A web-based AWACSS database was created for automated evaluation and storage of the obtained data in a format compatible with major databases of environmental organic pollutants in Europe. This first part article gives the reader an overview of the aims and scope of the AWACSS project as well as details about basic technology, immunoassays, software, and networking developed and utilised within the research project. The second part article reports on the system performance, first real sample measurements, and an international collaborative trial (inter-laboratory tests) to compare the biosensor with conventional anayltical methods. PMID:15626603

  19. The Design and Production of a Procedure Training Aid Using the Procedure Learning Format and the Computer Automated Page Layout (PLA) Routine. Technical Note 12-83.

    ERIC Educational Resources Information Center

    Terrell, William R.; And Others

    This report describes a field application of the Computer Automated Page Layout (PLA) system to the development of a procedure training aid for the SH-3D/H Helicopter, as part of the Training Analysis and Evaluation Group's (TAEG) ongoing development effort to provide tools for the design and publication of technical training aids in a format…

  20. A Tool for the Automated Design and Evaluation of Habitat Interior Layouts

    NASA Technical Reports Server (NTRS)

    Simon, Matthew A.; Wilhite, Alan W.

    2013-01-01

    The objective of space habitat design is to minimize mass and system size while providing adequate space for all necessary equipment and a functional layout that supports crew health and productivity. Unfortunately, development and evaluation of interior layouts is often ignored during conceptual design because of the subjectivity and long times required using current evaluation methods (e.g., human-in-the-loop mockup tests and in-depth CAD evaluations). Early, more objective assessment could prevent expensive design changes that may increase vehicle mass and compromise functionality. This paper describes a new interior design evaluation method to enable early, structured consideration of habitat interior layouts. This interior layout evaluation method features a comprehensive list of quantifiable habitat layout evaluation criteria, automatic methods to measure these criteria from a geometry model, and application of systems engineering tools and numerical methods to construct a multi-objective value function measuring the overall habitat layout performance. In addition to a detailed description of this method, a C++/OpenGL software tool which has been developed to implement this method is also discussed. This tool leverages geometry modeling coupled with collision detection techniques to identify favorable layouts subject to multiple constraints and objectives (e.g., minimize mass, maximize contiguous habitable volume, maximize task performance, and minimize crew safety risks). Finally, a few habitat layout evaluation examples are described to demonstrate the effectiveness of this method and tool to influence habitat design.

  1. Automated procedure for design of wing structures to satisfy strength and flutter requirements

    NASA Technical Reports Server (NTRS)

    Haftka, R. T.

    1973-01-01

    A pilot computer program was developed for the design of minimum mass wing structures under flutter, strength, and minimum gage constraints. The wing structure is idealized by finite elements, and second-order piston theory aerodynamics is used in the flutter calculation. Mathematical programing methods are used for the optimization. Computation times during the design process are reduced by three techniques. First, iterative analysis methods used to reduce significantly reanalysis times. Second, the number of design variables is kept small by not using a one-to-one correspondence between finite elements and design variables. Third, a technique for using approximate second derivatives with Newton's method for the optimization is incorporated. The program output is compared witH previous published results. It is found that some flutter characteristics, such as the flutter speed, can display discontinous dependence on the design variables (which are the thicknesses of the structural elements). It is concluded that it is undesirable to use such quantities in the formulation of the flutter constraint.

  2. Optimization and automation of the semi-submersible platforms mooring design

    SciTech Connect

    Ferrari, J.A. Jr.; Morooka, C.K.

    1994-12-31

    There are a few calculation programs around the world used for determining the main aspects of the Mooring Design of Semi-Submersible Platforms . These programs bold a worldwide acknowledgement and their results are actually reliable. But they require many runs to get a solution that comply with the Classification Society requirements. This paper presents some procedures in order to optimize the semi-submersible mooring design as well as to make it automatic. Regarding the optimization philosophies, the following aspects are treated: (1) the optimization of the platform heading and the mooring pattern based on the spreading of the environmental forces; (2) the searching for the optimum mooring line composition in an automatic mode. Basically, the paper`s main goal is to introduce some methods to find the lowest cost solution for the mooring system in a short time. All of these methods were computationally implemented creating the intelligent system named PROANC, which deals with the semi-submersible mooring design in a quasi-static and deterministic approach. It should be noted that the proposed system exerts a strong appeal as a design tool for feasibility studies of a given oil field and its quasi-static results can be directly applied to a mooring program capable of performing dynamic analysis. Finally some simulations are executed for different water depths and its final results, including the expended time to run, are presented in order to prove the PROANC system wide potential as a design tool.

  3. Design and Performance of an Automated Bioreactor for Cell Culture Experiments in a Microgravity Environment

    NASA Astrophysics Data System (ADS)

    Kim, Youn-Kyu; Park, Seul-Hyun; Lee, Joo-Hee; Choi, Gi-Hyuk

    2015-03-01

    In this paper, we describe the development of a bioreactor for a cell-culture experiment on the International Space Station (ISS). The bioreactor is an experimental device for culturing mouse muscle cells in a microgravity environment. The purpose of the experiment was to assess the impact of microgravity on the muscles to address the possibility of longterm human residence in space. After investigation of previously developed bioreactors, and analysis of the requirements for microgravity cell culture experiments, a bioreactor design is herein proposed that is able to automatically culture 32 samples simultaneously. This reactor design is capable of automatic control of temperature, humidity, and culture-medium injection rate; and satisfies the interface requirements of the ISS. Since bioreactors are vulnerable to cell contamination, the medium-circulation modules were designed to be a completely replaceable, in order to reuse the bioreactor after each experiment. The bioreactor control system is designed to circulate culture media to 32 culture chambers at a maximum speed of 1 ml/min, to maintain the temperature of the reactor at 36°C, and to keep the relative humidity of the reactor above 70%. Because bubbles in the culture media negatively affect cell culture, a de-bubbler unit was provided to eliminate such bubbles. A working model of the reactor was built according to the new design, to verify its performance, and was used to perform a cell culture experiment that confirmed the feasibility of this device.

  4. Automated design of coupled RF cavities using 2-D and 3-D codes

    SciTech Connect

    Smith, Peter; Christiansen, D. W.; Greninger, P. T.; Spalek, G.

    2001-01-01

    Coupled RF cavities in the Accelerator Production of Tritium Project have been designed using a procedure in which a 2-D code (CCT) searches for a design that meets frequency and coupling requirements, while a 3-D code (HFSS) is used to obtain empirical factors used by CCT to characterize the coupling slot between cavities. Using assumed values of the empirical factors, CCT runs the Superfish code iteratively to solve for a trial cavity design that has a specified frequency and coupling. The frequency shifts and the coupling constant k of the slot are modeled in CCT using a perturbation theory, the results of which are adjusted using the empirical factors. Given a trial design, HFSS is run using periodic boundary conditions to obtain a mode spectrum. The mode spectrum is processed using the DISPER code to obtain values of the coupling and the frequencies with slots. These results are used to calculate a new set of empirical factors, which are fed back into CCT for another design iteration. Cold models have been fabricated and tested to validate the codes, and results will be presented.

  5. Search Hanford accessible reports electronically system design description. Revision 1

    SciTech Connect

    Gilomen, T.L.

    1995-12-31

    The Search Hanford Accessible Records Electronically (SHARE) system was produced by a combined team of personnel from Westinghouse Hanford Company (WHC) Corrective Action Data Systems (CADS) and Information Resource Management (IRM) Information and Scientific Systems (ISS) organizations. The ESQD Text Evaluation and exchange Tool (ETEXT) prototype was used as a basis for the requirements used to support this design/build effort. TOPIC was used to build the SHARE application. TOPIC is a text search and retrieval software product produced by the Verity Corporation. The TOPIC source code is not provided with the product, and the programs cannot be changed. TOPIC can be customized for special requirements. The software is fully documented. Help messages, menu and screen layouts, command edits and options, and internal system design are all described in the TOPIC documentation. This System Design Description (SDD) will not reiterate the TOPIC documentation and design. Instead, it will focus on the SHARE installation of TOPIC. This SDD is designed to assist the SHARE database/infobase administrator (DBA) in maintaining and supporting the application. It assumes that the assigned DBA is knowledgeable in using the TOPIC product, and is also knowledgeable in using a personal computer (PC), Disk Operating System (DOS) commands, and the document WHC-CM-3-10. SHARE is an Impact Level 4 system, and all activities related to SHARE must conform with the WHC-CM-3-10 procedures for an Impact Level 4 system. The Alternatives Analysis will be treated as a level 3-Q document, to allow for reference by potential future projects.

  6. PopupCAD: a tool for automated design, fabrication, and analysis of laminate devices

    NASA Astrophysics Data System (ADS)

    Aukes, Daniel M.; Wood, Robert J.

    2015-05-01

    Recent advances in laminate manufacturing techniques have driven the development of new classes of millimeter-scale sensorized medical devices, robots capable of terrestrial locomotion and sustained flight, and new techniques for sensing and actuation. Recently, the analysis of laminate micro-devices has focused more manufacturability concerns and not on mechanics. Considering the nature of such devices, we draw from existing research in composites, origami kinematics, and finite element methods in order to identify issues related to sequential assembly and self-folding prior to fabrication as well as the stiffness of composite folded systems during operation. These techniques can be useful for understanding how such devices will bend and flex under normal operating conditions, and when added to new design tools like popupCAD, will give designers another means to develop better devices throughout the design process.

  7. Human-Automation Integration: Principle and Method for Design and Evaluation

    NASA Technical Reports Server (NTRS)

    Billman, Dorrit; Feary, Michael

    2012-01-01

    Future space missions will increasingly depend on integration of complex engineered systems with their human operators. It is important to ensure that the systems that are designed and developed do a good job of supporting the needs of the work domain. Our research investigates methods for needs analysis. We included analysis of work products (plans for regulation of the space station) as well as work processes (tasks using current software), in a case study of Attitude Determination and Control Officers (ADCO) planning work. This allows comparing how well different designs match the structure of the work to be supported. Redesigned planning software that better matches the structure of work was developed and experimentally assessed. The new prototype enabled substantially faster and more accurate performance in plan revision tasks. This success suggests the approach to needs assessment and use in design and evaluation is promising, and merits investigatation in future research.

  8. Automated DNA Sequencing System

    SciTech Connect

    Armstrong, G.A.; Ekkebus, C.P.; Hauser, L.J.; Kress, R.L.; Mural, R.J.

    1999-04-25

    Oak Ridge National Laboratory (ORNL) is developing a core DNA sequencing facility to support biological research endeavors at ORNL and to conduct basic sequencing automation research. This facility is novel because its development is based on existing standard biology laboratory equipment; thus, the development process is of interest to the many small laboratories trying to use automation to control costs and increase throughput. Before automation, biology Laboratory personnel purified DNA, completed cycle sequencing, and prepared 96-well sample plates with commercially available hardware designed specifically for each step in the process. Following purification and thermal cycling, an automated sequencing machine was used for the sequencing. A technician handled all movement of the 96-well sample plates between machines. To automate the process, ORNL is adding a CRS Robotics A- 465 arm, ABI 377 sequencing machine, automated centrifuge, automated refrigerator, and possibly an automated SpeedVac. The entire system will be integrated with one central controller that will direct each machine and the robot. The goal of this system is to completely automate the sequencing procedure from bacterial cell samples through ready-to-be-sequenced DNA and ultimately to completed sequence. The system will be flexible and will accommodate different chemistries than existing automated sequencing lines. The system will be expanded in the future to include colony picking and/or actual sequencing. This discrete event, DNA sequencing system will demonstrate that smaller sequencing labs can achieve cost-effective the laboratory grow.

  9. Automated trajectory design for impulsive and low thrust interplanetary mission analysis

    NASA Astrophysics Data System (ADS)

    Wagner, Samuel Arthur

    This dissertation describes a hybrid optimization algorithm that is able to determine optimal trajectories for many complex mission analysis and design orbital mechanics problems. This new algorithm will be used to determine optimal trajectories for a variety of mission design problems, including asteroid rendezvous, multiple gravity-assist (MGA), multiple gravity-assist with deep-space maneuvers (MGA-DSM), and low-thrust trajectory missions. The research described here was conducted at the Asteroid Deflection Research Center (ADRC) at Iowa State University.

  10. Integrated Design Engineering Analysis (IDEA) Environment Automated Generation of Structured CFD Grids using Topology Methods

    NASA Technical Reports Server (NTRS)

    Kamhawi, Hilmi N.

    2012-01-01

    This report documents the work performed from March 2010 to March 2012. The Integrated Design and Engineering Analysis (IDEA) environment is a collaborative environment based on an object-oriented, multidisciplinary, distributed framework using the Adaptive Modeling Language (AML) as a framework and supporting the configuration design and parametric CFD grid generation. This report will focus on describing the work in the area of parametric CFD grid generation using novel concepts for defining the interaction between the mesh topology and the geometry in such a way as to separate the mesh topology from the geometric topology while maintaining the link between the mesh topology and the actual geometry.

  11. Data Mining Approaches for Modeling Complex Electronic Circuit Design Activities

    SciTech Connect

    Kwon, Yongjin; Omitaomu, Olufemi A; Wang, Gi-Nam

    2008-01-01

    A printed circuit board (PCB) is an essential part of modern electronic circuits. It is made of a flat panel of insulating materials with patterned copper foils that act as electric pathways for various components such as ICs, diodes, capacitors, resistors, and coils. The size of PCBs has been shrinking over the years, while the number of components mounted on these boards has increased considerably. This trend makes the design and fabrication of PCBs ever more difficult. At the beginning of design cycles, it is important to estimate the time to complete the steps required accurately, based on many factors such as the required parts, approximate board size and shape, and a rough sketch of schematics. Current approach uses multiple linear regression (MLR) technique for time and cost estimations. However, the need for accurate predictive models continues to grow as the technology becomes more advanced. In this paper, we analyze a large volume of historical PCB design data, extract some important variables, and develop predictive models based on the extracted variables using a data mining approach. The data mining approach uses an adaptive support vector regression (ASVR) technique; the benchmark model used is the MLR technique currently being used in the industry. The strengths of SVR for this data include its ability to represent data in high-dimensional space through kernel functions. The computational results show that a data mining approach is a better prediction technique for this data. Our approach reduces computation time and enhances the practical applications of the SVR technique.

  12. Assessment of the application of an automated electronic milk analyzer for the enumeration of total bacteria in raw goat milk.

    PubMed

    Ramsahoi, L; Gao, A; Fabri, M; Odumeru, J A

    2011-07-01

    Automated electronic milk analyzers for rapid enumeration of total bacteria counts (TBC) are widely used for raw milk testing by many analytical laboratories worldwide. In Ontario, Canada, Bactoscan flow cytometry (BsnFC; Foss Electric, Hillerød, Denmark) is the official anchor method for TBC in raw cow milk. Penalties are levied at the BsnFC equivalent level of 50,000 cfu/mL, the standard plate count (SPC) regulatory limit. This study was conducted to assess the BsnFC for TBC in raw goat milk, to determine the mathematical relationship between the SPC and BsnFC methods, and to identify probable reasons for the difference in the SPC:BsnFC equivalents for goat and cow milks. Test procedures were conducted according to International Dairy Federation Bulletin guidelines. Approximately 115 farm bulk tank milk samples per month were tested for inhibitor residues, SPC, BsnFC, psychrotrophic bacteria count, composition (fat, protein, lactose, lactose and other solids, and freezing point), and somatic cell count from March 2009 to February 2010. Data analysis of the results for the samples tested indicated that the BsnFC method would be a good alternative to the SPC method, providing accurate and more precise results with a faster turnaround time. Although a linear regression model showed good correlation and prediction, tests for linearity indicated that the relationship was linear only beyond log 4.1 SPC. The logistic growth curve best modeled the relationship between the SPC and BsnFC for the entire sample population. The BsnFC equivalent to the SPC 50,000 cfu/mL regulatory limit was estimated to be 321,000 individual bacteria count (ibc)/mL. This estimate differs considerably from the BsnFC equivalent for cow milk (121,000 ibc/mL). Because of the low frequency of bulk tank milk pickups at goat farms, 78.5% of the samples had their oldest milking in the tank to be 6.5 to 9.0 d old when tested, compared with the cow milk samples, which had their oldest milking at 4 d

  13. A Camera and Multi-Sensor Automated Station Design for Polar Physical and Biological Systems Monitoring: AMIGOS

    NASA Astrophysics Data System (ADS)

    Bohlander, J. A.; Ross, R.; Scambos, T.; Haran, T. M.; Bauer, R. J.

    2012-12-01

    The Automated Meteorology - Ice/Indigenous species - Geophysics Observation System (AMIGOS) consists of a set of measurement instruments and camera(s) controlled by a single-board computer with a simplified Linux operating system and an Iridium satellite modem supporting two-way communication. Primary features of the system relevant to polar operations are low power requirements, daily data uploading, reprogramming, tolerance for low temperatures, and various approaches for automatic resets and recovery from low power or cold shut-down. Instruments include a compact weather station, C/A or dual-frequency GPS, solar flux and reflectivity sensors, sonic snow gages, simplified radio-echo-sounder, and resistance thermometer string in the firn column. In the current state of development, there are two basic designs. One is intended for in situ observations of glacier conditions. The other design supports a high-resolution camera for monitoring biological or geophysical systems from short distances (100 m to 20 km). The stations have been successfully used in several locations for operational support, monitoring rapid ice changes in response to climate change or iceberg drift, and monitoring penguin colony activity. As of June, 2012, there are 9 AMIGOS systems installed, all on the Antarctic continent. The stations are a working prototype for a planned series of upgraded stations, currently termed 'Sentinels'. These stations would carry further instrumentation, communications, and processing capability to investigate ice - ocean interaction from ice tongue, ice shelf, or fjord coastline areas.

  14. Designing Automated Adaptive Support to Improve Student Helping Behaviors in a Peer Tutoring Activity

    ERIC Educational Resources Information Center

    Walker, Erin; Rummel, Nikol; Koedinger, Kenneth R.

    2011-01-01

    Adaptive collaborative learning support systems analyze student collaboration as it occurs and provide targeted assistance to the collaborators. Too little is known about how to design adaptive support to have a positive effect on interaction and learning. We investigated this problem in a reciprocal peer tutoring scenario, where two students take…

  15. An automated system for chromosome analysis. Volume 1: Goals, system design, and performance

    NASA Technical Reports Server (NTRS)

    Castleman, K. R.; Melnyk, J. H.

    1975-01-01

    The design, construction, and testing of a complete system to produce karyotypes and chromosome measurement data from human blood samples, and a basis for statistical analysis of quantitative chromosome measurement data is described. The prototype was assembled, tested, and evaluated on clinical material and thoroughly documented.

  16. Technology and Jobs: Computer-Aided Design. Numerical-Control Machine-Tool Operators. Office Automation.

    ERIC Educational Resources Information Center

    Stanton, Michael; And Others

    1985-01-01

    Three reports on the effects of high technology on the nature of work include (1) Stanton on applications and implications of computer-aided design for engineers, drafters, and architects; (2) Nardone on the outlook and training of numerical-control machine tool operators; and (3) Austin and Drake on the future of clerical occupations in automated…

  17. HiVy automated translation of stateflow designs for model checking verification

    NASA Technical Reports Server (NTRS)

    Pingree, Paula

    2003-01-01

    tool set enables model checking of finite state machines designs. This is acheived by translating state-chart specifications into the input language of the Spin model checker. An abstract syntax of hierarchical sequential automata (HSA) is provided as an intermediate format tool set.

  18. Automated preliminary design of simplified wing structures to satisfy strength and flutter requirements

    NASA Technical Reports Server (NTRS)

    Stroud, W. J.; Dexter, C. B.; Stein, M.

    1972-01-01

    A simple structural model of an aircraft wing is used to show the effects of strength (stress) and flutter requirements on the design of minimum-weight aircraft-wing structures. The wing is idealized as an isotropic sandwich plate with a variable cover thickness distribution and a variable depth between covers. Plate theory is used for the structural analysis, and piston theory is used for the unsteady aerodynamics in the flutter analysis. Mathematical programming techniques are used to find the minimum-weight cover thickness distribution which satisfies flutter, strength, and minimum-gage constraints. The method of solution, some sample results, and the computer program used to obtain these results are presented. The results indicate that the cover thickness distribution obtained when designing for the strength requirement alone may be quite different from the cover thickness distribution obtained when designing for either the flutter requirement alone or for both the strength and flutter requirements concurrently. This conclusion emphasizes the need for designing for both flutter and strength from the outset.

  19. The Zeus Mission Study — An application of automated collaborative design

    NASA Astrophysics Data System (ADS)

    Doyotte, Romain; Love, Stanley G.; Peterson, Craig E.

    1999-11-01

    The purpose of the Zeus Mission Study was threefold. As an element of a graduate course in spacecraft system engineering, its purpose was primarily educational — to allow the students to apply their knowledge in a real mission study. The second purpose was to investigate the feasibility of applying advanced technology (the power antenna and solar electric propulsion concepts) to a challenging mission. Finally, the study allowed evaluation of the benefits of using quality-oriented techniques (Quality Function Deployment (QFD) and Taguchi Methods) for a mission study. To encourage innovation, several constraints were placed on the study from the onset. While the primary goal was to place at least one lander on Europa, the additional constraint of no nuclear power sources posed an additional challenge, particularly when coupled with the mass constraints imposed by using a Delta II class launch vehicle. In spite of these limitations, the team was able to develop a mission and spacecraft design capable of carrying three simple, lightweight, yet capable landers. The science return will more than adequately meet the science goals established QFD was used to determine the optimal choice of instrumentation. The lander design was selected from several competing lander concepts, including rovers. The carrier design was largely dictated by the needs of the propulsion system required to support the mission, although the development of a Project Trades Model (PTM) in software allowed for rapid recalculation of key system parameters as changes were made. Finally, Taguchi Methods (Design of Experiments) were used in conjunction with the PTM allowing for some limited optimization of design features.

  20. Evaluating the Validity of an Automated Device for Asthma Monitoring for Adolescents: Correlational Design

    PubMed Central

    Belyea, Michael J; Sterling, Mark; Bocko, Mark F

    2015-01-01

    Background Symptom monitoring is a cornerstone of asthma self-management. Conventional methods of symptom monitoring have fallen short in producing objective data and eliciting patients’ consistent adherence, particularly in teen patients. We have recently developed an Automated Device for Asthma Monitoring (ADAM) using a consumer mobile device as a platform to facilitate continuous and objective symptom monitoring in adolescents in vivo. Objective The objectives of the study were to evaluate the validity of the device using spirometer data, fractional exhaled nitric oxide (FeNO), existing measures of asthma symptoms/control and health care utilization data, and to examine the sensitivity and specificity of the device in discriminating asthma cases from nonasthma cases. Methods A total of 84 teens (42 teens with a current asthma diagnosis; 42 without asthma) aged between 13 and 17 years participated in the study. All participants used ADAM for 7 consecutive days during which participants with asthma completed an asthma diary two times a day. ADAM recorded the frequency of coughing for 24 hours throughout the 7-day trial. Pearson correlation and multiple regression were used to examine the relationships between ADAM data and asthma control, quality of life, and health care utilization at the time of the 7-day trial and 3 months later. A receiver operating characteristic (ROC) curve analysis was conducted to examine sensitivity and specificity based on the area under the curve (AUC) as an indicator of the device’s capacity to discriminate between asthma versus nonasthma cases. Results ADAM data (cough counts) were negatively associated with forced expiratory volume in first second of expiration (FEV1) (r=–.26, P=.05), forced vital capacity (FVC) (r=–.31, P=.02), and overall asthma control (r=–.41, P=.009) and positively associated with daily activity limitation (r=.46, P=.01), nighttime (r=.40, P=.02) and daytime symptoms (r=.38, P=.02), and health care

  1. An automation of design and modelling tasks in NX Siemens environment with original software - cost module

    NASA Astrophysics Data System (ADS)

    Zbiciak, R.; Grabowik, C.; Janik, W.

    2015-11-01

    The design-constructional process is a creation activity which strives to fulfil, as well as it possible at the certain moment of time, all demands and needs formulated by a user taking into account social, technical and technological advances. Engineer knowledge and skills and their inborn abilities have the greatest influence on the final product quality and cost. They have also deciding influence on product technical and economic value. Taking into account above it seems to be advisable to make software tools that support an engineer in the process of manufacturing cost estimation. The Cost module is built with analytical procedures which are used for relative manufacturing cost estimation. As in the case of the Generator module the Cost module was written in object programming language C# in Visual Studio environment. During the research the following eight factors, that have the greatest influence on overall manufacturing cost, were distinguished and defined: (i) a gear wheel teeth type it is straight or helicoidal, (ii) a gear wheel design shape A, B with or without wheel hub, (iii) a gear tooth module, (iv) teeth number, (v) gear rim width, (vi) gear wheel material, (vii) heat treatment or thermochemical treatment, (viii) accuracy class. Knowledge of parameters (i) to (v) is indispensable for proper modelling of 3D gear wheels models in CAD system environment. These parameters are also processed in the Cost module. The last three parameters it is (vi) to (viii) are exclusively used in the Cost module. The estimation of manufacturing relative cost is based on indexes calculated for each particular parameter. Estimated in this way the manufacturing relative cost gives an overview of design parameters influence on the final gear wheel manufacturing cost. This relative manufacturing cost takes values from 0.00 to 1,00 range. The bigger index value the higher relative manufacturing cost is. Verification whether the proposed algorithm of relative manufacturing

  2. Design Considerations for High Energy Electron -- Positron Storage Rings

    DOE R&D Accomplishments Database

    Richter, B.

    1966-11-01

    High energy electron-positron storage rings give a way of making a new attack on the most important problems of elementary particle physics. All of us who have worked in the storage ring field designing, building, or using storage rings know this. The importance of that part of storage ring work concerning tests of quantum electrodynamics and mu meson physics is also generally appreciated by the larger physics community. However, I do not think that most of the physicists working tin the elementary particle physics field realize the importance of the contribution that storage ring experiments can make to our understanding of the strongly interacting particles. I would therefore like to spend the next few minutes discussing the sort of things that one can do with storage rings in the strongly interacting particle field.

  3. Design and implementation of an electronic investigational drug accountability system.

    PubMed

    Grilley, B J; Trissel, L A; Bluml, B M

    1991-12-01

    A software system designed to maintain protocol-specific investigational drug accountability records is described. The University of Texas M. D. Anderson Cancer Center and Cygnus Systems Development, Inc., worked together to create an electronic investigational drug accountability system (IDRx), which meets the requirements of the National Cancer Institute. This system performs record keeping, stores information on drugs and protocols, and generates standard and customized reports. On-screen assistance makes it easy to use. Security is achieved by granting access only to authorized users, and an audit trail is automatically generated. Systematic implementation at M. D. Anderson, initially in the investigational drug control area and subsequently in the satellite pharmacies, has resulted in increased accuracy and efficiency, and few problems have been encountered. The IDRx software package is useful for keeping records, generating reports, and tracking and evaluating data associated with an investigational drug accountability system. PMID:1814202

  4. Automated design of hammerhead ribozymes and validation by targeting the PABPN1 gene transcript

    PubMed Central

    Kharma, Nawwaf; Varin, Luc; Abu-Baker, Aida; Ouellet, Jonathan; Najeh, Sabrine; Ehdaeivand, Mohammad-Reza; Belmonte, Gabriel; Ambri, Anas; Rouleau, Guy; Perreault, Jonathan

    2016-01-01

    We present a new publicly accessible web-service, RiboSoft, which implements a comprehensive hammerhead ribozyme design procedure. It accepts as input a target sequence (and some design parameters) then generates a set of ranked hammerhead ribozymes, which target the input sequence. This paper describes the implemented procedure, which takes into consideration multiple objectives leading to a multi-objective ranking of the computer-generated ribozymes. Many ribozymes were assayed and validated, including four ribozymes targeting the transcript of a disease-causing gene (a mutant version of PABPN1). These four ribozymes were successfully tested in vitro and in vivo, for their ability to cleave the targeted transcript. The wet-lab positive results of the test are presented here demonstrating the real-world potential of both hammerhead ribozymes and RiboSoft. RiboSoft is freely available at the website http://ribosoft.fungalgenomics.ca/ribosoft/. PMID:26527730

  5. Automated Structure- and Sequence-Based Design of Proteins for High Bacterial Expression and Stability.

    PubMed

    Goldenzweig, Adi; Goldsmith, Moshe; Hill, Shannon E; Gertman, Or; Laurino, Paola; Ashani, Yacov; Dym, Orly; Unger, Tamar; Albeck, Shira; Prilusky, Jaime; Lieberman, Raquel L; Aharoni, Amir; Silman, Israel; Sussman, Joel L; Tawfik, Dan S; Fleishman, Sarel J

    2016-07-21

    Upon heterologous overexpression, many proteins misfold or aggregate, thus resulting in low functional yields. Human acetylcholinesterase (hAChE), an enzyme mediating synaptic transmission, is a typical case of a human protein that necessitates mammalian systems to obtain functional expression. We developed a computational strategy and designed an AChE variant bearing 51 mutations that improved core packing, surface polarity, and backbone rigidity. This variant expressed at ∼2,000-fold higher levels in E. coli compared to wild-type hAChE and exhibited 20°C higher thermostability with no change in enzymatic properties or in the active-site configuration as determined by crystallography. To demonstrate broad utility, we similarly designed four other human and bacterial proteins. Testing at most three designs per protein, we obtained enhanced stability and/or higher yields of soluble and active protein in E. coli. Our algorithm requires only a 3D structure and several dozen sequences of naturally occurring homologs, and is available at http://pross.weizmann.ac.il. PMID:27425410

  6. Design of power electronics for TVC EMA systems

    NASA Astrophysics Data System (ADS)

    Nelms, R. Mark

    1993-08-01

    The Composite Development Division of the Propulsion Laboratory at Marshall Space Flight Center (MSFC) is currently developing a class of electromechanical actuators (EMA's) for use in space transportation applications such as thrust vector control (TVC) and propellant control valves (PCV). These high power servomechanisms will require rugged, reliable, and compact power electronic modules capable of modulating several hundred amperes of current at up to 270 volts. MSFC has selected the brushless dc motor for implementation in EMA's. This report presents the results of an investigation into the applicability of two new technologies, MOS-controlled thyristors (MCT's) and pulse density modulation (PDM), to the control of brushless dc motors in EMA systems. MCT's are new power semiconductor devices, which combine the high voltage and current capabilities of conventional thyristors and the low gate drive requirements of metal oxide semiconductor field effect transistors (MOSFET's). The commanded signals in a PDM system are synthesized using a series of sinusoidal pulses instead of a series of square pulses as in a pulse width modulation (PWM) system. A resonant dc link inverter is employed to generate the sinusoidal pulses in the PDM system. This inverter permits zero-voltage switching of all semiconductors which reduces switching losses and switching stresses. The objectives of this project are to develop and validate an analytical model of the MCT device when used in high power motor control applications and to design, fabricate, and test a prototype electronic circuit employing both MCT and PDM technology for controlling a brushless dc motor.

  7. Electronic hardware design of electrical capacitance tomography systems.

    PubMed

    Saied, I; Meribout, M

    2016-06-28

    Electrical tomography techniques for process imaging are very prominent for industrial applications, such as the oil and gas industry and chemical refineries, owing to their ability to provide the flow regime of a flowing fluid within a relatively high throughput. Among the various techniques, electrical capacitance tomography (ECT) is gaining popularity due to its non-invasive nature and its capability to differentiate between different phases based on their permittivity distribution. In recent years, several hardware designs have been provided for ECT systems that have improved its resolution of measurements to be around attofarads (aF, 10(-18) F), or the number of channels, that is required to be large for some applications that require a significant amount of data. In terms of image acquisition time, some recent systems could achieve a throughput of a few hundred frames per second, while data processing time could be achieved in only a few milliseconds per frame. This paper outlines the concept and main features of the most recent front-end and back-end electronic circuits dedicated for ECT systems. In this paper, multiple-excitation capacitance polling, a front-end electronic technique, shows promising results for ECT systems to acquire fast data acquisition speeds. A highly parallel field-programmable gate array (FPGA) based architecture for a fast reconstruction algorithm is also described. This article is part of the themed issue 'Supersensing through industrial process tomography'. PMID:27185964

  8. Design of power electronics for TVC EMA systems

    NASA Technical Reports Server (NTRS)

    Nelms, R. Mark

    1993-01-01

    The Composite Development Division of the Propulsion Laboratory at Marshall Space Flight Center (MSFC) is currently developing a class of electromechanical actuators (EMA's) for use in space transportation applications such as thrust vector control (TVC) and propellant control valves (PCV). These high power servomechanisms will require rugged, reliable, and compact power electronic modules capable of modulating several hundred amperes of current at up to 270 volts. MSFC has selected the brushless dc motor for implementation in EMA's. This report presents the results of an investigation into the applicability of two new technologies, MOS-controlled thyristors (MCT's) and pulse density modulation (PDM), to the control of brushless dc motors in EMA systems. MCT's are new power semiconductor devices, which combine the high voltage and current capabilities of conventional thyristors and the low gate drive requirements of metal oxide semiconductor field effect transistors (MOSFET's). The commanded signals in a PDM system are synthesized using a series of sinusoidal pulses instead of a series of square pulses as in a pulse width modulation (PWM) system. A resonant dc link inverter is employed to generate the sinusoidal pulses in the PDM system. This inverter permits zero-voltage switching of all semiconductors which reduces switching losses and switching stresses. The objectives of this project are to develop and validate an analytical model of the MCT device when used in high power motor control applications and to design, fabricate, and test a prototype electronic circuit employing both MCT and PDM technology for controlling a brushless dc motor.

  9. An automated Langmuir probe controller for plasma characterization

    NASA Astrophysics Data System (ADS)

    Bustos, A.; Juarez, A. M.; de Urquijo, J.; Muñoz, M.

    2016-08-01

    We present the design, construction and test of an automated electronic controller for a Langmuir plasma probe. The novel aspect of this system lies in the isolation of the high voltage present in the discharge from the grounded reference of the controller. This controller detects currents over the range from  ±1 μA to  ±50 mA, using dynamic and automated switching of a transresistance amplifier. This automated Langmuir probe (LP) system has been successfully tested in a glow discharge in argon at 0.8 and 10 Torr.

  10. The Design and Synthesis of Epoxy Matrix Composites Curable by Electron Beam Induced Cationic Polymerization

    NASA Technical Reports Server (NTRS)

    Crivello, James V.

    2000-01-01

    Several new series of novel, high reactivity epoxy resins are described which are designed specifically for the fabrication of high performance carbon fiber reinforced composites for commercial aircraft structural applications using cationic UV and e-beam curing. The objective of this investigation is to provide resin matrices which rapidly and efficiently cure under low e-beam doses which are suitable to high speed automated composite fabrication techniques such as automated tape and tow placement. It was further the objective of this work to provide resins with superior thermal, oxidative and atomic oxygen resistance.

  11. Designing attractive models via automated identification of chaotic and oscillatory dynamical regimes

    PubMed Central

    Silk, Daniel; Kirk, Paul D.W.; Barnes, Chris P.; Toni, Tina; Rose, Anna; Moon, Simon; Dallman, Margaret J.; Stumpf, Michael P.H.

    2011-01-01

    Chaos and oscillations continue to capture the interest of both the scientific and public domains. Yet despite the importance of these qualitative features, most attempts at constructing mathematical models of such phenomena have taken an indirect, quantitative approach, for example, by fitting models to a finite number of data points. Here we develop a qualitative inference framework that allows us to both reverse-engineer and design systems exhibiting these and other dynamical behaviours by directly specifying the desired characteristics of the underlying dynamical attractor. This change in perspective from quantitative to qualitative dynamics, provides fundamental and new insights into the properties of dynamical systems. PMID:21971504

  12. Optimal part and module selection for synthetic gene circuit design automation.

    PubMed

    Huynh, Linh; Tagkopoulos, Ilias

    2014-08-15

    An integral challenge in synthetic circuit design is the selection of optimal parts to populate a given circuit topology, so that the resulting circuit behavior best approximates the desired one. In some cases, it is also possible to reuse multipart constructs or modules that have been already built and experimentally characterized. Efficient part and module selection algorithms are essential to systematically search the solution space, and their significance will only increase in the following years due to the projected explosion in part libraries and circuit complexity. Here, we address this problem by introducing a structured abstraction methodology and a dynamic programming-based algorithm that guaranties optimal part selection. In addition, we provide three extensions that are based on symmetry check, information look-ahead and branch-and-bound techniques, to reduce the running time and space requirements. We have evaluated the proposed methodology with a benchmark of 11 circuits, a database of 73 parts and 304 experimentally constructed modules with encouraging results. This work represents a fundamental departure from traditional heuristic-based methods for part and module selection and is a step toward maximizing efficiency in synthetic circuit design and construction. PMID:24933033

  13. Poster — Thur Eve — 51: An analysis of the effectiveness of automated pre-, post- and intra-treatment auditing of electronic health records

    SciTech Connect

    Joseph, A.; Seuntjens, J.; Parker, W.; Kildea, J.; Freeman, C.

    2014-08-15

    We describe development of automated, web-based, electronic health record (EHR) auditing software for use within our paperless radiation oncology clinic. By facilitating access to multiple databases within the clinic, each patient's EHR is audited prior to treatment, regularly during treatment, and post treatment. Anomalies such as missing documentation, non-compliant workflow and treatment parameters that differ significantly from the norm may be monitored, flagged and brought to the attention of clinicians. By determining historical trends using existing patient data and by comparing new patient data with the historical, we expect our software to provide a measurable improvement in the quality of radiotherapy at our centre.

  14. Studies relevant to the design of reliable automotive electronics

    NASA Astrophysics Data System (ADS)

    Zeng, Hua

    This dissertation describes three independent studies related to the design of reliable automotive electronics. The topics covered are: the estimation of the radiated emissions from power bus structures, EM propagation of tire pressure monitoring systems (TPMS), and examinations of corrosion-induced faults in a connector. The first chapter describes a method for estimating the maximum possible radiated emissions from a printed circuit board power bus. An analysis based on a lossy cavity model is performed to determine the maximum possible radiated field corresponding to a given power bus noise voltage. A closed-form expression relating the maximum power bus noise voltage to the radiation peaks is then derived. This expression is solved in reverse to determine the minimum power bus voltage necessary to generate a radiated field and can be applied to measured values of power bus noise voltage to determine whether radiation directly from the power bus is potentially the emissions source. The second chapter identifies transmission parameters from a rotating tire and the vehicle body's effect on tire sensor transmission and propagation; relates these effects to receiver antenna packaging requirements; and then, based on these results, proposes the antenna design of employing car body as a part. In the new proposed TPMS design, a 20mm x 5mm loop antenna with a 40mm x 10mm slot beneath it is added to capture the surface currents of the car body and block the current path to increase the current density around the loop antenna. The simulation results show that this design exhibits a propagation factor 150 times larger than the traditional design. The third chapter investigates the effects of different contaminants (salt, oil, grease) on the shunt resistance between pins of a cable connector. The test results show that salt-induced corrosion and moisture may cause intermittent shunting resistances capable of affecting the normal operation of automotive systems. One

  15. Design of an Electrically Automated RF Transceiver Head Coil in MRI.

    PubMed

    Sohn, Sung-Min; DelaBarre, Lance; Gopinath, Anand; Vaughan, John Thomas

    2015-10-01

    Magnetic resonance imaging (MRI) is a widely used nonionizing and noninvasive diagnostic instrument to produce detailed images of the human body. The radio-frequency (RF) coil is an essential part of MRI hardware as an RF front-end. RF coils transmit RF energy to the subject and receive the returning MR signal. This paper presents an MRI-compatible hardware design of the new automatic frequency tuning and impedance matching system. The system automatically corrects the detuned and mismatched condition that occurs due to loading effects caused by the variable subjects (i.e., different human heads or torsos). An eight-channel RF transceiver head coil with the automatic system has been fabricated and tested at 7 Tesla (T) MRI system. The automatic frequency tuning and impedance matching system uses digitally controlled capacitor arrays with real-time feedback control capability. The hardware design is not only compatible with current MRI scanners in all aspects but also it operates the tuning and matching function rapidly and accurately. The experimental results show that the automatic function increases return losses from 8.4 dB to 23.7 dB (maximum difference) and from 12.7 dB to 19.6 dB (minimum difference) among eight channels within 550 ms . The reflected RF power decrease from 23.1% to 1.5% (maximum difference) and from 5.3% to 1.1% (minimum difference). Therefore, these results improve signal-to-noise ratio (SNR) in MR images with phantoms. PMID:25361512

  16. A hybrid systems strategy for automated spacecraft tour design and optimization

    NASA Astrophysics Data System (ADS)

    Stuart, Jeffrey R.

    As the number of operational spacecraft increases, autonomous operations is rapidly evolving into a critical necessity. Additionally, the capability to rapidly generate baseline trajectories greatly expands the range of options available to analysts as they explore the design space to meet mission demands. Thus, a general strategy is developed, one that is suitable for the construction of flight plans for both Earth-based and interplanetary spacecraft that encounter multiple objects, where these multiple encounters comprise a ``tour''. The proposed scheme is flexible in implementation and can readily be adjusted to a variety of mission architectures. Heuristic algorithms that autonomously generate baseline tour trajectories and, when appropriate, adjust reference solutions in the presence of rapidly changing environments are investigated. Furthermore, relative priorities for ranking the targets are explicitly accommodated during the construction of potential tour sequences. As a consequence, a priori, as well as newly acquired, knowledge concerning the target objects enhances the potential value of the ultimate encounter sequences. A variety of transfer options are incorporated, from rendezvous arcs enabled by low-thrust engines to more conventional impulsive orbit adjustments via chemical propulsion technologies. When advantageous, trajectories are optimized in terms of propellant consumption via a combination of indirect and direct methods; such a combination of available technologies is an example of hybrid optimization. Additionally, elements of hybrid systems theory, i.e., the blending of dynamical states, some discrete and some continuous, are integrated into the high-level tour generation scheme. For a preliminary investigation, this strategy is applied to mission design scenarios for a Sun-Jupiter Trojan asteroid tour as well as orbital debris removal for near-Earth applications.

  17. Design of an automated device to measure sagittal plane stiffness of an articulated ankle-foot orthosis.

    PubMed

    Kobayashi, Toshiki; Leung, Aaron K L; Akazawa, Yasushi; Naito, Hisashi; Tanaka, Masao; Hutchins, Stephen W

    2010-12-01

    The purpose of this study was to design a new automated stiffness measurement device which could perform a simultaneous measurement of both dorsi- and plantarflexion angles and the corresponding resistive torque around the rotational centre of an articulated ankle-foot orthosis (AAFO). This was achieved by controlling angular velocities and range of motion in the sagittal plane. The device consisted of a hydraulic servo fatigue testing machine, a torque meter, a potentiometer, a rotary plate and an upright supporter to enable an AAFO to be attached to the device via a surrogate shank. The accuracy of the device in reproducing the range of motion and angular velocity was within 4% and 1% respectively in the range of motion of 30° (15° plantarflexion to 15° dorsiflexion) at the angular velocity of 10°/s, while that in the measurement of AAFO torque was within 8% at the 0° position. The device should prove useful to assist an orthotist or a manufacturer to quantify the stiffness of an AAFO and inform its clinical use. PMID:20681928

  18. Automated Reflectance Measurement System Designed and Fabricated to Determine the Limits of Atomic Oxygen Treatment of Art Through Contrast Optimization

    NASA Technical Reports Server (NTRS)

    Sechkar, Edward A.; Stueber, Thomas J.; Rutledge, Sharon K.

    2000-01-01

    Atomic oxygen generated in ground-based research facilities has been used to not only test erosion of candidate spacecraft materials but as a noncontact technique for removing organic deposits from the surfaces of artwork. NASA has patented the use of atomic oxygen to remove carbon-based soot contamination from fire-damaged artwork. The process of cleaning soot-damaged paintings with atomic oxygen requires exposures for variable lengths of time, dependent on the condition of a painting. Care must be exercised while cleaning to prevent the removal of pigment. The cleaning process must be stopped as soon as visual inspection or surface reflectance measurements indicate that cleaning is complete. Both techniques rely on optical comparisons of known bright locations against known dark locations on the artwork being cleaned. Difficulties arise with these techniques when either a known bright or dark location cannot be determined readily. Furthermore, dark locations will lighten with excessive exposure to atomic oxygen. Therefore, an automated test instrument to quantitatively characterize cleaning progression was designed and developed at the NASA Glenn Research Center at Lewis Field to determine when atomic oxygen cleaning is complete.

  19. Automated Defect Classification (ADC)

    Energy Science and Technology Software Center (ESTSC)

    1998-01-01

    The ADC Software System is designed to provide semiconductor defect feature analysis and defect classification capabilities. Defect classification is an important software method used by semiconductor wafer manufacturers to automate the analysis of defect data collected by a wide range of microscopy techniques in semiconductor wafer manufacturing today. These microscopies (e.g., optical bright and dark field, scanning electron microscopy, atomic force microscopy, etc.) generate images of anomalies that are induced or otherwise appear on wafermore » surfaces as a result of errant manufacturing processes or simple atmospheric contamination (e.g., airborne particles). This software provides methods for analyzing these images, extracting statistical features from the anomalous regions, and applying supervised classifiers to label the anomalies into user-defined categories.« less

  20. Statistical learning for alloy design from electronic structure calculations

    NASA Astrophysics Data System (ADS)

    Broderick, Scott R.

    The objective of this thesis is to explore how statistical learning methods can contribute to the interpretation and efficacy of electronic structure calculations. This study develops new applications of statistical learning and data mining methods to both semi-empirical and density functional theory (DFT) calculations. Each of these classes of electronic structure calculations serves as templates for different data driven discovery strategies for materials science applications. In our study of semi-empirical methods, we take advantage of the ability of data mining methods to quantitatively assess high dimensional parameterization schemes. The impact of this work includes the development of accelerated computational schemes for developing reduced order models. Another application is the use of these informatics based techniques to serve as a means for estimating parameters when data for such calculations are not available. Using density of states (DOS) spectra derived from DFT calculations we have demonstrated the classification power of singular value decomposition methods to accurately develop structural and stoichiometric classifications of compounds. Building on this work we have extended this analytical strategy to apply the predictive capacity of informatics methods to develop a new and far more robust modeling approach for DOS spectra, addressing an issue that has gone relatively unchallenged over two decades. By exploring a diverse array of materials systems (metals, ceramics, different crystal structures) this work has laid the foundations for expanding the linkages between statistical learning and statistical thermodynamics. The results of this work provide exciting new opportunities in computational based design of materials that have not been explored before.

  1. Automated semantic indexing of imaging reports to support retrieval of medical images in the multimedia electronic medical record.

    PubMed

    Lowe, H J; Antipov, I; Hersh, W; Smith, C A; Mailhot, M

    1999-12-01

    This paper describes preliminary work evaluating automated semantic indexing of radiology imaging reports to represent images stored in the Image Engine multimedia medical record system at the University of Pittsburgh Medical Center. The authors used the SAPHIRE indexing system to automatically identify important biomedical concepts within radiology reports and represent these concepts with terms from the 1998 edition of the U.S. National Library of Medicine's Unified Medical Language System (UMLS) Metathesaurus. This automated UMLS indexing was then compared with manual UMLS indexing of the same reports. Human indexing identified appropriate UMLS Metathesaurus descriptors for 81% of the important biomedical concepts contained in the report set. SAPHIRE automatically identified UMLS Metathesaurus descriptors for 64% of the important biomedical concepts contained in the report set. The overall conclusions of this pilot study were that the UMLS metathesaurus provided adequate coverage of the majority of the important concepts contained within the radiology report test set and that SAPHIRE could automatically identify and translate almost two thirds of these concepts into appropriate UMLS descriptors. Further work is required to improve both the recall and precision of this automated concept extraction process. PMID:10805018

  2. EGUN: An electron optics and gun design program

    SciTech Connect

    Herrmannsfeldt, W.B.

    1988-10-01

    The name EGUN has become commonly associated with the program also known as the SLAC Electron Trajectory Program. This document is an updated version of SLAC-226, published in 1979. The program itself has had substantial upgrading since then, but only a few new features are of much concern to the user. Most of the improvements are internal and are intended to improve speed or accuracy. EGUN is designed to compute trajectories of charged particles in electrostatic and magnetostatic fields, including the effects of space charge and self-magnetic fields. Starting options include Child's Law conditions on cathodes of various shapes, as well as used specified initial conditions. Either rectangular or cylindrical symmetry may be used. In the new jargon, the program is a 2-1/2 dimension code meaning 2-D in all fields and 3-D in all particle motion. A Poisson's Equation Solver is used to find the electrostatic fields by using difference equations derived from the boundary conditions. Magnetic fields are to be specified externally, by the user, by using one of several methods including data from another program or arbitrary configurations of coils. This edition of the documentation also covers the program EGN87c, which is a recently developed version of EGUN designed to be used on the newer models of personal computers, small main frames, work stations, etc. The EGN87c program uses the programming language C which is very transportable so the program should operate on any system that supports C. Plotting routines for most common PC monitors are included, and the capability to make hard copy plots on dot-matrix printer-plotters is provided. 18 refs., 7 figs.

  3. Design and Development of the SMAP Microwave Radiometer Electronics

    NASA Technical Reports Server (NTRS)

    Piepmeier, Jeffrey R.; Medeiros, James J.; Horgan, Kevin A.; Brambora, Clifford K.; Estep, Robert H.

    2014-01-01

    The SMAP microwave radiometer will measure land surface brightness temperature at L-band (1413 MHz) in the presence of radio frequency interference (RFI) for soil moisture remote sensing. The radiometer design was driven by the requirements to incorporate internal calibration, to operate synchronously with the SMAP radar, and to mitigate the deleterious effects of RFI. The system design includes a highly linear super-heterodyne microwave receiver with internal reference loads and noise sources for calibration and an innovative digital signal processor and detection system. The front-end comprises a coaxial cable-based feed network, with a pair of diplexers and a coupled noise source, and radiometer front-end (RFE) box. Internal calibration is provided by reference switches and a common noise source inside the RFE. The RF back-end (RBE) downconverts the 1413 MHz channel to an intermediate frequency (IF) of 120 MHz. The IF signals are then sampled and quantized by high-speed analog-to-digital converters in the radiometer digital electronics (RDE) box. The RBE local oscillator and RDE sampling clocks are phase-locked to a common reference to ensure coherency between the signals. The RDE performs additional filtering, sub-band channelization, cross-correlation for measuring third and fourth Stokes parameters, and detection and integration of the first four raw moments of the signals. These data are packetized and sent to the ground for calibration and further processing. Here we discuss the novel features of the radiometer hardware particularly those influenced by the need to mitigate RFI.

  4. Compact design for two-dimensional electronic spectroscopy

    NASA Astrophysics Data System (ADS)

    Huang, Zheng; Wang, Peng; Shen, Xiong; Yan, Tian-Min; Zhang, Yizhu; Liu, Jun

    2016-03-01

    We present a passively phase-stabilized two-dimensional electronic spectroscopy (2DES) with a compact size, and the ease of implementation and maintenance. Our design relies on a mask beam-splitter with four holes to form non-collinear box geometry, and a homebuilt stacked retroreflector, which introduces the phase-locked pulse sequence, remedying the instability of commonly used translation stages. The minimized size of the setup suppresses the influences of optical path-length fluctuations during measurements, improving the phase stability and precise timing of pulse sequences. In our 2DES, only few conventional optical components are used, which make this sophisticated instrumentation convenient to establish and particularly easy to conduct alignment. In data analysis, the self-referencing spectral interferometry (SRSI) method is first introduced to extract the complex-valued signal from spectral interferometry in 2DES. The alternative algorithm achieves the improvement of the signal-to-noise ratio (SNR) and considerable reduction of data acquisition time. The new setup is suitable over a tunable range of spectroscopic wavelength, from ultraviolet (UV) to the near-infrared (NIR) regime, and for ultra-broadband bandwidth, few-cycle laser pulses.

  5. Low emittance injector design for free electron lasers

    NASA Astrophysics Data System (ADS)

    Bettoni, S.; Pedrozzi, M.; Reiche, S.

    2015-12-01

    Several parameters determine the performance of free electron lasers: the slice and the projected emittance, the slice energy spread, and the peak current are the most crucial ones. The peak current is essentially obtained by magnetic compression stages along the machine or occasionally assisted by velocity bunching at low energy. The minimum emittance and the alignment of the slices along the bunch are mainly determined in the low energy part of the accelerator (injector). Variations at the per-mille level of several parameters in this section of the machine strongly influence these quantities with highly nonlinear dynamic. We developed a numerical tool to perform the optimization of the injector. We applied this code to optimize the SwissFEL injector, assuming different gun designs, initial bunch lengths and intrinsic emittances. We obtained an emittance along the bunch of 0.14 mm mrad and around 0.08 mm mrad for the maximum and the minimum SwissFEL charges (200 and 10 pC, respectively). We applied the same tool to a running injector, where we automatized the optimization of the machine.

  6. Distribution automation applications of fiber optics

    NASA Technical Reports Server (NTRS)

    Kirkham, Harold; Johnston, A.; Friend, H.

    1989-01-01

    Motivations for interest and research in distribution automation are discussed. The communication requirements of distribution automation are examined and shown to exceed the capabilities of power line carrier, radio, and telephone systems. A fiber optic based communication system is described that is co-located with the distribution system and that could satisfy the data rate and reliability requirements. A cost comparison shows that it could be constructed at a cost that is similar to that of a power line carrier system. The requirements for fiber optic sensors for distribution automation are discussed. The design of a data link suitable for optically-powered electronic sensing is presented. Empirical results are given. A modeling technique that was used to understand the reflections of guided light from a variety of surfaces is described. An optical position-indicator design is discussed. Systems aspects of distribution automation are discussed, in particular, the lack of interface, communications, and data standards. The economics of distribution automation are examined.

  7. Aviation safety and automation technology for subsonic transports

    NASA Technical Reports Server (NTRS)

    Albers, James A.

    1991-01-01

    Discussed here are aviation safety human factors and air traffic control (ATC) automation research conducted at the NASA Ames Research Center. Research results are given in the areas of flight deck and ATC automations, displays and warning systems, crew coordination, and crew fatigue and jet lag. Accident investigation and an incident reporting system that is used to guide the human factors research is discussed. A design philosophy for human-centered automation is given, along with an evaluation of automation on advanced technology transports. Intelligent error tolerant systems such as electronic checklists are discussed along with design guidelines for reducing procedure errors. The data on evaluation of Crew Resource Management (CRM) training indicates highly significant positive changes in appropriate flight deck behavior and more effective use of available resources for crew members receiving the training.

  8. Automation of Resistance Bridge Calibrator

    NASA Astrophysics Data System (ADS)

    Podgornik, Tadej; Bojkovski, Jovan; Batagelj, Valentin; Drnovšek, Janko

    2008-02-01

    The article addresses the automation of the resistance bridge calibrator (RBC). The automation of the RBC is performed in order to facilitate the operation of the RBC, improve the reliability, and enable several additional possibilities compared to the tedious manual operation, thereby making the RBC a more practical device for routine use. The RBC is used to calibrate AC and DC resistance bridges, which are mainly used in a primary thermometry laboratory. It consists of a resistor network made up from four main resistors from which 35 different resistance values can be realized using toggle switches. Literature shows that the resistors’ non-zero temperature coefficient can influence the measurements, causing difficulties when calibrating resistance bridges with low uncertainty. Placing the RBC in a thermally stable environment can reduce this, but it does not solve the problem of the time-consuming manual selection of the resistance values. To solve this, an automated means to manipulate the switches, while the device is placed within a thermally stable environment, was created. Computer operation completely substitutes for any manual operation during which an operator would normally have to be present. The computer also acquires measurements from the bridge. In this way, repeated and reproducible calibration measurements inside a temperature-stable environment can be carried out with no active involvement of personnel. The automation process itself was divided into several stages. They included the construction of a servo-manipulator to move the switches, the design of a dedicated electronic controller that also provides a serial interface (RS-232) to the computer, and the development of custom computer software to configure the servo-manipulator and control the calibration process. Measurements show that automation does not affect the long-term stability and mechanical repeatability of the RBC. The repeatability and reproducibility of bridge calibration ratios

  9. Design of power electronics for TVC and EMA systems

    NASA Astrophysics Data System (ADS)

    Nelms, R. Mark; Bell, J. Brett; Shepherd, Michael T.

    1994-11-01

    The Component Development Division of the Propulsion Laboratory at Marshall Space Flight Center (MSFC) is currently developing a class of electromechanical actuators (EMA's) for use in space transportation applications such as thrust vector control (TVC) and propellant control valves (PCV). These high power servomechanisms will require rugged, reliable, and compact power electronic modules capable of modulating several hundred amperes of current at up to 270 volts. MSFC has selected the brushless dc motor for implementation in EMA's. A previous project performed by Auburn University examined the use of the resonant dc link (RDCL) inverter, pulse density modulation (PDM), and mos-controlled thyristors (MCT's) for speed control of a brushless dc motor. The speed of the brushless dc motor is proportional to the applied stator voltage. In a PDM system, the control system determines the number of resonant voltage pulses which must be applied to the stator to achieve a desired speed. The addition of a waveshaping circuit to the front end of a standard three-phase inverter yields a RDCL inverter; the resonant voltage pulses are produced through the action of this wave shaping circuit and the inverter. This project has focused on the implementation of a system which permits zero-voltage switching with the bus voltage clamped at the input voltage level. In the same manner as the RDCL inverter, the inverter selected for this implementation is a combination of waveshaping circuit and a standard three-phase inverter. In addition, this inverter allows a pulse-width modulated (PWM)-like control scheme instead of a PDM scheme. The operation of waveshaping circuit will be described through analysis and waveforms. Design relationships will also be presented.

  10. Design of power electronics for TVC and EMA systems

    NASA Technical Reports Server (NTRS)

    Nelms, R. Mark; Bell, J. Brett; Shepherd, Michael T.

    1994-01-01

    The Component Development Division of the Propulsion Laboratory at Marshall Space Flight Center (MSFC) is currently developing a class of electromechanical actuators (EMA's) for use in space transportation applications such as thrust vector control (TVC) and propellant control valves (PCV). These high power servomechanisms will require rugged, reliable, and compact power electronic modules capable of modulating several hundred amperes of current at up to 270 volts. MSFC has selected the brushless dc motor for implementation in EMA's. A previous project performed by Auburn University examined the use of the resonant dc link (RDCL) inverter, pulse density modulation (PDM), and mos-controlled thyristors (MCT's) for speed control of a brushless dc motor. The speed of the brushless dc motor is proportional to the applied stator voltage. In a PDM system, the control system determines the number of resonant voltage pulses which must be applied to the stator to achieve a desired speed. The addition of a waveshaping circuit to the front end of a standard three-phase inverter yields a RDCL inverter; the resonant voltage pulses are produced through the action of this wave shaping circuit and the inverter. This project has focused on the implementation of a system which permits zero-voltage switching with the bus voltage clamped at the input voltage level. In the same manner as the RDCL inverter, the inverter selected for this implementation is a combination of waveshaping circuit and a standard three-phase inverter. In addition, this inverter allows a pulse-width modulated (PWM)-like control scheme instead of a PDM scheme. The operation of waveshaping circuit will be described through analysis and waveforms. Design relationships will also be presented.

  11. Design, construction and testing of a low-cost automated 68Gallium-labeling synthesis unit for clinical use

    PubMed Central

    Heidari, Pedram; Szretter, Alicia; Rushford, Laura E; Stevens, Maria; Collier, Lee; Sore, Judit; Hooker, Jacob; Mahmood, Umar

    2016-01-01

    The interest in 68Gallium labeled PET probes continues to increase around the world. Widespread use in Europe and Asia has led to great interest for use at numerous sites in the US. One barrier to entry is the cost of the automated synthesis units for relatively simple labeling procedures. We describe the construction and testing of a relatively low-cost automated 68Ga-labeling unit for human-use. We provide a guide for construction, including part lists and synthesis timelists to facilitate local implementation. Such inexpensive systems could help increase use around the globe and in the US in particular by removing one of the barriers to greater widespread availability. The developed automated synthesis unit reproducibly synthesized 68Ga-DOTATOC with average yield of 71 ± 8% and a radiochemical purity ≥ 95% in a synthesis time of 25 ± 1 minutes. Automated product yields are comparable to that of manual synthesis. We demonstrate in-house construction and use of a low-cost automated synthesis unit for labeling of DOTATOC and similar peptides with 68Gallium. PMID:27508104

  12. Design, construction and testing of a low-cost automated (68)Gallium-labeling synthesis unit for clinical use.

    PubMed

    Heidari, Pedram; Szretter, Alicia; Rushford, Laura E; Stevens, Maria; Collier, Lee; Sore, Judit; Hooker, Jacob; Mahmood, Umar

    2016-01-01

    The interest in (68)Gallium labeled PET probes continues to increase around the world. Widespread use in Europe and Asia has led to great interest for use at numerous sites in the US. One barrier to entry is the cost of the automated synthesis units for relatively simple labeling procedures. We describe the construction and testing of a relatively low-cost automated (68)Ga-labeling unit for human-use. We provide a guide for construction, including part lists and synthesis timelists to facilitate local implementation. Such inexpensive systems could help increase use around the globe and in the US in particular by removing one of the barriers to greater widespread availability. The developed automated synthesis unit reproducibly synthesized (68)Ga-DOTATOC with average yield of 71 ± 8% and a radiochemical purity ≥ 95% in a synthesis time of 25 ± 1 minutes. Automated product yields are comparable to that of manual synthesis. We demonstrate in-house construction and use of a low-cost automated synthesis unit for labeling of DOTATOC and similar peptides with (68)Gallium. PMID:27508104

  13. Automated software development workstation

    NASA Technical Reports Server (NTRS)

    1986-01-01

    Engineering software development was automated using an expert system (rule-based) approach. The use of this technology offers benefits not available from current software development and maintenance methodologies. A workstation was built with a library or program data base with methods for browsing the designs stored; a system for graphical specification of designs including a capability for hierarchical refinement and definition in a graphical design system; and an automated code generation capability in FORTRAN. The workstation was then used in a demonstration with examples from an attitude control subsystem design for the space station. Documentation and recommendations are presented.

  14. Alternative positron-target design for electron-positron colliders

    SciTech Connect

    Donahue, R.J. ); Nelson, W.R. )

    1991-04-01

    Current electron-positron linear colliders are limited in luminosity by the number of positrons which can be generated from targets presently used. This paper examines the possibility of using an alternate wire-target geometry for the production of positrons via an electron-induced electromagnetic cascade shower. 39 refs., 38 figs., 5 tabs.

  15. Design certification review assessment report. Electron/proton spectrometer

    NASA Technical Reports Server (NTRS)

    1972-01-01

    The characteristics of the electron-photon spectrometer developed for the Skylab program are presented. The spectrometer is externally mounted on the Skylab module complex and provides omnidirectional measurement of electrons and protons which result from solar flares or enhancement of the radiation belts. The data are applied to the determination of relative biological effectiveness factors as a safety factor for manned space flight.

  16. RHIC electron lenses upgrades

    SciTech Connect

    Gu, X.; Altinbas, Z.; Bruno, D.; Binello, S.; Costanzo, M.; Drees, A.; Fischer, W.; Gassner, D. M.; Hock, J.; Hock, K.; Harvey, M.; Luo, Y.; Marusic, A.; Mi, C.; Mernick, K.; Minty, M.; Michnoff, R.; Miller, T. A.; Pikin, A. I.; Robert-Demolaize, G.; Samms, T.; Shrey, T. C.; Schoefer, V.; Tan, Y.; Than, R.; Thieberger, P.; White, S. M.

    2015-05-03

    In the Relativistic Heavy Ion Collider (RHIC) 100 GeV polarized proton run in 2015, two electron lenses were used to partially compensate for the head-on beam-beam effect for the first time. Here, we describe the design of the current electron lens, detailing the hardware modifications made after the 2014 commissioning run with heavy ions. A new electron gun with 15-mm diameter cathode is characterized. The electron beam transverse profile was measured using a YAG screen and fitted with a Gaussian distribution. During operation, the overlap of the electron and proton beams was achieved using the electron backscattering detector in conjunction with an automated orbit control program.

  17. Design and performance of an automated radionuclide separator: its application on the determination of ⁹⁹Tc in groundwater.

    PubMed

    Chung, Kun Ho; Choi, Sang Do; Choi, Geun Sik; Kang, Mun Ja

    2013-11-01

    A modular automated radionuclide separator for (99)Tc (MARS Tc-99) has been developed for the rapid and reproducible separation of technetium in groundwater samples. The control software of MARS Tc-99 was developed in the LabView programming language. An automated radiochemical method for separating (99)Tc was developed and validated by the purification of (99m)Tc tracer solution eluted from a commercial (99)Mo/(99m)Tc generator. The chemical recovery and analytical time for this radiochemical method were found to be 96 ± 2% and 81 min, respectively. PMID:23602584

  18. Automated Pilot Advisory System

    NASA Technical Reports Server (NTRS)

    Parks, J. L., Jr.; Haidt, J. G.

    1981-01-01

    An Automated Pilot Advisory System (APAS) was developed and operationally tested to demonstrate the concept that low cost automated systems can provide air traffic and aviation weather advisory information at high density uncontrolled airports. The system was designed to enhance the see and be seen rule of flight, and pilots who used the system preferred it over the self announcement system presently used at uncontrolled airports.

  19. An ultra-lightweight design for imperceptible plastic electronics.

    PubMed

    Kaltenbrunner, Martin; Sekitani, Tsuyoshi; Reeder, Jonathan; Yokota, Tomoyuki; Kuribara, Kazunori; Tokuhara, Takeyoshi; Drack, Michael; Schwödiauer, Reinhard; Graz, Ingrid; Bauer-Gogonea, Simona; Bauer, Siegfried; Someya, Takao

    2013-07-25

    Electronic devices have advanced from their heavy, bulky origins to become smart, mobile appliances. Nevertheless, they remain rigid, which precludes their intimate integration into everyday life. Flexible, textile and stretchable electronics are emerging research areas and may yield mainstream technologies. Rollable and unbreakable backplanes with amorphous silicon field-effect transistors on steel substrates only 3 μm thick have been demonstrated. On polymer substrates, bending radii of 0.1 mm have been achieved in flexible electronic devices. Concurrently, the need for compliant electronics that can not only be flexed but also conform to three-dimensional shapes has emerged. Approaches include the transfer of ultrathin polyimide layers encapsulating silicon CMOS circuits onto pre-stretched elastomers, the use of conductive elastomers integrated with organic field-effect transistors (OFETs) on polyimide islands, and fabrication of OFETs and gold interconnects on elastic substrates to realize pressure, temperature and optical sensors. Here we present a platform that makes electronics both virtually unbreakable and imperceptible. Fabricated directly on ultrathin (1 μm) polymer foils, our electronic circuits are light (3 g m(-2)) and ultraflexible and conform to their ambient, dynamic environment. Organic transistors with an ultra-dense oxide gate dielectric a few nanometres thick formed at room temperature enable sophisticated large-area electronic foils with unprecedented mechanical and environmental stability: they withstand repeated bending to radii of 5 μm and less, can be crumpled like paper, accommodate stretching up to 230% on prestrained elastomers, and can be operated at high temperatures and in aqueous environments. Because manufacturing costs of organic electronics are potentially low, imperceptible electronic foils may be as common in the future as plastic wrap is today. Applications include matrix-addressed tactile sensor foils for health care and

  20. Note: design and development of improved indirectly heated cathode based strip electron gun.

    PubMed

    Maiti, Namita; Bade, Abhijeet; Tembhare, G U; Patil, D S; Dasgupta, K

    2015-02-01

    An improved design of indirectly heated solid cathode based electron gun (200 kW, 45 kV, 270° bent strip type electron gun) has been presented. The solid cathode is made of thoriated tungsten, which acts as an improved source of electron at lower temperature. So, high power operation is possible without affecting structural integrity of the electron gun. The design issues are addressed based on the uniformity of temperature on the solid cathode and the single long filament based design. The design approach consists of simulation followed by extensive experimentation. In the design, the effort has been put to tailor the non-uniformity of the heat flux from the filament to the solid cathode to obtain better uniformity of temperature on the solid cathode. Trial beam experiments have been carried out and it is seen that the modified design achieves one to one correspondence of the solid cathode length and the electron beam length. PMID:25725898