2017-11-01
ARL-TR-8225 ● NOV 2017 US Army Research Laboratory Methodology for Designing and Developing a New Ultra-Wideband Antenna Based...Research Laboratory Methodology for Designing and Developing a New Ultra-Wideband Antenna Based on Bio-Inspired Optimization Techniques by...SUBTITLE Methodology for Designing and Developing a New Ultra-Wideband Antenna Based on Bio-Inspired Optimization Techniques 5a. CONTRACT NUMBER
Constraint-based component-modeling for knowledge-based design
NASA Technical Reports Server (NTRS)
Kolb, Mark A.
1992-01-01
The paper describes the application of various advanced programming techniques derived from artificial intelligence research to the development of flexible design tools for conceptual design. Special attention is given to two techniques which appear to be readily applicable to such design tools: the constraint propagation technique and the object-oriented programming. The implementation of these techniques in a prototype computer tool, Rubber Airplane, is described.
Memristor-Based Computing Architecture: Design Methodologies and Circuit Techniques
2013-03-01
MEMRISTOR-BASED COMPUTING ARCHITECTURE : DESIGN METHODOLOGIES AND CIRCUIT TECHNIQUES POLYTECHNIC INSTITUTE OF NEW YORK UNIVERSITY...TECHNICAL REPORT 3. DATES COVERED (From - To) OCT 2010 – OCT 2012 4. TITLE AND SUBTITLE MEMRISTOR-BASED COMPUTING ARCHITECTURE : DESIGN METHODOLOGIES...schemes for a memristor-based reconfigurable architecture design have not been fully explored yet. Therefore, in this project, we investigated
Issues to Consider in Designing WebQuests: A Literature Review
ERIC Educational Resources Information Center
Kurt, Serhat
2012-01-01
A WebQuest is an inquiry-based online learning technique. This technique has been widely adopted in K-16 education. Therefore, it is important that conditions of effective WebQuest design are defined. Through this article the author presents techniques for improving WebQuest design based on current research. More specifically, the author analyzes…
Wood lens design philosophy based on a binary additive manufacturing technique
NASA Astrophysics Data System (ADS)
Marasco, Peter L.; Bailey, Christopher
2016-04-01
Using additive manufacturing techniques in optical engineering to construct a gradient index (GRIN) optic may overcome a number of limitations of GRIN technology. Such techniques are maturing quickly, yielding additional design degrees of freedom for the engineer. How best to employ these degrees of freedom is not completely clear at this time. This paper describes a preliminary design philosophy, including assumptions, pertaining to a particular printing technique for GRIN optics. It includes an analysis based on simulation and initial component measurement.
The role of optimization in the next generation of computer-based design tools
NASA Technical Reports Server (NTRS)
Rogan, J. Edward
1989-01-01
There is a close relationship between design optimization and the emerging new generation of computer-based tools for engineering design. With some notable exceptions, the development of these new tools has not taken full advantage of recent advances in numerical design optimization theory and practice. Recent work in the field of design process architecture has included an assessment of the impact of next-generation computer-based design tools on the design process. These results are summarized, and insights into the role of optimization in a design process based on these next-generation tools are presented. An example problem has been worked out to illustrate the application of this technique. The example problem - layout of an aircraft main landing gear - is one that is simple enough to be solved by many other techniques. Although the mathematical relationships describing the objective function and constraints for the landing gear layout problem can be written explicitly and are quite straightforward, an approximation technique has been used in the solution of this problem that can just as easily be applied to integrate supportability or producibility assessments using theory of measurement techniques into the design decision-making process.
Design of Composite Structures Using Knowledge-Based and Case Based Reasoning
NASA Technical Reports Server (NTRS)
Lambright, Jonathan Paul
1996-01-01
A method of using knowledge based and case based reasoning to assist designers during conceptual design tasks of composite structures was proposed. The cooperative use of heuristics, procedural knowledge, and previous similar design cases suggests a potential reduction in design cycle time and ultimately product lead time. The hypothesis of this work is that the design process of composite structures can be improved by using Case-Based Reasoning (CBR) and Knowledge-Based (KB) reasoning in the early design stages. The technique of using knowledge-based and case-based reasoning facilitates the gathering of disparate information into one location that is easily and readily available. The method suggests that the inclusion of downstream life-cycle issues into the conceptual design phase reduces potential of defective, and sub-optimal composite structures. Three industry experts were interviewed extensively. The experts provided design rules, previous design cases, and test problems. A Knowledge Based Reasoning system was developed using the CLIPS (C Language Interpretive Procedural System) environment and a Case Based Reasoning System was developed using the Design Memory Utility For Sharing Experiences (MUSE) xviii environment. A Design Characteristic State (DCS) was used to document the design specifications, constraints, and problem areas using attribute-value pair relationships. The DCS provided consistent design information between the knowledge base and case base. Results indicated that the use of knowledge based and case based reasoning provided a robust design environment for composite structures. The knowledge base provided design guidance from well defined rules and procedural knowledge. The case base provided suggestions on design and manufacturing techniques based on previous similar designs and warnings of potential problems and pitfalls. The case base complemented the knowledge base and extended the problem solving capability beyond the existence of limited well defined rules. The findings indicated that the technique is most effective when used as a design aid and not as a tool to totally automate the composites design process. Other areas of application and implications for future research are discussed.
A decision-based perspective for the design of methods for systems design
NASA Technical Reports Server (NTRS)
Mistree, Farrokh; Muster, Douglas; Shupe, Jon A.; Allen, Janet K.
1989-01-01
Organization of material, a definition of decision based design, a hierarchy of decision based design, the decision support problem technique, a conceptual model design that can be manufactured and maintained, meta-design, computer-based design, action learning, and the characteristics of decisions are among the topics covered.
ERIC Educational Resources Information Center
Smith, Shaunna
2018-01-01
In the context of a 10-day summer camp makerspace experience that employed design-based learning (DBL) strategies, the purpose of this descriptive case study was to better understand the ways in which children use visualization skills to negotiate design as they move back and forth between the world of nondigital design techniques (i.e., drawing,…
A knowledge based system for scientific data visualization
NASA Technical Reports Server (NTRS)
Senay, Hikmet; Ignatius, Eve
1992-01-01
A knowledge-based system, called visualization tool assistant (VISTA), which was developed to assist scientists in the design of scientific data visualization techniques, is described. The system derives its knowledge from several sources which provide information about data characteristics, visualization primitives, and effective visual perception. The design methodology employed by the system is based on a sequence of transformations which decomposes a data set into a set of data partitions, maps this set of partitions to visualization primitives, and combines these primitives into a composite visualization technique design. Although the primary function of the system is to generate an effective visualization technique design for a given data set by using principles of visual perception the system also allows users to interactively modify the design, and renders the resulting image using a variety of rendering algorithms. The current version of the system primarily supports visualization techniques having applicability in earth and space sciences, although it may easily be extended to include other techniques useful in other disciplines such as computational fluid dynamics, finite-element analysis and medical imaging.
NASA Astrophysics Data System (ADS)
Ibraheem, Omveer, Hasan, N.
2010-10-01
A new hybrid stochastic search technique is proposed to design of suboptimal AGC regulator for a two area interconnected non reheat thermal power system incorporating DC link in parallel with AC tie-line. In this technique, we are proposing the hybrid form of Genetic Algorithm (GA) and simulated annealing (SA) based regulator. GASA has been successfully applied to constrained feedback control problems where other PI based techniques have often failed. The main idea in this scheme is to seek a feasible PI based suboptimal solution at each sampling time. The feasible solution decreases the cost function rather than minimizing the cost function.
Optimization of Turbine Blade Design for Reusable Launch Vehicles
NASA Technical Reports Server (NTRS)
Shyy, Wei
1998-01-01
To facilitate design optimization of turbine blade shape for reusable launching vehicles, appropriate techniques need to be developed to process and estimate the characteristics of the design variables and the response of the output with respect to the variations of the design variables. The purpose of this report is to offer insight into developing appropriate techniques for supporting such design and optimization needs. Neural network and polynomial-based techniques are applied to process aerodynamic data obtained from computational simulations for flows around a two-dimensional airfoil and a generic three- dimensional wing/blade. For the two-dimensional airfoil, a two-layered radial-basis network is designed and trained. The performances of two different design functions for radial-basis networks, one based on the accuracy requirement, whereas the other one based on the limit on the network size. While the number of neurons needed to satisfactorily reproduce the information depends on the size of the data, the neural network technique is shown to be more accurate for large data set (up to 765 simulations have been used) than the polynomial-based response surface method. For the three-dimensional wing/blade case, smaller aerodynamic data sets (between 9 to 25 simulations) are considered, and both the neural network and the polynomial-based response surface techniques improve their performance as the data size increases. It is found while the relative performance of two different network types, a radial-basis network and a back-propagation network, depends on the number of input data, the number of iterations required for radial-basis network is less than that for the back-propagation network.
From laptop to benchtop to bedside: Structure-based Drug Design on Protein Targets
Chen, Lu; Morrow, John K.; Tran, Hoang T.; Phatak, Sharangdhar S.; Du-Cuny, Lei; Zhang, Shuxing
2013-01-01
As an important aspect of computer-aided drug design, structure-based drug design brought a new horizon to pharmaceutical development. This in silico method permeates all aspects of drug discovery today, including lead identification, lead optimization, ADMET prediction and drug repurposing. Structure-based drug design has resulted in fruitful successes drug discovery targeting protein-ligand and protein-protein interactions. Meanwhile, challenges, noted by low accuracy and combinatoric issues, may also cause failures. In this review, state-of-the-art techniques for protein modeling (e.g. structure prediction, modeling protein flexibility, etc.), hit identification/optimization (e.g. molecular docking, focused library design, fragment-based design, molecular dynamic, etc.), and polypharmacology design will be discussed. We will explore how structure-based techniques can facilitate the drug discovery process and interplay with other experimental approaches. PMID:22316152
Singh, Jay; Chattterjee, Kalyan; Vishwakarma, C B
2018-01-01
Load frequency controller has been designed for reduced order model of single area and two-area reheat hydro-thermal power system through internal model control - proportional integral derivative (IMC-PID) control techniques. The controller design method is based on two degree of freedom (2DOF) internal model control which combines with model order reduction technique. Here, in spite of taking full order system model a reduced order model has been considered for 2DOF-IMC-PID design and the designed controller is directly applied to full order system model. The Logarithmic based model order reduction technique is proposed to reduce the single and two-area high order power systems for the application of controller design.The proposed IMC-PID design of reduced order model achieves good dynamic response and robustness against load disturbance with the original high order system. Copyright © 2018 ISA. Published by Elsevier Ltd. All rights reserved.
Design Tools for Reconfigurable Hardware in Orbit (RHinO)
NASA Technical Reports Server (NTRS)
French, Mathew; Graham, Paul; Wirthlin, Michael; Larchev, Gregory; Bellows, Peter; Schott, Brian
2004-01-01
The Reconfigurable Hardware in Orbit (RHinO) project is focused on creating a set of design tools that facilitate and automate design techniques for reconfigurable computing in space, using SRAM-based field-programmable-gate-array (FPGA) technology. These tools leverage an established FPGA design environment and focus primarily on space effects mitigation and power optimization. The project is creating software to automatically test and evaluate the single-event-upsets (SEUs) sensitivities of an FPGA design and insert mitigation techniques. Extensions into the tool suite will also allow evolvable algorithm techniques to reconfigure around single-event-latchup (SEL) events. In the power domain, tools are being created for dynamic power visualiization and optimization. Thus, this technology seeks to enable the use of Reconfigurable Hardware in Orbit, via an integrated design tool-suite aiming to reduce risk, cost, and design time of multimission reconfigurable space processors using SRAM-based FPGAs.
Artificial Intelligence Techniques: Applications for Courseware Development.
ERIC Educational Resources Information Center
Dear, Brian L.
1986-01-01
Introduces some general concepts and techniques of artificial intelligence (natural language interfaces, expert systems, knowledge bases and knowledge representation, heuristics, user-interface metaphors, and object-based environments) and investigates ways these techniques might be applied to analysis, design, development, implementation, and…
Approximate analytical relationships for linear optimal aeroelastic flight control laws
NASA Astrophysics Data System (ADS)
Kassem, Ayman Hamdy
1998-09-01
This dissertation introduces new methods to uncover functional relationships between design parameters of a contemporary control design technique and the resulting closed-loop properties. Three new methods are developed for generating such relationships through analytical expressions: the Direct Eigen-Based Technique, the Order of Magnitude Technique, and the Cost Function Imbedding Technique. Efforts concentrated on the linear-quadratic state-feedback control-design technique applied to an aeroelastic flight control task. For this specific application, simple and accurate analytical expressions for the closed-loop eigenvalues and zeros in terms of basic parameters such as stability and control derivatives, structural vibration damping and natural frequency, and cost function weights are generated. These expressions explicitly indicate how the weights augment the short period and aeroelastic modes, as well as the closed-loop zeros, and by what physical mechanism. The analytical expressions are used to address topics such as damping, nonminimum phase behavior, stability, and performance with robustness considerations, and design modifications. This type of knowledge is invaluable to the flight control designer and would be more difficult to formulate when obtained from numerical-based sensitivity analysis.
An Electronic Engineering Curriculum Design Based on Concept-Mapping Techniques
ERIC Educational Resources Information Center
Toral, S. L.; Martinez-Torres, M. R.; Barrero, F.; Gallardo, S.; Duran, M. J.
2007-01-01
Curriculum design is a concern in European Universities as they face the forthcoming European Higher Education Area (EHEA). This process can be eased by the use of scientific tools such as Concept-Mapping Techniques (CMT) that extract and organize the most relevant information from experts' experience using statistics techniques, and helps a…
Dual-band frequency selective surface with large band separation and stable performance
NASA Astrophysics Data System (ADS)
Zhou, Hang; Qu, Shao-Bo; Peng, Wei-Dong; Lin, Bao-Qin; Wang, Jia-Fu; Ma, Hua; Zhang, Jie-Qiu; Bai, Peng; Wang, Xu-Hua; Xu, Zhuo
2012-05-01
A new technique of designing a dual-band frequency selective surface with large band separation is presented. This technique is based on a delicately designed topology of L- and Ku-band microwave filters. The two band-pass responses are generated by a capacitively-loaded square-loop frequency selective surface and an aperture-coupled frequency selective surface, respectively. A Faraday cage is located between the two frequency selective surface structures to eliminate undesired couplings. Based on this technique, a dual-band frequency selective surface with large band separation is designed, which possesses large band separation, high selectivity, and stable performance under various incident angles and different polarizations.
[An object-oriented intelligent engineering design approach for lake pollution control].
Zou, Rui; Zhou, Jing; Liu, Yong; Zhu, Xiang; Zhao, Lei; Yang, Ping-Jian; Guo, Huai-Cheng
2013-03-01
Regarding the shortage and deficiency of traditional lake pollution control engineering techniques, a new lake pollution control engineering approach was proposed in this study, based on object-oriented intelligent design (OOID) from the perspective of intelligence. It can provide a new methodology and framework for effectively controlling lake pollution and improving water quality. The differences between the traditional engineering techniques and the OOID approach were compared. The key points for OOID were described as object perspective, cause and effect foundation, set points into surface, and temporal and spatial optimization. The blue algae control in lake was taken as an example in this study. The effect of algae control and water quality improvement were analyzed in details from the perspective of object-oriented intelligent design based on two engineering techniques (vertical hydrodynamic mixer and pumping algaecide recharge). The modeling results showed that the traditional engineering design paradigm cannot provide scientific and effective guidance for engineering design and decision-making regarding lake pollution. Intelligent design approach is based on the object perspective and quantitative causal analysis in this case. This approach identified that the efficiency of mixers was much higher than pumps in achieving the goal of low to moderate water quality improvement. However, when the objective of water quality exceeded a certain value (such as the control objective of peak Chla concentration exceeded 100 microg x L(-1) in this experimental water), the mixer cannot achieve this goal. The pump technique can achieve the goal but with higher cost. The efficiency of combining the two techniques was higher than using one of the two techniques alone. Moreover, the quantitative scale control of the two engineering techniques has a significant impact on the actual project benefits and costs.
Design of secondary and subdivision roads in Virginia based on thickness equivalency values.
DOT National Transportation Integrated Search
1971-01-01
The design of secondary and subdivision roads in Virginia is based on the design charts recommended by the Highway Department. In view of recently gained knowledge of materials and design techniques, the Pavement Research Advisory Committee requested...
2006-03-31
from existing image steganography and steganalysis techniques, the overall objective of Task (b) is to design and implement audio steganography in...general design of the VoIP steganography algorithm is based on known LSB hiding techniques (used for example in StegHide (http...system. Nasir Memon et. al. described a steganalyzer based on image quality metrics [AMS03]. Basically, the main idea to detect steganography by
Technique for Early Reliability Prediction of Software Components Using Behaviour Models
Ali, Awad; N. A. Jawawi, Dayang; Adham Isa, Mohd; Imran Babar, Muhammad
2016-01-01
Behaviour models are the most commonly used input for predicting the reliability of a software system at the early design stage. A component behaviour model reveals the structure and behaviour of the component during the execution of system-level functionalities. There are various challenges related to component reliability prediction at the early design stage based on behaviour models. For example, most of the current reliability techniques do not provide fine-grained sequential behaviour models of individual components and fail to consider the loop entry and exit points in the reliability computation. Moreover, some of the current techniques do not tackle the problem of operational data unavailability and the lack of analysis results that can be valuable for software architects at the early design stage. This paper proposes a reliability prediction technique that, pragmatically, synthesizes system behaviour in the form of a state machine, given a set of scenarios and corresponding constraints as input. The state machine is utilized as a base for generating the component-relevant operational data. The state machine is also used as a source for identifying the nodes and edges of a component probabilistic dependency graph (CPDG). Based on the CPDG, a stack-based algorithm is used to compute the reliability. The proposed technique is evaluated by a comparison with existing techniques and the application of sensitivity analysis to a robotic wheelchair system as a case study. The results indicate that the proposed technique is more relevant at the early design stage compared to existing works, and can provide a more realistic and meaningful prediction. PMID:27668748
Synthesis of concentric circular antenna arrays using dragonfly algorithm
NASA Astrophysics Data System (ADS)
Babayigit, B.
2018-05-01
Due to the strong non-linear relationship between the array factor and the array elements, concentric circular antenna array (CCAA) synthesis problem is challenging. Nature-inspired optimisation techniques have been playing an important role in solving array synthesis problems. Dragonfly algorithm (DA) is a novel nature-inspired optimisation technique which is based on the static and dynamic swarming behaviours of dragonflies in nature. This paper presents the design of CCAAs to get low sidelobes using DA. The effectiveness of the proposed DA is investigated in two different (with and without centre element) cases of two three-ring (having 4-, 6-, 8-element or 8-, 10-, 12-element) CCAA design. The radiation pattern of each design cases is obtained by finding optimal excitation weights of the array elements using DA. Simulation results show that the proposed algorithm outperforms the other state-of-the-art techniques (symbiotic organisms search, biogeography-based optimisation, sequential quadratic programming, opposition-based gravitational search algorithm, cat swarm optimisation, firefly algorithm, evolutionary programming) for all design cases. DA can be a promising technique for electromagnetic problems.
Metamodels for Computer-Based Engineering Design: Survey and Recommendations
NASA Technical Reports Server (NTRS)
Simpson, Timothy W.; Peplinski, Jesse; Koch, Patrick N.; Allen, Janet K.
1997-01-01
The use of statistical techniques to build approximations of expensive computer analysis codes pervades much of todays engineering design. These statistical approximations, or metamodels, are used to replace the actual expensive computer analyses, facilitating multidisciplinary, multiobjective optimization and concept exploration. In this paper we review several of these techniques including design of experiments, response surface methodology, Taguchi methods, neural networks, inductive learning, and kriging. We survey their existing application in engineering design and then address the dangers of applying traditional statistical techniques to approximate deterministic computer analysis codes. We conclude with recommendations for the appropriate use of statistical approximation techniques in given situations and how common pitfalls can be avoided.
Theoretical performance analysis of doped optical fibers based on pseudo parameters
NASA Astrophysics Data System (ADS)
Karimi, Maryam; Seraji, Faramarz E.
2010-09-01
Characterization of doped optical fibers (DOFs) is an essential primary stage for design of DOF-based devices. This paper presents design of novel measurement techniques to determine DOFs parameters using mono-beam propagation in a low-loss medium by generating pseudo parameters for the DOFs. The designed techniques are able to characterize simultaneously the absorption, emission cross-sections (ACS and ECS), and dopant concentration of DOFs. In both the proposed techniques, we assume pseudo parameters for the DOFs instead of their actual values and show that the choice of these pseudo parameters values for design of DOF-based devices, such as erbium-doped fiber amplifier (EDFA), are appropriate and the resulting error is quite negligible when compared with the actual parameters values.Utilization of pseudo ACS and ECS values in design procedure of EDFAs does not require the measurement of background loss coefficient (BLC) and makes the rate equation of the DOFs simple. It is shown that by using the pseudo parameters values obtained by the proposed techniques, the error in the gain of a designed EDFA with a BLC of about 1 dB/km, are about 0.08 dB. It is further indicated that the same scenario holds good for BLC lower than 5 dB/m and higher than 12 dB/m. The proposed characterization techniques have simple procedures and are low cost that can have an advantageous use in manufacturing of the DOFs.
NASA Astrophysics Data System (ADS)
Parvathi, S. P.; Ramanan, R. V.
2018-06-01
An iterative analytical trajectory design technique that includes perturbations in the departure phase of the interplanetary orbiter missions is proposed. The perturbations such as non-spherical gravity of Earth and the third body perturbations due to Sun and Moon are included in the analytical design process. In the design process, first the design is obtained using the iterative patched conic technique without including the perturbations and then modified to include the perturbations. The modification is based on, (i) backward analytical propagation of the state vector obtained from the iterative patched conic technique at the sphere of influence by including the perturbations, and (ii) quantification of deviations in the orbital elements at periapsis of the departure hyperbolic orbit. The orbital elements at the sphere of influence are changed to nullify the deviations at the periapsis. The analytical backward propagation is carried out using the linear approximation technique. The new analytical design technique, named as biased iterative patched conic technique, does not depend upon numerical integration and all computations are carried out using closed form expressions. The improved design is very close to the numerical design. The design analysis using the proposed technique provides a realistic insight into the mission aspects. Also, the proposed design is an excellent initial guess for numerical refinement and helps arrive at the four distinct design options for a given opportunity.
Evolutionary and biological metaphors for engineering design
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jakiela, M.
1994-12-31
Since computing became generally available, there has been strong interest in using computers to assist and automate engineering design processes. Specifically, for design optimization and automation, nonlinear programming and artificial intelligence techniques have been extensively studied. New computational techniques, based upon the natural processes of evolution, adaptation, and learing, are showing promise because of their generality and robustness. This presentation will describe the use of two such techniques, genetic algorithms and classifier systems, for a variety of engineering design problems. Structural topology optimization, meshing, and general engineering optimization are shown as example applications.
Structural reanalysis via a mixed method. [using Taylor series for accuracy improvement
NASA Technical Reports Server (NTRS)
Noor, A. K.; Lowder, H. E.
1975-01-01
A study is made of the approximate structural reanalysis technique based on the use of Taylor series expansion of response variables in terms of design variables in conjunction with the mixed method. In addition, comparisons are made with two reanalysis techniques based on the displacement method. These techniques are the Taylor series expansion and the modified reduced basis. It is shown that the use of the reciprocals of the sizing variables as design variables (which is the natural choice in the mixed method) can result in a substantial improvement in the accuracy of the reanalysis technique. Numerical results are presented for a space truss structure.
Shape optimization of disc-type flywheels
NASA Technical Reports Server (NTRS)
Nizza, R. S.
1976-01-01
Techniques were developed for presenting an analytical and graphical means for selecting an optimum flywheel system design, based on system requirements, geometric constraints, and weight limitations. The techniques for creating an analytical solution are formulated from energy and structural principals. The resulting flywheel design relates stress and strain pattern distribution, operating speeds, geometry, and specific energy levels. The design techniques incorporate the lowest stressed flywheel for any particular application and achieve the highest specific energy per unit flywheel weight possible. Stress and strain contour mapping and sectional profile plotting reflect the results of the structural behavior manifested under rotating conditions. This approach toward flywheel design is applicable to any metal flywheel, and permits the selection of the flywheel design to be based solely on the criteria of the system requirements that must be met, those that must be optimized, and those system parameters that may be permitted to vary.
A comparative study on stress and compliance based structural topology optimization
NASA Astrophysics Data System (ADS)
Hailu Shimels, G.; Dereje Engida, W.; Fakhruldin Mohd, H.
2017-10-01
Most of structural topology optimization problems have been formulated and solved to either minimize compliance or weight of a structure under volume or stress constraints, respectively. Even if, a lot of researches are conducted on these two formulation techniques separately, there is no clear comparative study between the two approaches. This paper intends to compare these formulation techniques, so that an end user or designer can choose the best one based on the problems they have. Benchmark problems under the same boundary and loading conditions are defined, solved and results are compared based on these formulations. Simulation results shows that the two formulation techniques are dependent on the type of loading and boundary conditions defined. Maximum stress induced in the design domain is higher when the design domains are formulated using compliance based formulations. Optimal layouts from compliance minimization formulation has complex layout than stress based ones which may lead the manufacturing of the optimal layouts to be challenging. Optimal layouts from compliance based formulations are dependent on the material to be distributed. On the other hand, optimal layouts from stress based formulation are dependent on the type of material used to define the design domain. High computational time for stress based topology optimization is still a challenge because of the definition of stress constraints at element level. Results also shows that adjustment of convergence criterions can be an alternative solution to minimize the maximum stress developed in optimal layouts. Therefore, a designer or end user should choose a method of formulation based on the design domain defined and boundary conditions considered.
Gaming against medical errors: methods and results from a design game on CPOE.
Kanstrup, Anne Marie; Nøhr, Christian
2009-01-01
The paper presents design game as a technique for participatory design for a Computerized Decision Support System (CDSS) for minimizing medical errors. Design game is used as a technique for working with the skills of users, the complexity of the use practice and the negotiation of design here within the challenging domain of medication. The paper presents a developed design game based on game inspiration from a computer game, theoretical inspiration on electronic decision support, and empirical grounding in scenarios of medical errors. The game has been played in a two-hour workshop with six clinicians. The result is presented as a list of central themes for design of CDSS and derived design principles from these themes. These principles are currently under further exploration in follow up prototype based activities.
Kuldeep, B; Singh, V K; Kumar, A; Singh, G K
2015-01-01
In this article, a novel approach for 2-channel linear phase quadrature mirror filter (QMF) bank design based on a hybrid of gradient based optimization and optimization of fractional derivative constraints is introduced. For the purpose of this work, recently proposed nature inspired optimization techniques such as cuckoo search (CS), modified cuckoo search (MCS) and wind driven optimization (WDO) are explored for the design of QMF bank. 2-Channel QMF is also designed with particle swarm optimization (PSO) and artificial bee colony (ABC) nature inspired optimization techniques. The design problem is formulated in frequency domain as sum of L2 norm of error in passband, stopband and transition band at quadrature frequency. The contribution of this work is the novel hybrid combination of gradient based optimization (Lagrange multiplier method) and nature inspired optimization (CS, MCS, WDO, PSO and ABC) and its usage for optimizing the design problem. Performance of the proposed method is evaluated by passband error (ϕp), stopband error (ϕs), transition band error (ϕt), peak reconstruction error (PRE), stopband attenuation (As) and computational time. The design examples illustrate the ingenuity of the proposed method. Results are also compared with the other existing algorithms, and it was found that the proposed method gives best result in terms of peak reconstruction error and transition band error while it is comparable in terms of passband and stopband error. Results show that the proposed method is successful for both lower and higher order 2-channel QMF bank design. A comparative study of various nature inspired optimization techniques is also presented, and the study singles out CS as a best QMF optimization technique. Copyright © 2014 ISA. Published by Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
San-Blas, A. A.; Roca, J. M.; Cogollos, S.; Morro, J. V.; Boria, V. E.; Gimeno, B.
2016-06-01
In this work, a full-wave tool for the accurate analysis and design of compensated E-plane multiport junctions is proposed. The implemented tool is capable of evaluating the undesired effects related to the use of low-cost manufacturing techniques, which are mostly due to the introduction of rounded corners in the cross section of the rectangular waveguides of the device. The obtained results show that, although stringent mechanical effects are imposed, it is possible to compensate for the impact of the cited low-cost manufacturing techniques by redesigning the matching elements considered in the original device. Several new designs concerning a great variety of E-plane components (such as right-angled bends, T-junctions and magic-Ts) are presented, and useful design guidelines are provided. The implemented tool, which is mainly based on the boundary integral-resonant mode expansion technique, has been successfully validated by comparing the obtained results to simulated data provided by a commercial software based on the finite element method.
Reliability based design optimization: Formulations and methodologies
NASA Astrophysics Data System (ADS)
Agarwal, Harish
Modern products ranging from simple components to complex systems should be designed to be optimal and reliable. The challenge of modern engineering is to ensure that manufacturing costs are reduced and design cycle times are minimized while achieving requirements for performance and reliability. If the market for the product is competitive, improved quality and reliability can generate very strong competitive advantages. Simulation based design plays an important role in designing almost any kind of automotive, aerospace, and consumer products under these competitive conditions. Single discipline simulations used for analysis are being coupled together to create complex coupled simulation tools. This investigation focuses on the development of efficient and robust methodologies for reliability based design optimization in a simulation based design environment. Original contributions of this research are the development of a novel efficient and robust unilevel methodology for reliability based design optimization, the development of an innovative decoupled reliability based design optimization methodology, the application of homotopy techniques in unilevel reliability based design optimization methodology, and the development of a new framework for reliability based design optimization under epistemic uncertainty. The unilevel methodology for reliability based design optimization is shown to be mathematically equivalent to the traditional nested formulation. Numerical test problems show that the unilevel methodology can reduce computational cost by at least 50% as compared to the nested approach. The decoupled reliability based design optimization methodology is an approximate technique to obtain consistent reliable designs at lesser computational expense. Test problems show that the methodology is computationally efficient compared to the nested approach. A framework for performing reliability based design optimization under epistemic uncertainty is also developed. A trust region managed sequential approximate optimization methodology is employed for this purpose. Results from numerical test studies indicate that the methodology can be used for performing design optimization under severe uncertainty.
Reduced-order modeling for hyperthermia: an extended balanced-realization-based approach.
Mattingly, M; Bailey, E A; Dutton, A W; Roemer, R B; Devasia, S
1998-09-01
Accurate thermal models are needed in hyperthermia cancer treatments for such tasks as actuator and sensor placement design, parameter estimation, and feedback temperature control. The complexity of the human body produces full-order models which are too large for effective execution of these tasks, making use of reduced-order models necessary. However, standard balanced-realization (SBR)-based model reduction techniques require a priori knowledge of the particular placement of actuators and sensors for model reduction. Since placement design is intractable (computationally) on the full-order models, SBR techniques must use ad hoc placements. To alleviate this problem, an extended balanced-realization (EBR)-based model-order reduction approach is presented. The new technique allows model order reduction to be performed over all possible placement designs and does not require ad hoc placement designs. It is shown that models obtained using the EBR method are more robust to intratreatment changes in the placement of the applied power field than those models obtained using the SBR method.
Hybrid real-code ant colony optimisation for constrained mechanical design
NASA Astrophysics Data System (ADS)
Pholdee, Nantiwat; Bureerat, Sujin
2016-01-01
This paper proposes a hybrid meta-heuristic based on integrating a local search simplex downhill (SDH) method into the search procedure of real-code ant colony optimisation (ACOR). This hybridisation leads to five hybrid algorithms where a Monte Carlo technique, a Latin hypercube sampling technique (LHS) and a translational propagation Latin hypercube design (TPLHD) algorithm are used to generate an initial population. Also, two numerical schemes for selecting an initial simplex are investigated. The original ACOR and its hybrid versions along with a variety of established meta-heuristics are implemented to solve 17 constrained test problems where a fuzzy set theory penalty function technique is used to handle design constraints. The comparative results show that the hybrid algorithms are the top performers. Using the TPLHD technique gives better results than the other sampling techniques. The hybrid optimisers are a powerful design tool for constrained mechanical design problems.
Space shuttle recommendations based on aircraft maintenance experience
NASA Technical Reports Server (NTRS)
Spears, J. M.; Fox, C. L.
1972-01-01
Space shuttle design recommendations based on aircraft maintenance experience are developed. The recommendations are specifically applied to the landing gear system, nondestructive inspection techniques, hydraulic system design, materials and processes, and program support.
An analytical approach to test and design upper limb prosthesis.
Veer, Karan
2015-01-01
In this work the signal acquiring technique, the analysis models and the design protocols of the prosthesis are discussed. The different methods to estimate the motion intended by the amputee from surface electromyogram (SEMG) signals based on time and frequency domain parameters are presented. The experiment proposed that the used techniques can help significantly in discriminating the amputee's motions among four independent activities using dual channel set-up. Further, based on experimental results, the design and working of an artificial arm have been covered under two constituents--the electronics design and the mechanical assembly. Finally, the developed hand prosthesis allows the amputated persons to perform daily routine activities easily.
NASA Astrophysics Data System (ADS)
Sánchez, H. T.; Estrems, M.; Franco, P.; Faura, F.
2009-11-01
In recent years, the market of heat exchangers is increasingly demanding new products in short cycle time, which means that both the design and manufacturing stages must be extremely reduced. The design stage can be reduced by means of CAD-based parametric design techniques. The methodology presented in this proceeding is based on the optimized control of geometric parameters of a service chamber of a heat exchanger by means of the Application Programming Interface (API) provided by the Solidworks CAD package. Using this implementation, a set of different design configurations of the service chamber made of stainless steel AISI 316 are studied by means of the FE method. As a result of this study, a set of knowledge rules based on the fatigue behaviour are constructed and integrated into the design optimization process.
A knowledge-based tool for multilevel decomposition of a complex design problem
NASA Technical Reports Server (NTRS)
Rogers, James L.
1989-01-01
Although much work has been done in applying artificial intelligence (AI) tools and techniques to problems in different engineering disciplines, only recently has the application of these tools begun to spread to the decomposition of complex design problems. A new tool based on AI techniques has been developed to implement a decomposition scheme suitable for multilevel optimization and display of data in an N x N matrix format.
[A comprehensive approach to designing of magnetotherapy techniques based on the Atos device].
Raĭgorodskiĭ, Iu M; Semiachkin, G P; Tatarenko, D A
1995-01-01
The paper determines how to apply a comprehensive approach to designing magnetic therapeutical techniques based on concomitant exposures to two or more physical factors. It shows the advantages of the running pattern of a magnetic field and photostimuli in terms of optimization of physiotherapeutical exposures. An Atos apparatus with an Amblio-1 attachment is used as an example to demonstrate how to apply the comprehensive approach for ophthalmology.
A supportive architecture for CFD-based design optimisation
NASA Astrophysics Data System (ADS)
Li, Ni; Su, Zeya; Bi, Zhuming; Tian, Chao; Ren, Zhiming; Gong, Guanghong
2014-03-01
Multi-disciplinary design optimisation (MDO) is one of critical methodologies to the implementation of enterprise systems (ES). MDO requiring the analysis of fluid dynamics raises a special challenge due to its extremely intensive computation. The rapid development of computational fluid dynamic (CFD) technique has caused a rise of its applications in various fields. Especially for the exterior designs of vehicles, CFD has become one of the three main design tools comparable to analytical approaches and wind tunnel experiments. CFD-based design optimisation is an effective way to achieve the desired performance under the given constraints. However, due to the complexity of CFD, integrating with CFD analysis in an intelligent optimisation algorithm is not straightforward. It is a challenge to solve a CFD-based design problem, which is usually with high dimensions, and multiple objectives and constraints. It is desirable to have an integrated architecture for CFD-based design optimisation. However, our review on existing works has found that very few researchers have studied on the assistive tools to facilitate CFD-based design optimisation. In the paper, a multi-layer architecture and a general procedure are proposed to integrate different CFD toolsets with intelligent optimisation algorithms, parallel computing technique and other techniques for efficient computation. In the proposed architecture, the integration is performed either at the code level or data level to fully utilise the capabilities of different assistive tools. Two intelligent algorithms are developed and embedded with parallel computing. These algorithms, together with the supportive architecture, lay a solid foundation for various applications of CFD-based design optimisation. To illustrate the effectiveness of the proposed architecture and algorithms, the case studies on aerodynamic shape design of a hypersonic cruising vehicle are provided, and the result has shown that the proposed architecture and developed algorithms have performed successfully and efficiently in dealing with the design optimisation with over 200 design variables.
Application of optimization techniques to vehicle design: A review
NASA Technical Reports Server (NTRS)
Prasad, B.; Magee, C. L.
1984-01-01
The work that has been done in the last decade or so in the application of optimization techniques to vehicle design is discussed. Much of the work reviewed deals with the design of body or suspension (chassis) components for reduced weight. Also reviewed are studies dealing with system optimization problems for improved functional performance, such as ride or handling. In reviewing the work on the use of optimization techniques, one notes the transition from the rare mention of the methods in the 70's to an increased effort in the early 80's. Efficient and convenient optimization and analysis tools still need to be developed so that they can be regularly applied in the early design stage of the vehicle development cycle to be most effective. Based on the reported applications, an attempt is made to assess the potential for automotive application of optimization techniques. The major issue involved remains the creation of quantifiable means of analysis to be used in vehicle design. The conventional process of vehicle design still contains much experience-based input because it has not yet proven possible to quantify all important constraints. This restraint on the part of the analysis will continue to be a major limiting factor in application of optimization to vehicle design.
Lee, M-Y; Chang, C-C; Ku, Y C
2008-01-01
Fixed dental restoration by conventional methods greatly relies on the skill and experience of the dental technician. The quality and accuracy of the final product depends mostly on the technician's subjective judgment. In addition, the traditional manual operation involves many complex procedures, and is a time-consuming and labour-intensive job. Most importantly, no quantitative design and manufacturing information is preserved for future retrieval. In this paper, a new device for scanning the dental profile and reconstructing 3D digital information of a dental model based on a layer-based imaging technique, called abrasive computer tomography (ACT) was designed in-house and proposed for the design of custom dental restoration. The fixed partial dental restoration was then produced by rapid prototyping (RP) and computer numerical control (CNC) machining methods based on the ACT scanned digital information. A force feedback sculptor (FreeForm system, Sensible Technologies, Inc., Cambridge MA, USA), which comprises 3D Touch technology, was applied to modify the morphology and design of the fixed dental restoration. In addition, a comparison of conventional manual operation and digital manufacture using both RP and CNC machining technologies for fixed dental restoration production is presented. Finally, a digital custom fixed restoration manufacturing protocol integrating proposed layer-based dental profile scanning, computer-aided design, 3D force feedback feature modification and advanced fixed restoration manufacturing techniques is illustrated. The proposed method provides solid evidence that computer-aided design and manufacturing technologies may become a new avenue for custom-made fixed restoration design, analysis, and production in the 21st century.
Diffraction based overlay and image based overlay on production flow for advanced technology node
NASA Astrophysics Data System (ADS)
Blancquaert, Yoann; Dezauzier, Christophe
2013-04-01
One of the main challenges for lithography step is the overlay control. For the advanced technology node like 28nm and 14nm, the overlay budget becomes very tight. Two overlay techniques compete in our advanced semiconductor manufacturing: the Diffraction based Overlay (DBO) with the YieldStar S200 (ASML) and the Image Based Overlay (IBO) with ARCHER (KLA). In this paper we will compare these two methods through 3 critical production layers: Poly Gate, Contact and first metal layer. We will show the overlay results of the 2 techniques, explore the accuracy and compare the total measurement uncertainty (TMU) for the standard overlay targets of both techniques. We will see also the response and impact for the Image Based Overlay and Diffraction Based Overlay techniques through a process change like an additional Hardmask TEOS layer on the front-end stack. The importance of the target design is approached; we will propose more adapted design for image based targets. Finally we will present embedded targets in the 14 FDSOI with first results.
NASA Astrophysics Data System (ADS)
Villanueva Perez, Carlos Hernan
Computational design optimization provides designers with automated techniques to develop novel and non-intuitive optimal designs. Topology optimization is a design optimization technique that allows for the evolution of a broad variety of geometries in the optimization process. Traditional density-based topology optimization methods often lack a sufficient resolution of the geometry and physical response, which prevents direct use of the optimized design in manufacturing and the accurate modeling of the physical response of boundary conditions. The goal of this thesis is to introduce a unified topology optimization framework that uses the Level Set Method (LSM) to describe the design geometry and the eXtended Finite Element Method (XFEM) to solve the governing equations and measure the performance of the design. The methodology is presented as an alternative to density-based optimization approaches, and is able to accommodate a broad range of engineering design problems. The framework presents state-of-the-art methods for immersed boundary techniques to stabilize the systems of equations and enforce the boundary conditions, and is studied with applications in 2D and 3D linear elastic structures, incompressible flow, and energy and species transport problems to test the robustness and the characteristics of the method. A comparison of the framework against density-based topology optimization approaches is studied with regards to convergence, performance, and the capability to manufacture the designs. Furthermore, the ability to control the shape of the design to operate within manufacturing constraints is developed and studied. The analysis capability of the framework is validated quantitatively through comparison against previous benchmark studies, and qualitatively through its application to topology optimization problems. The design optimization problems converge to intuitive designs and resembled well the results from previous 2D or density-based studies.
Al Mortadi, Noor; Eggbeer, Dominic; Lewis, Jeffrey; Williams, Robert J
2013-04-01
The aim of this study was to analyze the latest innovations in additive manufacture techniques and uniquely apply them to dentistry, to build a sleep apnea device requiring rotating hinges. Laser scanning was used to capture the three-dimensional topography of an upper and lower dental cast. The data sets were imported into an appropriate computer-aided design software environment, which was used to design a sleep apnea device. This design was then exported as a stereolithography file and transferred for three-dimensional printing by an additive manufacture machine. The results not only revealed that the novel computer-based technique presented provides new design opportunities but also highlighted limitations that must be addressed before the techniques can become clinically viable.
Construction of dynamic stochastic simulation models using knowledge-based techniques
NASA Technical Reports Server (NTRS)
Williams, M. Douglas; Shiva, Sajjan G.
1990-01-01
Over the past three decades, computer-based simulation models have proven themselves to be cost-effective alternatives to the more structured deterministic methods of systems analysis. During this time, many techniques, tools and languages for constructing computer-based simulation models have been developed. More recently, advances in knowledge-based system technology have led many researchers to note the similarities between knowledge-based programming and simulation technologies and to investigate the potential application of knowledge-based programming techniques to simulation modeling. The integration of conventional simulation techniques with knowledge-based programming techniques is discussed to provide a development environment for constructing knowledge-based simulation models. A comparison of the techniques used in the construction of dynamic stochastic simulation models and those used in the construction of knowledge-based systems provides the requirements for the environment. This leads to the design and implementation of a knowledge-based simulation development environment. These techniques were used in the construction of several knowledge-based simulation models including the Advanced Launch System Model (ALSYM).
NASA Astrophysics Data System (ADS)
Li, Qing; Lin, Haibo; Xiu, Yu-Feng; Wang, Ruixue; Yi, Chuijie
The test platform of wheat precision seeding based on image processing techniques is designed to develop the wheat precision seed metering device with high efficiency and precision. Using image processing techniques, this platform gathers images of seeds (wheat) on the conveyer belt which are falling from seed metering device. Then these data are processed and analyzed to calculate the qualified rate, reseeding rate and leakage sowing rate, etc. This paper introduces the whole structure, design parameters of the platform and hardware & software of the image acquisition system were introduced, as well as the method of seed identification and seed-space measurement using image's threshold and counting the seed's center. By analyzing the experimental result, the measurement error is less than ± 1mm.
Silicon-based optoelectronics: Monolithic integration for WDM
NASA Astrophysics Data System (ADS)
Pearson, Matthew Richard T.
2000-10-01
This thesis details the development of enabling technologies required for inexpensive, monolithic integration of Si-based wavelength division multiplexing (WDM) components and photodetectors. The work involves the design and fabrication of arrayed waveguide grating demultiplexers in silicon-on-insulator (SOI), the development of advanced SiGe photodetectors capable of photodetection at 1.55 mum wavelengths, and the development of a low cost fabrication technique that enables the high volume production of Si-based photonic components. Arrayed waveguide grating (AWG) demultiplexers were designed and fabricated in SOI. The fabrication of AWGs in SOI has been reported in the literature, however there are a number of design issues specific to the SOI material system that can have a large effect on device performance and design, and have not been theoretically examined in earlier work. The SOI AWGs presented in this thesis are the smallest devices of this type reported, and they exhibit performance acceptable for commercial applications. The SiGe photodetectors reported in the literature exhibit extremely low responsivities at wavelengths near 1.55 mum. We present the first use of three dimensional growth modes to enhance the photoresponse of SiGe at 1.55 mum wavelengths. Metal semiconductor-metal (MSM) photodetectors were fabricated using this undulating quantum well structure, and demonstrate the highest responsivities yet reported for a SiGe-based photodetector at 1.55 mum. These detectors were monolithically integrated with low-loss SOI waveguides, enabling integration with nearly any Si-based passive WDM component. The pursuit of inexpensive Si-based photonic components also requires the development of new manufacturing techniques that are more suitable for high volume production. This thesis presents the development of a low cost fabrication technique based on the local oxidation of silicon (LOCOS), a standard processing technique used for Si integrated circuits. This process is developed for both SiGe and SOI waveguides, but is shown to be commercially suitable only for SOI waveguide devices. The technique allows nearly any Si microelectronics fabrication facility to begin manufacturing optical components with minimal change in processing equipment or techniques. These enabling technologies provide the critical elements for inexpensive, monolithic integration in a Si-based system.
On verifying a high-level design. [cost and error analysis
NASA Technical Reports Server (NTRS)
Mathew, Ben; Wehbeh, Jalal A.; Saab, Daniel G.
1993-01-01
An overview of design verification techniques is presented, and some of the current research in high-level design verification is described. Formal hardware description languages that are capable of adequately expressing the design specifications have been developed, but some time will be required before they can have the expressive power needed to be used in real applications. Simulation-based approaches are more useful in finding errors in designs than they are in proving the correctness of a certain design. Hybrid approaches that combine simulation with other formal design verification techniques are argued to be the most promising over the short term.
CMOS array design automation techniques. [metal oxide semiconductors
NASA Technical Reports Server (NTRS)
Ramondetta, P.; Feller, A.; Noto, R.; Lombardi, T.
1975-01-01
A low cost, quick turnaround technique for generating custom metal oxide semiconductor arrays using the standard cell approach was developed, implemented, tested and validated. Basic cell design topology and guidelines are defined based on an extensive analysis that includes circuit, layout, process, array topology and required performance considerations particularly high circuit speed.
On Developing a Taxonomy for Multidisciplinary Design Optimization: A Decision-Based Perspective
NASA Technical Reports Server (NTRS)
Lewis, Kemper; Mistree, Farrokh
1995-01-01
In this paper, we approach MDO from a Decision-Based Design (DBD) perspective and explore classification schemes for designing complex systems and processes. Specifically, we focus on decisions, which are only a small portion of the Decision Support Problem (DSP) Technique, our implementation of DBD. We map coupled nonhierarchical and hierarchical representations from the DSP Technique into the Balling-Sobieski (B-S) framework (Balling and Sobieszczanski-Sobieski, 1994), and integrate domain-independent linguistic terms to complete our taxonomy. Application of DSPs to the design of complex, multidisciplinary systems include passenger aircraft, ships, damage tolerant structural and mechanical systems, and thermal energy systems. In this paper we show that Balling-Sobieski framework is consistent with that of the Decision Support Problem Technique through the use of linguistic entities to describe the same type of formulations. We show that the underlying linguistics of the solution approaches are the same and can be coalesced into a homogeneous framework with which to base the research, application, and technology MDO upon. We introduce, in the Balling-Sobieski framework, examples of multidisciplinary design, namely, aircraft, damage tolerant structural and mechanical systems, and thermal energy systems.
Design and Evaluation of Fusion Approach for Combining Brain and Gaze Inputs for Target Selection
Évain, Andéol; Argelaguet, Ferran; Casiez, Géry; Roussel, Nicolas; Lécuyer, Anatole
2016-01-01
Gaze-based interfaces and Brain-Computer Interfaces (BCIs) allow for hands-free human–computer interaction. In this paper, we investigate the combination of gaze and BCIs. We propose a novel selection technique for 2D target acquisition based on input fusion. This new approach combines the probabilistic models for each input, in order to better estimate the intent of the user. We evaluated its performance against the existing gaze and brain–computer interaction techniques. Twelve participants took part in our study, in which they had to search and select 2D targets with each of the evaluated techniques. Our fusion-based hybrid interaction technique was found to be more reliable than the previous gaze and BCI hybrid interaction techniques for 10 participants over 12, while being 29% faster on average. However, similarly to what has been observed in hybrid gaze-and-speech interaction, gaze-only interaction technique still provides the best performance. Our results should encourage the use of input fusion, as opposed to sequential interaction, in order to design better hybrid interfaces. PMID:27774048
Conformational Analysis of Drug Molecules: A Practical Exercise in the Medicinal Chemistry Course
ERIC Educational Resources Information Center
Yuriev, Elizabeth; Chalmers, David; Capuano, Ben
2009-01-01
Medicinal chemistry is a specialized, scientific discipline. Computational chemistry and structure-based drug design constitute important themes in the education of medicinal chemists. This problem-based task is associated with structure-based drug design lectures. It requires students to use computational techniques to investigate conformational…
Realizable optimal control for a remotely piloted research vehicle. [stability augmentation
NASA Technical Reports Server (NTRS)
Dunn, H. J.
1980-01-01
The design of a control system using the linear-quadratic regulator (LQR) control law theory for time invariant systems in conjunction with an incremental gradient procedure is presented. The incremental gradient technique reduces the full-state feedback controller design, generated by the LQR algorithm, to a realizable design. With a realizable controller, the feedback gains are based only on the available system outputs instead of being based on the full-state outputs. The design is for a remotely piloted research vehicle (RPRV) stability augmentation system. The design includes methods for accounting for noisy measurements, discrete controls with zero-order-hold outputs, and computational delay errors. Results from simulation studies of the response of the RPRV to a step in the elevator and frequency analysis techniques are included to illustrate these abnormalities and their influence on the controller design.
NASA Technical Reports Server (NTRS)
Adams, W. M., Jr.; Tiffany, S. H.
1983-01-01
A control law is developed to suppress symmetric flutter for a mathematical model of an aeroelastic research vehicle. An implementable control law is attained by including modified LQG (linear quadratic Gaussian) design techniques, controller order reduction, and gain scheduling. An alternate (complementary) design approach is illustrated for one flight condition wherein nongradient-based constrained optimization techniques are applied to maximize controller robustness.
Information System Design Methodology Based on PERT/CPM Networking and Optimization Techniques.
ERIC Educational Resources Information Center
Bose, Anindya
The dissertation attempts to demonstrate that the program evaluation and review technique (PERT)/Critical Path Method (CPM) or some modified version thereof can be developed into an information system design methodology. The methodology utilizes PERT/CPM which isolates the basic functional units of a system and sets them in a dynamic time/cost…
NASA Astrophysics Data System (ADS)
Safour, Salaheddine; Bernard, Yves
2017-10-01
This paper focuses on the design of a wireless power supply system for low power devices (e.g. sensors) located in harsh electromagnetic environment with ferromagnetic and conductive materials. Such particular environment could be found in linear and rotating actuators. The studied power transfer system is based on the resonant magnetic coupling between a fixed transmitter coil and a moving receiver coil. The technique was utilized successfully for rotary machines. The aim of this paper is to extend the technique to linear actuators. A modeling approach based on 2D Axisymmetric Finite Element model and an electrical lumped model based on the two-port network theory is introduced. The study shows the limitation of the technique to transfer the required power in the presence of ferromagnetic and conductive materials. Parametric and circuit analysis were conducted in order to design a resonant magnetic coupler that ensures good power transfer capability and efficiency. A design methodology is proposed based on this study. Measurements on the prototype show efficiency up to 75% at a linear distance of 20 mm.
NASA Technical Reports Server (NTRS)
1994-01-01
This manual presents a series of recommended techniques that can increase overall operational effectiveness of both flight and ground based NASA systems. It provides a set of tools that minimizes risk associated with: (1) restoring failed functions (both ground and flight based); (2) conducting complex and highly visible maintenance operations; and (3) sustaining a technical capability to support the NASA mission using aging equipment or facilities. It considers (1) program management - key elements of an effective maintainability effort; (2) design and development - techniques that have benefited previous programs; (3) analysis and test - quantitative and qualitative analysis processes and testing techniques; and (4) operations and operational design techniques that address NASA field experience. This document is a valuable resource for continuous improvement ideas in executing the systems development process in accordance with the NASA 'better, faster, smaller, and cheaper' goal without compromising safety.
Model-based optimal design of experiments - semidefinite and nonlinear programming formulations
Duarte, Belmiro P.M.; Wong, Weng Kee; Oliveira, Nuno M.C.
2015-01-01
We use mathematical programming tools, such as Semidefinite Programming (SDP) and Nonlinear Programming (NLP)-based formulations to find optimal designs for models used in chemistry and chemical engineering. In particular, we employ local design-based setups in linear models and a Bayesian setup in nonlinear models to find optimal designs. In the latter case, Gaussian Quadrature Formulas (GQFs) are used to evaluate the optimality criterion averaged over the prior distribution for the model parameters. Mathematical programming techniques are then applied to solve the optimization problems. Because such methods require the design space be discretized, we also evaluate the impact of the discretization scheme on the generated design. We demonstrate the techniques for finding D–, A– and E–optimal designs using design problems in biochemical engineering and show the method can also be directly applied to tackle additional issues, such as heteroscedasticity in the model. Our results show that the NLP formulation produces highly efficient D–optimal designs but is computationally less efficient than that required for the SDP formulation. The efficiencies of the generated designs from the two methods are generally very close and so we recommend the SDP formulation in practice. PMID:26949279
Model-based optimal design of experiments - semidefinite and nonlinear programming formulations.
Duarte, Belmiro P M; Wong, Weng Kee; Oliveira, Nuno M C
2016-02-15
We use mathematical programming tools, such as Semidefinite Programming (SDP) and Nonlinear Programming (NLP)-based formulations to find optimal designs for models used in chemistry and chemical engineering. In particular, we employ local design-based setups in linear models and a Bayesian setup in nonlinear models to find optimal designs. In the latter case, Gaussian Quadrature Formulas (GQFs) are used to evaluate the optimality criterion averaged over the prior distribution for the model parameters. Mathematical programming techniques are then applied to solve the optimization problems. Because such methods require the design space be discretized, we also evaluate the impact of the discretization scheme on the generated design. We demonstrate the techniques for finding D -, A - and E -optimal designs using design problems in biochemical engineering and show the method can also be directly applied to tackle additional issues, such as heteroscedasticity in the model. Our results show that the NLP formulation produces highly efficient D -optimal designs but is computationally less efficient than that required for the SDP formulation. The efficiencies of the generated designs from the two methods are generally very close and so we recommend the SDP formulation in practice.
Wei, Xuelei; Dong, Fuhui
2011-12-01
To review recent advance in the research and application of computer aided forming techniques for constructing bone tissue engineering scaffolds. The literature concerning computer aided forming techniques for constructing bone tissue engineering scaffolds in recent years was reviewed extensively and summarized. Several studies over last decade have focused on computer aided forming techniques for bone scaffold construction using various scaffold materials, which is based on computer aided design (CAD) and bone scaffold rapid prototyping (RP). CAD include medical CAD, STL, and reverse design. Reverse design can fully simulate normal bone tissue and could be very useful for the CAD. RP techniques include fused deposition modeling, three dimensional printing, selected laser sintering, three dimensional bioplotting, and low-temperature deposition manufacturing. These techniques provide a new way to construct bone tissue engineering scaffolds with complex internal structures. With rapid development of molding and forming techniques, computer aided forming techniques are expected to provide ideal bone tissue engineering scaffolds.
Shape and Reinforcement Optimization of Underground Tunnels
NASA Astrophysics Data System (ADS)
Ghabraie, Kazem; Xie, Yi Min; Huang, Xiaodong; Ren, Gang
Design of support system and selecting an optimum shape for the opening are two important steps in designing excavations in rock masses. Currently selecting the shape and support design are mainly based on designer's judgment and experience. Both of these problems can be viewed as material distribution problems where one needs to find the optimum distribution of a material in a domain. Topology optimization techniques have proved to be useful in solving these kinds of problems in structural design. Recently the application of topology optimization techniques in reinforcement design around underground excavations has been studied by some researchers. In this paper a three-phase material model will be introduced changing between normal rock, reinforced rock, and void. Using such a material model both problems of shape and reinforcement design can be solved together. A well-known topology optimization technique used in structural design is bi-directional evolutionary structural optimization (BESO). In this paper the BESO technique has been extended to simultaneously optimize the shape of the opening and the distribution of reinforcements. Validity and capability of the proposed approach have been investigated through some examples.
Coupling artificial intelligence and numerical computation for engineering design (Invited paper)
NASA Astrophysics Data System (ADS)
Tong, S. S.
1986-01-01
The possibility of combining artificial intelligence (AI) systems and numerical computation methods for engineering designs is considered. Attention is given to three possible areas of application involving fan design, controlled vortex design of turbine stage blade angles, and preliminary design of turbine cascade profiles. Among the AI techniques discussed are: knowledge-based systems; intelligent search; and pattern recognition systems. The potential cost and performance advantages of an AI-based design-generation system are discussed in detail.
Enhancing the Front-End Phase of Design Methodology
ERIC Educational Resources Information Center
Elias, Erasto
2006-01-01
Design methodology (DM) is defined by the procedural path, expressed in design models, and techniques or methods used to untangle the various activities within a design model. Design education in universities is mainly based on descriptive design models. Much knowledge and organization have been built into DM to facilitate design teaching.…
Design Oriented Structural Modeling for Airplane Conceptual Design Optimization
NASA Technical Reports Server (NTRS)
Livne, Eli
1999-01-01
The main goal for research conducted with the support of this grant was to develop design oriented structural optimization methods for the conceptual design of airplanes. Traditionally in conceptual design airframe weight is estimated based on statistical equations developed over years of fitting airplane weight data in data bases of similar existing air- planes. Utilization of such regression equations for the design of new airplanes can be justified only if the new air-planes use structural technology similar to the technology on the airplanes in those weight data bases. If any new structural technology is to be pursued or any new unconventional configurations designed the statistical weight equations cannot be used. In such cases any structural weight estimation must be based on rigorous "physics based" structural analysis and optimization of the airframes under consideration. Work under this grant progressed to explore airframe design-oriented structural optimization techniques along two lines of research: methods based on "fast" design oriented finite element technology and methods based on equivalent plate / equivalent shell models of airframes, in which the vehicle is modelled as an assembly of plate and shell components, each simulating a lifting surface or nacelle / fuselage pieces. Since response to changes in geometry are essential in conceptual design of airplanes, as well as the capability to optimize the shape itself, research supported by this grant sought to develop efficient techniques for parametrization of airplane shape and sensitivity analysis with respect to shape design variables. Towards the end of the grant period a prototype automated structural analysis code designed to work with the NASA Aircraft Synthesis conceptual design code ACS= was delivered to NASA Ames.
Decomposition-Based Decision Making for Aerospace Vehicle Design
NASA Technical Reports Server (NTRS)
Borer, Nicholas K.; Mavris, DImitri N.
2005-01-01
Most practical engineering systems design problems have multiple and conflicting objectives. Furthermore, the satisfactory attainment level for each objective ( requirement ) is likely uncertain early in the design process. Systems with long design cycle times will exhibit more of this uncertainty throughout the design process. This is further complicated if the system is expected to perform for a relatively long period of time, as now it will need to grow as new requirements are identified and new technologies are introduced. These points identify a need for a systems design technique that enables decision making amongst multiple objectives in the presence of uncertainty. Traditional design techniques deal with a single objective or a small number of objectives that are often aggregates of the overarching goals sought through the generation of a new system. Other requirements, although uncertain, are viewed as static constraints to this single or multiple objective optimization problem. With either of these formulations, enabling tradeoffs between the requirements, objectives, or combinations thereof is a slow, serial process that becomes increasingly complex as more criteria are added. This research proposal outlines a technique that attempts to address these and other idiosyncrasies associated with modern aerospace systems design. The proposed formulation first recasts systems design into a multiple criteria decision making problem. The now multiple objectives are decomposed to discover the critical characteristics of the objective space. Tradeoffs between the objectives are considered amongst these critical characteristics by comparison to a probabilistic ideal tradeoff solution. The proposed formulation represents a radical departure from traditional methods. A pitfall of this technique is in the validation of the solution: in a multi-objective sense, how can a decision maker justify a choice between non-dominated alternatives? A series of examples help the reader to observe how this technique can be applied to aerospace systems design and compare the results of this so-called Decomposition-Based Decision Making to more traditional design approaches.
NASA Technical Reports Server (NTRS)
Henderson, M. L.
1979-01-01
The benefits to high lift system maximum life and, alternatively, to high lift system complexity, of applying analytic design and analysis techniques to the design of high lift sections for flight conditions were determined and two high lift sections were designed to flight conditions. The influence of the high lift section on the sizing and economics of a specific energy efficient transport (EET) was clarified using a computerized sizing technique and an existing advanced airplane design data base. The impact of the best design resulting from the design applications studies on EET sizing and economics were evaluated. Flap technology trade studies, climb and descent studies, and augmented stability studies are included along with a description of the baseline high lift system geometry, a calculation of lift and pitching moment when separation is present, and an inverse boundary layer technique for pressure distribution synthesis and optimization.
Yoo, Dongjin
2012-07-01
Advanced additive manufacture (AM) techniques are now being developed to fabricate scaffolds with controlled internal pore architectures in the field of tissue engineering. In general, these techniques use a hybrid method which combines computer-aided design (CAD) with computer-aided manufacturing (CAM) tools to design and fabricate complicated three-dimensional (3D) scaffold models. The mathematical descriptions of micro-architectures along with the macro-structures of the 3D scaffold models are limited by current CAD technologies as well as by the difficulty of transferring the designed digital models to standard formats for fabrication. To overcome these difficulties, we have developed an efficient internal pore architecture design system based on triply periodic minimal surface (TPMS) unit cell libraries and associated computational methods to assemble TPMS unit cells into an entire scaffold model. In addition, we have developed a process planning technique based on TPMS internal architecture pattern of unit cells to generate tool paths for freeform fabrication of tissue engineering porous scaffolds. Copyright © 2012 IPEM. Published by Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Luo, Jianjun; Wei, Caisheng; Dai, Honghua; Yuan, Jianping
2018-03-01
This paper focuses on robust adaptive control for a class of uncertain nonlinear systems subject to input saturation and external disturbance with guaranteed predefined tracking performance. To reduce the limitations of classical predefined performance control method in the presence of unknown initial tracking errors, a novel predefined performance function with time-varying design parameters is first proposed. Then, aiming at reducing the complexity of nonlinear approximations, only two least-square-support-vector-machine-based (LS-SVM-based) approximators with two design parameters are required through norm form transformation of the original system. Further, a novel LS-SVM-based adaptive constrained control scheme is developed under the time-vary predefined performance using backstepping technique. Wherein, to avoid the tedious analysis and repeated differentiations of virtual control laws in the backstepping technique, a simple and robust finite-time-convergent differentiator is devised to only extract its first-order derivative at each step in the presence of external disturbance. In this sense, the inherent demerit of backstepping technique-;explosion of terms; brought by the recursive virtual controller design is conquered. Moreover, an auxiliary system is designed to compensate the control saturation. Finally, three groups of numerical simulations are employed to validate the effectiveness of the newly developed differentiator and the proposed adaptive constrained control scheme.
Problem Solving Techniques for the Design of Algorithms.
ERIC Educational Resources Information Center
Kant, Elaine; Newell, Allen
1984-01-01
Presents model of algorithm design (activity in software development) based on analysis of protocols of two subjects designing three convex hull algorithms. Automation methods, methods for studying algorithm design, role of discovery in problem solving, and comparison of different designs of case study according to model are highlighted.…
NASA Technical Reports Server (NTRS)
Kolb, Mark A.
1990-01-01
Originally, computer programs for engineering design focused on detailed geometric design. Later, computer programs for algorithmically performing the preliminary design of specific well-defined classes of objects became commonplace. However, due to the need for extreme flexibility, it appears unlikely that conventional programming techniques will prove fruitful in developing computer aids for engineering conceptual design. The use of symbolic processing techniques, such as object-oriented programming and constraint propagation, facilitate such flexibility. Object-oriented programming allows programs to be organized around the objects and behavior to be simulated, rather than around fixed sequences of function- and subroutine-calls. Constraint propagation allows declarative statements to be understood as designating multi-directional mathematical relationships among all the variables of an equation, rather than as unidirectional assignments to the variable on the left-hand side of the equation, as in conventional computer programs. The research has concentrated on applying these two techniques to the development of a general-purpose computer aid for engineering conceptual design. Object-oriented programming techniques are utilized to implement a user-extensible database of design components. The mathematical relationships which model both geometry and physics of these components are managed via constraint propagation. In addition, to this component-based hierarchy, special-purpose data structures are provided for describing component interactions and supporting state-dependent parameters. In order to investigate the utility of this approach, a number of sample design problems from the field of aerospace engineering were implemented using the prototype design tool, Rubber Airplane. The additional level of organizational structure obtained by representing design knowledge in terms of components is observed to provide greater convenience to the program user, and to result in a database of engineering information which is easier both to maintain and to extend.
Using cognitive task analysis to develop simulation-based training for medical tasks.
Cannon-Bowers, Jan; Bowers, Clint; Stout, Renee; Ricci, Katrina; Hildabrand, Annette
2013-10-01
Pressures to increase the efficacy and effectiveness of medical training are causing the Department of Defense to investigate the use of simulation technologies. This article describes a comprehensive cognitive task analysis technique that can be used to simultaneously generate training requirements, performance metrics, scenario requirements, and simulator/simulation requirements for medical tasks. On the basis of a variety of existing techniques, we developed a scenario-based approach that asks experts to perform the targeted task multiple times, with each pass probing a different dimension of the training development process. In contrast to many cognitive task analysis approaches, we argue that our technique can be highly cost effective because it is designed to accomplish multiple goals. The technique was pilot tested with expert instructors from a large military medical training command. These instructors were employed to generate requirements for two selected combat casualty care tasks-cricothyroidotomy and hemorrhage control. Results indicated that the technique is feasible to use and generates usable data to inform simulation-based training system design. Reprint & Copyright © 2013 Association of Military Surgeons of the U.S.
Nonlinear relaxation algorithms for circuit simulation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Saleh, R.A.
Circuit simulation is an important Computer-Aided Design (CAD) tool in the design of Integrated Circuits (IC). However, the standard techniques used in programs such as SPICE result in very long computer-run times when applied to large problems. In order to reduce the overall run time, a number of new approaches to circuit simulation were developed and are described. These methods are based on nonlinear relaxation techniques and exploit the relative inactivity of large circuits. Simple waveform-processing techniques are described to determine the maximum possible speed improvement that can be obtained by exploiting this property of large circuits. Three simulation algorithmsmore » are described, two of which are based on the Iterated Timing Analysis (ITA) method and a third based on the Waveform-Relaxation Newton (WRN) method. New programs that incorporate these techniques were developed and used to simulate a variety of industrial circuits. The results from these simulations are provided. The techniques are shown to be much faster than the standard approach. In addition, a number of parallel aspects of these algorithms are described, and a general space-time model of parallel-task scheduling is developed.« less
NASA Technical Reports Server (NTRS)
Fischer, Robert E. (Editor); Rogers, Philip J. (Editor)
1986-01-01
The present conference considers topics in the fields of optical systems design software, the design and analysis of optical systems, illustrative cases of advanced optical system design, the integration of optical designs into greater systems, and optical fabrication and testing techniques. Attention is given to an extended range diffraction-based merit function for lens design optimization, an assessment of technologies for stray light control and evaluation, the automated characterization of IR systems' spatial resolution, a spectrum of design techniques based on aberration theory, a three-field IR telescope, a large aperture zoom lens for 16-mm motion picture cameras, and the use of concave holographic gratings as monochomators. Also discussed are the use of aspherics in optical systems, glass choice procedures for periscope design, the fabrication and testing of unconventional optics, low mass mirrors for large optics, and the diamond grinding of optical surfaces on aspheric lens molds.
Magnetic induction tomography of objects for security applications
NASA Astrophysics Data System (ADS)
Ward, Rob; Joseph, Max; Langley, Abbi; Taylor, Stuart; Watson, Joe C.
2017-10-01
A coil array imaging system has been further developed from previous investigations, focusing on designing its application for fast screening of small bags or parcels, with a view to the production of a compact instrument for security applications. In addition to reducing image acquisition times, work was directed toward exploring potential cost effective manufacturing routes. Based on magnetic induction tomography and eddy-current principles, the instrument captured images of conductive targets using a lock-in amplifier, individually multiplexing signals between a primary driver coil and a 20 by 21 imaging array of secondary passive coils constructed using a reproducible multiple tile design. The design was based on additive manufacturing techniques and provided 2 orthogonal imaging planes with an ability to reconstruct images in less than 10 seconds. An assessment of one of the imaging planes is presented. This technique potentially provides a cost effective threat evaluation technique that may compliment conventional radiographic approaches.
Beam by design: Laser manipulation of electrons in modern accelerators
NASA Astrophysics Data System (ADS)
Hemsing, Erik; Stupakov, Gennady; Xiang, Dao; Zholents, Alexander
2014-07-01
Accelerator-based light sources such as storage rings and free-electron lasers use relativistic electron beams to produce intense radiation over a wide spectral range for fundamental research in physics, chemistry, materials science, biology, and medicine. More than a dozen such sources operate worldwide, and new sources are being built to deliver radiation that meets with the ever-increasing sophistication and depth of new research. Even so, conventional accelerator techniques often cannot keep pace with new demands and, thus, new approaches continue to emerge. In this article, a variety of recently developed and promising techniques that rely on lasers to manipulate and rearrange the electron distribution in order to tailor the properties of the radiation are reviewed. Basic theories of electron-laser interactions, techniques to create microstructures and nanostructures in electron beams, and techniques to produce radiation with customizable waveforms are reviewed. An overview of laser-based techniques for the generation of fully coherent x rays, mode-locked x-ray pulse trains, light with orbital angular momentum, and attosecond or even zeptosecond long coherent pulses in free-electron lasers is presented. Several methods to generate femtosecond pulses in storage rings are also discussed. Additionally, various schemes designed to enhance the performance of light sources through precision beam preparation including beam conditioning, laser heating, emittance exchange, and various laser-based diagnostics are described. Together these techniques represent a new emerging concept of "beam by design" in modern accelerators, which is the primary focus of this article.
Study of synthesis techniques for insensitive aircraft control systems
NASA Technical Reports Server (NTRS)
Harvey, C. A.; Pope, R. E.
1977-01-01
Insensitive flight control system design criteria was defined in terms of maximizing performance (handling qualities, RMS gust response, transient response, stability margins) over a defined parameter range. Wing load alleviation for the C-5A was chosen as a design problem. The C-5A model was a 79-state, two-control structure with uncertainties assumed to exist in dynamic pressure, structural damping and frequency, and the stability derivative, M sub w. Five new techniques (mismatch estimation, uncertainty weighting, finite dimensional inverse, maximum difficulty, dual Lyapunov) were developed. Six existing techniques (additive noise, minimax, multiplant, sensitivity vector augmentation, state dependent noise, residualization) and the mismatch estimation and uncertainty weighting techniques were synthesized and evaluated on the design example. Evaluation and comparison of these six techniques indicated that the minimax and the uncertainty weighting techniques were superior to the other six, and of these two, uncertainty weighting has lower computational requirements. Techniques based on the three remaining new concepts appear promising and are recommended for further research.
Optimizations and Applications in Head-Mounted Video-Based Eye Tracking
ERIC Educational Resources Information Center
Li, Feng
2011-01-01
Video-based eye tracking techniques have become increasingly attractive in many research fields, such as visual perception and human-computer interface design. The technique primarily relies on the positional difference between the center of the eye's pupil and the first-surface reflection at the cornea, the corneal reflection (CR). This…
Image-Based 3d Reconstruction and Analysis for Orthodontia
NASA Astrophysics Data System (ADS)
Knyaz, V. A.
2012-08-01
Among the main tasks of orthodontia are analysis of teeth arches and treatment planning for providing correct position for every tooth. The treatment plan is based on measurement of teeth parameters and designing perfect teeth arch curve which teeth are to create after treatment. The most common technique for teeth moving uses standard brackets which put on teeth and a wire of given shape which is clamped by these brackets for producing necessary forces to every tooth for moving it in given direction. The disadvantages of standard bracket technique are low accuracy of tooth dimensions measurements and problems with applying standard approach for wide variety of complex orthodontic cases. The image-based technique for orthodontic planning, treatment and documenting aimed at overcoming these disadvantages is proposed. The proposed approach provides performing accurate measurements of teeth parameters needed for adequate planning, designing correct teeth position and monitoring treatment process. The developed technique applies photogrammetric means for teeth arch 3D model generation, brackets position determination and teeth shifting analysis.
A Model-Based Anomaly Detection Approach for Analyzing Streaming Aircraft Engine Measurement Data
NASA Technical Reports Server (NTRS)
Simon, Donald L.; Rinehart, Aidan W.
2014-01-01
This paper presents a model-based anomaly detection architecture designed for analyzing streaming transient aircraft engine measurement data. The technique calculates and monitors residuals between sensed engine outputs and model predicted outputs for anomaly detection purposes. Pivotal to the performance of this technique is the ability to construct a model that accurately reflects the nominal operating performance of the engine. The dynamic model applied in the architecture is a piecewise linear design comprising steady-state trim points and dynamic state space matrices. A simple curve-fitting technique for updating the model trim point information based on steadystate information extracted from available nominal engine measurement data is presented. Results from the application of the model-based approach for processing actual engine test data are shown. These include both nominal fault-free test case data and seeded fault test case data. The results indicate that the updates applied to improve the model trim point information also improve anomaly detection performance. Recommendations for follow-on enhancements to the technique are also presented and discussed.
A Model-Based Anomaly Detection Approach for Analyzing Streaming Aircraft Engine Measurement Data
NASA Technical Reports Server (NTRS)
Simon, Donald L.; Rinehart, Aidan Walker
2015-01-01
This paper presents a model-based anomaly detection architecture designed for analyzing streaming transient aircraft engine measurement data. The technique calculates and monitors residuals between sensed engine outputs and model predicted outputs for anomaly detection purposes. Pivotal to the performance of this technique is the ability to construct a model that accurately reflects the nominal operating performance of the engine. The dynamic model applied in the architecture is a piecewise linear design comprising steady-state trim points and dynamic state space matrices. A simple curve-fitting technique for updating the model trim point information based on steadystate information extracted from available nominal engine measurement data is presented. Results from the application of the model-based approach for processing actual engine test data are shown. These include both nominal fault-free test case data and seeded fault test case data. The results indicate that the updates applied to improve the model trim point information also improve anomaly detection performance. Recommendations for follow-on enhancements to the technique are also presented and discussed.
Qin, Mian; Liu, Yaxiong; He, Jiankang; Wang, Ling; Lian, Qin; Li, Dichen; Jin, Zhongmin; He, Sanhu; Li, Gang; Liu, Yanpu; Wang, Zhen
2014-03-01
To summarize the latest research development of the application of digital design and three-dimensional (3-D) printing technique on individualized medical treatment. Recent research data and clinical literature about the application of digital design and 3-D printing technique on individualized medical treatment in Xi'an Jiaotong University and its cooperation unit were summarized, reviewed, and analyzed. Digital design and 3-D printing technique can design and manufacture individualized implant based on the patient's specific disease conditions. And the implant can satisfy the needs of specific shape and function of the patient, reducing dependence on the level of experience required for the doctor. So 3-D printing technique get more and more recognition of the surgeon on the individualized repair of human tissue. Xi'an Jiaotong University is the first unit to develop the commercial 3-D printer and conduct depth research on the design and manufacture of individualized medical implant. And complete technological processes and quality standards of product have been developed. The individualized medical implant manufactured by 3-D printing technique can not only achieve personalized match but also meet the functional requirements and aesthetic requirements of patients. In addition, the individualized medical implant has the advantages of accurate positioning, stable connection, and high strength. So 3-D printing technique has broad prospects in the manufacture and application of individualized implant.
Knowledge-Based Instructional Gaming: GEO.
ERIC Educational Resources Information Center
Duchastel, Philip
1989-01-01
Describes the design and development of an instructional game, GEO, in which the user learns elements of Canadian geography. The use of knowledge-based artificial intelligence techniques is discussed, the use of HyperCard in the design of GEO is explained, and future directions are suggested. (15 references) (Author/LRW)
OPC for curved designs in application to photonics on silicon
NASA Astrophysics Data System (ADS)
Orlando, Bastien; Farys, Vincent; Schneider, Loïc.; Cremer, Sébastien; Postnikov, Sergei V.; Millequant, Matthieu; Dirrenberger, Mathieu; Tiphine, Charles; Bayle, Sébastian; Tranquillin, Céline; Schiavone, Patrick
2016-03-01
Today's design for photonics devices on silicon relies on non-Manhattan features such as curves and a wide variety of angles with minimum feature size below 100nm. Industrial manufacturing of such devices requires optimized process window with 193nm lithography. Therefore, Resolution Enhancement Techniques (RET) that are commonly used for CMOS manufacturing are required. However, most RET algorithms are based on Manhattan fragmentation (0°, 45° and 90°) which can generate large CD dispersion on masks for photonic designs. Industrial implementation of RET solutions to photonic designs is challenging as most currently available OPC tools are CMOS-oriented. Discrepancy from design to final results induced by RET techniques can lead to lower photonic device performance. We propose a novel sizing algorithm allowing adjustment of design edge fragments while preserving the topology of the original structures. The results of the algorithm implementation in the rule based sizing, SRAF placement and model based correction will be discussed in this paper. Corrections based on this novel algorithm were applied and characterized on real photonics devices. The obtained results demonstrate the validity of the proposed correction method integrated in Inscale software of Aselta Nanographics.
Design of vibration isolation systems using multiobjective optimization techniques
NASA Technical Reports Server (NTRS)
Rao, S. S.
1984-01-01
The design of vibration isolation systems is considered using multicriteria optimization techniques. The integrated values of the square of the force transmitted to the main mass and the square of the relative displacement between the main mass and the base are taken as the performance indices. The design of a three degrees-of-freedom isolation system with an exponentially decaying type of base disturbance is considered for illustration. Numerical results are obtained using the global criterion, utility function, bounded objective, lexicographic, goal programming, goal attainment and game theory methods. It is found that the game theory approach is superior in finding a better optimum solution with proper balance of the various objective functions.
An ultra-low-power filtering technique for biomedical applications.
Zhang, Tan-Tan; Mak, Pui-In; Vai, Mang-I; Mak, Peng-Un; Wan, Feng; Martins, R P
2011-01-01
This paper describes an ultra-low-power filtering technique for biomedical applications designated as T-wave sensing in heart-activities detection systems. The topology is based on a source-follower-based Biquad operating in the sub-threshold region. With the intrinsic advantages of simplicity and high linearity of the source-follower, ultra-low-cutoff filtering can be achieved, simultaneously with ultra low power and good linearity. An 8(th)-order 2.4-Hz lowpass filter design example optimized in a 0.35-μm CMOS process was designed achieving over 85-dB dynamic range, 74-dB stopband attenuation and consuming only 0.36 nW at a 3-V supply.
Balancing generality and specificity in component-based reuse
NASA Technical Reports Server (NTRS)
Eichmann, David A.; Beck, Jon
1992-01-01
For a component industry to be successful, we must move beyond the current techniques of black box reuse and genericity to a more flexible framework supporting customization of components as well as instantiation and composition of components. Customization of components strikes a balanced between creating dozens of variations of a base component and requiring the overhead of unnecessary features of an 'everything but the kitchen sink' component. We argue that design and instantiation of reusable components have competing criteria - design-for-use strives for generality, design-with-reuse strives for specificity - and that providing mechanisms for each can be complementary rather than antagonistic. In particular, we demonstrate how program slicing techniques can be applied to customization of reusable components.
A Design of Product Collaborative Online Configuration Model
NASA Astrophysics Data System (ADS)
Wang, Xiaoguo; Zheng, Jin; Zeng, Qian
According to the actual needs of mass customization, the personalization of product and its collaborative design, the paper analyzes and studies the working mechanism of modular-based product configuration technology and puts forward an information model of modular product family. Combined with case-based reasoning techniques (CBR) and the constraint satisfaction problem solving techniques (CSP), we design and study the algorithm for product configuration, and analyze its time complexity. A car chassis is made as the application object, we provide a prototype system of online configuration. Taking advantage of this system, designers can make appropriate changes on the existing programs in accordance with the demand. This will accelerate all aspects of product development and shorten the product cycle. Also the system will provide a strong technical support for enterprises to improve their market competitiveness.
Interacting with Visual Poems through AR-Based Digital Artwork
ERIC Educational Resources Information Center
Lin, Hao-Chiang Koong; Hsieh, Min-Chai; Liu, Eric Zhi-Feng; Chuang, Tsung-Yen
2012-01-01
In this study, an AR-based digital artwork called "Mind Log" was designed and evaluated. The augmented reality technique was employed to create digital artwork that would present interactive poems. A digital poem was generated via the interplay between a video film and a text-based poem. This artwork was created following a rigorous design flow,…
The influence of surface finishing methods on touch-sensitive reactions
NASA Astrophysics Data System (ADS)
Kukhta, M. S.; Sokolov, A. P.; Krauinsh, P. Y.; Kozlova, A. D.; Bouchard, C.
2017-02-01
This paper describes the modern technological development trends in jewelry design. In the jewelry industry, new trends, associated with the introduction of updated non-traditional materials and finishing techniques, are appearing. The existing information-oriented society enhances the visual aesthetics of new jewelry forms, decoration techniques (depth and surface), synthesis of different materials, which, all in all, reveal a bias towards positive effects of visual design. Today, the jewelry industry includes not only traditional techniques, but also such improved techniques as computer-assisted design, 3D-prototyping and other alternatives to produce an updated level of jewelry material processing. The authors present the specific features of ornamental pattern designing, decoration types (depth and surface) and comparative analysis of different approaches in surface finishing. Identifying the appearance or the effect of jewelry is based on proposed evaluation criteria, providing an advanced visual aesthetics basis is predicated on touch-sensitive responses.
Niamul Islam, Naz; Hannan, M A; Mohamed, Azah; Shareef, Hussain
2016-01-01
Power system oscillation is a serious threat to the stability of multimachine power systems. The coordinated control of power system stabilizers (PSS) and thyristor-controlled series compensation (TCSC) damping controllers is a commonly used technique to provide the required damping over different modes of growing oscillations. However, their coordinated design is a complex multimodal optimization problem that is very hard to solve using traditional tuning techniques. In addition, several limitations of traditionally used techniques prevent the optimum design of coordinated controllers. In this paper, an alternate technique for robust damping over oscillation is presented using backtracking search algorithm (BSA). A 5-area 16-machine benchmark power system is considered to evaluate the design efficiency. The complete design process is conducted in a linear time-invariant (LTI) model of a power system. It includes the design formulation into a multi-objective function from the system eigenvalues. Later on, nonlinear time-domain simulations are used to compare the damping performances for different local and inter-area modes of power system oscillations. The performance of the BSA technique is compared against that of the popular particle swarm optimization (PSO) for coordinated design efficiency. Damping performances using different design techniques are compared in term of settling time and overshoot of oscillations. The results obtained verify that the BSA-based design improves the system stability significantly. The stability of the multimachine power system is improved by up to 74.47% and 79.93% for an inter-area mode and a local mode of oscillation, respectively. Thus, the proposed technique for coordinated design has great potential to improve power system stability and to maintain its secure operation.
Global Design Optimization for Aerodynamics and Rocket Propulsion Components
NASA Technical Reports Server (NTRS)
Shyy, Wei; Papila, Nilay; Vaidyanathan, Rajkumar; Tucker, Kevin; Turner, James E. (Technical Monitor)
2000-01-01
Modern computational and experimental tools for aerodynamics and propulsion applications have matured to a stage where they can provide substantial insight into engineering processes involving fluid flows, and can be fruitfully utilized to help improve the design of practical devices. In particular, rapid and continuous development in aerospace engineering demands that new design concepts be regularly proposed to meet goals for increased performance, robustness and safety while concurrently decreasing cost. To date, the majority of the effort in design optimization of fluid dynamics has relied on gradient-based search algorithms. Global optimization methods can utilize the information collected from various sources and by different tools. These methods offer multi-criterion optimization, handle the existence of multiple design points and trade-offs via insight into the entire design space, can easily perform tasks in parallel, and are often effective in filtering the noise intrinsic to numerical and experimental data. However, a successful application of the global optimization method needs to address issues related to data requirements with an increase in the number of design variables, and methods for predicting the model performance. In this article, we review recent progress made in establishing suitable global optimization techniques employing neural network and polynomial-based response surface methodologies. Issues addressed include techniques for construction of the response surface, design of experiment techniques for supplying information in an economical manner, optimization procedures and multi-level techniques, and assessment of relative performance between polynomials and neural networks. Examples drawn from wing aerodynamics, turbulent diffuser flows, gas-gas injectors, and supersonic turbines are employed to help demonstrate the issues involved in an engineering design context. Both the usefulness of the existing knowledge to aid current design practices and the need for future research are identified.
Favazza, Christopher P; Yu, Lifeng; Leng, Shuai; Kofler, James M; McCollough, Cynthia H
2015-01-01
To compare computed tomography dose and noise arising from use of an automatic exposure control (AEC) system designed to maintain constant image noise as patient size varies with clinically accepted technique charts and AEC systems designed to vary image noise. A model was developed to describe tube current modulation as a function of patient thickness. Relative dose and noise values were calculated as patient width varied for AEC settings designed to yield constant or variable noise levels and were compared to empirically derived values used by our clinical practice. Phantom experiments were performed in which tube current was measured as a function of thickness using a constant-noise-based AEC system and the results were compared with clinical technique charts. For 12-, 20-, 28-, 44-, and 50-cm patient widths, the requirement of constant noise across patient size yielded relative doses of 5%, 14%, 38%, 260%, and 549% and relative noises of 435%, 267%, 163%, 61%, and 42%, respectively, as compared with our clinically used technique chart settings at each respective width. Experimental measurements showed that a constant noise-based AEC system yielded 175% relative noise for a 30-cm phantom and 206% relative dose for a 40-cm phantom compared with our clinical technique chart. Automatic exposure control systems that prescribe constant noise as patient size varies can yield excessive noise in small patients and excessive dose in obese patients compared with clinically accepted technique charts. Use of noise-level technique charts and tube current limits can mitigate these effects.
2011-08-01
challenges in new design methodologies . Particular examples involve an in-circuit functional timing testing of systems with millions of cores. I...TECHNIQUES Chair: Dwight Woolard, U.S. Army Research Office (ARO) 8:40-9:05 EXPERIMENTAL DESIGN OF SINGLE-CRYSTAL DNA FOR THZ SPECTROSCOPY...Detection Based Techniques EXPERIMENTAL DESIGN OF SINGLE-CRYSTAL DNA FOR THZ SPECTROSCOPY E. R. Brown, M.L. Norton, M. Rahman, W. Zhang Wright
Structured codebook design in CELP
NASA Technical Reports Server (NTRS)
Leblanc, W. P.; Mahmoud, S. A.
1990-01-01
Codebook Excited Linear Protection (CELP) is a popular analysis by synthesis technique for quantizing speech at bit rates from 4 to 6 kbps. Codebook design techniques to date have been largely based on either random (often Gaussian) codebooks, or on known binary or ternary codes which efficiently map the space of (assumed white) excitation codevectors. It has been shown that by introducing symmetries into the codebook, good complexity reduction can be realized with only marginal decrease in performance. Codebook design algorithms are considered for a wide range of structured codebooks.
Collaborative Learning in the Dance Technique Class
ERIC Educational Resources Information Center
Raman, Tanja
2009-01-01
This research was designed to enhance dance technique learning by promoting critical thinking amongst students studying on a degree programme at the University of Wales Institute, Cardiff. Students were taught Cunningham-based dance technique using pair work together with the traditional demonstration/copying method. To evaluate the study,…
NASA Technical Reports Server (NTRS)
Walker, Carrie K.
1991-01-01
A technique has been developed for combining features of a systems architecture design and assessment tool and a software development tool. This technique reduces simulation development time and expands simulation detail. The Architecture Design and Assessment System (ADAS), developed at the Research Triangle Institute, is a set of computer-assisted engineering tools for the design and analysis of computer systems. The ADAS system is based on directed graph concepts and supports the synthesis and analysis of software algorithms mapped to candidate hardware implementations. Greater simulation detail is provided by the ADAS functional simulator. With the functional simulator, programs written in either Ada or C can be used to provide a detailed description of graph nodes. A Computer-Aided Software Engineering tool developed at the Charles Stark Draper Laboratory (CSDL CASE) automatically generates Ada or C code from engineering block diagram specifications designed with an interactive graphical interface. A technique to use the tools together has been developed, which further automates the design process.
NASA Technical Reports Server (NTRS)
Ostroff, A. J.
1973-01-01
Some of the major difficulties associated with large orbiting astronomical telescopes are the cost of manufacturing the primary mirror to precise tolerances and the maintaining of diffraction-limited tolerances while in orbit. One successfully demonstrated approach for minimizing these problem areas is the technique of actively deforming the primary mirror by applying discrete forces to the rear of the mirror. A modal control technique, as applied to active optics, has previously been developed and analyzed. The modal control technique represents the plant to be controlled in terms of its eigenvalues and eigenfunctions which are estimated via numerical approximation techniques. The report includes an extension of previous work using the modal control technique and also describes an optimal feedback controller. The equations for both control laws are developed in state-space differential form and include such considerations as stability, controllability, and observability. These equations are general and allow the incorporation of various mode-analyzer designs; two design approaches are presented. The report also includes a technique for placing actuator and sensor locations at points on the mirror based upon the flexibility matrix of the uncontrolled or unobserved modes of the structure. The locations selected by this technique are used in the computer runs which are described. The results are based upon three different initial error distributions, two mode-analyzer designs, and both the modal and optimal control laws.
2D Presentation Techniques of Mind-maps for Blind Meeting Participants.
Pölzer, Stephan; Miesenberger, Klaus
2015-01-01
Mind-maps, used as ideation technique in co-located meetings (e.g. in brainstorming sessions), which meet with increased importance in business and education, show considerably accessibility challenges for blind meeting participants. Besides an overview of general aspects of accessibility issues in co-located meetings, this paper focuses on the design and development of alternative non-visual presentation techniques for mind-maps. The different aspects of serialized presentation techniques (e.g. treeview) for Braille and audio rendering and two dimensional presentation techniques (e.g. tactile two dimensional array matrix and edge-projection method [1]) are discussed based on the user feedback gathered in intermediate tests following a user centered design approach.
Kelay, Tanika; Chan, Kah Leong; Ako, Emmanuel; Yasin, Mohammad; Costopoulos, Charis; Gold, Matthew; Kneebone, Roger K; Malik, Iqbal S; Bello, Fernando
2017-01-01
Distributed Simulation is the concept of portable, high-fidelity immersive simulation. Here, it is used for the development of a simulation-based training programme for cardiovascular specialities. We present an evidence base for how accessible, portable and self-contained simulated environments can be effectively utilised for the modelling, development and testing of a complex training framework and assessment methodology. Iterative user feedback through mixed-methods evaluation techniques resulted in the implementation of the training programme. Four phases were involved in the development of our immersive simulation-based training programme: ( 1) initial conceptual stage for mapping structural criteria and parameters of the simulation training framework and scenario development ( n = 16), (2) training facility design using Distributed Simulation , (3) test cases with clinicians ( n = 8) and collaborative design, where evaluation and user feedback involved a mixed-methods approach featuring (a) quantitative surveys to evaluate the realism and perceived educational relevance of the simulation format and framework for training and (b) qualitative semi-structured interviews to capture detailed feedback including changes and scope for development. Refinements were made iteratively to the simulation framework based on user feedback, resulting in (4) transition towards implementation of the simulation training framework, involving consistent quantitative evaluation techniques for clinicians ( n = 62). For comparative purposes, clinicians' initial quantitative mean evaluation scores for realism of the simulation training framework, realism of the training facility and relevance for training ( n = 8) are presented longitudinally, alongside feedback throughout the development stages from concept to delivery, including the implementation stage ( n = 62). Initially, mean evaluation scores fluctuated from low to average, rising incrementally. This corresponded with the qualitative component, which augmented the quantitative findings; trainees' user feedback was used to perform iterative refinements to the simulation design and components (collaborative design), resulting in higher mean evaluation scores leading up to the implementation phase. Through application of innovative Distributed Simulation techniques, collaborative design, and consistent evaluation techniques from conceptual, development, and implementation stages, fully immersive simulation techniques for cardiovascular specialities are achievable and have the potential to be implemented more broadly.
CometBoards Users Manual Release 1.0
NASA Technical Reports Server (NTRS)
Guptill, James D.; Coroneos, Rula M.; Patnaik, Surya N.; Hopkins, Dale A.; Berke, Lazlo
1996-01-01
Several nonlinear mathematical programming algorithms for structural design applications are available at present. These include the sequence of unconstrained minimizations technique, the method of feasible directions, and the sequential quadratic programming technique. The optimality criteria technique and the fully utilized design concept are two other structural design methods. A project was undertaken to bring all these design methods under a common computer environment so that a designer can select any one of these tools that may be suitable for his/her application. To facilitate selection of a design algorithm, to validate and check out the computer code, and to ascertain the relative merits of the design tools, modest finite element structural analysis programs based on the concept of stiffness and integrated force methods have been coupled to each design method. The code that contains both these design and analysis tools, by reading input information from analysis and design data files, can cast the design of a structure as a minimum-weight optimization problem. The code can then solve it with a user-specified optimization technique and a user-specified analysis method. This design code is called CometBoards, which is an acronym for Comparative Evaluation Test Bed of Optimization and Analysis Routines for the Design of Structures. This manual describes for the user a step-by-step procedure for setting up the input data files and executing CometBoards to solve a structural design problem. The manual includes the organization of CometBoards; instructions for preparing input data files; the procedure for submitting a problem; illustrative examples; and several demonstration problems. A set of 29 structural design problems have been solved by using all the optimization methods available in CometBoards. A summary of the optimum results obtained for these problems is appended to this users manual. CometBoards, at present, is available for Posix-based Cray and Convex computers, Iris and Sun workstations, and the VM/CMS system.
Safety Guided Design of Crew Return Vehicle in Concept Design Phase Using STAMP/STPA
NASA Astrophysics Data System (ADS)
Nakao, H.; Katahira, M.; Miyamoto, Y.; Leveson, N.
2012-01-01
In the concept development and design phase of a new space system, such as a Crew Vehicle, designers tend to focus on how to implement new technology. Designers also consider the difficulty of using the new technology and trade off several system design candidates. Then they choose an optimal design from the candidates. Safety should be a key aspect driving optimal concept design. However, in past concept design activities, safety analysis such as FTA has not used to drive the design because such analysis techniques focus on component failure and component failure cannot be considered in the concept design phase. The solution to these problems is to apply a new hazard analysis technique, called STAMP/STPA. STAMP/STPA defines safety as a control problem rather than a failure problem and identifies hazardous scenarios and their causes. Defining control flow is the essential in concept design phase. Therefore STAMP/STPA could be a useful tool to assess the safety of system candidates and to be part of the rationale for choosing a design as the baseline of the system. In this paper, we explain our case study of safety guided concept design using STPA, the new hazard analysis technique, and model-based specification technique on Crew Return Vehicle design and evaluate benefits of using STAMP/STPA in concept development phase.
NASA Technical Reports Server (NTRS)
Sreekanta Murthy, T.
1992-01-01
Results of the investigation of formal nonlinear programming-based numerical optimization techniques of helicopter airframe vibration reduction are summarized. The objective and constraint function and the sensitivity expressions used in the formulation of airframe vibration optimization problems are presented and discussed. Implementation of a new computational procedure based on MSC/NASTRAN and CONMIN in a computer program system called DYNOPT for optimizing airframes subject to strength, frequency, dynamic response, and dynamic stress constraints is described. An optimization methodology is proposed which is thought to provide a new way of applying formal optimization techniques during the various phases of the airframe design process. Numerical results obtained from the application of the DYNOPT optimization code to a helicopter airframe are discussed.
ERIC Educational Resources Information Center
Hitt, Fernando; Kieran, Carolyn
2009-01-01
Our research project aimed at understanding the complexity of the construction of knowledge in a CAS environment. Basing our work on the French instrumental approach, in particular the Task-Technique-Theory (T-T-T) theoretical frame as adapted from Chevallard's Anthropological Theory of Didactics, we were mindful that a careful task design process…
NanoDesign: Concepts and Software for a Nanotechnology Based on Functionalized Fullerenes
NASA Technical Reports Server (NTRS)
Globus, Al; Jaffe, Richard; Chancellor, Marisa K. (Technical Monitor)
1996-01-01
Eric Drexler has proposed a hypothetical nanotechnology based on diamond and investigated the properties of such molecular systems. While attractive, diamonoid nanotechnology is not physically accessible with straightforward extensions of current laboratory techniques. We propose a nanotechnology based on functionalized fullerenes and investigate carbon nanotube based gears with teeth added via a benzyne reaction known to occur with C60. The gears are single-walled carbon nanotubes with appended coenzyme groups for teeth. Fullerenes are in widespread laboratory use and can be functionalized in many ways. Companion papers computationally demonstrate the properties of these gears (they appear to work) and the accessibility of the benzyne/nanotube reaction. This paper describes the molecular design techniques and rationale as well as the software that implements these design techniques. The software is a set of persistent C++ objects controlled by TCL command scripts. The c++/tcl interface is automatically generated by a software system called tcl_c++ developed by the author and described here. The objects keep track of different portions of the molecular machinery to allow different simulation techniques and boundary conditions to be applied as appropriate. This capability has been required to demonstrate (computationally) our gear's feasibility. A new distributed software architecture featuring a WWW universal client, CORBA distributed objects, and agent software is under consideration. The software architecture is intended to eventually enable a widely disbursed group to develop complex simulated molecular machines.
Logic-based assessment of the compatibility of UMLS ontology sources
2011-01-01
Background The UMLS Metathesaurus (UMLS-Meta) is currently the most comprehensive effort for integrating independently-developed medical thesauri and ontologies. UMLS-Meta is being used in many applications, including PubMed and ClinicalTrials.gov. The integration of new sources combines automatic techniques, expert assessment, and auditing protocols. The automatic techniques currently in use, however, are mostly based on lexical algorithms and often disregard the semantics of the sources being integrated. Results In this paper, we argue that UMLS-Meta’s current design and auditing methodologies could be significantly enhanced by taking into account the logic-based semantics of the ontology sources. We provide empirical evidence suggesting that UMLS-Meta in its 2009AA version contains a significant number of errors; these errors become immediately apparent if the rich semantics of the ontology sources is taken into account, manifesting themselves as unintended logical consequences that follow from the ontology sources together with the information in UMLS-Meta. We then propose general principles and specific logic-based techniques to effectively detect and repair such errors. Conclusions Our results suggest that the methodologies employed in the design of UMLS-Meta are not only very costly in terms of human effort, but also error-prone. The techniques presented here can be useful for both reducing human effort in the design and maintenance of UMLS-Meta and improving the quality of its contents. PMID:21388571
NASA Technical Reports Server (NTRS)
Ricks, Wendell R.; Abbott, Kathy H.
1987-01-01
To the software design community, the concern over the costs associated with a program's execution time and implementation is great. It is always desirable, and sometimes imperative, that the proper programming technique is chosen which minimizes all costs for a given application or type of application. A study is described that compared cost-related factors associated with traditional programming techniques to rule-based programming techniques for a specific application. The results of this study favored the traditional approach regarding execution efficiency, but favored the rule-based approach regarding programmer productivity (implementation ease). Although this study examined a specific application, the results should be widely applicable.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, Kuangcai
The goal of this study is to help with future data analysis and experiment designs in rotational dynamics research using DIC-based SPORT technique. Most of the current studies using DIC-based SPORT techniques are technical demonstrations. Understanding the mechanisms behind the observed rotational behaviors of the imaging probes should be the focus of the future SPORT studies. More efforts are still needed in the development of new imaging probes, particle tracking methods, instrumentations, and advanced data analysis methods to further extend the potential of DIC-based SPORT technique.
Concept of combinatorial de novo design of drug-like molecules by particle swarm optimization.
Hartenfeller, Markus; Proschak, Ewgenij; Schüller, Andreas; Schneider, Gisbert
2008-07-01
We present a fast stochastic optimization algorithm for fragment-based molecular de novo design (COLIBREE, Combinatorial Library Breeding). The search strategy is based on a discrete version of particle swarm optimization. Molecules are represented by a scaffold, which remains constant during optimization, and variable linkers and side chains. Different linkers represent virtual chemical reactions. Side-chain building blocks were obtained from pseudo-retrosynthetic dissection of large compound databases. Here, ligand-based design was performed using chemically advanced template search (CATS) topological pharmacophore similarity to reference ligands as fitness function. A weighting scheme was included for particle swarm optimization-based molecular design, which permits the use of many reference ligands and allows for positive and negative design to be performed simultaneously. In a case study, the approach was applied to the de novo design of potential peroxisome proliferator-activated receptor subtype-selective agonists. The results demonstrate the ability of the technique to cope with large combinatorial chemistry spaces and its applicability to focused library design. The technique was able to perform exploitation of a known scheme and at the same time explorative search for novel ligands within the framework of a given molecular core structure. It thereby represents a practical solution for compound screening in the early hit and lead finding phase of a drug discovery project.
Melt Flow Control in the Directional Solidification of Binary Alloys
NASA Technical Reports Server (NTRS)
Zabaras, Nicholas
2003-01-01
Our main project objectives are to develop computational techniques based on inverse problem theory that can be used to design directional solidification processes that lead to desired temperature gradient and growth conditions at the freezing front at various levels of gravity. It is known that control of these conditions plays a significant role in the selection of the form and scale of the obtained solidification microstructures. Emphasis is given on the control of the effects of various melt flow mechanisms on the local to the solidification front conditions. The thermal boundary conditions (furnace design) as well as the magnitude and direction of an externally applied magnetic field are the main design variables. We will highlight computational design models for sharp front solidification models and briefly discuss work in progress toward the development of design techniques for multi-phase volume-averaging based solidification models.
Nonlinear program based optimization of boost and buck-boost converter designs
NASA Astrophysics Data System (ADS)
Rahman, S.; Lee, F. C.
The facility of an Augmented Lagrangian (ALAG) multiplier based nonlinear programming technique is demonstrated for minimum-weight design optimizations of boost and buck-boost power converters. Certain important features of ALAG are presented in the framework of a comprehensive design example for buck-boost power converter design optimization. The study provides refreshing design insight of power converters and presents such information as weight and loss profiles of various semiconductor components and magnetics as a function of the switching frequency.
A review of techniques to determine alternative selection in design for remanufacturing
NASA Astrophysics Data System (ADS)
Noor, A. Z. Mohamed; Fauadi, M. H. F. Md; Jafar, F. A.; Mohamad, N. R.; Yunos, A. S. Mohd
2017-10-01
This paper discusses the techniques used for optimization in manufacturing system. Although problem domain is focused on sustainable manufacturing, techniques used to optimize general manufacturing system were also discussed. Important aspects of Design for Remanufacturing (DFReM) considered include indexes, weighted average, grey decision making and Fuzzy TOPSIS. The limitation of existing techniques are most of them is highly based on decision maker’s perspective. Different experts may have different understanding and eventually scale it differently. Therefore, the objective of this paper is to determine available techniques and identify the lacking feature in it. Once all the techniques have been reviewed, a decision will be made by create another technique which should counter the lacking of discussed techniques. In this paper, shows that the hybrid computation of Fuzzy Analytic Hierarchy Process (AHP) and Artificial Neural Network (ANN) is suitable and fill the gap of all discussed technique.
Design, construction and evaluation of a 12.2 GHz, 4.0 kW-CW coupled-cavity traveling wave tube
NASA Technical Reports Server (NTRS)
Ayers, W. R.; Harman, W. A.
1973-01-01
An analytical and experimental program to study design techniques and to utilize these techniques to optimize the performance of an X-band 4 kW, CW traveling wave tube ultimately intended for satellite-borne television broadcast transmitters is described. The design is based on the coupled-cavity slow-wave circuit with velocity resynchronization to maximize the conversion efficiency. The design incorporates a collector which is demountable from the tube. This was done to facilitate multistage depressed collector experiments employing a NASA designed axisymmetric, electrostatic collector for linear beam microwave tubes after shipment of the tubes to NASA.
USDA-ARS?s Scientific Manuscript database
The molecular biological techniques for plasmid-based assembly and cloning of gene open reading frames are essential for elucidating the function of the proteins encoded by the genes. These techniques involve the production of full-length cDNA libraries as a source of plasmid-based clones to expres...
The design and implementation of hydrographical information management system (HIMS)
NASA Astrophysics Data System (ADS)
Sui, Haigang; Hua, Li; Wang, Qi; Zhang, Anming
2005-10-01
With the development of hydrographical work and information techniques, the large variety of hydrographical information including electronic charts, documents and other materials are widely used, and the traditional management mode and techniques are unsuitable for the development of the Chinese Marine Safety Administration Bureau (CMSAB). How to manage all kinds of hydrographical information has become an important and urgent problem. A lot of advanced techniques including GIS, RS, spatial database management and VR techniques are introduced for solving these problems. Some design principles and key techniques of the HIMS including the mixed mode base on B/S, C/S and stand-alone computer mode, multi-source & multi-scale data organization and management, multi-source data integration and diverse visualization of digital chart, efficient security control strategies are illustrated in detail. Based on the above ideas and strategies, an integrated system named Hydrographical Information Management System (HIMS) was developed. And the HIMS has been applied in the Shanghai Marine Safety Administration Bureau and obtained good evaluation.
Geuna, S
2000-11-20
Quantitative morphology of the nervous system has undergone great developments over recent years, and several new technical procedures have been devised and applied successfully to neuromorphological research. However, a lively debate has arisen on some issues, and a great deal of confusion appears to exist that is definitely responsible for the slow spread of the new techniques among scientists. One such element of confusion is related to uncertainty about the meaning, implications, and advantages of the design-based sampling strategy that characterize the new techniques. In this article, to help remove this uncertainty, morphoquantitative methods are described and contrasted on the basis of the inferential paradigm of the sampling strategy: design-based vs model-based. Moreover, some recommendations are made to help scientists judge the appropriateness of a method used for a given study in relation to its specific goals. Finally, the use of the term stereology to label, more or less expressly, only some methods is critically discussed. Copyright 2000 Wiley-Liss, Inc.
Low-Bit Rate Feedback Strategies for Iterative IA-Precoded MIMO-OFDM-Based Systems
Teodoro, Sara; Silva, Adão; Dinis, Rui; Gameiro, Atílio
2014-01-01
Interference alignment (IA) is a promising technique that allows high-capacity gains in interference channels, but which requires the knowledge of the channel state information (CSI) for all the system links. We design low-complexity and low-bit rate feedback strategies where a quantized version of some CSI parameters is fed back from the user terminal (UT) to the base station (BS), which shares it with the other BSs through a limited-capacity backhaul network. This information is then used by BSs to perform the overall IA design. With the proposed strategies, we only need to send part of the CSI information, and this can even be sent only once for a set of data blocks transmitted over time-varying channels. These strategies are applied to iterative MMSE-based IA techniques for the downlink of broadband wireless OFDM systems with limited feedback. A new robust iterative IA technique, where channel quantization errors are taken into account in IA design, is also proposed and evaluated. With our proposed strategies, we need a small number of quantization bits to transmit and share the CSI, when comparing with the techniques used in previous works, while allowing performance close to the one obtained with perfect channel knowledge. PMID:24678274
Low-bit rate feedback strategies for iterative IA-precoded MIMO-OFDM-based systems.
Teodoro, Sara; Silva, Adão; Dinis, Rui; Gameiro, Atílio
2014-01-01
Interference alignment (IA) is a promising technique that allows high-capacity gains in interference channels, but which requires the knowledge of the channel state information (CSI) for all the system links. We design low-complexity and low-bit rate feedback strategies where a quantized version of some CSI parameters is fed back from the user terminal (UT) to the base station (BS), which shares it with the other BSs through a limited-capacity backhaul network. This information is then used by BSs to perform the overall IA design. With the proposed strategies, we only need to send part of the CSI information, and this can even be sent only once for a set of data blocks transmitted over time-varying channels. These strategies are applied to iterative MMSE-based IA techniques for the downlink of broadband wireless OFDM systems with limited feedback. A new robust iterative IA technique, where channel quantization errors are taken into account in IA design, is also proposed and evaluated. With our proposed strategies, we need a small number of quantization bits to transmit and share the CSI, when comparing with the techniques used in previous works, while allowing performance close to the one obtained with perfect channel knowledge.
Logic Design Pathology and Space Flight Electronics
NASA Technical Reports Server (NTRS)
Katz, Richard B.; Barto, Rod L.; Erickson, Ken
1999-01-01
This paper presents a look at logic design from early in the US Space Program and examines faults in recent logic designs. Most examples are based on flight hardware failures and analysis of new tools and techniques. The paper is presented in viewgraph form.
Hardware Implementation of 32-Bit High-Speed Direct Digital Frequency Synthesizer
Ibrahim, Salah Hasan; Ali, Sawal Hamid Md.; Islam, Md. Shabiul
2014-01-01
The design and implementation of a high-speed direct digital frequency synthesizer are presented. A modified Brent-Kung parallel adder is combined with pipelining technique to improve the speed of the system. A gated clock technique is proposed to reduce the number of registers in the phase accumulator design. The quarter wave symmetry technique is used to store only one quarter of the sine wave. The ROM lookup table (LUT) is partitioned into three 4-bit sub-ROMs based on angular decomposition technique and trigonometric identity. Exploiting the advantages of sine-cosine symmetrical attributes together with XOR logic gates, one sub-ROM block can be removed from the design. These techniques, compressed the ROM into 368 bits. The ROM compressed ratio is 534.2 : 1, with only two adders, two multipliers, and XOR-gates with high frequency resolution of 0.029 Hz. These techniques make the direct digital frequency synthesizer an attractive candidate for wireless communication applications. PMID:24991635
NASA Technical Reports Server (NTRS)
Vishida, J. M.; Brodersen, L. K.
1974-01-01
An analytical and experimental program is described, for studying design techniques for optimizing the conversion efficiency of klystron amplifiers, and to utilize these techniques in the development and fabrication of an X-band 4 kW cw klystron, for use in satellite-borne television broadcast transmitters. The design is based on a technique for increasing the RF beam current by using the second harmonic space charge forces in the bunched beam. Experimental analysis was also made of a method to enhance circuit efficiency in the klystron cavities. The design incorporates a collector which is demountable from the tube to facilitate multistage depressed collector experiments employing an axisymmetric, electrostatic collector for linear beam microwave tubes.
Design of a laser system for instantaneous location of a longwall shearer
NASA Technical Reports Server (NTRS)
Stein, R.
1981-01-01
Calculations and measurements for the design of a laser system for instantaneous location of a longwall shearer were made. The designs determine shearer location to approximately one foot. The roll, pitch, and yaw angles of the shearer track are determined to approximately two degrees. The first technique uses the water target system. A single silicon sensor system and three gallium arsenide laser beams are used in this technique. The second technique is based on an arrangement similar to that employed in aircraft omnidirectional position finding. The angle between two points is determined by combining information in an onmidirectional flash with a scanned, narrow beam beacon. It is concluded that this approach maximizes the signal levels.
Fortuna, A O; Gurd, J R
1999-01-01
During certain medical procedures, it is important to continuously measure the respiratory flow of a patient, as lack of proper ventilation can cause brain damage and ultimately death. The monitoring of the ventilatory condition of a patient is usually performed with the aid of flowmeters. However, water and other secretions present in the expired air can build up and ultimately block a traditional, restriction-based flowmeter; by using an orifice plate flowmeter, such blockages are minimized. This paper describes the design of an orifice plate flowmetering system including, especially, a description of the numerical and computational techniques adopted in order to simulate human respiratory and sinusoidal air flow across various possible designs for the orifice plate flowmeter device. Parallel computation and multigrid techniques were employed in order to reduce execution time. The simulated orifice plate was later built and tested under unsteady sinusoidal flows. Experimental tests show reasonable agreement with the numerical simulation, thereby reinforcing the general hypothesis that computational exploration of the design space is sufficiently accurate to allow designers of such systems to use this in preference to the more traditional, mechanical prototyping techniques.
NASA Technical Reports Server (NTRS)
Kolb, Mark A.
1988-01-01
The Rubber Airplane program, which combines two symbolic processing techniques with a component-based database of design knowledge, is proposed as a computer aid for conceptual design. Using object-oriented programming, programs are organized around the objects and behavior to be simulated, and using constraint propagation, declarative statements designate mathematical relationships among all the equation variables. It is found that the additional level of organizational structure resulting from the arrangement of the design information in terms of design components provides greater flexibility and convenience.
Application of additive laser technologies in the gas turbine blades design process
NASA Astrophysics Data System (ADS)
Shevchenko, I. V.; Rogalev, A. N.; Osipov, S. K.; Bychkov, N. M.; Komarov, I. I.
2017-11-01
An emergence of modern innovative technologies requires delivering new and modernization existing design and production processes. It is especially relevant for designing the high-temperature turbines of gas turbine engines, development of which is characterized by a transition to higher parameters of working medium in order to improve their efficient performance. A design technique for gas turbine blades based on predictive verification of thermal and hydraulic models of their cooling systems by testing of a blade prototype fabricated using the selective laser melting technology was presented in this article. Technique was proven at the time of development of the first stage blade cooling system for the high-pressure turbine. An experimental procedure for verification of a thermal model of the blades with convective cooling systems based on the comparison of heat-flux density obtained from the numerical simulation data and results of tests in a liquid-metal thermostat was developed. The techniques makes it possible to obtain an experimentally tested blade version and to exclude its experimental adjustment after the start of mass production.
H(2)- and H(infinity)-design tools for linear time-invariant systems
NASA Technical Reports Server (NTRS)
Ly, Uy-Loi
1989-01-01
Recent advances in optimal control have brought design techniques based on optimization of H(2) and H(infinity) norm criteria, closer to be attractive alternatives to single-loop design methods for linear time-variant systems. Significant steps forward in this technology are the deeper understanding of performance and robustness issues of these design procedures and means to perform design trade-offs. However acceptance of the technology is hindered by the lack of convenient design tools to exercise these powerful multivariable techniques, while still allowing single-loop design formulation. Presented is a unique computer tool for designing arbitrary low-order linear time-invarient controllers than encompasses both performance and robustness issues via the familiar H(2) and H(infinity) norm optimization. Application to disturbance rejection design for a commercial transport is demonstrated.
ROENTGEN: case-based reasoning and radiation therapy planning.
Berger, J.
1992-01-01
ROENTGEN is a design assistant for radiation therapy planning which uses case-based reasoning, an artificial intelligence technique. It learns both from specific problem-solving experiences and from direct instruction from the user. The first sort of learning is the normal case-based method of storing problem solutions so that they can be reused. The second sort is necessary because ROENTGEN does not, initially, have an internal model of the physics of its problem domain. This dependence on explicit user instruction brings to the forefront representational questions regarding indexing, failure definition, failure explanation and repair. This paper presents the techniques used by ROENTGEN in its knowledge acquisition and design activities. PMID:1482869
The Business Flight Simulator.
ERIC Educational Resources Information Center
Dwyer, P.; Simpson, D.
1989-01-01
The authors describe a simulation program based on a workshop approach designed for postsecondary business students. Features and benefits of the workshop technique are discussed. The authors cover practical aspects of designing and implementing simulation workshops. (CH)
Inquiry-based experiments for large-scale introduction to PCR and restriction enzyme digests.
Johanson, Kelly E; Watt, Terry J
2015-01-01
Polymerase chain reaction and restriction endonuclease digest are important techniques that should be included in all Biochemistry and Molecular Biology laboratory curriculums. These techniques are frequently taught at an advanced level, requiring many hours of student and faculty time. Here we present two inquiry-based experiments that are designed for introductory laboratory courses and combine both techniques. In both approaches, students must determine the identity of an unknown DNA sequence, either a gene sequence or a primer sequence, based on a combination of PCR product size and restriction digest pattern. The experimental design is flexible, and can be adapted based on available instructor preparation time and resources, and both approaches can accommodate large numbers of students. We implemented these experiments in our courses with a combined total of 584 students and have an 85% success rate. Overall, students demonstrated an increase in their understanding of the experimental topics, ability to interpret the resulting data, and proficiency in general laboratory skills. © 2015 The International Union of Biochemistry and Molecular Biology.
A TEMPLATE-BASED FABRICATION TECHNIQUE FOR SPATIALLY-DESIGNED POLYMER MICRO/NANOFIBER COMPOSITES
Naik, Nisarga; Caves, Jeff; Kumar, Vivek; Chaikof, Elliot; Allen, Mark G.
2013-01-01
This paper reports a template-based technique for the fabrication of polymer micro/nanofiber composites, exercising control over the fiber dimensions and alignment. Unlike conventional spinning-based methods of fiber production, the presented approach is based on micro-transfer molding. It is a parallel processing technique capable of producing fibers with control over both in-plane and out-of-plane geometries, in addition to packing density and layout of the fibers. Collagen has been used as a test polymer to demonstrate the concept. Hollow and solid collagen fibers with various spatial layouts have been fabricated. Produced fibers have widths ranging from 2 µm to 50 µm, and fiber thicknesses ranging from 300 nm to 3 µm. Also, three-dimensionality of the process has been demonstrated by producing in-plane serpentine fibers with designed arc lengths, out-of-plane wavy fibers, fibers with focalized particle encapsulation, and porous fibers with desired periodicity and pore sizes. PMID:24533428
NASA Technical Reports Server (NTRS)
Grissom, D. S.; Schneider, W. C.
1971-01-01
The determination of a base line (minimum weight) design for the primary structure of the living quarters modules in an earth-orbiting space base was investigated. Although the design is preliminary in nature, the supporting analysis is sufficiently thorough to provide a reasonably accurate weight estimate of the major components that are considered to comprise the structural weight of the space base.
Physical-level synthesis for digital lab-on-a-chip considering variation, contamination, and defect.
Liao, Chen; Hu, Shiyan
2014-03-01
Microfluidic lab-on-a-chips have been widely utilized in biochemical analysis and human health studies due to high detection accuracy, high timing efficiency, and low cost. The increasing design complexity of lab-on-a-chips necessitates the computer-aided design (CAD) methodology in contrast to the classical manual design methodology. A key part in lab-on-a-chip CAD is physical-level synthesis. It includes the lab-on-a-chip placement and routing, where placement is to determine the physical location and the starting time of each operation and routing is to transport each droplet from the source to the destination. In the lab-on-a-chip design, variation, contamination, and defect need to be considered. This work designs a physical-level synthesis flow which simultaneously considers variation, contamination, and defect of the lab-on-a-chip design. It proposes a maze routing based, variation, contamination, and defect aware droplet routing technique, which is seamlessly integrated into an existing placement technique. The proposed technique improves the placement solution for routing and achieves the placement and routing co-optimization to handle variation, contamination, and defect. The simulation results demonstrate that our technique does not use any defective/contaminated grids, while the technique without considering contamination and defect uses 17.0% of the defective/contaminated grids on average. In addition, our routing variation aware technique significantly improves the average routing yield by 51.2% with only 3.5% increase in completion time compared to a routing variation unaware technique.
NASA Technical Reports Server (NTRS)
Sellers, William L., III; Dwoyer, Douglas L.
1992-01-01
The design of a hypersonic aircraft poses unique challenges to the engineering community. Problems with duplicating flight conditions in ground based facilities have made performance predictions risky. Computational fluid dynamics (CFD) has been proposed as an additional means of providing design data. At the present time, CFD codes are being validated based on sparse experimental data and then used to predict performance at flight conditions with generally unknown levels of uncertainty. This paper will discuss the facility and measurement techniques that are required to support CFD development for the design of hypersonic aircraft. Illustrations are given of recent success in combining experimental and direct numerical simulation in CFD model development and validation for hypersonic perfect gas flows.
Efficient Testing Combining Design of Experiment and Learn-to-Fly Strategies
NASA Technical Reports Server (NTRS)
Murphy, Patrick C.; Brandon, Jay M.
2017-01-01
Rapid modeling and efficient testing methods are important in a number of aerospace applications. In this study efficient testing strategies were evaluated in a wind tunnel test environment and combined to suggest a promising approach for both ground-based and flight-based experiments. Benefits of using Design of Experiment techniques, well established in scientific, military, and manufacturing applications are evaluated in combination with newly developing methods for global nonlinear modeling. The nonlinear modeling methods, referred to as Learn-to-Fly methods, utilize fuzzy logic and multivariate orthogonal function techniques that have been successfully demonstrated in flight test. The blended approach presented has a focus on experiment design and identifies a sequential testing process with clearly defined completion metrics that produce increased testing efficiency.
Investigation of optical/infrared sensor techniques for application satellites
NASA Technical Reports Server (NTRS)
Kaufman, I.
1972-01-01
A method of scanning an optical sensor array by acoustic surface waves is discussed. Data cover detailed computer based analysis of the operation of a multielement acoustic surface-wave-scanned optical sensor, the development of design and operation techniques that were used to show the feasibility of an integrated array to design several such arrays, and experimental verification of a number of the calculations with discrete sensor devices.
Shuttle wave experiments. [space plasma investigations: design and instrumentation
NASA Technical Reports Server (NTRS)
Calvert, W.
1976-01-01
Wave experiments on shuttle are needed to verify dispersion relations, to study nonlinear and exotic phenomena, to support other plasma experiments, and to test engineering designs. Techniques based on coherent detection and bistatic geometry are described. New instrumentation required to provide modules for a variety of missions and to incorporate advanced signal processing and control techniques is discussed. An experiment for Z to 0 coupling is included.
NASA Astrophysics Data System (ADS)
Yang, Weizhu; Yue, Zhufeng; Li, Lei; Wang, Peiyan
2016-01-01
An optimization procedure combining an automated finite element modelling (AFEM) technique with a ground structure approach (GSA) is proposed for structural layout and sizing design of aircraft wings. The AFEM technique, based on CATIA VBA scripting and PCL programming, is used to generate models automatically considering the arrangement of inner systems. GSA is used for local structural topology optimization. The design procedure is applied to a high-aspect-ratio wing. The arrangement of the integral fuel tank, landing gear and control surfaces is considered. For the landing gear region, a non-conventional initial structural layout is adopted. The positions of components, the number of ribs and local topology in the wing box and landing gear region are optimized to obtain a minimum structural weight. Constraints include tank volume, strength, buckling and aeroelastic parameters. The results show that the combined approach leads to a greater weight saving, i.e. 26.5%, compared with three additional optimizations based on individual design approaches.
Lunar Contour Crafting: A Novel Technique for ISRU-Based Habitat Development
NASA Technical Reports Server (NTRS)
Khoshnevis, Behrokh; Bodiford, Melanie P.; Burks, Kevin H.; Ethridge, Ed; Tucker, Dennis; Kim, Won; Toutanji, Houssam; Fiske, Michael R.
2004-01-01
As the nation prepares to return to the Moon, it is apparent that the viability of long duration visits with appropriate radiation shielding/crew protection, hinges on the development of Lunar structures, preferably in advance of a manned landing, and preferably utilizing in-situ resources. Contour Crafting is a USC-patented technique for automated development of terrestrial concrete-based structures. The process is relatively fast, completely automated, and supports the incorporation of various infrastructure elements such as plumbing and electrical wiring. This paper will present a conceptual design of a Lunar Contour Crafting system designed to autonomously fabricate integrated structures on the Lunar surface using high-strength concrete based on Lunar regolith, including glass reinforcement rods or fibers fabricated from melted regolith. Design concepts will be presented, as well as results of initial tests aimed at concrete and glass production using Lunar regolith simulant. Key issues and concerns will be presented, along with design concepts for an LCC testbed to be developed at MSFC's Prototype Development Laboratory (PDL).
Game-Based Learning in Science Education: A Review of Relevant Research
ERIC Educational Resources Information Center
Li, Ming-Chaun; Tsai, Chin-Chung
2013-01-01
The purpose of this study is to review empirical research articles regarding game-based science learning (GBSL) published from 2000 to 2011. Thirty-one articles were identified through the Web of Science and SCOPUS databases. A qualitative content analysis technique was adopted to analyze the research purposes and designs, game design and…
Favazza, Christopher P.; Yu, Lifeng; Leng, Shuai; Kofler, James M.; McCollough, Cynthia H.
2015-01-01
Objective To compare computed tomography dose and noise arising from use of an automatic exposure control (AEC) system designed to maintain constant image noise as patient size varies with clinically accepted technique charts and AEC systems designed to vary image noise. Materials and Methods A model was developed to describe tube current modulation as a function of patient thickness. Relative dose and noise values were calculated as patient width varied for AEC settings designed to yield constant or variable noise levels and were compared to empirically derived values used by our clinical practice. Phantom experiments were performed in which tube current was measured as a function of thickness using a constant-noise-based AEC system and the results were compared with clinical technique charts. Results For 12-, 20-, 28-, 44-, and 50-cm patient widths, the requirement of constant noise across patient size yielded relative doses of 5%, 14%, 38%, 260%, and 549% and relative noises of 435%, 267%, 163%, 61%, and 42%, respectively, as compared with our clinically used technique chart settings at each respective width. Experimental measurements showed that a constant noise–based AEC system yielded 175% relative noise for a 30-cm phantom and 206% relative dose for a 40-cm phantom compared with our clinical technique chart. Conclusions Automatic exposure control systems that prescribe constant noise as patient size varies can yield excessive noise in small patients and excessive dose in obese patients compared with clinically accepted technique charts. Use of noise-level technique charts and tube current limits can mitigate these effects. PMID:25938214
Acquiring Software Design Schemas: A Machine Learning Perspective
NASA Technical Reports Server (NTRS)
Harandi, Mehdi T.; Lee, Hing-Yan
1991-01-01
In this paper, we describe an approach based on machine learning that acquires software design schemas from design cases of existing applications. An overview of the technique, design representation, and acquisition system are presented. the paper also addresses issues associated with generalizing common features such as biases. The generalization process is illustrated using an example.
NASA Astrophysics Data System (ADS)
Koziel, Slawomir; Bekasiewicz, Adrian
2016-10-01
Multi-objective optimization of antenna structures is a challenging task owing to the high computational cost of evaluating the design objectives as well as the large number of adjustable parameters. Design speed-up can be achieved by means of surrogate-based optimization techniques. In particular, a combination of variable-fidelity electromagnetic (EM) simulations, design space reduction techniques, response surface approximation models and design refinement methods permits identification of the Pareto-optimal set of designs within a reasonable timeframe. Here, a study concerning the scalability of surrogate-assisted multi-objective antenna design is carried out based on a set of benchmark problems, with the dimensionality of the design space ranging from six to 24 and a CPU cost of the EM antenna model from 10 to 20 min per simulation. Numerical results indicate that the computational overhead of the design process increases more or less quadratically with the number of adjustable geometric parameters of the antenna structure at hand, which is a promising result from the point of view of handling even more complex problems.
Logan, Heather; Wolfaardt, Johan; Boulanger, Pierre; Hodgetts, Bill; Seikaly, Hadi
2013-06-19
It is important to understand the perceived value of surgical design and simulation (SDS) amongst surgeons, as this will influence its implementation in clinical settings. The purpose of the present study was to examine the application of the convergent interview technique in the field of surgical design and simulation and evaluate whether the technique would uncover new perceptions of virtual surgical planning (VSP) and medical models not discovered by other qualitative case-based techniques. Five surgeons were asked to participate in the study. Each participant was interviewed following the convergent interview technique. After each interview, the interviewer interpreted the information by seeking agreements and disagreements among the interviewees in order to understand the key concepts in the field of SDS. Fifteen important issues were extracted from the convergent interviews. In general, the convergent interview was an effective technique in collecting information about the perception of clinicians. The study identified three areas where the technique could be improved upon for future studies in the SDS field.
2013-01-01
Background It is important to understand the perceived value of surgical design and simulation (SDS) amongst surgeons, as this will influence its implementation in clinical settings. The purpose of the present study was to examine the application of the convergent interview technique in the field of surgical design and simulation and evaluate whether the technique would uncover new perceptions of virtual surgical planning (VSP) and medical models not discovered by other qualitative case-based techniques. Methods Five surgeons were asked to participate in the study. Each participant was interviewed following the convergent interview technique. After each interview, the interviewer interpreted the information by seeking agreements and disagreements among the interviewees in order to understand the key concepts in the field of SDS. Results Fifteen important issues were extracted from the convergent interviews. Conclusion In general, the convergent interview was an effective technique in collecting information about the perception of clinicians. The study identified three areas where the technique could be improved upon for future studies in the SDS field. PMID:23782771
Lin, Yang-Cheng; Yeh, Chung-Hsing; Wang, Chen-Cheng; Wei, Chun-Chun
2012-01-01
How to design highly reputable and hot-selling products is an essential issue in product design. Whether consumers choose a product depends largely on their perception of the product image. A consumer-oriented design approach presented in this paper helps product designers incorporate consumers' perceptions of product forms in the design process. The consumer-oriented design approach uses quantification theory type I, grey prediction (the linear modeling technique), and neural networks (the nonlinear modeling technique) to determine the optimal form combination of product design for matching a given product image. An experimental study based on the concept of Kansei Engineering is conducted to collect numerical data for examining the relationship between consumers' perception of product image and product form elements of personal digital assistants (PDAs). The result of performance comparison shows that the QTTI model is good enough to help product designers determine the optimal form combination of product design. Although the PDA form design is used as a case study, the approach is applicable to other consumer products with various design elements and product images. The approach provides an effective mechanism for facilitating the consumer-oriented product design process.
Lin, Yang-Cheng; Yeh, Chung-Hsing; Wang, Chen-Cheng; Wei, Chun-Chun
2012-01-01
How to design highly reputable and hot-selling products is an essential issue in product design. Whether consumers choose a product depends largely on their perception of the product image. A consumer-oriented design approach presented in this paper helps product designers incorporate consumers' perceptions of product forms in the design process. The consumer-oriented design approach uses quantification theory type I, grey prediction (the linear modeling technique), and neural networks (the nonlinear modeling technique) to determine the optimal form combination of product design for matching a given product image. An experimental study based on the concept of Kansei Engineering is conducted to collect numerical data for examining the relationship between consumers' perception of product image and product form elements of personal digital assistants (PDAs). The result of performance comparison shows that the QTTI model is good enough to help product designers determine the optimal form combination of product design. Although the PDA form design is used as a case study, the approach is applicable to other consumer products with various design elements and product images. The approach provides an effective mechanism for facilitating the consumer-oriented product design process. PMID:23258961
NASA Astrophysics Data System (ADS)
Vidal, Borja; Lafuente, Juan A.
2016-03-01
A simple technique to avoid color limitations in image capture systems based on chroma key video composition using retroreflective screens and light-emitting diodes (LED) rings is proposed and demonstrated. The combination of an asynchronous temporal modulation onto the background illumination and simple image processing removes the usual restrictions on foreground colors in the scene. The technique removes technical constraints in stage composition, allowing its design to be purely based on artistic grounds. Since it only requires adding a very simple electronic circuit to widely used chroma keying hardware based on retroreflective screens, the technique is easily applicable to TV and filming studios.
MANTA, a novel plug-based vascular closure device for large bore arteriotomies: technical report.
van Gils, Lennart; Daemen, Joost; Walters, Greg; Sorzano, Todd; Grintz, Todd; Nardone, Sam; Lenzen, Mattie; De Jaegere, Peter P T; Roubin, Gary; Van Mieghem, Nicolas M
2016-09-18
Catheter-based interventions have become a less invasive alternative to conventional surgical techniques for a wide array of cardiovascular diseases but often create large arteriotomies. A completely percutaneous technique is attractive as it may reduce the overall complication rate and procedure time. Currently, large bore arteriotomy closure relies on suture-based techniques. Access-site complications are not uncommon and often seem related to closure device failure. The MANTA VCD is a novel collagen-based closure device that specifically targets arteriotomies between 10 and 22 Fr. This technical report discusses the MANTA design concept, practical instructions for use and preliminary clinical experience.
Property-driven functional verification technique for high-speed vision system-on-chip processor
NASA Astrophysics Data System (ADS)
Nshunguyimfura, Victor; Yang, Jie; Liu, Liyuan; Wu, Nanjian
2017-04-01
The implementation of functional verification in a fast, reliable, and effective manner is a challenging task in a vision chip verification process. The main reason for this challenge is the stepwise nature of existing functional verification techniques. This vision chip verification complexity is also related to the fact that in most vision chip design cycles, extensive efforts are focused on how to optimize chip metrics such as performance, power, and area. Design functional verification is not explicitly considered at an earlier stage at which the most sound decisions are made. In this paper, we propose a semi-automatic property-driven verification technique. The implementation of all verification components is based on design properties. We introduce a low-dimension property space between the specification space and the implementation space. The aim of this technique is to speed up the verification process for high-performance parallel processing vision chips. Our experimentation results show that the proposed technique can effectively improve the verification effort up to 20% for the complex vision chip design while reducing the simulation and debugging overheads.
An advanced artificial intelligence tool for menu design.
Khan, Abdus Salam; Hoffmann, Achim
2003-01-01
The computer-assisted menu design still remains a difficult task. Usually knowledge that aids in menu design by a computer is hard-coded and because of that a computerised menu planner cannot handle the menu design problem for an unanticipated client. To address this problem we developed a menu design tool, MIKAS (menu construction using incremental knowledge acquisition system), an artificial intelligence system that allows the incremental development of a knowledge-base for menu design. We allow an incremental knowledge acquisition process in which the expert is only required to provide hints to the system in the context of actual problem instances during menu design using menus stored in a so-called Case Base. Our system incorporates Case-Based Reasoning (CBR), an Artificial Intelligence (AI) technique developed to mimic human problem solving behaviour. Ripple Down Rules (RDR) are a proven technique for the acquisition of classification knowledge from expert directly while they are using the system, which complement CBR in a very fruitful way. This combination allows the incremental improvement of the menu design system while it is already in routine use. We believe MIKAS allows better dietary practice by leveraging a dietitian's skills and expertise. As such MIKAS has the potential to be helpful for any institution where dietary advice is practised.
Model-based Acceleration Control of Turbofan Engines with a Hammerstein-Wiener Representation
NASA Astrophysics Data System (ADS)
Wang, Jiqiang; Ye, Zhifeng; Hu, Zhongzhi; Wu, Xin; Dimirovsky, Georgi; Yue, Hong
2017-05-01
Acceleration control of turbofan engines is conventionally designed through either schedule-based or acceleration-based approach. With the widespread acceptance of model-based design in aviation industry, it becomes necessary to investigate the issues associated with model-based design for acceleration control. In this paper, the challenges for implementing model-based acceleration control are explained; a novel Hammerstein-Wiener representation of engine models is introduced; based on the Hammerstein-Wiener model, a nonlinear generalized minimum variance type of optimal control law is derived; the feature of the proposed approach is that it does not require the inversion operation that usually upsets those nonlinear control techniques. The effectiveness of the proposed control design method is validated through a detailed numerical study.
Evaluation of architectures for an ASP MPEG-4 decoder using a system-level design methodology
NASA Astrophysics Data System (ADS)
Garcia, Luz; Reyes, Victor; Barreto, Dacil; Marrero, Gustavo; Bautista, Tomas; Nunez, Antonio
2005-06-01
Trends in multimedia consumer electronics, digital video and audio, aim to reach users through low-cost mobile devices connected to data broadcasting networks with limited bandwidth. An emergent broadcasting network is the digital audio broadcasting network (DAB) which provides CD quality audio transmission together with robustness and efficiency techniques to allow good quality reception in motion conditions. This paper focuses on the system-level evaluation of different architectural options to allow low bandwidth digital video reception over DAB, based on video compression techniques. Profiling and design space exploration techniques are applied over the ASP MPEG-4 decoder in order to find out the best HW/SW partition given the application and platform constraints. An innovative SystemC-based system-level design tool, called CASSE, is being used for modelling, exploration and evaluation of different ASP MPEG-4 decoder HW/SW partitions. System-level trade offs and quantitative data derived from this analysis are also presented in this work.
Aerodynamic Design of Complex Configurations Using Cartesian Methods and CAD Geometry
NASA Technical Reports Server (NTRS)
Nemec, Marian; Aftosmis, Michael J.; Pulliam, Thomas H.
2003-01-01
The objective for this paper is to present the development of an optimization capability for the Cartesian inviscid-flow analysis package of Aftosmis et al. We evaluate and characterize the following modules within the new optimization framework: (1) A component-based geometry parameterization approach using a CAD solid representation and the CAPRI interface. (2) The use of Cartesian methods in the development Optimization techniques using a genetic algorithm. The discussion and investigations focus on several real world problems of the optimization process. We examine the architectural issues associated with the deployment of a CAD-based design approach in a heterogeneous parallel computing environment that contains both CAD workstations and dedicated compute nodes. In addition, we study the influence of noise on the performance of optimization techniques, and the overall efficiency of the optimization process for aerodynamic design of complex three-dimensional configurations. of automated optimization tools. rithm and a gradient-based algorithm.
An Automated Approach to Instructional Design Guidance.
ERIC Educational Resources Information Center
Spector, J. Michael; And Others
This paper describes the Guided Approach to Instructional Design Advising (GAIDA), an automated instructional design tool that incorporates techniques of artificial intelligence. GAIDA was developed by the U.S. Air Force Armstrong Laboratory to facilitate the planning and production of interactive courseware and computer-based training materials.…
The Evolvement of Automobile Steering System Based on TRIZ
NASA Astrophysics Data System (ADS)
Zhao, Xinjun; Zhang, Shuang
Products and techniques pass through a process of birth, growth, maturity, death and quit the stage like biological evolution process. The developments of products and techniques conform to some evolvement rules. If people know and hold these rules, they can design new kind of products and forecast the develop trends of the products. Thereby, enterprises can grasp the future technique directions of products, and make product and technique innovation. Below, based on TRIZ theory, the mechanism evolvement, the function evolvement and the appearance evolvement of automobile steering system had been analyzed and put forward some new ideas about future automobile steering system.
Modelling Technique for Demonstrating Gravity Collapse Structures in Jointed Rock.
ERIC Educational Resources Information Center
Stimpson, B.
1979-01-01
Described is a base-friction modeling technique for studying the development of collapse structures in jointed rocks. A moving belt beneath weak material is designed to simulate gravity. A description is given of the model frame construction. (Author/SA)
NASA Technical Reports Server (NTRS)
Rais-Rohani, Masoud
2003-01-01
This report discusses the development and application of two alternative strategies in the form of global and sequential local response surface (RS) techniques for the solution of reliability-based optimization (RBO) problems. The problem of a thin-walled composite circular cylinder under axial buckling instability is used as a demonstrative example. In this case, the global technique uses a single second-order RS model to estimate the axial buckling load over the entire feasible design space (FDS) whereas the local technique uses multiple first-order RS models with each applied to a small subregion of FDS. Alternative methods for the calculation of unknown coefficients in each RS model are explored prior to the solution of the optimization problem. The example RBO problem is formulated as a function of 23 uncorrelated random variables that include material properties, thickness and orientation angle of each ply, cylinder diameter and length, as well as the applied load. The mean values of the 8 ply thicknesses are treated as independent design variables. While the coefficients of variation of all random variables are held fixed, the standard deviations of ply thicknesses can vary during the optimization process as a result of changes in the design variables. The structural reliability analysis is based on the first-order reliability method with reliability index treated as the design constraint. In addition to the probabilistic sensitivity analysis of reliability index, the results of the RBO problem are presented for different combinations of cylinder length and diameter and laminate ply patterns. The two strategies are found to produce similar results in terms of accuracy with the sequential local RS technique having a considerably better computational efficiency.
MEMS-based platforms for mechanical manipulation and characterization of cells
NASA Astrophysics Data System (ADS)
Pan, Peng; Wang, Wenhui; Ru, Changhai; Sun, Yu; Liu, Xinyu
2017-12-01
Mechanical manipulation and characterization of single cells are important experimental techniques in biological and medical research. Because of the microscale sizes and highly fragile structures of cells, conventional cell manipulation and characterization techniques are not accurate and/or efficient enough or even cannot meet the more and more demanding needs in different types of cell-based studies. To this end, novel microelectromechanical systems (MEMS)-based technologies have been developed to improve the accuracy, efficiency, and consistency of various cell manipulation and characterization tasks, and enable new types of cell research. This article summarizes existing MEMS-based platforms developed for cell mechanical manipulation and characterization, highlights their specific design considerations making them suitable for their designated tasks, and discuss their advantages and limitations. In closing, an outlook into future trends is also provided.
Compiler-assisted multiple instruction rollback recovery using a read buffer
NASA Technical Reports Server (NTRS)
Alewine, N. J.; Chen, S.-K.; Fuchs, W. K.; Hwu, W.-M.
1993-01-01
Multiple instruction rollback (MIR) is a technique that has been implemented in mainframe computers to provide rapid recovery from transient processor failures. Hardware-based MIR designs eliminate rollback data hazards by providing data redundancy implemented in hardware. Compiler-based MIR designs have also been developed which remove rollback data hazards directly with data-flow transformations. This paper focuses on compiler-assisted techniques to achieve multiple instruction rollback recovery. We observe that some data hazards resulting from instruction rollback can be resolved efficiently by providing an operand read buffer while others are resolved more efficiently with compiler transformations. A compiler-assisted multiple instruction rollback scheme is developed which combines hardware-implemented data redundancy with compiler-driven hazard removal transformations. Experimental performance evaluations indicate improved efficiency over previous hardware-based and compiler-based schemes.
A Modal Approach to Compact MIMO Antenna Design
NASA Astrophysics Data System (ADS)
Yang, Binbin
MIMO (Multiple-Input Multiple-Output) technology offers new possibilities for wireless communication through transmission over multiple spatial channels, and enables linear increases in spectral efficiency as the number of the transmitting and receiving antennas increases. However, the physical implementation of such systems in compact devices encounters many physical constraints mainly from the design of multi-antennas. First, an antenna's bandwidth decreases dramatically as its electrical size reduces, a fact known as antenna Q limit; secondly, multiple antennas closely spaced tend to couple with each other, undermining MIMO performance. Though different MIMO antenna designs have been proposed in the literature, there is still a lack of a systematic design methodology and knowledge of performance limits. In this dissertation, we employ characteristic mode theory (CMT) as a powerful tool for MIMO antenna analysis and design. CMT allows us to examine each physical mode of the antenna aperture, and to access its many physical parameters without even exciting the antenna. For the first time, we propose efficient circuit models for MIMO antennas of arbitrary geometry using this modal decomposition technique. Those circuit models demonstrate the powerful physical insight of CMT for MIMO antenna modeling, and simplify MIMO antenna design problem to just the design of specific antenna structural modes and a modal feed network, making possible the separate design of antenna aperture and feeds. We therefore develop a feed-independent shape synthesis technique for optimization of broadband multi-mode apertures. Combining the shape synthesis and circuit modeling techniques for MIMO antennas, we propose a shape-first feed-next design methodology for MIMO antennas, and designed and fabricated two planar MIMO antennas, each occupying an aperture much smaller than the regular size of lambda/2 x lambda/2. Facilitated by the newly developed source formulation for antenna stored energy and recently reported work on antenna Q factor minimization, we extend the minimum Q limit to antennas of arbitrary geometry, and show that given an antenna aperture, any antenna design based on its substructure will result into minimum Q factors larger than or equal to that of the complete structure. This limit is much tighter than Chu's limit based on spherical modes, and applies to antennas of arbitrary geometry. Finally, considering the almost inevitable presence of mutual coupling effects within compact multiport antennas, we develop new decoupling networks (DN) and decoupling network synthesis techniques. An information-theoretic metric, information mismatch loss (Gammainfo), is defined for DN characterization. Based on this metric, the optimization of decoupling networks for broadband system performance is conducted, which demonstrates the limitation of the single-frequency decoupling techniques and room for improvement.
NASA Technical Reports Server (NTRS)
Olds, John Robert; Walberg, Gerald D.
1993-01-01
Multidisciplinary design optimization (MDO) is an emerging discipline within aerospace engineering. Its goal is to bring structure and efficiency to the complex design process associated with advanced aerospace launch vehicles. Aerospace vehicles generally require input from a variety of traditional aerospace disciplines - aerodynamics, structures, performance, etc. As such, traditional optimization methods cannot always be applied. Several multidisciplinary techniques and methods were proposed as potentially applicable to this class of design problem. Among the candidate options are calculus-based (or gradient-based) optimization schemes and parametric schemes based on design of experiments theory. A brief overview of several applicable multidisciplinary design optimization methods is included. Methods from the calculus-based class and the parametric class are reviewed, but the research application reported focuses on methods from the parametric class. A vehicle of current interest was chosen as a test application for this research. The rocket-based combined-cycle (RBCC) single-stage-to-orbit (SSTO) launch vehicle combines elements of rocket and airbreathing propulsion in an attempt to produce an attractive option for launching medium sized payloads into low earth orbit. The RBCC SSTO presents a particularly difficult problem for traditional one-variable-at-a-time optimization methods because of the lack of an adequate experience base and the highly coupled nature of the design variables. MDO, however, with it's structured approach to design, is well suited to this problem. The result of the application of Taguchi methods, central composite designs, and response surface methods to the design optimization of the RBCC SSTO are presented. Attention is given to the aspect of Taguchi methods that attempts to locate a 'robust' design - that is, a design that is least sensitive to uncontrollable influences on the design. Near-optimum minimum dry weight solutions are determined for the vehicle. A summary and evaluation of the various parametric MDO methods employed in the research are included. Recommendations for additional research are provided.
Enhanced Multiobjective Optimization Technique for Comprehensive Aerospace Design. Part A
NASA Technical Reports Server (NTRS)
Chattopadhyay, Aditi; Rajadas, John N.
1997-01-01
A multidisciplinary design optimization procedure which couples formal multiobjectives based techniques and complex analysis procedures (such as computational fluid dynamics (CFD) codes) developed. The procedure has been demonstrated on a specific high speed flow application involving aerodynamics and acoustics (sonic boom minimization). In order to account for multiple design objectives arising from complex performance requirements, multiobjective formulation techniques are used to formulate the optimization problem. Techniques to enhance the existing Kreisselmeier-Steinhauser (K-S) function multiobjective formulation approach have been developed. The K-S function procedure used in the proposed work transforms a constrained multiple objective functions problem into an unconstrained problem which then is solved using the Broyden-Fletcher-Goldfarb-Shanno (BFGS) algorithm. Weight factors are introduced during the transformation process to each objective function. This enhanced procedure will provide the designer the capability to emphasize specific design objectives during the optimization process. The demonstration of the procedure utilizes a computational Fluid dynamics (CFD) code which solves the three-dimensional parabolized Navier-Stokes (PNS) equations for the flow field along with an appropriate sonic boom evaluation procedure thus introducing both aerodynamic performance as well as sonic boom as the design objectives to be optimized simultaneously. Sensitivity analysis is performed using a discrete differentiation approach. An approximation technique has been used within the optimizer to improve the overall computational efficiency of the procedure in order to make it suitable for design applications in an industrial setting.
Model and controller reduction of large-scale structures based on projection methods
NASA Astrophysics Data System (ADS)
Gildin, Eduardo
The design of low-order controllers for high-order plants is a challenging problem theoretically as well as from a computational point of view. Frequently, robust controller design techniques result in high-order controllers. It is then interesting to achieve reduced-order models and controllers while maintaining robustness properties. Controller designed for large structures based on models obtained by finite element techniques yield large state-space dimensions. In this case, problems related to storage, accuracy and computational speed may arise. Thus, model reduction methods capable of addressing controller reduction problems are of primary importance to allow the practical applicability of advanced controller design methods for high-order systems. A challenging large-scale control problem that has emerged recently is the protection of civil structures, such as high-rise buildings and long-span bridges, from dynamic loadings such as earthquakes, high wind, heavy traffic, and deliberate attacks. Even though significant effort has been spent in the application of control theory to the design of civil structures in order increase their safety and reliability, several challenging issues are open problems for real-time implementation. This dissertation addresses with the development of methodologies for controller reduction for real-time implementation in seismic protection of civil structures using projection methods. Three classes of schemes are analyzed for model and controller reduction: nodal truncation, singular value decomposition methods and Krylov-based methods. A family of benchmark problems for structural control are used as a framework for a comparative study of model and controller reduction techniques. It is shown that classical model and controller reduction techniques, such as balanced truncation, modal truncation and moment matching by Krylov techniques, yield reduced-order controllers that do not guarantee stability of the closed-loop system, that is, the reduced-order controller implemented with the full-order plant. A controller reduction approach is proposed such that to guarantee closed-loop stability. It is based on the concept of dissipativity (or positivity) of linear dynamical systems. Utilizing passivity preserving model reduction together with dissipative-LQG controllers, effective low-order optimal controllers are obtained. Results are shown through simulations.
CMOS output buffer wave shaper
NASA Technical Reports Server (NTRS)
Albertson, L.; Whitaker, S.; Merrell, R.
1990-01-01
As the switching speeds and densities of Digital CMOS integrated circuits continue to increase, output switching noise becomes more of a problem. A design technique which aids in the reduction of switching noise is reported. The output driver stage is analyzed through the use of an equivalent RLC circuit. The results of the analysis are used in the design of an output driver stage. A test circuit based on these techniques is being submitted to MOSIS for fabrication.
Graphic design of pinhole cameras
NASA Technical Reports Server (NTRS)
Edwards, H. B.; Chu, W. P.
1979-01-01
The paper describes a graphic technique for the analysis and optimization of pinhole size and focal length. The technique is based on the use of the transfer function of optical elements described by Scott (1959) to construct the transfer function of a circular pinhole camera. This transfer function is the response of a component or system to a pattern of lines having a sinusoidally varying radiance at varying spatial frequencies. Some specific examples of graphic design are presented.
NASA Astrophysics Data System (ADS)
Majidzadeh, K.; Ilves, G. J.
1981-08-01
A ready reference to design procedures for asphaltic concrete overlay of flexible pavements based on elastic layer theory is provided. The design procedures and the analytical techniques presented were formulated to predict the structural fatigue response of asphaltic concrete overlays for various design conditions, including geometrical and material properties, loading conditions and environmental variables.
On the decomposition of synchronous state mechines using sequence invariant state machines
NASA Technical Reports Server (NTRS)
Hebbalalu, K.; Whitaker, S.; Cameron, K.
1992-01-01
This paper presents a few techniques for the decomposition of Synchronous State Machines of medium to large sizes into smaller component machines. The methods are based on the nature of the transitions and sequences of states in the machine and on the number and variety of inputs to the machine. The results of the decomposition, and of using the Sequence Invariant State Machine (SISM) Design Technique for generating the component machines, include great ease and quickness in the design and implementation processes. Furthermore, there is increased flexibility in making modifications to the original design leading to negligible re-design time.
A Modified Particle Swarm Optimization Technique for Finding Optimal Designs for Mixture Models
Wong, Weng Kee; Chen, Ray-Bing; Huang, Chien-Chih; Wang, Weichung
2015-01-01
Particle Swarm Optimization (PSO) is a meta-heuristic algorithm that has been shown to be successful in solving a wide variety of real and complicated optimization problems in engineering and computer science. This paper introduces a projection based PSO technique, named ProjPSO, to efficiently find different types of optimal designs, or nearly optimal designs, for mixture models with and without constraints on the components, and also for related models, like the log contrast models. We also compare the modified PSO performance with Fedorov's algorithm, a popular algorithm used to generate optimal designs, Cocktail algorithm, and the recent algorithm proposed by [1]. PMID:26091237
Errorless-based techniques can improve route finding in early Alzheimer's disease: a case study.
Provencher, Véronique; Bier, Nathalie; Audet, Thérèse; Gagnon, Lise
2008-01-01
Topographical disorientation is a common and early manifestation of dementia of Alzheimer type, which threatens independence in activities of daily living. Errorless-based techniques appear to be effective in helping patients with amnesia to learn routes, but little is known about their effectiveness in early dementia of Alzheimer type. A 77-year-old woman with dementia of Alzheimer type had difficulty in finding her way around her seniors residence, which reduced her social activities. This study used an ABA design (A is the baseline and B is the intervention) with multiple baselines across routes for going to the rosary (target), laundry, and game rooms (controls). The errorless-based technique intervention was applied to 2 of the 3 routes. Analyses showed significant improvement only for the routes learned with errorless-based techniques. Following the study, the participant increased her topographical knowledge of her surroundings. Route learning interventions based on errorless-based techniques appear to be a promising approach for improving the independence in early dementia of Alzheimer type.
A Survey of Techniques for Modeling and Improving Reliability of Computing Systems
Mittal, Sparsh; Vetter, Jeffrey S.
2015-04-24
Recent trends of aggressive technology scaling have greatly exacerbated the occurrences and impact of faults in computing systems. This has made `reliability' a first-order design constraint. To address the challenges of reliability, several techniques have been proposed. In this study, we provide a survey of architectural techniques for improving resilience of computing systems. We especially focus on techniques proposed for microarchitectural components, such as processor registers, functional units, cache and main memory etc. In addition, we discuss techniques proposed for non-volatile memory, GPUs and 3D-stacked processors. To underscore the similarities and differences of the techniques, we classify them based onmore » their key characteristics. We also review the metrics proposed to quantify vulnerability of processor structures. Finally, we believe that this survey will help researchers, system-architects and processor designers in gaining insights into the techniques for improving reliability of computing systems.« less
A Survey of Techniques for Modeling and Improving Reliability of Computing Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mittal, Sparsh; Vetter, Jeffrey S.
Recent trends of aggressive technology scaling have greatly exacerbated the occurrences and impact of faults in computing systems. This has made `reliability' a first-order design constraint. To address the challenges of reliability, several techniques have been proposed. In this study, we provide a survey of architectural techniques for improving resilience of computing systems. We especially focus on techniques proposed for microarchitectural components, such as processor registers, functional units, cache and main memory etc. In addition, we discuss techniques proposed for non-volatile memory, GPUs and 3D-stacked processors. To underscore the similarities and differences of the techniques, we classify them based onmore » their key characteristics. We also review the metrics proposed to quantify vulnerability of processor structures. Finally, we believe that this survey will help researchers, system-architects and processor designers in gaining insights into the techniques for improving reliability of computing systems.« less
A hybrid nonlinear programming method for design optimization
NASA Technical Reports Server (NTRS)
Rajan, S. D.
1986-01-01
Solutions to engineering design problems formulated as nonlinear programming (NLP) problems usually require the use of more than one optimization technique. Moreover, the interaction between the user (analysis/synthesis) program and the NLP system can lead to interface, scaling, or convergence problems. An NLP solution system is presented that seeks to solve these problems by providing a programming system to ease the user-system interface. A simple set of rules is used to select an optimization technique or to switch from one technique to another in an attempt to detect, diagnose, and solve some potential problems. Numerical examples involving finite element based optimal design of space trusses and rotor bearing systems are used to illustrate the applicability of the proposed methodology.
Iterative optimization method for design of quantitative magnetization transfer imaging experiments.
Levesque, Ives R; Sled, John G; Pike, G Bruce
2011-09-01
Quantitative magnetization transfer imaging (QMTI) using spoiled gradient echo sequences with pulsed off-resonance saturation can be a time-consuming technique. A method is presented for selection of an optimum experimental design for quantitative magnetization transfer imaging based on the iterative reduction of a discrete sampling of the Z-spectrum. The applicability of the technique is demonstrated for human brain white matter imaging at 1.5 T and 3 T, and optimal designs are produced to target specific model parameters. The optimal number of measurements and the signal-to-noise ratio required for stable parameter estimation are also investigated. In vivo imaging results demonstrate that this optimal design approach substantially improves parameter map quality. The iterative method presented here provides an advantage over free form optimal design methods, in that pragmatic design constraints are readily incorporated. In particular, the presented method avoids clustering and repeated measures in the final experimental design, an attractive feature for the purpose of magnetization transfer model validation. The iterative optimal design technique is general and can be applied to any method of quantitative magnetization transfer imaging. Copyright © 2011 Wiley-Liss, Inc.
Prado, Igor Afonso Acampora; Pereira, Mateus de Freitas Virgílio; de Castro, Davi Ferreira; Dos Santos, Davi Antônio; Balthazar, Jose Manoel
2018-06-01
The present paper is concerned with the design and experimental evaluation of optimal control laws for the nonlinear attitude dynamics of a multirotor aerial vehicle. Three design methods based on Hamilton-Jacobi-Bellman equation are taken into account. The first one is a linear control with guarantee of stability for nonlinear systems. The second and third are a nonlinear suboptimal control techniques. These techniques are based on an optimal control design approach that takes into account the nonlinearities present in the vehicle dynamics. The stability Proof of the closed-loop system is presented. The performance of the control system designed is evaluated via simulations and also via an experimental scheme using the Quanser 3-DOF Hover. The experiments show the effectiveness of the linear control method over the nonlinear strategy. Copyright © 2018 ISA. Published by Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Norinder, Ulf
1990-12-01
An experimental design based 3-D QSAR analysis using a combination of principal component and PLS analysis is presented and applied to human corticosteroid-binding globulin complexes. The predictive capability of the created model is good. The technique can also be used as guidance when selecting new compounds to be investigated.
Tablet-Based Math Assessment: What Can We Learn from Math Apps?
ERIC Educational Resources Information Center
Cayton-Hodges, Gabrielle A.; Feng, Gary; Pan, Xingyu
2015-01-01
In this report, we describe a survey of mathematics education apps in the Apple App Store, conducted as part of a research project to develop a tablet-based assessment prototype for elementary mathematics. This survey was performed with the goal of understanding the design principles and techniques used in mathematics apps designed for tablets. We…
A Project-Based Laboratory for Learning Embedded System Design with Industry Support
ERIC Educational Resources Information Center
Lee, Chyi-Shyong; Su, Juing-Huei; Lin, Kuo-En; Chang, Jia-Hao; Lin, Gu-Hong
2010-01-01
A project-based laboratory for learning embedded system design with support from industry is presented in this paper. The aim of this laboratory is to motivate students to learn the building blocks of embedded systems and practical control algorithms by constructing a line-following robot using the quadratic interpolation technique to predict the…
Database Design Learning: A Project-Based Approach Organized through a Course Management System
ERIC Educational Resources Information Center
Dominguez, Cesar; Jaime, Arturo
2010-01-01
This paper describes an active method for database design learning through practical tasks development by student teams in a face-to-face course. This method integrates project-based learning, and project management techniques and tools. Some scaffolding is provided at the beginning that forms a skeleton that adapts to a great variety of…
Recent developments of axial flow compressors under transonic flow conditions
NASA Astrophysics Data System (ADS)
Srinivas, G.; Raghunandana, K.; Satish Shenoy, B.
2017-05-01
The objective of this paper is to give a holistic view of the most advanced technology and procedures that are practiced in the field of turbomachinery design. Compressor flow solver is the turbulence model used in the CFD to solve viscous problems. The popular techniques like Jameson’s rotated difference scheme was used to solve potential flow equation in transonic condition for two dimensional aero foils and later three dimensional wings. The gradient base method is also a popular method especially for compressor blade shape optimization. Various other types of optimization techniques available are Evolutionary algorithms (EAs) and Response surface methodology (RSM). It is observed that in order to improve compressor flow solver and to get agreeable results careful attention need to be paid towards viscous relations, grid resolution, turbulent modeling and artificial viscosity, in CFD. The advanced techniques like Jameson’s rotated difference had most substantial impact on wing design and aero foil. For compressor blade shape optimization, Evolutionary algorithm is quite simple than gradient based technique because it can solve the parameters simultaneously by searching from multiple points in the given design space. Response surface methodology (RSM) is a method basically used to design empirical models of the response that were observed and to study systematically the experimental data. This methodology analyses the correct relationship between expected responses (output) and design variables (input). RSM solves the function systematically in a series of mathematical and statistical processes. For turbomachinery blade optimization recently RSM has been implemented successfully. The well-designed high performance axial flow compressors finds its application in any air-breathing jet engines.
INDES User's guide multistep input design with nonlinear rotorcraft modeling
NASA Technical Reports Server (NTRS)
1979-01-01
The INDES computer program, a multistep input design program used as part of a data processing technique for rotorcraft systems identification, is described. Flight test inputs base on INDES improve the accuracy of parameter estimates. The input design algorithm, program input, and program output are presented.
Optimization of brushless direct current motor design using an intelligent technique.
Shabanian, Alireza; Tousiwas, Armin Amini Poustchi; Pourmandi, Massoud; Khormali, Aminollah; Ataei, Abdolhay
2015-07-01
This paper presents a method for the optimal design of a slotless permanent magnet brushless DC (BLDC) motor with surface mounted magnets using an improved bee algorithm (IBA). The characteristics of the motor are expressed as functions of motor geometries. The objective function is a combination of losses, volume and cost to be minimized simultaneously. This method is based on the capability of swarm-based algorithms in finding the optimal solution. One sample case is used to illustrate the performance of the design approach and optimization technique. The IBA has a better performance and speed of convergence compared with bee algorithm (BA). Simulation results show that the proposed method has a very high/efficient performance. Copyright © 2015 ISA. Published by Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Abdullahi, Auwalu M.; Mohamed, Z.; Selamat, H.; Pota, Hemanshu R.; Zainal Abidin, M. S.; Ismail, F. S.; Haruna, A.
2018-01-01
Payload hoisting and wind disturbance during crane operations are among the challenging factors that affect a payload sway and thus, affect the crane's performance. This paper proposes a new online adaptive output-based command shaping (AOCS) technique for an effective payload sway reduction of an overhead crane under the influence of those effects. This technique enhances the previously developed output-based command shaping (OCS) which was effective only for a fixed system and without external disturbances. Unlike the conventional input shaping design technique which requires the system's natural frequency and damping ratio, the proposed technique is designed by using the output signal and thus, an online adaptive algorithm can be formulated. To test the effectiveness of the AOCS, experiments are carried out using a laboratory overhead crane with a payload hoisting in the presence of wind, and with different payloads. The superiority of the method is confirmed by 82% and 29% reductions in the overall sway and the maximum transient sway respectively, when compared to the OCS, and two robust input shapers namely Zero Vibration Derivative-Derivative and Extra-Insensitive shapers. Furthermore, the method demonstrates a uniform crane's performance under all conditions. It is envisaged that the proposed method can be very useful in designing an effective controller for a crane system with an unknown payload and under the influence of external disturbances.
The ERL-based Design of Electron-Hadron Collider eRHIC
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ptitsyn, Vadim
2016-06-01
Recent developments of the ERL-based design of future high-luminosity electron-hadron collider eRHIC focused on balancing technological risks present in the design versus the design cost. As a result a lower risk design has been adopted at moderate cost increase. The modifications include a change of the main linac RF frequency, reduced number of SRF cavity types and modified electron spin transport using a spin rotator. A luminosity-staged approach is being explored with a Nominal design (more » $$L \\sim 10^{33} {\\rm cm}^2 {\\rm s}^{-1}$$) that employs reduced electron current and could possibly be based on classical electron cooling, and then with the Ultimate design ($$L \\gt 10^{34} {\\rm cm}^{-2} {\\rm s}^{-1}$$) that uses higher electron current and an innovative cooling technique (CeC). The paper describes the recent design modifications, and presents the full status of the eRHIC ERL-based design.« less
An assessment of finite-element modeling techniques for thick-solid/thin-shell joints analysis
NASA Technical Reports Server (NTRS)
Min, J. B.; Androlake, S. G.
1993-01-01
The subject of finite-element modeling has long been of critical importance to the practicing designer/analyst who is often faced with obtaining an accurate and cost-effective structural analysis of a particular design. Typically, these two goals are in conflict. The purpose is to discuss the topic of finite-element modeling for solid/shell connections (joints) which are significant for the practicing modeler. Several approaches are currently in use, but frequently various assumptions restrict their use. Such techniques currently used in practical applications were tested, especially to see which technique is the most ideally suited for the computer aided design (CAD) environment. Some basic thoughts regarding each technique are also discussed. As a consequence, some suggestions based on the results are given to lead reliable results in geometrically complex joints where the deformation and stress behavior are complicated.
Design Considerations of a Compounded Sterile Preparations Course
Petraglia, Christine; Mattison, Melissa J.
2016-01-01
Objective. To design a comprehensive learning and assessment environment for the practical application of compounded sterile preparations using a constructivist approach. Design. Compounded Sterile Preparations Laboratory is a required 1-credit course that builds upon the themes of training aseptic technique typically used in health system settings and threads application of concepts from other courses in the curriculum. Students used critical-thinking skills to devise appropriate strategies to compound sterile preparations. Assessment. Aseptic technique skills were assessed with objective, structured, checklist-based rubrics. Most students successfully completed practical assessments using appropriate technique (mean assessment grade=83.2%). Almost all students passed the practical media fill (98%) and gloved fingertip sampling (86%) tests on the first attempt; all passed on the second attempt. Conclusion. Employing a constructivist scaffold approach to teaching proper hygiene and aseptic technique prepared students to pass media fill and gloved fingertip tests and to perform well on practical compounding assessments. PMID:26941438
Computer Aided Drug Design: Success and Limitations.
Baig, Mohammad Hassan; Ahmad, Khurshid; Roy, Sudeep; Ashraf, Jalaluddin Mohammad; Adil, Mohd; Siddiqui, Mohammad Haris; Khan, Saif; Kamal, Mohammad Amjad; Provazník, Ivo; Choi, Inho
2016-01-01
Over the last few decades, computer-aided drug design has emerged as a powerful technique playing a crucial role in the development of new drug molecules. Structure-based drug design and ligand-based drug design are two methods commonly used in computer-aided drug design. In this article, we discuss the theory behind both methods, as well as their successful applications and limitations. To accomplish this, we reviewed structure based and ligand based virtual screening processes. Molecular dynamics simulation, which has become one of the most influential tool for prediction of the conformation of small molecules and changes in their conformation within the biological target, has also been taken into account. Finally, we discuss the principles and concepts of molecular docking, pharmacophores and other methods used in computer-aided drug design.
Ramrakhyani, A K; Mirabbasi, S; Mu Chiao
2011-02-01
Resonance-based wireless power delivery is an efficient technique to transfer power over a relatively long distance. This technique typically uses four coils as opposed to two coils used in conventional inductive links. In the four-coil system, the adverse effects of a low coupling coefficient between primary and secondary coils are compensated by using high-quality (Q) factor coils, and the efficiency of the system is improved. Unlike its two-coil counterpart, the efficiency profile of the power transfer is not a monotonically decreasing function of the operating distance and is less sensitive to changes in the distance between the primary and secondary coils. A four-coil energy transfer system can be optimized to provide maximum efficiency at a given operating distance. We have analyzed the four-coil energy transfer systems and outlined the effect of design parameters on power-transfer efficiency. Design steps to obtain the efficient power-transfer system are presented and a design example is provided. A proof-of-concept prototype system is implemented and confirms the validity of the proposed analysis and design techniques. In the prototype system, for a power-link frequency of 700 kHz and a coil distance range of 10 to 20 mm, using a 22-mm diameter implantable coil resonance-based system shows a power-transfer efficiency of more than 80% with an enhanced operating range compared to ~40% efficiency achieved by a conventional two-coil system.
Application of real-time digitization techniques in beam measurement for accelerators
NASA Astrophysics Data System (ADS)
Zhao, Lei; Zhan, Lin-Song; Gao, Xing-Shun; Liu, Shu-Bin; An, Qi
2016-04-01
Beam measurement is very important for accelerators. In this paper, modern digital beam measurement techniques based on IQ (In-phase & Quadrature-phase) analysis are discussed. Based on this method and high-speed high-resolution analog-to-digital conversion, we have completed three beam measurement electronics systems designed for the China Spallation Neutron Source (CSNS), Shanghai Synchrotron Radiation Facility (SSRF), and Accelerator Driven Sub-critical system (ADS). Core techniques of hardware design and real-time system calibration are discussed, and performance test results of these three instruments are also presented. Supported by National Natural Science Foundation of China (11205153, 10875119), Knowledge Innovation Program of the Chinese Academy of Sciences (KJCX2-YW-N27), and the Fundamental Research Funds for the Central Universities (WK2030040029),and the CAS Center for Excellence in Particle Physics (CCEPP).
Conditional Monte Carlo randomization tests for regression models.
Parhat, Parwen; Rosenberger, William F; Diao, Guoqing
2014-08-15
We discuss the computation of randomization tests for clinical trials of two treatments when the primary outcome is based on a regression model. We begin by revisiting the seminal paper of Gail, Tan, and Piantadosi (1988), and then describe a method based on Monte Carlo generation of randomization sequences. The tests based on this Monte Carlo procedure are design based, in that they incorporate the particular randomization procedure used. We discuss permuted block designs, complete randomization, and biased coin designs. We also use a new technique by Plamadeala and Rosenberger (2012) for simple computation of conditional randomization tests. Like Gail, Tan, and Piantadosi, we focus on residuals from generalized linear models and martingale residuals from survival models. Such techniques do not apply to longitudinal data analysis, and we introduce a method for computation of randomization tests based on the predicted rate of change from a generalized linear mixed model when outcomes are longitudinal. We show, by simulation, that these randomization tests preserve the size and power well under model misspecification. Copyright © 2014 John Wiley & Sons, Ltd.
Live fire testing requirements - Assessing the impact
DOE Office of Scientific and Technical Information (OSTI.GOV)
O'Bryon, J.F.
1992-08-01
Full-up live-fire testing (LFT) of aircraft configured for combat is evaluated in terms of the practical implications of the technique. LFT legislation requires the testing of tactical fighters, helicopters, and other aircraft when they are loaded with the flammables and explosives associated with combat. LFT permits the study of damage mechanisms and battle-damage repair techniques during the design phase, and probability-of-kill estimates and novel systems designs can be developed based on LFT data.
NASA Astrophysics Data System (ADS)
Jiménez-Varona, J.; Ponsin Roca, J.
2015-06-01
Under a contract with AIRBUS MILITARY (AI-M), an exercise to analyze the potential of optimization techniques to improve the wing performances at cruise conditions has been carried out by using an in-house design code. The original wing was provided by AI-M and several constraints were posed for the redesign. To maximize the aerodynamic efficiency at cruise, optimizations were performed using the design techniques developed internally at INTA under a research program (Programa de Termofluidodinámica). The code is a gradient-based optimizaa tion code, which uses classical finite differences approach for gradient computations. Several techniques for search direction computation are implemented for unconstrained and constrained problems. Techniques for geometry modifications are based on different approaches which include perturbation functions for the thickness and/or mean line distributions and others by Bézier curves fitting of certain degree. It is very e important to afford a real design which involves several constraints that reduce significantly the feasible design space. And the assessment of the code is needed in order to check the capabilities and the possible drawbacks. Lessons learnt will help in the development of future enhancements. In addition, the validation of the results was done using also the well-known TAU flow solver and a far-field drag method in order to determine accurately the improvement in terms of drag counts.
NASA Technical Reports Server (NTRS)
Hall, David W.; Rogan, J. Edward
1989-01-01
A microcomputer-based integration of aircraft design disciplines has been applied theoretically to sailplane, microwave-powered aircraft, and High Altitude Long-Endurance (HALE) aircraft configurational definition efforts. Attention is presently given to the further development of such integrated-discipline approaches through the incorporation of AI techniques; these are then applied to the aforementioned case of the HALE. The windFrame language used, which is based on HyperTalk, will allow designers to write programs using a highly graphical, user interface-oriented environment.
Inherent secure communications using lattice based waveform design
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pugh, Matthew Owen
2013-12-01
The wireless communications channel is innately insecure due to the broadcast nature of the electromagnetic medium. Many techniques have been developed and implemented in order to combat insecurities and ensure the privacy of transmitted messages. Traditional methods include encrypting the data via cryptographic methods, hiding the data in the noise floor as in wideband communications, or nulling the signal in the spatial direction of the adversary using array processing techniques. This work analyzes the design of signaling constellations, i.e. modulation formats, to combat eavesdroppers from correctly decoding transmitted messages. It has been shown that in certain channel models the abilitymore » of an adversary to decode the transmitted messages can be degraded by a clever signaling constellation based on lattice theory. This work attempts to optimize certain lattice parameters in order to maximize the security of the data transmission. These techniques are of interest because they are orthogonal to, and can be used in conjunction with, traditional security techniques to create a more secure communication channel.« less
Low cost MATLAB-based pulse oximeter for deployment in research and development applications.
Shokouhian, M; Morling, R C S; Kale, I
2013-01-01
Problems such as motion artifact and effects of ambient lights have forced developers to design different signal processing techniques and algorithms to increase the reliability and accuracy of the conventional pulse oximeter device. To evaluate the robustness of these techniques, they are applied either to recorded data or are implemented on chip to be applied to real-time data. Recorded data is the most common method of evaluating however it is not as reliable as real-time measurements. On the other hand, hardware implementation can be both expensive and time consuming. This paper presents a low cost MATLAB-based pulse oximeter that can be used for rapid evaluation of newly developed signal processing techniques and algorithms. Flexibility to apply different signal processing techniques, providing both processed and unprocessed data along with low implementation cost are the important features of this design which makes it ideal for research and development purposes, as well as commercial, hospital and healthcare application.
Using EIGER for Antenna Design and Analysis
NASA Technical Reports Server (NTRS)
Champagne, Nathan J.; Khayat, Michael; Kennedy, Timothy F.; Fink, Patrick W.
2007-01-01
EIGER (Electromagnetic Interactions GenERalized) is a frequency-domain electromagnetics software package that is built upon a flexible framework, designed using object-oriented techniques. The analysis methods used include moment method solutions of integral equations, finite element solutions of partial differential equations, and combinations thereof. The framework design permits new analysis techniques (boundary conditions, Green#s functions, etc.) to be added to the software suite with a sensible effort. The code has been designed to execute (in serial or parallel) on a wide variety of platforms from Intel-based PCs and Unix-based workstations. Recently, new potential integration scheme s that avoid singularity extraction techniques have been added for integral equation analysis. These new integration schemes are required for facilitating the use of higher-order elements and basis functions. Higher-order elements are better able to model geometrical curvature using fewer elements than when using linear elements. Higher-order basis functions are beneficial for simulating structures with rapidly varying fields or currents. Results presented here will demonstrate curren t and future capabilities of EIGER with respect to analysis of installed antenna system performance in support of NASA#s mission of exploration. Examples include antenna coupling within an enclosed environment and antenna analysis on electrically large manned space vehicles.
Singh, Pankaj Kumar; Negi, Arvind; Gupta, Pawan Kumar; Chauhan, Monika; Kumar, Raj
2016-08-01
Toxicity is a common drawback of newly designed chemotherapeutic agents. With the exception of pharmacophore-induced toxicity (lack of selectivity at higher concentrations of a drug), the toxicity due to chemotherapeutic agents is based on the toxicophore moiety present in the drug. To date, methodologies implemented to determine toxicophores may be broadly classified into biological, bioanalytical and computational approaches. The biological approach involves analysis of bioactivated metabolites, whereas the computational approach involves a QSAR-based method, mapping techniques, an inverse docking technique and a few toxicophore identification/estimation tools. Being one of the major steps in drug discovery process, toxicophore identification has proven to be an essential screening step in drug design and development. The paper is first of its kind, attempting to cover and compare different methodologies employed in predicting and determining toxicophores with an emphasis on their scope and limitations. Such information may prove vital in the appropriate selection of methodology and can be used as screening technology by researchers to discover the toxicophoric potentials of their designed and synthesized moieties. Additionally, it can be utilized in the manipulation of molecules containing toxicophores in such a manner that their toxicities might be eliminated or removed.
NASA Technical Reports Server (NTRS)
Harrison, P. Ann
1993-01-01
All the NASA VEGetation Workbench (VEG) goals except the Learning System provide the scientist with several different techniques. When VEG is run, rules assist the scientist in selecting the best of the available techniques to apply to the sample of cover type data being studied. The techniques are stored in the VEG knowledge base. The design and implementation of an interface that allows the scientist to add new techniques to VEG without assistance from the developer were completed. A new interface that enables the scientist to add techniques to VEG without assistance from the developer was designed and implemented. This interface does not require the scientist to have a thorough knowledge of Knowledge Engineering Environment (KEE) by Intellicorp or a detailed knowledge of the structure of VEG. The interface prompts the scientist to enter the required information about the new technique. It prompts the scientist to enter the required Common Lisp functions for executing the technique and the left hand side of the rule that causes the technique to be selected. A template for each function and rule and detailed instructions about the arguments of the functions, the values they should return, and the format of the rule are displayed. Checks are made to ensure that the required data were entered, the functions compiled correctly, and the rule parsed correctly before the new technique is stored. The additional techniques are stored separately from the VEG knowledge base. When the VEG knowledge base is loaded, the additional techniques are not normally loaded. The interface allows the scientist the option of adding all the previously defined new techniques before running VEG. When the techniques are added, the required units to store the additional techniques are created automatically in the correct places in the VEG knowledge base. The methods file containing the functions required by the additional techniques is loaded. New rule units are created to store the new rules. The interface that allow the scientist to select which techniques to use is updated automatically to include the new techniques. Task H was completed. The interface that allows the scientist to add techniques to VEG was implemented and comprehensively tested. The Common Lisp code for the Add Techniques system is listed in Appendix A.
NASA Technical Reports Server (NTRS)
Lee, K. W.; Putnam, A. A.; Gieseke, J. A.; Golovin, M. N.; Hale, J. A.
1979-01-01
Techniques of generating monodisperse sprays and information concerning chemical liquids used in agricultural aviation are surveyed. The periodic dispersion of liquid jet, the spinning disk method, and ultrasonic atomization are the techniques discussed. Conceptually designed spray nozzles for generating monodisperse sprays are assessed. These are based on the classification of the drops using centrifugal force, on using two opposing liquid laden air jets, and on operating a spinning disk at an overloaded flow. Performance requirements for the designs are described and estimates of the operational characteristics are presented.
A combination of selected mapping and clipping to increase energy efficiency of OFDM systems
Lee, Byung Moo; Rim, You Seung
2017-01-01
We propose an energy efficient combination design for OFDM systems based on selected mapping (SLM) and clipping peak-to-average power ratio (PAPR) reduction techniques, and show the related energy efficiency (EE) performance analysis. The combination of two different PAPR reduction techniques can provide a significant benefit in increasing EE, because it can take advantages of both techniques. For the combination, we choose the clipping and SLM techniques, since the former technique is quite simple and effective, and the latter technique does not cause any signal distortion. We provide the structure and the systematic operating method, and show the various analyzes to derive the EE gain based on the combined technique. Our analysis show that the combined technique increases the EE by 69% compared to no PAPR reduction, and by 19.34% compared to only using SLM technique. PMID:29023591
Information management system study results. Volume 1: IMS study results
NASA Technical Reports Server (NTRS)
1971-01-01
The information management system (IMS) special emphasis task was performed as an adjunct to the modular space station study, with the objective of providing extended depth of analysis and design in selected key areas of the information management system. Specific objectives included: (1) in-depth studies of IMS requirements and design approaches; (2) design and fabricate breadboard hardware for demonstration and verification of design concepts; (3) provide a technological base to identify potential design problems and influence long range planning (4) develop hardware and techniques to permit long duration, low cost, manned space operations; (5) support SR&T areas where techniques or equipment are considered inadequate; and (6) permit an overall understanding of the IMS as an integrated component of the space station.
NASA Technical Reports Server (NTRS)
Matic, Roy M.; Mosley, Judith I.
1994-01-01
Future space-based, remote sensing systems will have data transmission requirements that exceed available downlinks necessitating the use of lossy compression techniques for multispectral data. In this paper, we describe several algorithms for lossy compression of multispectral data which combine spectral decorrelation techniques with an adaptive, wavelet-based, image compression algorithm to exploit both spectral and spatial correlation. We compare the performance of several different spectral decorrelation techniques including wavelet transformation in the spectral dimension. The performance of each technique is evaluated at compression ratios ranging from 4:1 to 16:1. Performance measures used are visual examination, conventional distortion measures, and multispectral classification results. We also introduce a family of distortion metrics that are designed to quantify and predict the effect of compression artifacts on multi spectral classification of the reconstructed data.
Li, Hongsheng
2018-01-01
This review aims to compare existing robot-assisted ankle rehabilitation techniques in terms of robot design. Included studies mainly consist of selected papers in two published reviews involving a variety of robot-assisted ankle rehabilitation techniques. A free search was also made in Google Scholar and Scopus by using keywords “ankle∗,” and “robot∗,” and (“rehabilitat∗” or “treat∗”). The search is limited to English-language articles published between January 1980 and September 2016. Results show that existing robot-assisted ankle rehabilitation techniques can be classified into wearable exoskeleton and platform-based devices. Platform-based devices are mostly developed for the treatment of a variety of ankle musculoskeletal and neurological injuries, while wearable ones focus more on ankle-related gait training. In terms of robot design, comparative analysis indicates that an ideal ankle rehabilitation robot should have aligned rotation center as the ankle joint, appropriate workspace, and actuation torque, no matter how many degrees of freedom (DOFs) it has. Single-DOF ankle robots are mostly developed for specific applications, while multi-DOF devices are more suitable for comprehensive ankle rehabilitation exercises. Other factors including posture adjustability and sensing functions should also be considered to promote related clinical applications. An ankle rehabilitation robot with reconfigurability to maximize its functions will be a new research point towards optimal design, especially on parallel mechanisms. PMID:29736230
Miao, Qing; Zhang, Mingming; Wang, Congzhe; Li, Hongsheng
2018-01-01
This review aims to compare existing robot-assisted ankle rehabilitation techniques in terms of robot design. Included studies mainly consist of selected papers in two published reviews involving a variety of robot-assisted ankle rehabilitation techniques. A free search was also made in Google Scholar and Scopus by using keywords "ankle ∗ ," and "robot ∗ ," and ("rehabilitat ∗ " or "treat ∗ "). The search is limited to English-language articles published between January 1980 and September 2016. Results show that existing robot-assisted ankle rehabilitation techniques can be classified into wearable exoskeleton and platform-based devices. Platform-based devices are mostly developed for the treatment of a variety of ankle musculoskeletal and neurological injuries, while wearable ones focus more on ankle-related gait training. In terms of robot design, comparative analysis indicates that an ideal ankle rehabilitation robot should have aligned rotation center as the ankle joint, appropriate workspace, and actuation torque, no matter how many degrees of freedom (DOFs) it has. Single-DOF ankle robots are mostly developed for specific applications, while multi-DOF devices are more suitable for comprehensive ankle rehabilitation exercises. Other factors including posture adjustability and sensing functions should also be considered to promote related clinical applications. An ankle rehabilitation robot with reconfigurability to maximize its functions will be a new research point towards optimal design, especially on parallel mechanisms.
Design and development of a quad copter (UMAASK) using CAD/CAM/CAE
NASA Astrophysics Data System (ADS)
Manarvi, Irfan Anjum; Aqib, Muhammad; Ajmal, Muhammad; Usman, Muhammad; Khurshid, Saqib; Sikandar, Usman
Micro flying vehicles1 (MFV) have become a popular area of research due to economy of production, flexibility of launch and variety of applications. A large number of techniques from pencil sketching to computer based software are being used for designing specific geometries and selection of materials to arrive at novel designs for specific requirements. Present research was focused on development of suitable design configuration using CAD/CAM/CAE tools and techniques. A number of designs were reviewed for this purpose. Finally, rotary wing Quadcopter flying vehicle design was considered appropriate for this research. Performance requirements were planned as approximately 10 meters ceiling, weight less than 500grams and ability to take videos and pictures. Parts were designed using Finite Element Analysis, manufactured using CNC machines and assembled to arrive at final design named as UMAASK. Flight tests were carried out which confirmed the design requirements.
Distillation Designs for the Lunar Surface
NASA Technical Reports Server (NTRS)
Boul, Peter J.; Lange,Kevin E.; Conger, Bruce; Anderson, Molly
2010-01-01
Gravity-based distillation methods may be applied to the purification of wastewater on the lunar base. These solutions to water processing are robust physical separation techniques, which may be more advantageous than many other techniques for their simplicity in design and operation. The two techniques can be used in conjunction with each other to obtain high purity water. The components and feed compositions for modeling waste water streams are presented in conjunction with the Aspen property system for traditional stage distillation. While the individual components for each of the waste streams will vary naturally within certain bounds, an analog model for waste water processing is suggested based on typical concentration ranges for these components. Target purity levels for recycled water are determined for each individual component based on NASA s required maximum contaminant levels for potable water Optimum parameters such as reflux ratio, feed stage location, and processing rates are determined with respect to the power consumption of the process. Multistage distillation is evaluated for components in wastewater to determine the minimum number of stages necessary for each of 65 components in humidity condensate and urine wastewater mixed streams.
Laser direct-write for fabrication of three-dimensional paper-based devices.
He, P J W; Katis, I N; Eason, R W; Sones, C L
2016-08-16
We report the use of a laser-based direct-write (LDW) technique that allows the design and fabrication of three-dimensional (3D) structures within a paper substrate that enables implementation of multi-step analytical assays via a 3D protocol. The technique is based on laser-induced photo-polymerisation, and through adjustment of the laser writing parameters such as the laser power and scan speed we can control the depths of hydrophobic barriers that are formed within a substrate which, when carefully designed and integrated, produce 3D flow paths. So far, we have successfully used this depth-variable patterning protocol for stacking and sealing of multi-layer substrates, for assembly of backing layers for two-dimensional (2D) lateral flow devices and finally for fabrication of 3D devices. Since the 3D flow paths can also be formed via a single laser-writing process by controlling the patterning parameters, this is a distinct improvement over other methods that require multiple complicated and repetitive assembly procedures. This technique is therefore suitable for cheap, rapid and large-scale fabrication of 3D paper-based microfluidic devices.
ERIC Educational Resources Information Center
McClain, Lucy R.; Zimmerman, Heather Toomey
2016-01-01
This study describes the implementation of a self-guiding mobile learning tool designed to support youths' engagement with the natural world as they explored the flora and fauna along one nature trail at an environmental center. Using qualitative video-based data collection and analysis techniques, we conducted two design-based research study…
Biosensor-based microRNA detection: techniques, design, performance, and challenges.
Johnson, Blake N; Mutharasan, Raj
2014-04-07
The current state of biosensor-based techniques for amplification-free microRNA (miRNA) detection is critically reviewed. Comparison with non-sensor and amplification-based molecular techniques (MTs), such as polymerase-based methods, is made in terms of transduction mechanism, associated protocol, and sensitivity. Challenges associated with miRNA hybridization thermodynamics which affect assay selectivity and amplification bias are briefly discussed. Electrochemical, electromechanical, and optical classes of miRNA biosensors are reviewed in terms of transduction mechanism, limit of detection (LOD), time-to-results (TTR), multiplexing potential, and measurement robustness. Current trends suggest that biosensor-based techniques (BTs) for miRNA assay will complement MTs due to the advantages of amplification-free detection, LOD being femtomolar (fM)-attomolar (aM), short TTR, multiplexing capability, and minimal sample preparation requirement. Areas of future importance in miRNA BT development are presented which include focus on achieving high measurement confidence and multiplexing capabilities.
Almazyad, Abdulaziz S.; Seddiq, Yasser M.; Alotaibi, Ahmed M.; Al-Nasheri, Ahmed Y.; BenSaleh, Mohammed S.; Obeid, Abdulfattah M.; Qasim, Syed Manzoor
2014-01-01
Anomalies such as leakage and bursts in water pipelines have severe consequences for the environment and the economy. To ensure the reliability of water pipelines, they must be monitored effectively. Wireless Sensor Networks (WSNs) have emerged as an effective technology for monitoring critical infrastructure such as water, oil and gas pipelines. In this paper, we present a scalable design and simulation of a water pipeline leakage monitoring system using Radio Frequency IDentification (RFID) and WSN technology. The proposed design targets long-distance aboveground water pipelines that have special considerations for maintenance, energy consumption and cost. The design is based on deploying a group of mobile wireless sensor nodes inside the pipeline and allowing them to work cooperatively according to a prescheduled order. Under this mechanism, only one node is active at a time, while the other nodes are sleeping. The node whose turn is next wakes up according to one of three wakeup techniques: location-based, time-based and interrupt-driven. In this paper, mathematical models are derived for each technique to estimate the corresponding energy consumption and memory size requirements. The proposed equations are analyzed and the results are validated using simulation. PMID:24561404
Almazyad, Abdulaziz S; Seddiq, Yasser M; Alotaibi, Ahmed M; Al-Nasheri, Ahmed Y; BenSaleh, Mohammed S; Obeid, Abdulfattah M; Qasim, Syed Manzoor
2014-02-20
Anomalies such as leakage and bursts in water pipelines have severe consequences for the environment and the economy. To ensure the reliability of water pipelines, they must be monitored effectively. Wireless Sensor Networks (WSNs) have emerged as an effective technology for monitoring critical infrastructure such as water, oil and gas pipelines. In this paper, we present a scalable design and simulation of a water pipeline leakage monitoring system using Radio Frequency IDentification (RFID) and WSN technology. The proposed design targets long-distance aboveground water pipelines that have special considerations for maintenance, energy consumption and cost. The design is based on deploying a group of mobile wireless sensor nodes inside the pipeline and allowing them to work cooperatively according to a prescheduled order. Under this mechanism, only one node is active at a time, while the other nodes are sleeping. The node whose turn is next wakes up according to one of three wakeup techniques: location-based, time-based and interrupt-driven. In this paper, mathematical models are derived for each technique to estimate the corresponding energy consumption and memory size requirements. The proposed equations are analyzed and the results are validated using simulation.
Macready, Anna L; Fallaize, Rosalind; Butler, Laurie T; Ellis, Judi A; Kuznesof, Sharron; Frewer, Lynn J; Celis-Morales, Carlos; Livingstone, Katherine M; Araújo-Soares, Vera; Fischer, Arnout RH; Stewart-Knox, Barbara J; Mathers, John C
2018-01-01
Background To determine the efficacy of behavior change techniques applied in dietary and physical activity intervention studies, it is first necessary to record and describe techniques that have been used during such interventions. Published frameworks used in dietary and smoking cessation interventions undergo continuous development, and most are not adapted for Web-based delivery. The Food4Me study (N=1607) provided the opportunity to use existing frameworks to describe standardized Web-based techniques employed in a large-scale, internet-based intervention to change dietary behavior and physical activity. Objective The aims of this study were (1) to describe techniques embedded in the Food4Me study design and explain the selection rationale and (2) to demonstrate the use of behavior change technique taxonomies, develop standard operating procedures for training, and identify strengths and limitations of the Food4Me framework that will inform its use in future studies. Methods The 6-month randomized controlled trial took place simultaneously in seven European countries, with participants receiving one of four levels of personalized advice (generalized, intake-based, intake+phenotype–based, and intake+phenotype+gene–based). A three-phase approach was taken: (1) existing taxonomies were reviewed and techniques were identified a priori for possible inclusion in the Food4Me study, (2) a standard operating procedure was developed to maintain consistency in the use of methods and techniques across research centers, and (3) the Food4Me behavior change technique framework was reviewed and updated post intervention. An analysis of excluded techniques was also conducted. Results Of 46 techniques identified a priori as being applicable to Food4Me, 17 were embedded in the intervention design; 11 were from a dietary taxonomy, and 6 from a smoking cessation taxonomy. In addition, the four-category smoking cessation framework structure was adopted for clarity of communication. Smoking cessation texts were adapted for dietary use where necessary. A posteriori, a further 9 techniques were included. Examination of excluded items highlighted the distinction between techniques considered appropriate for face-to-face versus internet-based delivery. Conclusions The use of existing taxonomies facilitated the description and standardization of techniques used in Food4Me. We recommend that for complex studies of this nature, technique analysis should be conducted a priori to develop standardized procedures and training and reviewed a posteriori to audit the techniques actually adopted. The present framework description makes a valuable contribution to future systematic reviews and meta-analyses that explore technique efficacy and underlying psychological constructs. This was a novel application of the behavior change taxonomies and was the first internet-based personalized nutrition intervention to use such a framework remotely. Trial Registration ClinicalTrials.gov NCT01530139; https://clinicaltrials.gov/ct2/show/NCT01530139 (Archived by WebCite at http://www.webcitation.org/6y8XYUft1) PMID:29631993
ERIC Educational Resources Information Center
Ghasem, Nayef
2016-01-01
This paper illustrates a teaching technique used in computer applications in chemical engineering employed for designing various unit operation processes, where the students learn about unit operations by designing them. The aim of the course is not to teach design, but rather to teach the fundamentals and the function of unit operation processes…
SDRE controller for motion design of cable-suspended robot with uncertainties and moving obstacles
NASA Astrophysics Data System (ADS)
Behboodi, Ahad; Salehi, Seyedmohammad
2017-10-01
In this paper an optimal control approach for nonlinear dynamical systems was proposed based on State Dependent Riccati Equation (SDRE) and its robustness against uncertainties is shown by simulation results. The proposed method was applied on a spatial six-cable suspended robot, which was designed to carry loads or perform different tasks in huge workspaces. Motion planning for cable-suspended robots in such a big workspace is subjected to uncertainties and obstacles. First, we emphasized the ability of SDRE to construct a systematic basis and efficient design of controller for wide variety of nonlinear dynamical systems. Then we showed how this systematic design improved the robustness of the system and facilitated the integration of motion planning techniques with the controller. In particular, obstacle avoidance technique based on artificial potential field (APF) can be easily combined with SDRE controller with efficient performance. Due to difficulties of exact solution for SDRE, an approximation method was used based on power series expansion. The efficiency and robustness of the SDRE controller was illustrated on a six-cable suspended robot with proper simulations.
Ground-based deep-space LADAR for satellite detection: A parametric study
NASA Astrophysics Data System (ADS)
Davey, Kevin F.
1989-12-01
The minimum performance requirements are determined of a ground based infrared LADAR designed to detect deep space satellites, and a candidate sensor design is presented based on current technology. The research examines LADAR techniques and detection methods to determine the optimum LADAR configuration, and then assesses the effects of atmospheric transmission, background radiance, and turbulence across the infrared region to find the optimum laser wavelengths. Diffraction theory is then used in a parametric analysis of the transmitted laser beam and received signal, using a Cassegrainian telescope design and heterodyne detection. The effects of beam truncation and obscuration, heterodyne misalignment, off-boresight detection, and image-pixel geometry are also included in the analysis. The derived equations are then used to assess the feasibility of several candidate designs under a wide range of detection conditions including daylight operation through cirrus. The results show that successful detection is theoretically possible under most conditions by transmitting a high power frequency modulated pulse train from an isotopic 13CO2 laser radiating at 11.17 micrometers, and utilizing post-detection integration and pulse compression techniques.
Design of Tactile Sensor Using Dynamic Wafer Technology Based on VLSI Technique
2001-10-25
Charles Noback, Rober Carola," Human Anatomy and Physiology" third edition, 1995. [5] M.H. Raibert and John E. Tanner, "Design and Implementation of VLSI Tactile Sensing Computer" Robotics Research vol 1, 1983.
Low Life Cycle Cost Paratransit Vehicle Design Study
DOT National Transportation Integrated Search
1978-08-01
A preliminary design and cost study was performed for a low life cycle cost paratransit vehicle. The manufacturing technique and cost analysis were based on limited production of 5000 units per year for a ten year period. The vehicle configuration re...
Use of activity theory-based need finding for biomedical device development.
Rismani, Shalaleh; Ratto, Matt; Machiel Van der Loos, H F
2016-08-01
Identifying the appropriate needs for biomedical device design is challenging, especially for less structured environments. The paper proposes an alternate need-finding method based on Cultural Historical Activity Theory and expanded to explicitly examine the role of devices within a socioeconomic system. This is compared to a conventional need-finding technique in a preliminary study with engineering student teams. The initial results show that the Activity Theory-based technique allows teams to gain deeper insights into their needs space.
NASA Technical Reports Server (NTRS)
LaValley, Brian W.; Little, Phillip D.; Walter, Chris J.
2011-01-01
This report documents the capabilities of the EDICT tools for error modeling and error propagation analysis when operating with models defined in the Architecture Analysis & Design Language (AADL). We discuss our experience using the EDICT error analysis capabilities on a model of the Scalable Processor-Independent Design for Enhanced Reliability (SPIDER) architecture that uses the Reliable Optical Bus (ROBUS). Based on these experiences we draw some initial conclusions about model based design techniques for error modeling and analysis of highly reliable computing architectures.
1989-02-01
which capture the knowledge of such experts. These Expert Systems, or Knowledge-Based Systems’, differ from the usual computer programming techniques...their applications in the fields of structural design and welding is reviewed. 5.1 Introduction Expert Systems, or KBES, are computer programs using Al...procedurally constructed as conventional computer programs usually are; * The knowledge base of such systems is executable, unlike databases 3 "Ill
Design and manufacturing of the CFRP lightweight telescope structure
NASA Astrophysics Data System (ADS)
Stoeffler, Guenter; Kaindl, Rainer
2000-06-01
Design of earthbound telescopes is normally based on conventional steel constructions. Several years ago thermostable CFRP Telescope and reflector structures were developed and manufacturing for harsh terrestrial environments. The airborne SOFIA TA requires beyond thermostability an excessive stiffness to mass ratio for the structure fulfilling performance and not to exceed mass limitations by the aircraft Boeing 747 SP. Additional integration into A/C drives design of structure subassemblies. Thickness of CFRP Laminates, either filament wound or prepreg manufactured need special attention and techniques to gain high material quality according to aerospace requirements. Sequential shop assembly of the structure subassemblies minimizes risk for assembling TA. Design goals, optimization of layout and manufacturing techniques and results are presented.
Adjoint-Based Aerodynamic Design of Complex Aerospace Configurations
NASA Technical Reports Server (NTRS)
Nielsen, Eric J.
2016-01-01
An overview of twenty years of adjoint-based aerodynamic design research at NASA Langley Research Center is presented. Adjoint-based algorithms provide a powerful tool for efficient sensitivity analysis of complex large-scale computational fluid dynamics (CFD) simulations. Unlike alternative approaches for which computational expense generally scales with the number of design parameters, adjoint techniques yield sensitivity derivatives of a simulation output with respect to all input parameters at the cost of a single additional simulation. With modern large-scale CFD applications often requiring millions of compute hours for a single analysis, the efficiency afforded by adjoint methods is critical in realizing a computationally tractable design optimization capability for such applications.
DOT National Transportation Integrated Search
2017-06-01
The main objective of this study was to advance the understanding of alternative pavement designs. In particular, potential techniques such as inverted base pavements (IBP) have increased the importance of granular aggregate bases (GAB) in pavement s...
Automated extraction of knowledge for model-based diagnostics
NASA Technical Reports Server (NTRS)
Gonzalez, Avelino J.; Myler, Harley R.; Towhidnejad, Massood; Mckenzie, Frederic D.; Kladke, Robin R.
1990-01-01
The concept of accessing computer aided design (CAD) design databases and extracting a process model automatically is investigated as a possible source for the generation of knowledge bases for model-based reasoning systems. The resulting system, referred to as automated knowledge generation (AKG), uses an object-oriented programming structure and constraint techniques as well as internal database of component descriptions to generate a frame-based structure that describes the model. The procedure has been designed to be general enough to be easily coupled to CAD systems that feature a database capable of providing label and connectivity data from the drawn system. The AKG system is capable of defining knowledge bases in formats required by various model-based reasoning tools.
Multivariable PID controller design tuning using bat algorithm for activated sludge process
NASA Astrophysics Data System (ADS)
Atikah Nor’Azlan, Nur; Asmiza Selamat, Nur; Mat Yahya, Nafrizuan
2018-04-01
The designing of a multivariable PID control for multi input multi output is being concerned with this project by applying four multivariable PID control tuning which is Davison, Penttinen-Koivo, Maciejowski and Proposed Combined method. The determination of this study is to investigate the performance of selected optimization technique to tune the parameter of MPID controller. The selected optimization technique is Bat Algorithm (BA). All the MPID-BA tuning result will be compared and analyzed. Later, the best MPID-BA will be chosen in order to determine which techniques are better based on the system performances in terms of transient response.
NASA Astrophysics Data System (ADS)
Jiang, Xiao-Pan; Zhang, Zi-Liang; Qin, Xiu-Bo; Yu, Run-Sheng; Wang, Bao-Yi
2010-12-01
Positronium time of flight spectroscopy (Ps-TOF) is an effective technique for porous material research. It has advantages over other techniques for analyzing the porosity and pore tortuosity of materials. This paper describes a design for Ps-TOF apparatus based on the Beijing intense slow positron beam, supplying a new material characterization technique. In order to improve the time resolution and increase the count rate of the apparatus, the detector system is optimized. For 3 eV o-Ps, the time broadening is 7.66 ns and the count rate is 3 cps after correction.
A novel surrogate-based approach for optimal design of electromagnetic-based circuits
NASA Astrophysics Data System (ADS)
Hassan, Abdel-Karim S. O.; Mohamed, Ahmed S. A.; Rabie, Azza A.; Etman, Ahmed S.
2016-02-01
A new geometric design centring approach for optimal design of central processing unit-intensive electromagnetic (EM)-based circuits is introduced. The approach uses norms related to the probability distribution of the circuit parameters to find distances from a point to the feasible region boundaries by solving nonlinear optimization problems. Based on these normed distances, the design centring problem is formulated as a max-min optimization problem. A convergent iterative boundary search technique is exploited to find the normed distances. To alleviate the computation cost associated with the EM-based circuits design cycle, space-mapping (SM) surrogates are used to create a sequence of iteratively updated feasible region approximations. In each SM feasible region approximation, the centring process using normed distances is implemented, leading to a better centre point. The process is repeated until a final design centre is attained. Practical examples are given to show the effectiveness of the new design centring method for EM-based circuits.
Fabric-based active electrode design and fabrication for health monitoring clothing.
Merritt, Carey R; Nagle, H Troy; Grant, Edward
2009-03-01
In this paper, two versions of fabric-based active electrodes are presented to provide a wearable solution for ECG monitoring clothing. The first version of active electrode involved direct attachment of surface-mountable components to a textile screen-printed circuit using polymer thick film techniques. The second version involved attaching a much smaller, thinner, and less obtrusive interposer containing the active electrode circuitry to a simplified textile circuit. These designs explored techniques for electronic textile interconnection, chip attachment to textiles, and packaging of circuits on textiles for durability. The results from ECG tests indicate that the performance of each active electrode is comparable to commercial Ag/AgCl electrodes. The interposer-based active electrodes survived a five-cycle washing test while maintaining good signal integrity.
Compiler-assisted multiple instruction rollback recovery using a read buffer
NASA Technical Reports Server (NTRS)
Alewine, Neal J.; Chen, Shyh-Kwei; Fuchs, W. Kent; Hwu, Wen-Mei W.
1995-01-01
Multiple instruction rollback (MIR) is a technique that has been implemented in mainframe computers to provide rapid recovery from transient processor failures. Hardware-based MIR designs eliminate rollback data hazards by providing data redundancy implemented in hardware. Compiler-based MIR designs have also been developed which remove rollback data hazards directly with data-flow transformations. This paper describes compiler-assisted techniques to achieve multiple instruction rollback recovery. We observe that some data hazards resulting from instruction rollback can be resolved efficiently by providing an operand read buffer while others are resolved more efficiently with compiler transformations. The compiler-assisted scheme presented consists of hardware that is less complex than shadow files, history files, history buffers, or delayed write buffers, while experimental evaluation indicates performance improvement over compiler-based schemes.
NASA Technical Reports Server (NTRS)
Lakshminarayana, B.
1991-01-01
Various computational fluid dynamic techniques are reviewed focusing on the Euler and Navier-Stokes solvers with a brief assessment of boundary layer solutions, and quasi-3D and quasi-viscous techniques. Particular attention is given to a pressure-based method, explicit and implicit time marching techniques, a pseudocompressibility technique for incompressible flow, and zonal techniques. Recommendations are presented with regard to the most appropriate technique for various flow regimes and types of turbomachinery, incompressible and compressible flows, cascades, rotors, stators, liquid-handling, and gas-handling turbomachinery.
A Review of Developments in Computer-Based Systems to Image Teeth and Produce Dental Restorations
Rekow, E. Dianne; Erdman, Arthur G.; Speidel, T. Michael
1987-01-01
Computer-aided design and manufacturing (CAD/CAM) make it possible to automate the creation of dental restorations. Currently practiced techniques are described. Three automated systems currently under development are described and compared. Advances in computer-aided design and computer-aided manufacturing (CAD/CAM) provide a new option for dentistry, creating an alternative technique for producing dental restorations. It is possible to create dental restorations that are automatically produced and meet or exceed current requirements for fit and occlusion.
A high level language for a high performance computer
NASA Technical Reports Server (NTRS)
Perrott, R. H.
1978-01-01
The proposed computational aerodynamic facility will join the ranks of the supercomputers due to its architecture and increased execution speed. At present, the languages used to program these supercomputers have been modifications of programming languages which were designed many years ago for sequential machines. A new programming language should be developed based on the techniques which have proved valuable for sequential programming languages and incorporating the algorithmic techniques required for these supercomputers. The design objectives for such a language are outlined.
Ahmed, Ashik; Al-Amin, Rasheduzzaman; Amin, Ruhul
2014-01-01
This paper proposes designing of Static Synchronous Series Compensator (SSSC) based damping controller to enhance the stability of a Single Machine Infinite Bus (SMIB) system by means of Invasive Weed Optimization (IWO) technique. Conventional PI controller is used as the SSSC damping controller which takes rotor speed deviation as the input. The damping controller parameters are tuned based on time integral of absolute error based cost function using IWO. Performance of IWO based controller is compared to that of Particle Swarm Optimization (PSO) based controller. Time domain based simulation results are presented and performance of the controllers under different loading conditions and fault scenarios is studied in order to illustrate the effectiveness of the IWO based design approach.
Shock and vibration technology with applications to electrical systems
NASA Technical Reports Server (NTRS)
Eshleman, R. L.
1972-01-01
A survey is presented of shock and vibration technology for electrical systems developed by the aerospace programs. The shock environment is surveyed along with new techniques for modeling, computer simulation, damping, and response analysis. Design techniques based on the use of analog computers, shock spectra, optimization, and nonlinear isolation are discussed. Shock mounting of rotors for performance and survival, and vibration isolation techniques are reviewed.
NASA Astrophysics Data System (ADS)
Orsini, S.; di Lellis, A. M.; Milillo, A.; de Angelis, E.; Mura, A.; Selci, S.; Dandouras, I.; Cerulli-Irelli, P.; Leoni, R.; Mangano, V.; Massetti, S.; Mattioli, F.; Orfei, R.; Austin, C.; Medale, J.-L.; Vertolli, N.; di Giulio, D.
2009-06-01
The neutral sensor ELENA (Emitted Low-Energy Neutral Atoms) for the ESA cornerstone BepiColombo mission to Mercury (in the SERENA instrument package) is a new kind of low energetic neutral atoms instrument, mostly devoted to sputtering emission from planetary surfaces, from E~20 eV up to E~5 keV, within 1-D (2°×76°). ELENA is a Time-of-Flight (TOF) system, based on oscillating shutter (operated at frequencies up to a 100 kHz) and mechanical gratings: the incoming neutral particles directly impinge upon the entrance with a definite timing (START) and arrive to a STOP detector after a flight path. After a brief dissertation on the achievable scientific objectives, this paper describes the instrument, with the new design techniques approached for the neutral particles identification and the nano-techniques used for designing and manufacturing the nano-structure shuttering core of the ELENA sensor. The expected count-rates, based on the Hermean environment features, are shortly presented and discussed. Such design technologies could be fruitfully exported to different applications for planetary exploration.
A data compression technique for synthetic aperture radar images
NASA Technical Reports Server (NTRS)
Frost, V. S.; Minden, G. J.
1986-01-01
A data compression technique is developed for synthetic aperture radar (SAR) imagery. The technique is based on an SAR image model and is designed to preserve the local statistics in the image by an adaptive variable rate modification of block truncation coding (BTC). A data rate of approximately 1.6 bit/pixel is achieved with the technique while maintaining the image quality and cultural (pointlike) targets. The algorithm requires no large data storage and is computationally simple.
EMU Suit Performance Simulation
NASA Technical Reports Server (NTRS)
Cowley, Matthew S.; Benson, Elizabeth; Harvill, Lauren; Rajulu, Sudhakar
2014-01-01
Introduction: Designing a planetary suit is very complex and often requires difficult trade-offs between performance, cost, mass, and system complexity. To verify that new suit designs meet requirements, full prototypes must be built and tested with human subjects. However, numerous design iterations will occur before the hardware meets those requirements. Traditional draw-prototype-test paradigms for research and development are prohibitively expensive with today's shrinking Government budgets. Personnel at NASA are developing modern simulation techniques that focus on a human-centric design paradigm. These new techniques make use of virtual prototype simulations and fully adjustable physical prototypes of suit hardware. This is extremely advantageous and enables comprehensive design down-selections to be made early in the design process. Objectives: The primary objective was to test modern simulation techniques for evaluating the human performance component of two EMU suit concepts, pivoted and planar style hard upper torso (HUT). Methods: This project simulated variations in EVA suit shoulder joint design and subject anthropometry and then measured the differences in shoulder mobility caused by the modifications. These estimations were compared to human-in-the-loop test data gathered during past suited testing using four subjects (two large males, two small females). Results: Results demonstrated that EVA suit modeling and simulation are feasible design tools for evaluating and optimizing suit design based on simulated performance. The suit simulation model was found to be advantageous in its ability to visually represent complex motions and volumetric reach zones in three dimensions, giving designers a faster and deeper comprehension of suit component performance vs. human performance. Suit models were able to discern differing movement capabilities between EMU HUT configurations, generic suit fit concerns, and specific suit fit concerns for crewmembers based on individual anthropometry
NASA software specification and evaluation system design, part 2
NASA Technical Reports Server (NTRS)
1976-01-01
A survey and analysis of the existing methods, tools and techniques employed in the development of software are presented along with recommendations for the construction of reliable software. Functional designs for software specification language, and the data base verifier are presented.
A Mobile-Based E-Learning System
ERIC Educational Resources Information Center
Ojokoh, Bolanle Adefowoke; Doyeni, Olubimtan Ayo; Adewale, Olumide Sunday; Isinkaye, Folasade Olubusola
2013-01-01
E-learning is an innovative approach for delivering electronically mediated, well-designed, learner-centred interactive learning environments by utilizing internet and digital technologies with respect to instructional design principles. This paper presents the application of Software Development techniques in the development of a Mobile Based…
NASA Astrophysics Data System (ADS)
Baier, S.; Rochet, A.; Hofmann, G.; Kraut, M.; Grunwaldt, J.-D.
2015-06-01
We report on a new modular setup on a silicon-based microreactor designed for correlative spectroscopic, scattering, and analytic on-line gas investigations for in situ studies of heterogeneous catalysts. The silicon microreactor allows a combination of synchrotron radiation based techniques (e.g., X-ray diffraction and X-ray absorption spectroscopy) as well as infrared thermography and Raman spectroscopy. Catalytic performance can be determined simultaneously by on-line product analysis using mass spectrometry. We present the design of the reactor, the experimental setup, and as a first example for an in situ study, the catalytic partial oxidation of methane showing the applicability of this reactor for in situ studies.
Baier, S; Rochet, A; Hofmann, G; Kraut, M; Grunwaldt, J-D
2015-06-01
We report on a new modular setup on a silicon-based microreactor designed for correlative spectroscopic, scattering, and analytic on-line gas investigations for in situ studies of heterogeneous catalysts. The silicon microreactor allows a combination of synchrotron radiation based techniques (e.g., X-ray diffraction and X-ray absorption spectroscopy) as well as infrared thermography and Raman spectroscopy. Catalytic performance can be determined simultaneously by on-line product analysis using mass spectrometry. We present the design of the reactor, the experimental setup, and as a first example for an in situ study, the catalytic partial oxidation of methane showing the applicability of this reactor for in situ studies.
Verification and Validation of Autonomy Software at NASA
NASA Technical Reports Server (NTRS)
Pecheur, Charles
2000-01-01
Autonomous software holds the promise of new operation possibilities, easier design and development and lower operating costs. However, as those system close control loops and arbitrate resources on board with specialized reasoning, the range of possible situations becomes very large and uncontrollable from the outside, making conventional scenario-based testing very inefficient. Analytic verification and validation (V&V) techniques, and model checking in particular, can provide significant help for designing autonomous systems in a more efficient and reliable manner, by providing a better coverage and allowing early error detection. This article discusses the general issue of V&V of autonomy software, with an emphasis towards model-based autonomy, model-checking techniques and concrete experiments at NASA.
Verification and Validation of Autonomy Software at NASA
NASA Technical Reports Server (NTRS)
Pecheur, Charles
2000-01-01
Autonomous software holds the promise of new operation possibilities, easier design and development, and lower operating costs. However, as those system close control loops and arbitrate resources on-board with specialized reasoning, the range of possible situations becomes very large and uncontrollable from the outside, making conventional scenario-based testing very inefficient. Analytic verification and validation (V&V) techniques, and model checking in particular, can provide significant help for designing autonomous systems in a more efficient and reliable manner, by providing a better coverage and allowing early error detection. This article discusses the general issue of V&V of autonomy software, with an emphasis towards model-based autonomy, model-checking techniques, and concrete experiments at NASA.
Chang, Yeong-Chan
2005-12-01
This paper addresses the problem of designing adaptive fuzzy-based (or neural network-based) robust controls for a large class of uncertain nonlinear time-varying systems. This class of systems can be perturbed by plant uncertainties, unmodeled perturbations, and external disturbances. Nonlinear H(infinity) control technique incorporated with adaptive control technique and VSC technique is employed to construct the intelligent robust stabilization controller such that an H(infinity) control is achieved. The problem of the robust tracking control design for uncertain robotic systems is employed to demonstrate the effectiveness of the developed robust stabilization control scheme. Therefore, an intelligent robust tracking controller for uncertain robotic systems in the presence of high-degree uncertainties can easily be implemented. Its solution requires only to solve a linear algebraic matrix inequality and a satisfactorily transient and asymptotical tracking performance is guaranteed. A simulation example is made to confirm the performance of the developed control algorithms.
NASA Astrophysics Data System (ADS)
Kozhikkottu, Vivek J.
The scaling of integrated circuits into the nanometer regime has led to variations emerging as a primary concern for designers of integrated circuits. Variations are an inevitable consequence of the semiconductor manufacturing process, and also arise due to the side-effects of operation of integrated circuits (voltage, temperature, and aging). Conventional design approaches, which are based on design corners or worst-case scenarios, leave designers with an undesirable choice between the considerable overheads associated with over-design and significantly reduced manufacturing yield. Techniques for variation-tolerant design at the logic, circuit and layout levels of the design process have been developed and are in commercial use. However, with the incessant increase in variations due to technology scaling and design trends such as near-threshold computing, these techniques are no longer sufficient to contain the effects of variations, and there is a need to address variations at all stages of design. This thesis addresses the problem of variation-tolerant design at the earliest stages of the design process, where the system-level design decisions that are made can have a very significant impact. There are two key aspects to making system-level design variation-aware. First, analysis techniques must be developed to project the impact of variations on system-level metrics such as application performance and energy. Second, variation-tolerant design techniques need to be developed to absorb the residual impact of variations (that cannot be contained through lower-level techniques). In this thesis, we address both these facets by developing robust and scalable variation-aware analysis and variation mitigation techniques at the system level. The first contribution of this thesis is a variation-aware system-level performance analysis framework. We address the key challenge of translating the per-component clock frequency distributions into a system-level application performance distribution. This task is particularly complex and challenging due to the inter-dependencies between components' execution, indirect effects of shared resources, and interactions between multiple system-level "execution paths". We argue that accurate variation-aware performance analysis requires Monte-Carlo based repeated system execution. Our proposed analysis framework leverages emulation to significantly speedup performance analysis without sacrificing the generality and accuracy achieved by Monte-Carlo based simulations. Our experiments show performance improvements of around 60x compared to state-of-the-art hardware-software co-simulation tools and also underscore the framework's potential to enable variation-aware design and exploration at the system level. Our second contribution addresses the problem of designing variation-tolerant SoCs using recovery based design, a popular circuit design paradigm that addresses variations by eliminating guard-bands and operating circuits at close to "zero margins" while detecting and recovering from timing errors. While previous efforts have demonstrated the potential benefits of recovery based design, we identify several challenges that need to be addressed in order to apply this technique to SoCs. We present a systematic design framework to apply recovery based design at the system level. We propose to partition SoCs into "recovery islands", wherein each recovery island consists of one or more SoC components that can recover independent of the rest of the SoC. We present a variation-aware design methodology that partitions a given SoC into recovery islands and computes the optimal operating points for each island, taking into account the various trade-offs involved. Our experiments demonstrate that the proposed design framework achieves an average of 32% energy savings over conventional worst-case designs, with negligible losses in performance. The third contribution of this thesis introduces disproportionate allocation of shared system resources as a means to combat the adverse impact of within-die variations on multi-core platforms. For multi-threaded programs executing on variation-impacted multi-cores platforms, we make the key observation that thread performance is not only a function of the frequency of the core on which it is executing on, but also depends upon the amount of shared system resources allocated to it. We utilize this insight to design a variation-aware runtime scheme which allocates the ways of a last-level shared L2 cache amongst the different cores/threads of a multi-core platform taking into account both application characteristics as well as chip specific variation profiles. Our experiments on 100 quad-core chips, each with a distinct variation profile, shows on an average 15% performance improvements for a suite of multi-threaded benchmarks. Our final contribution investigates the variation-tolerant design of domain-specific accelerators and demonstrates how the unique architectural properties of these accelerators can be leveraged to create highly effective variation tolerance mechanisms. We explore this concept through the variation-tolerant design of a vector processor that efficiently executes applications from the domains of recognition, mining and synthesis (RMS). We develop a novel design approach for variation tolerance, which leverages the unique nature of the vector reduction operations performed by this processor to effectively predict and preempt the occurrence of timing errors under variations and subsequently restore the correct output at the end of each vector reduction operation. We implement the above predict, preempt and restore operations by suitably enhancing the processor hardware and the application software and demonstrate considerable energy benefits (on an average 32%) across six applications from the domains of RMS. In conclusion, our work provides system designers with powerful tools and mechanisms in their efforts to combat variations, resulting in improved designer productivity and variation-tolerant systems.
NASA Astrophysics Data System (ADS)
Yin, X.; Chen, G.; Li, W.; Huthchins, D. A.
2013-01-01
Previous work indicated that the capacitive imaging (CI) technique is a useful NDE tool which can be used on a wide range of materials, including metals, glass/carbon fibre composite materials and concrete. The imaging performance of the CI technique for a given application is determined by design parameters and characteristics of the CI probe. In this paper, a rapid method for calculating the whole probe sensitivity distribution based on the finite element model (FEM) is presented to provide a direct view of the imaging capabilities of the planar CI probe. Sensitivity distributions of CI probes with different geometries were obtained. Influencing factors on sensitivity distribution were studied. Comparisons between CI probes with point-to-point triangular electrode pair and back-to-back triangular electrode pair were made based on the analysis of the corresponding sensitivity distributions. The results indicated that the sensitivity distribution could be useful for optimising the probe design parameters and predicting the imaging performance.
Coarse-to-fine markerless gait analysis based on PCA and Gauss-Laguerre decomposition
NASA Astrophysics Data System (ADS)
Goffredo, Michela; Schmid, Maurizio; Conforto, Silvia; Carli, Marco; Neri, Alessandro; D'Alessio, Tommaso
2005-04-01
Human movement analysis is generally performed through the utilization of marker-based systems, which allow reconstructing, with high levels of accuracy, the trajectories of markers allocated on specific points of the human body. Marker based systems, however, show some drawbacks that can be overcome by the use of video systems applying markerless techniques. In this paper, a specifically designed computer vision technique for the detection and tracking of relevant body points is presented. It is based on the Gauss-Laguerre Decomposition, and a Principal Component Analysis Technique (PCA) is used to circumscribe the region of interest. Results obtained on both synthetic and experimental tests provide significant reduction of the computational costs, with no significant reduction of the tracking accuracy.
Principles and techniques in the design of ADMS+. [advanced data-base management system
NASA Technical Reports Server (NTRS)
Roussopoulos, Nick; Kang, Hyunchul
1986-01-01
'ADMS+/-' is an advanced data base management system whose architecture integrates the ADSM+ mainframe data base system with a large number of work station data base systems, designated ADMS-; no communications exist between these work stations. The use of this system radically decreases the response time of locally processed queries, since the work station runs in a single-user mode, and no dynamic security checking is required for the downloaded portion of the data base. The deferred update strategy used reduces overhead due to update synchronization in message traffic.
A Method for Generating Reduced Order Linear Models of Supersonic Inlets
NASA Technical Reports Server (NTRS)
Chicatelli, Amy; Hartley, Tom T.
1997-01-01
For the modeling of high speed propulsion systems, there are at least two major categories of models. One is based on computational fluid dynamics (CFD), and the other is based on design and analysis of control systems. CFD is accurate and gives a complete view of the internal flow field, but it typically has many states and runs much slower dm real-time. Models based on control design typically run near real-time but do not always capture the fundamental dynamics. To provide improved control models, methods are needed that are based on CFD techniques but yield models that are small enough for control analysis and design.
Integrated performance and reliability specification for digital avionics systems
NASA Technical Reports Server (NTRS)
Brehm, Eric W.; Goettge, Robert T.
1995-01-01
This paper describes an automated tool for performance and reliability assessment of digital avionics systems, called the Automated Design Tool Set (ADTS). ADTS is based on an integrated approach to design assessment that unifies traditional performance and reliability views of system designs, and that addresses interdependencies between performance and reliability behavior via exchange of parameters and result between mathematical models of each type. A multi-layer tool set architecture has been developed for ADTS that separates the concerns of system specification, model generation, and model solution. Performance and reliability models are generated automatically as a function of candidate system designs, and model results are expressed within the system specification. The layered approach helps deal with the inherent complexity of the design assessment process, and preserves long-term flexibility to accommodate a wide range of models and solution techniques within the tool set structure. ADTS research and development to date has focused on development of a language for specification of system designs as a basis for performance and reliability evaluation. A model generation and solution framework has also been developed for ADTS, that will ultimately encompass an integrated set of analytic and simulated based techniques for performance, reliability, and combined design assessment.
United States Army Weapon Systems 2010
2009-09-18
cue Soldiers based on how their brains process what they see, hear, and feel. Such neuro- ergonomic designs can exploit how the brain functions...environments, as well as techniques to use them for neuro- ergonomic design . Technology development will focus on solutions to cognition, visual...are accomplishing our mission. It is designed to promote greater understanding of our major acquisition programs. It describes what each is designed
Design and Evaluation of Perceptual-based Object Group Selection Techniques
NASA Astrophysics Data System (ADS)
Dehmeshki, Hoda
Selecting groups of objects is a frequent task in graphical user interfaces. It is required prior to many standard operations such as deletion, movement, or modification. Conventional selection techniques are lasso, rectangle selection, and the selection and de-selection of items through the use of modifier keys. These techniques may become time-consuming and error-prone when target objects are densely distributed or when the distances between target objects are large. Perceptual-based selection techniques can considerably improve selection tasks when targets have a perceptual structure, for example when arranged along a line. Current methods to detect such groups use ad hoc grouping algorithms that are not based on results from perception science. Moreover, these techniques do not allow selecting groups with arbitrary arrangements or permit modifying a selection. This dissertation presents two domain-independent perceptual-based systems that address these issues. Based on established group detection models from perception research, the proposed systems detect perceptual groups formed by the Gestalt principles of good continuation and proximity. The new systems provide gesture-based or click-based interaction techniques for selecting groups with curvilinear or arbitrary structures as well as clusters. Moreover, the gesture-based system is adapted for the graph domain to facilitate path selection. This dissertation includes several user studies that show the proposed systems outperform conventional selection techniques when targets form salient perceptual groups and are still competitive when targets are semi-structured.
Investigation of advanced phase-shifting projected fringe profilometry techniques
NASA Astrophysics Data System (ADS)
Liu, Hongyu
1999-11-01
The phase-shifting projected fringe profilometry (PSPFP) technique is a powerful tool in the profile measurements of rough engineering surfaces. Compared with other competing techniques, this technique is notable for its full-field measurement capacity, system simplicity, high measurement speed, and low environmental vulnerability. The main purpose of this dissertation is to tackle three important problems, which severely limit the capability and the accuracy of the PSPFP technique, with some new approaches. Chapter 1 provides some background information of the PSPFP technique including the measurement principles, basic features, and related techniques is briefly introduced. The objectives and organization of the thesis are also outlined. Chapter 2 gives a theoretical treatment to the absolute PSPFP measurement. The mathematical formulations and basic requirements of the absolute PSPFP measurement and its supporting techniques are discussed in detail. Chapter 3 introduces the experimental verification of the proposed absolute PSPFP technique. Some design details of a prototype system are discussed as supplements to the previous theoretical analysis. Various fundamental experiments performed for concept verification and accuracy evaluation are introduced together with some brief comments. Chapter 4 presents the theoretical study of speckle- induced phase measurement errors. In this analysis, the expression for speckle-induced phase errors is first derived based on the multiplicative noise model of image- plane speckles. The statistics and the system dependence of speckle-induced phase errors are then thoroughly studied through numerical simulations and analytical derivations. Based on the analysis, some suggestions on the system design are given to improve measurement accuracy. Chapter 5 discusses a new technique combating surface reflectivity variations. The formula used for error compensation is first derived based on a simplified model of the detection process. The techniques coping with two major effects of surface reflectivity variations are then introduced. Some fundamental problems in the proposed technique are studied through simulations. Chapter 6 briefly summarizes the major contributions of the current work and provides some suggestions for the future research.
Rifai, Damhuji; Abdalla, Ahmed N; Razali, Ramdan; Ali, Kharudin; Faraj, Moneer A
2017-03-13
The use of the eddy current technique (ECT) for the non-destructive testing of conducting materials has become increasingly important in the past few years. The use of the non-destructive ECT plays a key role in the ensuring the safety and integrity of the large industrial structures such as oil and gas pipelines. This paper introduce a novel ECT probe design integrated with the distributed ECT inspection system (DSECT) use for crack inspection on inner ferromagnetic pipes. The system consists of an array of giant magneto-resistive (GMR) sensors, a pneumatic system, a rotating magnetic field excitation source and a host PC acting as the data analysis center. Probe design parameters, namely probe diameter, an excitation coil and the number of GMR sensors in the array sensor is optimized using numerical optimization based on the desirability approach. The main benefits of DSECT can be seen in terms of its modularity and flexibility for the use of different types of magnetic transducers/sensors, and signals of a different nature with either digital or analog outputs, making it suited for the ECT probe design using an array of GMR magnetic sensors. A real-time application of the DSECT distributed system for ECT inspection can be exploited for the inspection of 70 mm carbon steel pipe. In order to predict the axial and circumference defect detection, a mathematical model is developed based on the technique known as response surface methodology (RSM). The inspection results of a carbon steel pipe sample with artificial defects indicate that the system design is highly efficient.
TRAC Innovative Visualization Techniques
2016-11-14
Therefore, TRAC analysts need a way to analyze the effectiveness of their visualization design choices. Currently, TRAC does not have a methodology ...to analyze visualizations used to support an analysis story. Our research team developed a visualization design methodology to create effective...visualizations that support an analysis story. First, we based our methodology on the latest research on design thinking, cognitive learning, and
High-speed engine/component performance assessment using exergy and thrust-based methods
NASA Technical Reports Server (NTRS)
Riggins, D. W.
1996-01-01
This investigation summarizes a comparative study of two high-speed engine performance assessment techniques based on energy (available work) and thrust-potential (thrust availability). Simple flow-fields utilizing Rayleigh heat addition and one-dimensional flow with friction are used to demonstrate the fundamental inability of conventional energy techniques to predict engine component performance, aid in component design, or accurately assess flow losses. The use of the thrust-based method on these same examples demonstrates its ability to yield useful information in all these categories. Energy and thrust are related and discussed from the stand-point of their fundamental thermodynamic and fluid dynamic definitions in order to explain the differences in information obtained using the two methods. The conventional definition of energy is shown to include work which is inherently unavailable to an aerospace Brayton engine. An engine-based energy is then developed which accurately accounts for this inherently unavailable work; performance parameters based on this quantity are then shown to yield design and loss information equivalent to the thrust-based method.
Algorithms and architecture for multiprocessor based circuit simulation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Deutsch, J.T.
Accurate electrical simulation is critical to the design of high performance integrated circuits. Logic simulators can verify function and give first-order timing information. Switch level simulators are more effective at dealing with charge sharing than standard logic simulators, but cannot provide accurate timing information or discover DC problems. Delay estimation techniques and cell level simulation can be used in constrained design methods, but must be tuned for each application, and circuit simulation must still be used to generate the cell models. None of these methods has the guaranteed accuracy that many circuit designers desire, and none can provide detailed waveformmore » information. Detailed electrical-level simulation can predict circuit performance if devices and parasitics are modeled accurately. However, the computational requirements of conventional circuit simulators make it impractical to simulate current large circuits. In this dissertation, the implementation of Iterated Timing Analysis (ITA), a relaxation-based technique for accurate circuit simulation, on a special-purpose multiprocessor is presented. The ITA method is an SOR-Newton, relaxation-based method which uses event-driven analysis and selective trace to exploit the temporal sparsity of the electrical network. Because event-driven selective trace techniques are employed, this algorithm lends itself to implementation on a data-driven computer.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mian, Muhammad Umer, E-mail: umermian@gmail.com; Khir, M. H. Md.; Tang, T. B.
Pre-fabrication, behavioural and performance analysis with computer aided design (CAD) tools is a common and fabrication cost effective practice. In light of this we present a simulation methodology for a dual-mass oscillator based 3 Degree of Freedom (3-DoF) MEMS gyroscope. 3-DoF Gyroscope is modeled through lumped parameter models using equivalent circuit elements. These equivalent circuits consist of elementary components which are counterpart of their respective mechanical components, used to design and fabricate 3-DoF MEMS gyroscope. Complete designing of equivalent circuit model, mathematical modeling and simulation are being presented in this paper. Behaviors of the equivalent lumped models derived for themore » proposed device design are simulated in MEMSPRO T-SPICE software. Simulations are carried out with the design specifications following design rules of the MetalMUMPS fabrication process. Drive mass resonant frequencies simulated by this technique are 1.59 kHz and 2.05 kHz respectively, which are close to the resonant frequencies found by the analytical formulation of the gyroscope. The lumped equivalent circuit modeling technique proved to be a time efficient modeling technique for the analysis of complex MEMS devices like 3-DoF gyroscopes. The technique proves to be an alternative approach to the complex and time consuming couple field analysis Finite Element Analysis (FEA) previously used.« less
Continual Response Measurement: Design and Validation.
ERIC Educational Resources Information Center
Baggaley, Jon
1987-01-01
Discusses reliability and validity of continual response measurement (CRM), a computer-based measurement technique, and its use in social science research. Highlights include the importance of criterion-referencing the data, guidelines for designing studies using CRM, examples typifying their deductive and inductive functions, and a discussion of…
Computer-oriented synthesis of wide-band non-uniform negative resistance amplifiers
NASA Technical Reports Server (NTRS)
Branner, G. R.; Chan, S.-P.
1975-01-01
This paper presents a synthesis procedure which provides design values for broad-band amplifiers using non-uniform negative resistance devices. Employing a weighted least squares optimization scheme, the technique, based on an extension of procedures for uniform negative resistance devices, is capable of providing designs for a variety of matching network topologies. It also provides, for the first time, quantitative results for predicting the effects of parameter element variations on overall amplifier performance. The technique is also unique in that it employs exact partial derivatives for optimization and sensitivity computation. In comparison with conventional procedures, significantly improved broad-band designs are shown to result.
Top down, bottom up structured programming and program structuring
NASA Technical Reports Server (NTRS)
Hamilton, M.; Zeldin, S.
1972-01-01
New design and programming techniques for shuttle software. Based on previous Apollo experience, recommendations are made to apply top-down structured programming techniques to shuttle software. New software verification techniques for large software systems are recommended. HAL, the higher order language selected for the shuttle flight code, is discussed and found to be adequate for implementing these techniques. Recommendations are made to apply the workable combination of top-down, bottom-up methods in the management of shuttle software. Program structuring is discussed relevant to both programming and management techniques.
Herrera-May, Agustín Leobardo; Soler-Balcazar, Juan Carlos; Vázquez-Leal, Héctor; Martínez-Castillo, Jaime; Vigueras-Zuñiga, Marco Osvaldo; Aguilera-Cortés, Luz Antonio
2016-08-24
Microelectromechanical systems (MEMS) resonators have allowed the development of magnetic field sensors with potential applications such as biomedicine, automotive industry, navigation systems, space satellites, telecommunications and non-destructive testing. We present a review of recent magnetic field sensors based on MEMS resonators, which operate with Lorentz force. These sensors have a compact structure, wide measurement range, low energy consumption, high sensitivity and suitable performance. The design methodology, simulation tools, damping sources, sensing techniques and future applications of magnetic field sensors are discussed. The design process is fundamental in achieving correct selection of the operation principle, sensing technique, materials, fabrication process and readout systems of the sensors. In addition, the description of the main sensing systems and challenges of the MEMS sensors are discussed. To develop the best devices, researches of their mechanical reliability, vacuum packaging, design optimization and temperature compensation circuits are needed. Future applications will require multifunctional sensors for monitoring several physical parameters (e.g., magnetic field, acceleration, angular ratio, humidity, temperature and gases).
Herrera-May, Agustín Leobardo; Soler-Balcazar, Juan Carlos; Vázquez-Leal, Héctor; Martínez-Castillo, Jaime; Vigueras-Zuñiga, Marco Osvaldo; Aguilera-Cortés, Luz Antonio
2016-01-01
Microelectromechanical systems (MEMS) resonators have allowed the development of magnetic field sensors with potential applications such as biomedicine, automotive industry, navigation systems, space satellites, telecommunications and non-destructive testing. We present a review of recent magnetic field sensors based on MEMS resonators, which operate with Lorentz force. These sensors have a compact structure, wide measurement range, low energy consumption, high sensitivity and suitable performance. The design methodology, simulation tools, damping sources, sensing techniques and future applications of magnetic field sensors are discussed. The design process is fundamental in achieving correct selection of the operation principle, sensing technique, materials, fabrication process and readout systems of the sensors. In addition, the description of the main sensing systems and challenges of the MEMS sensors are discussed. To develop the best devices, researches of their mechanical reliability, vacuum packaging, design optimization and temperature compensation circuits are needed. Future applications will require multifunctional sensors for monitoring several physical parameters (e.g., magnetic field, acceleration, angular ratio, humidity, temperature and gases). PMID:27563912
Heat transfer with hockey-stick steam generator. [LMFBR
DOE Office of Scientific and Technical Information (OSTI.GOV)
Moody, E; Gabler, M J
1977-11-01
The hockey-stick modular design concept is a good answer to future needs for reliable, economic LMFBR steam generators. The concept was successfully demonstrated in the 30 Mwt MSG test unit; scaled up versions are currently in fabrication for CRBRP usage, and further scaling has been accomplished for PLBR applications. Design and performance characteristics are presented for the three generations of hockey-stick steam generators. The key features of the design are presented based on extensive analytical effort backed up by extensive ancillary test data. The bases for and actual performance evaluations are presented with emphasis on the CRBRP design. The designmore » effort on these units has resulted in the development of analytical techniques that are directly applicable to steam generators for any LMFBR application. In conclusion, the hockey-stick steam generator concept has been proven to perform both thermally and hydraulically as predicted. The heat transfer characteristics are well defined, and proven analytical techniques are available as are personnel experienced in their use.« less
Design and Implementation of a New Real-Time Frequency Sensor Used as Hardware Countermeasure
Jiménez-Naharro, Raúl; Gómez-Galán, Juan Antonio; Sánchez-Raya, Manuel; Gómez-Bravo, Fernando; Pedro-Carrasco, Manuel
2013-01-01
A new digital countermeasure against attacks related to the clock frequency is –presented. This countermeasure, known as frequency sensor, consists of a local oscillator, a transition detector, a measurement element and an output block. The countermeasure has been designed using a full-custom technique implemented in an Application-Specific Integrated Circuit (ASIC), and the implementation has been verified and characterized with an integrated design using a 0.35 μm standard Complementary Metal Oxide Semiconductor (CMOS) technology (Very Large Scale Implementation—VLSI implementation). The proposed solution is configurable in resolution time and allowed range of period, achieving a minimum resolution time of only 1.91 ns and an initialization time of 5.84 ns. The proposed VLSI implementation shows better results than other solutions, such as digital ones based on semi-custom techniques and analog ones based on band pass filters, all design parameters considered. Finally, a counter has been used to verify the good performance of the countermeasure in avoiding the success of an attack. PMID:24008285
Initial planetary base construction techniques and machine implementation
NASA Technical Reports Server (NTRS)
Crockford, William W.
1987-01-01
Conceptual designs of (1) initial planetary base structures, and (2) an unmanned machine to perform the construction of these structures using materials local to the planet are presented. Rock melting is suggested as a possible technique to be used by the machine in fabricating roads, platforms, and interlocking bricks. Identification of problem areas in machine design and materials processing is accomplished. The feasibility of the designs is contingent upon favorable results of an analysis of the engineering behavior of the product materials. The analysis requires knowledge of several parameters for solution of the constitutive equations of the theory of elasticity. An initial collection of these parameters is presented which helps to define research needed to perform a realistic feasibility study. A qualitative approach to estimating power and mass lift requirements for the proposed machine is used which employs specifications of currently available equipment. An initial, unmanned mission scenario is discussed with emphasis on identifying uncompleted tasks and suggesting design considerations for vehicles and primitive structures which use the products of the machine processing.
NASA Technical Reports Server (NTRS)
Ippolito, Louis J.
1989-01-01
The NASA Propagation Effects Handbook for Satellite Systems Design provides a systematic compilation of the major propagation effects experienced on space-Earth paths in the 10 to 100 GHz frequency band region. It provides both a detailed description of the propagation phenomenon and a summary of the impact of the effect on the communications system design and performance. Chapter 2 through 5 describe the propagation effects, prediction models, and available experimental data bases. In Chapter 6, design techniques and prediction methods available for evaluating propagation effects on space-Earth communication systems are presented. Chapter 7 addresses the system design process and how the effects of propagation on system design and performance should be considered and how that can be mitigated. Examples of operational and planned Ku, Ka, and EHF satellite communications systems are given.
Short Duration Base Heating Test Improvements
NASA Technical Reports Server (NTRS)
Bender, Robert L.; Dagostino, Mark G.; Engel, Bradley A.; Engel, Carl D.
1999-01-01
Significant improvements have been made to a short duration space launch vehicle base heating test technique. This technique was first developed during the 1960's to investigate launch vehicle plume induced convective environments. Recent improvements include the use of coiled nitrogen buffer gas lines upstream of the hydrogen / oxygen propellant charge tubes, fast acting solenoid valves, stand alone gas delivery and data acquisition systems, and an integrated model design code. Technique improvements were successfully demonstrated during a 2.25% scale X-33 base heating test conducted in the NASA/MSFC Nozzle Test Facility in early 1999. Cost savings of approximately an order of magnitude over previous tests were realized due in large part to these improvements.
NASA Astrophysics Data System (ADS)
Bandte, Oliver
It has always been the intention of systems engineering to invent or produce the best product possible. Many design techniques have been introduced over the course of decades that try to fulfill this intention. Unfortunately, no technique has succeeded in combining multi-criteria decision making with probabilistic design. The design technique developed in this thesis, the Joint Probabilistic Decision Making (JPDM) technique, successfully overcomes this deficiency by generating a multivariate probability distribution that serves in conjunction with a criterion value range of interest as a universally applicable objective function for multi-criteria optimization and product selection. This new objective function constitutes a meaningful Xnetric, called Probability of Success (POS), that allows the customer or designer to make a decision based on the chance of satisfying the customer's goals. In order to incorporate a joint probabilistic formulation into the systems design process, two algorithms are created that allow for an easy implementation into a numerical design framework: the (multivariate) Empirical Distribution Function and the Joint Probability Model. The Empirical Distribution Function estimates the probability that an event occurred by counting how many times it occurred in a given sample. The Joint Probability Model on the other hand is an analytical parametric model for the multivariate joint probability. It is comprised of the product of the univariate criterion distributions, generated by the traditional probabilistic design process, multiplied with a correlation function that is based on available correlation information between pairs of random variables. JPDM is an excellent tool for multi-objective optimization and product selection, because of its ability to transform disparate objectives into a single figure of merit, the likelihood of successfully meeting all goals or POS. The advantage of JPDM over other multi-criteria decision making techniques is that POS constitutes a single optimizable function or metric that enables a comparison of all alternative solutions on an equal basis. Hence, POS allows for the use of any standard single-objective optimization technique available and simplifies a complex multi-criteria selection problem into a simple ordering problem, where the solution with the highest POS is best. By distinguishing between controllable and uncontrollable variables in the design process, JPDM can account for the uncertain values of the uncontrollable variables that are inherent to the design problem, while facilitating an easy adjustment of the controllable ones to achieve the highest possible POS. Finally, JPDM's superiority over current multi-criteria decision making techniques is demonstrated with an optimization of a supersonic transport concept and ten contrived equations as well as a product selection example, determining an airline's best choice among Boeing's B-747, B-777, Airbus' A340, and a Supersonic Transport. The optimization examples demonstrate JPDM's ability to produce a better solution with a higher POS than an Overall Evaluation Criterion or Goal Programming approach. Similarly, the product selection example demonstrates JPDM's ability to produce a better solution with a higher POS and different ranking than the Overall Evaluation Criterion or Technique for Order Preferences by Similarity to the Ideal Solution (TOPSIS) approach.
Large Terrain Modeling and Visualization for Planets
NASA Technical Reports Server (NTRS)
Myint, Steven; Jain, Abhinandan; Cameron, Jonathan; Lim, Christopher
2011-01-01
Physics-based simulations are actively used in the design, testing, and operations phases of surface and near-surface planetary space missions. One of the challenges in realtime simulations is the ability to handle large multi-resolution terrain data sets within models as well as for visualization. In this paper, we describe special techniques that we have developed for visualization, paging, and data storage for dealing with these large data sets. The visualization technique uses a real-time GPU-based continuous level-of-detail technique that delivers multiple frames a second performance even for planetary scale terrain model sizes.
Development of evaluation technique of GMAW welding quality based on statistical analysis
NASA Astrophysics Data System (ADS)
Feng, Shengqiang; Terasaki, Hidenri; Komizo, Yuichi; Hu, Shengsun; Chen, Donggao; Ma, Zhihua
2014-11-01
Nondestructive techniques for appraising gas metal arc welding(GMAW) faults plays a very important role in on-line quality controllability and prediction of the GMAW process. On-line welding quality controllability and prediction have several disadvantages such as high cost, low efficiency, complication and greatly being affected by the environment. An enhanced, efficient evaluation technique for evaluating welding faults based on Mahalanobis distance(MD) and normal distribution is presented. In addition, a new piece of equipment, designated the weld quality tester(WQT), is developed based on the proposed evaluation technique. MD is superior to other multidimensional distances such as Euclidean distance because the covariance matrix used for calculating MD takes into account correlations in the data and scaling. The values of MD obtained from welding current and arc voltage are assumed to follow a normal distribution. The normal distribution has two parameters: the mean µ and standard deviation σ of the data. In the proposed evaluation technique used by the WQT, values of MD located in the range from zero to µ+3 σ are regarded as "good". Two experiments which involve changing the flow of shielding gas and smearing paint on the surface of the substrate are conducted in order to verify the sensitivity of the proposed evaluation technique and the feasibility of using WQT. The experimental results demonstrate the usefulness of the WQT for evaluating welding quality. The proposed technique can be applied to implement the on-line welding quality controllability and prediction, which is of great importance to design some novel equipment for weld quality detection.
Mobility based key management technique for multicast security in mobile ad hoc networks.
Madhusudhanan, B; Chitra, S; Rajan, C
2015-01-01
In MANET multicasting, forward and backward secrecy result in increased packet drop rate owing to mobility. Frequent rekeying causes large message overhead which increases energy consumption and end-to-end delay. Particularly, the prevailing group key management techniques cause frequent mobility and disconnections. So there is a need to design a multicast key management technique to overcome these problems. In this paper, we propose the mobility based key management technique for multicast security in MANET. Initially, the nodes are categorized according to their stability index which is estimated based on the link availability and mobility. A multicast tree is constructed such that for every weak node, there is a strong parent node. A session key-based encryption technique is utilized to transmit a multicast data. The rekeying process is performed periodically by the initiator node. The rekeying interval is fixed depending on the node category so that this technique greatly minimizes the rekeying overhead. By simulation results, we show that our proposed approach reduces the packet drop rate and improves the data confidentiality.
A new slit lamp-based technique for anterior chamber angle estimation.
Gispets, Joan; Cardona, Genís; Tomàs, Núria; Fusté, Cèlia; Binns, Alison; Fortes, Miguel A
2014-06-01
To design and test a new noninvasive method for anterior chamber angle (ACA) estimation based on the slit lamp that is accessible to all eye-care professionals. A new technique (slit lamp anterior chamber estimation [SLACE]) that aims to overcome some of the limitations of the van Herick procedure was designed. The technique, which only requires a slit lamp, was applied to estimate the ACA of 50 participants (100 eyes) using two different slit lamp models, and results were compared with gonioscopy as the clinical standard. The Spearman nonparametric correlation between ACA values as determined by gonioscopy and SLACE were 0.81 (p < 0.001) and 0.79 (p < 0.001) for each slit lamp. Sensitivity values of 100 and 87.5% and specificity values of 75 and 81.2%, depending on the slit lamp used, were obtained for the SLACE technique as compared with gonioscopy (Spaeth classification). The SLACE technique, when compared with gonioscopy, displayed good accuracy in the detection of narrow angles, and it may be useful for eye-care clinicians without access to expensive alternative equipment or those who cannot perform gonioscopy because of legal constraints regarding the use of diagnostic drugs.
Design-for-Hardware-Trust Techniques, Detection Strategies and Metrics for Hardware Trojans
2015-12-14
down both rising and falling transitions. For Trojan detection , one fault , slow-‐to-‐rise or slow-‐to...in Jan. 2016. Through the course of this project we developed novel hardware Trojan detection techniques based on clock sweeping. The technique takes...algorithms to detect minor changes due to Trojan and compared them with those changes made by process variations. This technique was implemented on
Analyzing the effectiveness of a frame-level redundancy scrubbing technique for SRAM-based FPGAs
Tonfat, Jorge; Lima Kastensmidt, Fernanda; Rech, Paolo; ...
2015-12-17
Radiation effects such as soft errors are the major threat to the reliability of SRAM-based FPGAs. This work analyzes the effectiveness in correcting soft errors of a novel scrubbing technique using internal frame redundancy called Frame-level Redundancy Scrubbing (FLR-scrubbing). This correction technique can be implemented in a coarse grain TMR design. The FLR-scrubbing technique was implemented on a mid-size Xilinx Virtex-5 FPGA device used as a case study. The FLR-scrubbing technique was tested under neutron radiation and fault injection. Implementation results demonstrated minimum area and energy consumption overhead when compared to other techniques. The time to repair the fault ismore » also improved by using the Internal Configuration Access Port (ICAP). Lastly, neutron radiation test results demonstrated that the proposed technique is suitable for correcting accumulated SEUs and MBUs.« less
Analyzing the effectiveness of a frame-level redundancy scrubbing technique for SRAM-based FPGAs
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tonfat, Jorge; Lima Kastensmidt, Fernanda; Rech, Paolo
Radiation effects such as soft errors are the major threat to the reliability of SRAM-based FPGAs. This work analyzes the effectiveness in correcting soft errors of a novel scrubbing technique using internal frame redundancy called Frame-level Redundancy Scrubbing (FLR-scrubbing). This correction technique can be implemented in a coarse grain TMR design. The FLR-scrubbing technique was implemented on a mid-size Xilinx Virtex-5 FPGA device used as a case study. The FLR-scrubbing technique was tested under neutron radiation and fault injection. Implementation results demonstrated minimum area and energy consumption overhead when compared to other techniques. The time to repair the fault ismore » also improved by using the Internal Configuration Access Port (ICAP). Lastly, neutron radiation test results demonstrated that the proposed technique is suitable for correcting accumulated SEUs and MBUs.« less
NASA Astrophysics Data System (ADS)
Kim, Ilkyu
Recent developments in mobile communications have led to an increased appearance of short-range communications and high data-rate signal transmission. New technologies provides the need for an accurate near-field coupling analysis and novel antenna designs. An ability to effectively estimate the coupling within the near-field region is required to realize short-range communications. Currently, two common techniques that are applicable to the near-field coupling problem are 1) integral form of coupling formula and 2) generalized Friis formula. These formulas are investigated with an emphasis on straightforward calculation and accuracy for various distances between the two antennas. The coupling formulas are computed for a variety of antennas, and several antenna configurations are evaluated through full-wave simulation and indoor measurement in order to validate these techniques. In addition, this research aims to design multi-functional and high performance antennas based on MEMS (Microelectromechanical Systems) switches, EBG (Electromagnetic Bandgap) structures, and septum polarizers. A MEMS switch is incorporated into a slot loaded patch antenna to attain frequency reconfigurability. The resonant frequency of the patch antenna can be shifted using the MEM switch, which is actuated by the integrated bias networks. Furthermore, a high gain base-station antenna utilizing beam-tilting is designed to maximize gain for tilted beam applications. To realize this base-station antenna, an array of four dipole-EBG elements is constructed to implement a fixed down-tilt main beam with application in base station arrays. An improvement of the operating range with the EBG-dipole array is evaluated using a simple linkbudget analysis. The septum polarizer has been widely used in circularly polarized antenna systems due to its simple and compact design and high quality of circularity. In this research, the sigmoid function is used to smoothen the edge in the septum design, which makes it suitable for HPM systems. The PSO (Particle Swarm Optimization) technique is applied to the septum design to achieve a high performance antenna design. The electric field intensity above the septum is evaluated through the simulation and its properties are compared to simple half-plane scattering phenomena.
2004-06-01
Information Systems, Faculty of ICT, International Islamic University, Malaysia . Abstract. Several techniques for evaluating a groupware...inspection based techniques couldn’t be carried out in other parts of Pakistan where the IT industry has mushroomed in the past few years. Nevertheless...there are no set standards for using any particular technique. Evaluating a groupware interface is an evolving process and requires more investigation
A firefly algorithm for optimum design of new-generation beams
NASA Astrophysics Data System (ADS)
Erdal, F.
2017-06-01
This research addresses the minimum weight design of new-generation steel beams with sinusoidal openings using a metaheuristic search technique, namely the firefly method. The proposed algorithm is also used to compare the optimum design results of sinusoidal web-expanded beams with steel castellated and cellular beams. Optimum design problems of all beams are formulated according to the design limitations stipulated by the Steel Construction Institute. The design methods adopted in these publications are consistent with BS 5950 specifications. The formulation of the design problem considering the above-mentioned limitations turns out to be a discrete programming problem. The design algorithms based on the technique select the optimum universal beam sections, dimensional properties of sinusoidal, hexagonal and circular holes, and the total number of openings along the beam as design variables. Furthermore, this selection is also carried out such that the behavioural limitations are satisfied. Numerical examples are presented, where the suggested algorithm is implemented to achieve the minimum weight design of these beams subjected to loading combinations.
Design Criteria For Networked Image Analysis System
NASA Astrophysics Data System (ADS)
Reader, Cliff; Nitteberg, Alan
1982-01-01
Image systems design is currently undergoing a metamorphosis from the conventional computing systems of the past into a new generation of special purpose designs. This change is motivated by several factors, notably among which is the increased opportunity for high performance with low cost offered by advances in semiconductor technology. Another key issue is a maturing in understanding of problems and the applicability of digital processing techniques. These factors allow the design of cost-effective systems that are functionally dedicated to specific applications and used in a utilitarian fashion. Following an overview of the above stated issues, the paper presents a top-down approach to the design of networked image analysis systems. The requirements for such a system are presented, with orientation toward the hospital environment. The three main areas are image data base management, viewing of image data and image data processing. This is followed by a survey of the current state of the art, covering image display systems, data base techniques, communications networks and software systems control. The paper concludes with a description of the functional subystems and architectural framework for networked image analysis in a production environment.
A systematic FPGA acceleration design for applications based on convolutional neural networks
NASA Astrophysics Data System (ADS)
Dong, Hao; Jiang, Li; Li, Tianjian; Liang, Xiaoyao
2018-04-01
Most FPGA accelerators for convolutional neural network are designed to optimize the inner acceleration and are ignored of the optimization for the data path between the inner accelerator and the outer system. This could lead to poor performance in applications like real time video object detection. We propose a brand new systematic FPFA acceleration design to solve this problem. This design takes the data path optimization between the inner accelerator and the outer system into consideration and optimizes the data path using techniques like hardware format transformation, frame compression. It also takes fixed-point, new pipeline technique to optimize the inner accelerator. All these make the final system's performance very good, reaching about 10 times the performance comparing with the original system.
Robust Stability Analysis of the Space Launch System Control Design: A Singular Value Approach
NASA Technical Reports Server (NTRS)
Pei, Jing; Newsome, Jerry R.
2015-01-01
Classical stability analysis consists of breaking the feedback loops one at a time and determining separately how much gain or phase variations would destabilize the stable nominal feedback system. For typical launch vehicle control design, classical control techniques are generally employed. In addition to stability margins, frequency domain Monte Carlo methods are used to evaluate the robustness of the design. However, such techniques were developed for Single-Input-Single-Output (SISO) systems and do not take into consideration the off-diagonal terms in the transfer function matrix of Multi-Input-Multi-Output (MIMO) systems. Robust stability analysis techniques such as H(sub infinity) and mu are applicable to MIMO systems but have not been adopted as standard practices within the launch vehicle controls community. This paper took advantage of a simple singular-value-based MIMO stability margin evaluation method based on work done by Mukhopadhyay and Newsom and applied it to the SLS high-fidelity dynamics model. The method computes a simultaneous multi-loop gain and phase margin that could be related back to classical margins. The results presented in this paper suggest that for the SLS system, traditional SISO stability margins are similar to the MIMO margins. This additional level of verification provides confidence in the robustness of the control design.
NASA Astrophysics Data System (ADS)
Duan, Libin; Xiao, Ning-cong; Li, Guangyao; Cheng, Aiguo; Chen, Tao
2017-07-01
Tailor-rolled blank thin-walled (TRB-TH) structures have become important vehicle components owing to their advantages of light weight and crashworthiness. The purpose of this article is to provide an efficient lightweight design for improving the energy-absorbing capability of TRB-TH structures under dynamic loading. A finite element (FE) model for TRB-TH structures is established and validated by performing a dynamic axial crash test. Different material properties for individual parts with different thicknesses are considered in the FE model. Then, a multi-objective crashworthiness design of the TRB-TH structure is constructed based on the ɛ-support vector regression (ɛ-SVR) technique and non-dominated sorting genetic algorithm-II. The key parameters (C, ɛ and σ) are optimized to further improve the predictive accuracy of ɛ-SVR under limited sample points. Finally, the technique for order preference by similarity to the ideal solution method is used to rank the solutions in Pareto-optimal frontiers and find the best compromise optima. The results demonstrate that the light weight and crashworthiness performance of the optimized TRB-TH structures are superior to their uniform thickness counterparts. The proposed approach provides useful guidance for designing TRB-TH energy absorbers for vehicle bodies.
Yu, Quan; Gong, Xin; Wang, Guo-Min; Yu, Zhe-Yuan; Qian, Yu-Fen; Shen, Gang
2011-01-01
To establish a new method of presurgical nasoalveolar molding (NAM) using computer-aided reverse engineering and rapid prototyping technique in infants with unilateral cleft lip and palate (UCLP). Five infants (2 males and 3 females with mean age of 1.2 w) with complete UCLP were recruited. All patients were subjected to NAM before the cleft lip repair. The upper denture casts were recorded using a three-dimensional laser scanner within 2 weeks after birth in UCLP infants. A digital model was constructed and analyzed to simulate the NAM procedure with reverse engineering software. The digital geometrical data were exported to print the solid model with rapid prototyping system. The whole set of appliances was fabricated based on these solid models. Laser scanning and digital model construction simplified the NAM procedure and estimated the treatment objective. The appliances were fabricated based on the rapid prototyping technique, and for each patient, the complete set of appliances could be obtained at one time. By the end of presurgical NAM treatment, the cleft was narrowed, and the malformation of nasoalveolar segments was aligned normally. We have developed a novel technique of presurgical NAM based on a computer-aided design. The accurate digital denture model of UCLP infants could be obtained with laser scanning. The treatment design and appliance fabrication could be simplified with a computer-aided reverse engineering and rapid prototyping technique.
Multidisciplinary Aerospace Systems Optimization: Computational AeroSciences (CAS) Project
NASA Technical Reports Server (NTRS)
Kodiyalam, S.; Sobieski, Jaroslaw S. (Technical Monitor)
2001-01-01
The report describes a method for performing optimization of a system whose analysis is so expensive that it is impractical to let the optimization code invoke it directly because excessive computational cost and elapsed time might result. In such situation it is imperative to have user control the number of times the analysis is invoked. The reported method achieves that by two techniques in the Design of Experiment category: a uniform dispersal of the trial design points over a n-dimensional hypersphere and a response surface fitting, and the technique of krigging. Analyses of all the trial designs whose number may be set by the user are performed before activation of the optimization code and the results are stored as a data base. That code is then executed and referred to the above data base. Two applications, one of the airborne laser system, and one of an aircraft optimization illustrate the method application.
Chen, Wen; Chowdhury, Fahmida N; Djuric, Ana; Yeh, Chih-Ping
2014-09-01
This paper provides a new design of robust fault detection for turbofan engines with adaptive controllers. The critical issue is that the adaptive controllers can depress the faulty effects such that the actual system outputs remain the pre-specified values, making it difficult to detect faults/failures. To solve this problem, a Total Measurable Fault Information Residual (ToMFIR) technique with the aid of system transformation is adopted to detect faults in turbofan engines with adaptive controllers. This design is a ToMFIR-redundancy-based robust fault detection. The ToMFIR is first introduced and existing results are also summarized. The Detailed design process of the ToMFIRs is presented and a turbofan engine model is simulated to verify the effectiveness of the proposed ToMFIR-based fault-detection strategy. Copyright © 2013 ISA. Published by Elsevier Ltd. All rights reserved.
Subranging technique using superconducting technology
Gupta, Deepnarayan
2003-01-01
Subranging techniques using "digital SQUIDs" are used to design systems with large dynamic range, high resolution and large bandwidth. Analog-to-digital converters (ADCs) embodying the invention include a first SQUID based "coarse" resolution circuit and a second SQUID based "fine" resolution circuit to convert an analog input signal into "coarse" and "fine" digital signals for subsequent processing. In one embodiment, an ADC includes circuitry for supplying an analog input signal to an input coil having at least a first inductive section and a second inductive section. A first superconducting quantum interference device (SQUID) is coupled to the first inductive section and a second SQUID is coupled to the second inductive section. The first SQUID is designed to produce "coarse" (large amplitude, low resolution) output signals and the second SQUID is designed to produce "fine" (low amplitude, high resolution) output signals in response to the analog input signals.
Designing intuitive dialog boxes in Windows environments
NASA Astrophysics Data System (ADS)
Souetova, Natalia
2000-01-01
There were analyzed some approaches to user interface design. Most existing interfaces seem to be difficult for understanding and studying for newcomers. There were defined some ways for designing interfaces based on psychology of computer image perception and experience got while working with artists and designers without special technique education. Some applications with standard Windows interfaces, based on these results, were developed. Windows environment was chosen because they are very popular now. This increased quality and speed of users' job and reduced quantity of troubles and mistakes. Now high-qualified employers do not spend their working time for explanation and help.
General Analytical Schemes for the Characterization of Pectin-Based Edible Gelled Systems
Haghighi, Maryam; Rezaei, Karamatollah
2012-01-01
Pectin-based gelled systems have gained increasing attention for the design of newly developed food products. For this reason, the characterization of such formulas is a necessity in order to present scientific data and to introduce an appropriate finished product to the industry. Various analytical techniques are available for the evaluation of the systems formulated on the basis of pectin and the designed gel. In this paper, general analytical approaches for the characterization of pectin-based gelled systems were categorized into several subsections including physicochemical analysis, visual observation, textural/rheological measurement, microstructural image characterization, and psychorheological evaluation. Three-dimensional trials to assess correlations among microstructure, texture, and taste were also discussed. Practical examples of advanced objective techniques including experimental setups for small and large deformation rheological measurements and microstructural image analysis were presented in more details. PMID:22645484
Design of an MSAT-X mobile transceiver and related base and gateway stations
NASA Technical Reports Server (NTRS)
Fang, Russell J. F.; Bhaskar, Udaya; Hemmati, Farhad; Mackenthun, Kenneth M.; Shenoy, Ajit
1987-01-01
This paper summarizes the results of a design study of the mobile transceiver, base station, and gateway station for NASA's proposed Mobile Satellite Experiment (MSAT-X). Major ground segment system design issues such as frequency stability control, modulation method, linear predictive coding vocoder algorithm, and error control technique are addressed. The modular and flexible transceiver design is described in detail, including the core, RF/IF, modem, vocoder, forward error correction codec, amplitude-companded single sideband, and input/output modules, as well as the flexible interface. Designs for a three-carrier base station and a 10-carrier gateway station are also discussed, including the interface with the controllers and with the public-switched telephone networks at the gateway station. Functional specifications are given for the transceiver, the base station, and the gateway station.
Design of an MSAT-X mobile transceiver and related base and gateway stations
NASA Astrophysics Data System (ADS)
Fang, Russell J. F.; Bhaskar, Udaya; Hemmati, Farhad; Mackenthun, Kenneth M.; Shenoy, Ajit
This paper summarizes the results of a design study of the mobile transceiver, base station, and gateway station for NASA's proposed Mobile Satellite Experiment (MSAT-X). Major ground segment system design issues such as frequency stability control, modulation method, linear predictive coding vocoder algorithm, and error control technique are addressed. The modular and flexible transceiver design is described in detail, including the core, RF/IF, modem, vocoder, forward error correction codec, amplitude-companded single sideband, and input/output modules, as well as the flexible interface. Designs for a three-carrier base station and a 10-carrier gateway station are also discussed, including the interface with the controllers and with the public-switched telephone networks at the gateway station. Functional specifications are given for the transceiver, the base station, and the gateway station.
A novel shape-based coding-decoding technique for an industrial visual inspection system.
Mukherjee, Anirban; Chaudhuri, Subhasis; Dutta, Pranab K; Sen, Siddhartha; Patra, Amit
2004-01-01
This paper describes a unique single camera-based dimension storage method for image-based measurement. The system has been designed and implemented in one of the integrated steel plants of India. The purpose of the system is to encode the frontal cross-sectional area of an ingot. The encoded data will be stored in a database to facilitate the future manufacturing diagnostic process. The compression efficiency and reconstruction error of the lossy encoding technique have been reported and found to be quite encouraging.
Computer simulations of optimum boost and buck-boost converters
NASA Technical Reports Server (NTRS)
Rahman, S.
1982-01-01
The development of mathematicl models suitable for minimum weight boost and buck-boost converter designs are presented. The facility of an augumented Lagrangian (ALAG) multiplier-based nonlinear programming technique is demonstrated for minimum weight design optimizations of boost and buck-boost power converters. ALAG-based computer simulation results for those two minimum weight designs are discussed. Certain important features of ALAG are presented in the framework of a comprehensive design example for boost and buck-boost power converter design optimization. The study provides refreshing design insight of power converters and presents such information as weight annd loss profiles of various semiconductor components and magnetics as a function of the switching frequency.
Mastery Learning in Physical Education.
ERIC Educational Resources Information Center
Annarino, Anthony
This paper discusses the design of a physical education curriculum to be used in advanced secondary physical education programs and in university basic instructional programs; the design is based on the premise of mastery learning and employs programed instructional techniques. The effective implementation of a mastery learning model necessitates…
ERIC Educational Resources Information Center
National Institute of General Medical Sciences (NIGMS), 2007
2007-01-01
This booklet reveals how structural biology provides insight into health and disease and is useful in developing new medications. It contains a general introduction to proteins, coverage of the techniques used to determine protein structures, and a chapter on structure-based drug design. The booklet features "Student Snapshots," designed to…
USDA-ARS?s Scientific Manuscript database
Substrate integrated waveguide- based sensors balance the performance and well known design techniques of classical waveguides with the cheaper and more adaptable aspects of planar circuits. Propagation characteristics are similar to waveguides with the design retaining many positive aspects of wave...
Informal Reading Inventories: Creating Teacher-Designed Literature-Based Assessments
ERIC Educational Resources Information Center
Provost, Mary C.; Lambert, Monica A.; Babkie, Andrea M.
2010-01-01
Mandates emphasizing student achievement have increased the importance of appropriate assessment techniques for students in general and special education classrooms. Informal reading inventories (IRIs), designed by classroom teachers, have been proven to be an efficient and effective way to determine students' strengths, weaknesses, and strategies…
Comparative Analysis of RF Emission Based Fingerprinting Techniques for ZigBee Device Classification
quantify the differences invarious RF fingerprinting techniques via comparative analysis of MDA/ML classification results. The findings herein demonstrate...correct classification rates followed by COR-DNA and then RF-DNA in most test cases and especially in low Eb/N0 ranges, where ZigBee is designed to operate.
An Investigative Graduate Laboratory Course for Teaching Modern DNA Techniques
ERIC Educational Resources Information Center
de Lencastre, Alexandre; Torello, A. Thomas; Keller, Lani C.
2017-01-01
This graduate-level DNA methods laboratory course is designed to model a discovery-based research project and engages students in both traditional DNA analysis methods and modern recombinant DNA cloning techniques. In the first part of the course, students clone the "Drosophila" ortholog of a human disease gene of their choosing using…
Integration of Video-Based Demonstrations to Prepare Students for the Organic Chemistry Laboratory
ERIC Educational Resources Information Center
Nadelson, Louis S.; Scaggs, Jonathan; Sheffield, Colin; McDougal, Owen M.
2015-01-01
Consistent, high-quality introductions to organic chemistry laboratory techniques effectively and efficiently support student learning in the organic chemistry laboratory. In this work, we developed and deployed a series of instructional videos to communicate core laboratory techniques and concepts. Using a quasi-experimental design, we tested the…
A Semester-Long Project-Oriented Biochemistry Laboratory Based on "Helicobacter pylori" Urease
ERIC Educational Resources Information Center
Farnham, Kate R.; Dube, Danielle H.
2015-01-01
Here we present the development of a 13 week project-oriented biochemistry laboratory designed to introduce students to foundational biochemical techniques and then enable students to perform original research projects once they have mastered these techniques. In particular, we describe a semester-long laboratory that focuses on a biomedically…
TRUSS: An intelligent design system for aircraft wings
NASA Technical Reports Server (NTRS)
Bates, Preston R.; Schrage, Daniel P.
1989-01-01
Competitive leadership in the international marketplace, superiority in national defense, excellence in productivity, and safety of both private and public systems are all national defense goals which are dependent on superior engineering design. In recent years, it has become more evident that early design decisions are critical, and when only based on performance often result in products which are too expensive, hard to manufacture, or unsupportable. Better use of computer-aided design tools and information-based technologies is required to produce better quality United States products. A program is outlined here to explore the use of knowledge based expert systems coupled with numerical optimization, database management techniques, and designer interface methods in a networked design environment to improve and assess design changes due to changing emphasis or requirements. The initial structural design of a tiltrotor aircraft wing is used as a representative example to demonstrate the approach being followed.
An improved sample loading technique for cellular metabolic response monitoring under pressure
NASA Astrophysics Data System (ADS)
Gikunda, Millicent Nkirote
To monitor cellular metabolism under pressure, a pressure chamber designed around a simple-to-construct capillary-based spectroscopic chamber coupled to a microliter-flow perfusion system is used in the laboratory. Although cyanide-induced metabolic responses from Saccharomyces cerevisiae (baker's yeast) could be controllably induced and monitored under pressure, previously used sample loading technique was not well controlled. An improved cell-loading technique which is based on use of a secondary inner capillary into which the sample is loaded then inserted into the capillary pressure chamber, has been developed. As validation, we demonstrate the ability to measure the chemically-induced metabolic responses at pressures of up to 500 bars. This technique is shown to be less prone to sample loss due to perfusive flow than the previous techniques used.
Survey of adaptive control using Liapunov design
NASA Technical Reports Server (NTRS)
Lindorff, D. P.; Carroll, R. L.
1972-01-01
A survey was made of the literature devoted to the synthesis of model-tracking adaptive systems based on application of Liapunov's second method. The basic synthesis procedure is introduced and a critical review of extensions made to the theory since 1966 is made. The extensions relate to design for relative stability, reduction of order techniques, design with disturbance, design with time variable parameters, multivariable systems, identification, and an adaptive observer.
A study of multiplex data bus techniques for the space shuttle
NASA Technical Reports Server (NTRS)
Kearney, R. J.; Kalange, M. A.
1972-01-01
A comprehensive technology base for the design of a multiplexed data bus subsystem is provided. Extensive analyses, both analytical and empirical, were performed. Subjects covered are classified under the following headings: requirements identification and analysis; transmission media studies; signal design and detection studies; synchronization, timing, and control studies; user-subsystem interface studies; operational reliability analyses; design of candidate data bus configurations; and evaluation of candidate data bus designs.
Design, evaluation and test of an electronic, multivariable control for the F100 turbofan engine
NASA Technical Reports Server (NTRS)
Skira, C. A.; Dehoff, R. L.; Hall, W. E., Jr.
1980-01-01
A digital, multivariable control design procedure for the F100 turbofan engine is described. The controller is based on locally linear synthesis techniques using linear, quadratic regulator design methods. The control structure uses an explicit model reference form with proportional and integral feedback near a nominal trajectory. Modeling issues, design procedures for the control law and the estimation of poorly measured variables are presented.
1988-09-01
could use the assistance of a microcomputer-based management information system . However, adequate system design and development requires an in-depth...understanding of the Equipment Management Section and the environment in which it functions were asked and answered. Then, a management information system was...designed, developed, and tested. The management information system is called the Equipment Management Information System (EMIS).
Model-Based Trade Space Exploration for Near-Earth Space Missions
NASA Technical Reports Server (NTRS)
Cohen, Ronald H.; Boncyk, Wayne; Brutocao, James; Beveridge, Iain
2005-01-01
We developed a capability for model-based trade space exploration to be used in the conceptual design of Earth-orbiting space missions. We have created a set of reusable software components to model various subsystems and aspects of space missions. Several example mission models were created to test the tools and process. This technique and toolset has demonstrated itself to be valuable for space mission architectural design.
Du, Qi-Shi; Huang, Ri-Bo; Wei, Yu-Tuo; Pang, Zong-Wen; Du, Li-Qin; Chou, Kuo-Chen
2009-01-30
In cooperation with the fragment-based design a new drug design method, the so-called "fragment-based quantitative structure-activity relationship" (FB-QSAR) is proposed. The essence of the new method is that the molecular framework in a family of drug candidates are divided into several fragments according to their substitutes being investigated. The bioactivities of molecules are correlated with the physicochemical properties of the molecular fragments through two sets of coefficients in the linear free energy equations. One coefficient set is for the physicochemical properties and the other for the weight factors of the molecular fragments. Meanwhile, an iterative double least square (IDLS) technique is developed to solve the two sets of coefficients in a training data set alternately and iteratively. The IDLS technique is a feedback procedure with machine learning ability. The standard Two-dimensional quantitative structure-activity relationship (2D-QSAR) is a special case, in the FB-QSAR, when the whole molecule is treated as one entity. The FB-QSAR approach can remarkably enhance the predictive power and provide more structural insights into rational drug design. As an example, the FB-QSAR is applied to build a predictive model of neuraminidase inhibitors for drug development against H5N1 influenza virus. (c) 2008 Wiley Periodicals, Inc.
Compiler-Assisted Multiple Instruction Rollback Recovery Using a Read Buffer. Ph.D. Thesis
NASA Technical Reports Server (NTRS)
Alewine, Neal Jon
1993-01-01
Multiple instruction rollback (MIR) is a technique to provide rapid recovery from transient processor failures and was implemented in hardware by researchers and slow in mainframe computers. Hardware-based MIR designs eliminate rollback data hazards by providing data redundancy implemented in hardware. Compiler-based MIR designs were also developed which remove rollback data hazards directly with data flow manipulations, thus eliminating the need for most data redundancy hardware. Compiler-assisted techniques to achieve multiple instruction rollback recovery are addressed. It is observed that data some hazards resulting from instruction rollback can be resolved more efficiently by providing hardware redundancy while others are resolved more efficiently with compiler transformations. A compiler-assisted multiple instruction rollback scheme is developed which combines hardware-implemented data redundancy with compiler-driven hazard removal transformations. Experimental performance evaluations were conducted which indicate improved efficiency over previous hardware-based and compiler-based schemes. Various enhancements to the compiler transformations and to the data redundancy hardware developed for the compiler-assisted MIR scheme are described and evaluated. The final topic deals with the application of compiler-assisted MIR techniques to aid in exception repair and branch repair in a speculative execution architecture.
NASA Astrophysics Data System (ADS)
Mashayekhi, Mohammad Jalali; Behdinan, Kamran
2017-10-01
The increasing demand to minimize undesired vibration and noise levels in several high-tech industries has generated a renewed interest in vibration transfer path analysis. Analyzing vibration transfer paths within a system is of crucial importance in designing an effective vibration isolation strategy. Most of the existing vibration transfer path analysis techniques are empirical which are suitable for diagnosis and troubleshooting purpose. The lack of an analytical transfer path analysis to be used in the design stage is the main motivation behind this research. In this paper an analytical transfer path analysis based on the four-pole theory is proposed for multi-energy-domain systems. Bond graph modeling technique which is an effective approach to model multi-energy-domain systems is used to develop the system model. In this paper an electro-mechanical system is used as a benchmark example to elucidate the effectiveness of the proposed technique. An algorithm to obtain the equivalent four-pole representation of a dynamical systems based on the corresponding bond graph model is also presented in this paper.
Reventlov Husted, Gitte; Teilmann, Grete; Hommel, Eva; Olsen, Birthe Susanne; Kensing, Finn
2017-01-01
Background Young people with type 1 diabetes often struggle to self-manage their disease. Mobile health (mHealth) apps show promise in supporting self-management of chronic conditions such as type 1 diabetes. Many health care providers become involved in app development. Unfortunately, limited information is available to guide their selection of appropriate methods, techniques, and tools for a participatory design (PD) project in health care. Objective The aim of our study was to develop an mHealth app to support young people in self-managing type 1 diabetes. This paper presents our methodological recommendations based on experiences and reflections from a 2-year research study. Methods A mixed methods design was used to identify user needs before designing the app and testing it in a randomized controlled trial. App design was based on qualitative, explorative, interventional, and experimental activities within an overall iterative PD approach. Several techniques and tools were used, including workshops, a mail panel, think-aloud tests, and a feasibility study. Results The final mHealth solution was “Young with Diabetes” (YWD). The iterative PD approach supported researchers and designers in understanding the needs of end users (ie, young people, parents, and health care providers) and their assessment of YWD, as well as how to improve app usability and feasibility. It is critical to include all end user groups during all phases of a PD project and to establish a multidisciplinary team to provide the wide range of expertise required to build a usable and useful mHealth app. Conclusions Future research is needed to develop and evaluate more efficient PD techniques. Health care providers need guidance on what tools and techniques to choose for which subgroups of users and guidance on how to introduce an app to colleagues to successfully implement an mHealth app in health care organizations. These steps are important for anyone who wants to design an mHealth app for any illness. PMID:29061552
An Analysis of Factors that Inhibit Business Use of User-Centered Design Principles: A Delphi Study
ERIC Educational Resources Information Center
Hilton, Tod M.
2010-01-01
The use of user-centered design (UCD) principles has a positive impact on the use of web-based interactive systems in customer-centric organizations. User-centered design methodologies are not widely adopted in organizations due to intraorganizational factors. A qualitative study using a modified Delphi technique was used to identify the factors…
Determination of Secondary Students' Preferences Regarding Design Features Used in Digital Textbooks
ERIC Educational Resources Information Center
Öngöz, Sakine; Mollamehmetoglu, Mehmet Zülküf
2017-01-01
The aim of this study was to determine secondary school students' choice of design features for digital textbooks. As a part of the research--which was conducted using a mixed technique--a literature review was carried out to source points to consider in the designing of digital textbooks and experts' opinions were obtained. Based on the results,…
Advanced Technologies in Safe and Efficient Operating Rooms
2009-02-01
parameters and sitting strategies to determine car seat design that is both comfortable and ergonomically sound.8 The study of ergonomics in the surgical...off stereo into a prototype endoscope and developed design concepts for visualization techniques based on principles of cognitive ergonomics ...results from simultaneous task performance—is typical of knowledge ascertained through ergonomic clinical research.6 Ergonomic theory, design , and
NASA Technical Reports Server (NTRS)
Craun, Robert W.; Acosta, Diana M.; Beard, Steven D.; Leonard, Michael W.; Hardy, Gordon H.; Weinstein, Michael; Yildiz, Yildiray
2013-01-01
This paper describes the maturation of a control allocation technique designed to assist pilots in the recovery from pilot induced oscillations (PIOs). The Control Allocation technique to recover from Pilot Induced Oscillations (CAPIO) is designed to enable next generation high efficiency aircraft designs. Energy efficient next generation aircraft require feedback control strategies that will enable lowering the actuator rate limit requirements for optimal airframe design. One of the common issues flying with actuator rate limits is PIOs caused by the phase lag between the pilot inputs and control surface response. CAPIO utilizes real-time optimization for control allocation to eliminate phase lag in the system caused by control surface rate limiting. System impacts of the control allocator were assessed through a piloted simulation evaluation of a non-linear aircraft simulation in the NASA Ames Vertical Motion Simulator. Results indicate that CAPIO helps reduce oscillatory behavior, including the severity and duration of PIOs, introduced by control surface rate limiting.
Novel neural control for a class of uncertain pure-feedback systems.
Shen, Qikun; Shi, Peng; Zhang, Tianping; Lim, Cheng-Chew
2014-04-01
This paper is concerned with the problem of adaptive neural tracking control for a class of uncertain pure-feedback nonlinear systems. Using the implicit function theorem and backstepping technique, a practical robust adaptive neural control scheme is proposed to guarantee that the tracking error converges to an adjusted neighborhood of the origin by choosing appropriate design parameters. In contrast to conventional Lyapunov-based design techniques, an alternative Lyapunov function is constructed for the development of control law and learning algorithms. Differing from the existing results in the literature, the control scheme does not need to compute the derivatives of virtual control signals at each step in backstepping design procedures. Furthermore, the scheme requires the desired trajectory and its first derivative rather than its first n derivatives. In addition, the useful property of the basis function of the radial basis function, which will be used in control design, is explored. Simulation results illustrate the effectiveness of the proposed techniques.
Investigation of model-based physical design restrictions (Invited Paper)
NASA Astrophysics Data System (ADS)
Lucas, Kevin; Baron, Stanislas; Belledent, Jerome; Boone, Robert; Borjon, Amandine; Couderc, Christophe; Patterson, Kyle; Riviere-Cazaux, Lionel; Rody, Yves; Sundermann, Frank; Toublan, Olivier; Trouiller, Yorick; Urbani, Jean-Christophe; Wimmer, Karl
2005-05-01
As lithography and other patterning processes become more complex and more non-linear with each generation, the task of physical design rules necessarily increases in complexity also. The goal of the physical design rules is to define the boundary between the physical layout structures which will yield well from those which will not. This is essentially a rule-based pre-silicon guarantee of layout correctness. However the rapid increase in design rule requirement complexity has created logistical problems for both the design and process functions. Therefore, similar to the semiconductor industry's transition from rule-based to model-based optical proximity correction (OPC) due to increased patterning complexity, opportunities for improving physical design restrictions by implementing model-based physical design methods are evident. In this paper we analyze the possible need and applications for model-based physical design restrictions (MBPDR). We first analyze the traditional design rule evolution, development and usage methodologies for semiconductor manufacturers. Next we discuss examples of specific design rule challenges requiring new solution methods in the patterning regime of low K1 lithography and highly complex RET. We then evaluate possible working strategies for MBPDR in the process development and product design flows, including examples of recent model-based pre-silicon verification techniques. Finally we summarize with a proposed flow and key considerations for MBPDR implementation.
Adaptive control of an exoskeleton robot with uncertainties on kinematics and dynamics.
Brahmi, Brahim; Saad, Maarouf; Ochoa-Luna, Cristobal; Rahman, Mohammad H
2017-07-01
In this paper, we propose a new adaptive control technique based on nonlinear sliding mode control (JSTDE) taking into account kinematics and dynamics uncertainties. This approach is applied to an exoskeleton robot with uncertain kinematics and dynamics. The adaptation design is based on Time Delay Estimation (TDE). The proposed strategy does not necessitate the well-defined dynamic and kinematic models of the system robot. The updated laws are designed using Lyapunov-function to solve the adaptation problem systematically, proving the close loop stability and ensuring the convergence asymptotically of the outputs tracking errors. Experiments results show the effectiveness and feasibility of JSTDE technique to deal with the variation of the unknown nonlinear dynamics and kinematics of the exoskeleton model.
A Comparison of FPGA and GPGPU Designs for Bayesian Occupancy Filters.
Medina, Luis; Diez-Ochoa, Miguel; Correal, Raul; Cuenca-Asensi, Sergio; Serrano, Alejandro; Godoy, Jorge; Martínez-Álvarez, Antonio; Villagra, Jorge
2017-11-11
Grid-based perception techniques in the automotive sector based on fusing information from different sensors and their robust perceptions of the environment are proliferating in the industry. However, one of the main drawbacks of these techniques is the traditionally prohibitive, high computing performance that is required for embedded automotive systems. In this work, the capabilities of new computing architectures that embed these algorithms are assessed in a real car. The paper compares two ad hoc optimized designs of the Bayesian Occupancy Filter; one for General Purpose Graphics Processing Unit (GPGPU) and the other for Field-Programmable Gate Array (FPGA). The resulting implementations are compared in terms of development effort, accuracy and performance, using datasets from a realistic simulator and from a real automated vehicle.
Requirements analysis, domain knowledge, and design
NASA Technical Reports Server (NTRS)
Potts, Colin
1988-01-01
Two improvements to current requirements analysis practices are suggested: domain modeling, and the systematic application of analysis heuristics. Domain modeling is the representation of relevant application knowledge prior to requirements specification. Artificial intelligence techniques may eventually be applicable for domain modeling. In the short term, however, restricted domain modeling techniques, such as that in JSD, will still be of practical benefit. Analysis heuristics are standard patterns of reasoning about the requirements. They usually generate questions of clarification or issues relating to completeness. Analysis heuristics can be represented and therefore systematically applied in an issue-based framework. This is illustrated by an issue-based analysis of JSD's domain modeling and functional specification heuristics. They are discussed in the context of the preliminary design of simple embedded systems.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Baier, S.; Rochet, A.; Hofmann, G.
2015-06-15
We report on a new modular setup on a silicon-based microreactor designed for correlative spectroscopic, scattering, and analytic on-line gas investigations for in situ studies of heterogeneous catalysts. The silicon microreactor allows a combination of synchrotron radiation based techniques (e.g., X-ray diffraction and X-ray absorption spectroscopy) as well as infrared thermography and Raman spectroscopy. Catalytic performance can be determined simultaneously by on-line product analysis using mass spectrometry. We present the design of the reactor, the experimental setup, and as a first example for an in situ study, the catalytic partial oxidation of methane showing the applicability of this reactor formore » in situ studies.« less
NASA Astrophysics Data System (ADS)
Ouyang, Qi; Lu, Wenxi; Lin, Jin; Deng, Wenbing; Cheng, Weiguo
2017-08-01
The surrogate-based simulation-optimization techniques are frequently used for optimal groundwater remediation design. When this technique is used, surrogate errors caused by surrogate-modeling uncertainty may lead to generation of infeasible designs. In this paper, a conservative strategy that pushes the optimal design into the feasible region was used to address surrogate-modeling uncertainty. In addition, chance-constrained programming (CCP) was adopted to compare with the conservative strategy in addressing this uncertainty. Three methods, multi-gene genetic programming (MGGP), Kriging (KRG) and support vector regression (SVR), were used to construct surrogate models for a time-consuming multi-phase flow model. To improve the performance of the surrogate model, ensemble surrogates were constructed based on combinations of different stand-alone surrogate models. The results show that: (1) the surrogate-modeling uncertainty was successfully addressed by the conservative strategy, which means that this method is promising for addressing surrogate-modeling uncertainty. (2) The ensemble surrogate model that combines MGGP with KRG showed the most favorable performance, which indicates that this ensemble surrogate can utilize both stand-alone surrogate models to improve the performance of the surrogate model.
NASA Technical Reports Server (NTRS)
Sable, Dan M.; Cho, Bo H.; Lee, Fred C.
1990-01-01
A detailed comparison of a boost converter, a voltage-fed, autotransformer converter, and a multimodule boost converter, designed specifically for the space platform battery discharger, is performed. Computer-based nonlinear optimization techniques are used to facilitate an objective comparison. The multimodule boost converter is shown to be the optimum topology at all efficiencies. The margin is greatest at 97 percent efficiency. The multimodule, multiphase boost converter combines the advantages of high efficiency, light weight, and ample margin on the component stresses, thus ensuring high reliability.
Innovative real CSF leak simulation model for rhinology training: human cadaveric design.
AlQahtani, Abdulaziz A; Albathi, Abeer A; Alhammad, Othman M; Alrabie, Abdulkarim S
2018-04-01
To study the feasibility of designing a human cadaveric simulation model of real CSF leak for rhinology training. The laboratory investigation took place at the surgical academic center of Prince Sultan Military Medical City between 2016 and 2017. Five heads of human cadaveric specimens were cannulated into the intradural space through two frontal bone holes. Fluorescein-dyed fluid was injected intracranialy, then endoscopic endonasal iatrogenic skull base defect was created with observation of fluid leak, followed by skull base reconstruction. The outcome measures included subjective assessment of integrity of the design, the ability of creating real CSF leak in multiple site of skull base and the possibility of watertight closure by various surgical techniques. The fluid filled the intradural space in all specimens without spontaneous leak from skull base or extra sinus areas. Successfully, we demonstrated fluid leak from all areas after iatrogenic defect in the cribriform plate, fovea ethmoidalis, planum sphenoidale sellar and clival regions. Watertight closure was achieved in all defects using different reconstruction techniques (overly, underlay and gasket seal closure). The design is simulating the real patient with CSF leak. It has potential in the learning process of acquiring and maintaining the surgical skills of skull base reconstruction before direct involvement of the patient. This model needs further evaluation and competence measurement as training tools in rhinology training.
COST-EFFECTIVE SAMPLING FOR SPATIALLY DISTRIBUTED PHENOMENA
Various measures of sampling plan cost and loss are developed and analyzed as they relate to a variety of multidisciplinary sampling techniques. The sampling choices examined include methods from design-based sampling, model-based sampling, and geostatistics. Graphs and tables ar...
NASA Technical Reports Server (NTRS)
Balas, Gary J.
1992-01-01
The use is studied of active control to attenuate structural vibrations of the NASA Langley Phase Zero Evolutionary Structure due to external disturbance excitations. H sub infinity and structured singular value (mu) based control techniques are used to analyze and synthesize control laws for the NASA Langley Controls Structures Interaction (CSI) Evolutionary Model (CEM). The CEM structure experiment provides an excellent test bed to address control design issues for large space structures. Specifically, control design for structures with numerous lightly damped, coupled flexible modes, collocated and noncollocated sensors and actuators and stringent performance specifications. The performance objectives are to attenuate the vibration of the structure due to external disturbances, and minimize the actuator control force. The control design problem formulation for the CEM Structure uses a mathematical model developed with finite element techniques. A reduced order state space model for the control design is formulated from the finite element model. It is noted that there are significant variations between the design model and the experimentally derived transfer function data.
Design for active and passive flutter suppression and gust alleviation. Ph.D. Thesis
NASA Technical Reports Server (NTRS)
Karpel, M.
1981-01-01
Analytical design techniques for active and passive control of aeroelastic systems are based on a rational approximation of the unsteady aerodynamic loads in the entire Laplace domain, which yields matrix equations of motion with constant coefficients. Some existing schemes are reviewed, the matrix Pade approximant is modified, and a technique which yields a minimal number of augmented states for a desired accuracy is presented. The state-space aeroelastic model is used to design an active control system for simultaneous flutter suppression and gust alleviation. The design target is for a continuous controller which transfers some measurements taken on the vehicle to a control command applied to a control surface. Structural modifications are formulated in a way which enables the treatment of passive flutter suppression system with the same procedures by which active control systems are designed.
Participatory design for drug-drug interaction alerts.
Luna, Daniel; Otero, Carlos; Almerares, Alfredo; Stanziola, Enrique; Risk, Marcelo; González Bernaldo de Quirós, Fernán
2015-01-01
The utilization of decision support systems, in the point of care, to alert drug-drug interactions has been shown to improve quality of care. Still, the use of these systems has not been as expected, it is believed, because of the difficulties in their knowledge databases; errors in the generation of the alerts and the lack of a suitable design. This study expands on the development of alerts using participatory design techniques based on user centered design process. This work was undertaken in three stages (inquiry, participatory design and usability testing) it showed that the use of these techniques improves satisfaction, effectiveness and efficiency in an alert system for drug-drug interactions, a fact that was evident in specific situations such as the decrease of errors to meet the specified task, the time, the workload optimization and users overall satisfaction in the system.
Communications network design and costing model programmers manual
NASA Technical Reports Server (NTRS)
Logan, K. P.; Somes, S. S.; Clark, C. A.
1983-01-01
Otpimization algorithms and techniques used in the communications network design and costing model for least cost route and least cost network problems are examined from the programmer's point of view. All system program modules, the data structures within the model, and the files which make up the data base are described.
ERIC Educational Resources Information Center
Wang, Zhijun; Anderson, Terry; Chen, Li; Barbera, Elena
2017-01-01
Connectivist learning is interaction-centered learning. A framework describing interaction and cognitive engagement in connectivist learning was constructed using logical reasoning techniques. The framework and analysis was designed to help researchers and learning designers understand and adapt the characteristics and principles of interaction in…
ERIC Educational Resources Information Center
Fosmire, Michael
2017-01-01
Engineering designers must make evidence-based decisions when applying the practical tools and techniques of their discipline to human problems. Information literacy provides a structure for determining information gaps, locating appropriate and relevant information, applying that information effectively, and documenting and managing the knowledge…
Microprocessor Based Temperature Control of Liquid Delivery with Flow Disturbances.
ERIC Educational Resources Information Center
Kaya, Azmi
1982-01-01
Discusses analytical design and experimental verification of a PID control value for a temperature controlled liquid delivery system, demonstrating that the analytical design techniques can be experimentally verified by using digital controls as a tool. Digital control instrumentation and implementation are also demonstrated and documented for…
Intelligent Tutors in Immersive Virtual Environments
ERIC Educational Resources Information Center
Yan, Peng; Slator, Brian M.; Vender, Bradley; Jin, Wei; Kariluoma, Matti; Borchert, Otto; Hokanson, Guy; Aggarwal, Vaibhav; Cosmano, Bob; Cox, Kathleen T.; Pilch, André; Marry, Andrew
2013-01-01
Research into virtual role-based learning has progressed over the past decade. Modern issues include gauging the difficulty of designing a goal system capable of meeting the requirements of students with different knowledge levels, and the reasonability and possibility of taking advantage of the well-designed formula and techniques served in other…
Yao, Yongchao; Ju, Xiaodong; Lu, Junqiang; Men, Baiyong
2017-06-10
A logging-while-drilling (LWD) caliper is a tool used for the real-time measurement of a borehole diameter in oil drilling engineering. This study introduces the mechanical structure and working principle of a new LWD caliper based on ultrasonic distance measurement (UDM). The detection range is a major performance index of a UDM system. This index is determined by the blind zone length and remote reflecting interface detection capability of the system. To reduce the blind zone length and detect near the reflecting interface, a full bridge acoustic emission technique based on bootstrap gate driver (BGD) and metal-oxide-semiconductor field effect transistor (MOSFET) is designed by analyzing the working principle and impedance characteristics of a given piezoelectric transducer. To detect the remote reflecting interface and reduce the dynamic range of the received echo signals, the relationships between the echo amplitude and propagation distance of ultrasonic waves are determined. A signal compensation technique based on time-varying amplification theory, which can automatically change the gain according to the echo arrival time is designed. Lastly, the aforementioned techniques and corresponding circuits are experimentally verified. Results show that the blind zone length in the UDM system of the LWD caliper is significantly reduced and the capability to detect the remote reflecting interface is considerably improved.
Yao, Yongchao; Ju, Xiaodong; Lu, Junqiang; Men, Baiyong
2017-01-01
A logging-while-drilling (LWD) caliper is a tool used for the real-time measurement of a borehole diameter in oil drilling engineering. This study introduces the mechanical structure and working principle of a new LWD caliper based on ultrasonic distance measurement (UDM). The detection range is a major performance index of a UDM system. This index is determined by the blind zone length and remote reflecting interface detection capability of the system. To reduce the blind zone length and detect near the reflecting interface, a full bridge acoustic emission technique based on bootstrap gate driver (BGD) and metal-oxide-semiconductor field effect transistor (MOSFET) is designed by analyzing the working principle and impedance characteristics of a given piezoelectric transducer. To detect the remote reflecting interface and reduce the dynamic range of the received echo signals, the relationships between the echo amplitude and propagation distance of ultrasonic waves are determined. A signal compensation technique based on time-varying amplification theory, which can automatically change the gain according to the echo arrival time is designed. Lastly, the aforementioned techniques and corresponding circuits are experimentally verified. Results show that the blind zone length in the UDM system of the LWD caliper is significantly reduced and the capability to detect the remote reflecting interface is considerably improved. PMID:28604603
Aircraft applications of fault detection and isolation techniques
NASA Astrophysics Data System (ADS)
Marcos Esteban, Andres
In this thesis the problems of fault detection & isolation and fault tolerant systems are studied from the perspective of LTI frequency-domain, model-based techniques. Emphasis is placed on the applicability of these LTI techniques to nonlinear models, especially to aerospace systems. Two applications of Hinfinity LTI fault diagnosis are given using an open-loop (no controller) design approach: one for the longitudinal motion of a Boeing 747-100/200 aircraft, the other for a turbofan jet engine. An algorithm formalizing a robust identification approach based on model validation ideas is also given and applied to the previous jet engine. A general linear fractional transformation formulation is given in terms of the Youla and Dual Youla parameterizations for the integrated (control and diagnosis filter) approach. This formulation provides better insight into the trade-off between the control and the diagnosis objectives. It also provides the basic groundwork towards the development of nested schemes for the integrated approach. These nested structures allow iterative improvements on the control/filter Youla parameters based on successive identification of the system uncertainty (as given by the Dual Youla parameter). The thesis concludes with an application of Hinfinity LTI techniques to the integrated design for the longitudinal motion of the previous Boeing 747-100/200 model.
Advanced applications of numerical modelling techniques for clay extruder design
NASA Astrophysics Data System (ADS)
Kandasamy, Saravanakumar
Ceramic materials play a vital role in our day to day life. Recent advances in research, manufacture and processing techniques and production methodologies have broadened the scope of ceramic products such as bricks, pipes and tiles, especially in the construction industry. These are mainly manufactured using an extrusion process in auger extruders. During their long history of application in the ceramic industry, most of the design developments of extruder systems have resulted from expensive laboratory-based experimental work and field-based trial and error runs. In spite of these design developments, the auger extruders continue to be energy intensive devices with high operating costs. Limited understanding of the physical process involved in the process and the cost and time requirements of lab-based experiments were found to be the major obstacles in the further development of auger extruders.An attempt has been made herein to use Computational Fluid Dynamics (CFD) and Finite Element Analysis (FEA) based numerical modelling techniques to reduce the costs and time associated with research into design improvement by experimental trials. These two techniques, although used widely in other engineering applications, have rarely been applied for auger extruder development. This had been due to a number of reasons including technical limitations of CFD tools previously available. Modern CFD and FEA software packages have much enhanced capabilities and allow the modelling of the flow of complex fluids such as clay.This research work presents a methodology in using Herschel-Bulkley's fluid flow based CFD model to simulate and assess the flow of clay-water mixture through the extruder and the die of a vacuum de-airing type clay extrusion unit used in ceramic extrusion. The extruder design and the operating parameters were varied to study their influence on the power consumption and the extrusion pressure. The model results were then validated using results from experimental trials on a scaled extruder which seemed to be in reasonable agreement with the former. The modelling methodology was then extended to full-scale industrial extruders. The technical and commercialsuitability of using light weight materials to manufacture extruder components was also investigated. The stress and deformation induced on the components, due to extrusion pressure, was analysed using FEA and suitable alternative materials were identified. A cost comparison was then made for different extruder materials. The results show potential of significant technical and commercial benefits to the ceramic industry.
A knowledge-based system with learning for computer communication network design
NASA Technical Reports Server (NTRS)
Pierre, Samuel; Hoang, Hai Hoc; Tropper-Hausen, Evelyne
1990-01-01
Computer communication network design is well-known as complex and hard. For that reason, the most effective methods used to solve it are heuristic. Weaknesses of these techniques are listed and a new approach based on artificial intelligence for solving this problem is presented. This approach is particularly recommended for large packet switched communication networks, in the sense that it permits a high degree of reliability and offers a very flexible environment dealing with many relevant design parameters such as link cost, link capacity, and message delay.
NASA Technical Reports Server (NTRS)
Thomas, J. M.; Hanagud, S.
1974-01-01
The design criteria and test options for aerospace structural reliability were investigated. A decision methodology was developed for selecting a combination of structural tests and structural design factors. The decision method involves the use of Bayesian statistics and statistical decision theory. Procedures are discussed for obtaining and updating data-based probabilistic strength distributions for aerospace structures when test information is available and for obtaining subjective distributions when data are not available. The techniques used in developing the distributions are explained.
Directed molecular evolution to design advanced red fluorescent proteins.
Subach, Fedor V; Piatkevich, Kiryl D; Verkhusha, Vladislav V
2011-11-29
Fluorescent proteins have become indispensable imaging tools for biomedical research. Continuing progress in fluorescence imaging, however, requires probes with additional colors and properties optimized for emerging techniques. Here we summarize strategies for development of red-shifted fluorescent proteins. We discuss possibilities for knowledge-based rational design based on the photochemistry of fluorescent proteins and the position of the chromophore in protein structure. We consider advances in library design by mutagenesis, protein expression systems and instrumentation for high-throughput screening that should yield improved fluorescent proteins for advanced imaging applications.
Model-based engineering for medical-device software.
Ray, Arnab; Jetley, Raoul; Jones, Paul L; Zhang, Yi
2010-01-01
This paper demonstrates the benefits of adopting model-based design techniques for engineering medical device software. By using a patient-controlled analgesic (PCA) infusion pump as a candidate medical device, the authors show how using models to capture design information allows for i) fast and efficient construction of executable device prototypes ii) creation of a standard, reusable baseline software architecture for a particular device family, iii) formal verification of the design against safety requirements, and iv) creation of a safety framework that reduces verification costs for future versions of the device software. 1.
NASA Technical Reports Server (NTRS)
Ong, K. M.; Macdoran, P. F.; Thomas, J. B.; Fliegel, H. F.; Skjerve, L. J.; Spitzmesser, D. J.; Batelaan, P. D.; Paine, S. R.; Newsted, M. G.
1976-01-01
A precision geodetic measurement system (Aries, for Astronomical Radio Interferometric Earth Surveying) based on the technique of very long base line interferometry has been designed and implemented through the use of a 9-m transportable antenna and the NASA 64-m antenna of the Deep Space Communications Complex at Goldstone, California. A series of experiments designed to demonstrate the inherent accuracy of a transportable interferometer was performed on a 307-m base line during the period from December 1973 to June 1974. This short base line was chosen in order to obtain a comparison with a conventional survey with a few-centimeter accuracy and to minimize Aries errors due to transmission media effects, source locations, and earth orientation parameters. The base-line vector derived from a weighted average of the measurements, representing approximately 24 h of data, possessed a formal uncertainty of about 3 cm in all components. This average interferometry base-line vector was in good agreement with the conventional survey vector within the statistical range allowed by the combined uncertainties (3-4 cm) of the two techniques.
Moghram, Basem Ameen; Nabil, Emad; Badr, Amr
2018-01-01
T-cell epitope structure identification is a significant challenging immunoinformatic problem within epitope-based vaccine design. Epitopes or antigenic peptides are a set of amino acids that bind with the Major Histocompatibility Complex (MHC) molecules. The aim of this process is presented by Antigen Presenting Cells to be inspected by T-cells. MHC-molecule-binding epitopes are responsible for triggering the immune response to antigens. The epitope's three-dimensional (3D) molecular structure (i.e., tertiary structure) reflects its proper function. Therefore, the identification of MHC class-II epitopes structure is a significant step towards epitope-based vaccine design and understanding of the immune system. In this paper, we propose a new technique using a Genetic Algorithm for Predicting the Epitope Structure (GAPES), to predict the structure of MHC class-II epitopes based on their sequence. The proposed Elitist-based genetic algorithm for predicting the epitope's tertiary structure is based on Ab-Initio Empirical Conformational Energy Program for Peptides (ECEPP) Force Field Model. The developed secondary structure prediction technique relies on Ramachandran Plot. We used two alignment algorithms: the ROSS alignment and TM-Score alignment. We applied four different alignment approaches to calculate the similarity scores of the dataset under test. We utilized the support vector machine (SVM) classifier as an evaluation of the prediction performance. The prediction accuracy and the Area Under Receiver Operating Characteristic (ROC) Curve (AUC) were calculated as measures of performance. The calculations are performed on twelve similarity-reduced datasets of the Immune Epitope Data Base (IEDB) and a large dataset of peptide-binding affinities to HLA-DRB1*0101. The results showed that GAPES was reliable and very accurate. We achieved an average prediction accuracy of 93.50% and an average AUC of 0.974 in the IEDB dataset. Also, we achieved an accuracy of 95.125% and an AUC of 0.987 on the HLA-DRB1*0101 allele of the Wang benchmark dataset. The results indicate that the proposed prediction technique "GAPES" is a promising technique that will help researchers and scientists to predict the protein structure and it will assist them in the intelligent design of new epitope-based vaccines. Copyright © 2017 Elsevier B.V. All rights reserved.
Machine learning in computational docking.
Khamis, Mohamed A; Gomaa, Walid; Ahmed, Walaa F
2015-03-01
The objective of this paper is to highlight the state-of-the-art machine learning (ML) techniques in computational docking. The use of smart computational methods in the life cycle of drug design is relatively a recent development that has gained much popularity and interest over the last few years. Central to this methodology is the notion of computational docking which is the process of predicting the best pose (orientation + conformation) of a small molecule (drug candidate) when bound to a target larger receptor molecule (protein) in order to form a stable complex molecule. In computational docking, a large number of binding poses are evaluated and ranked using a scoring function. The scoring function is a mathematical predictive model that produces a score that represents the binding free energy, and hence the stability, of the resulting complex molecule. Generally, such a function should produce a set of plausible ligands ranked according to their binding stability along with their binding poses. In more practical terms, an effective scoring function should produce promising drug candidates which can then be synthesized and physically screened using high throughput screening process. Therefore, the key to computer-aided drug design is the design of an efficient highly accurate scoring function (using ML techniques). The methods presented in this paper are specifically based on ML techniques. Despite many traditional techniques have been proposed, the performance was generally poor. Only in the last few years started the application of the ML technology in the design of scoring functions; and the results have been very promising. The ML-based techniques are based on various molecular features extracted from the abundance of protein-ligand information in the public molecular databases, e.g., protein data bank bind (PDBbind). In this paper, we present this paradigm shift elaborating on the main constituent elements of the ML approach to molecular docking along with the state-of-the-art research in this area. For instance, the best random forest (RF)-based scoring function on PDBbind v2007 achieves a Pearson correlation coefficient between the predicted and experimentally determined binding affinities of 0.803 while the best conventional scoring function achieves 0.644. The best RF-based ranking power ranks the ligands correctly based on their experimentally determined binding affinities with accuracy 62.5% and identifies the top binding ligand with accuracy 78.1%. We conclude with open questions and potential future research directions that can be pursued in smart computational docking; using molecular features of different nature (geometrical, energy terms, pharmacophore), advanced ML techniques (e.g., deep learning), combining more than one ML models. Copyright © 2015 Elsevier B.V. All rights reserved.
CATO: a CAD tool for intelligent design of optical networks and interconnects
NASA Astrophysics Data System (ADS)
Chlamtac, Imrich; Ciesielski, Maciej; Fumagalli, Andrea F.; Ruszczyk, Chester; Wedzinga, Gosse
1997-10-01
Increasing communication speed requirements have created a great interest in very high speed optical and all-optical networks and interconnects. The design of these optical systems is a highly complex task, requiring the simultaneous optimization of various parts of the system, ranging from optical components' characteristics to access protocol techniques. Currently there are no computer aided design (CAD) tools on the market to support the interrelated design of all parts of optical communication systems, thus the designer has to rely on costly and time consuming testbed evaluations. The objective of the CATO (CAD tool for optical networks and interconnects) project is to develop a prototype of an intelligent CAD tool for the specification, design, simulation and optimization of optical communication networks. CATO allows the user to build an abstract, possible incomplete, model of the system, and determine its expected performance. Based on design constraints provided by the user, CATO will automatically complete an optimum design, using mathematical programming techniques, intelligent search methods and artificial intelligence (AI). Initial design and testing of a CATO prototype (CATO-1) has been completed recently. The objective was to prove the feasibility of combining AI techniques, simulation techniques, an optical device library and a graphical user interface into a flexible CAD tool for obtaining optimal communication network designs in terms of system cost and performance. CATO-1 is an experimental tool for designing packet-switching wavelength division multiplexing all-optical communication systems using a LAN/MAN ring topology as the underlying network. The two specific AI algorithms incorporated are simulated annealing and a genetic algorithm. CATO-1 finds the optimal number of transceivers for each network node, using an objective function that includes the cost of the devices and the overall system performance.
NASA Technical Reports Server (NTRS)
Hale, Mark A.; Craig, James I.; Mistree, Farrokh; Schrage, Daniel P.
1995-01-01
Integrated Product and Process Development (IPPD) embodies the simultaneous application of both system and quality engineering methods throughout an iterative design process. The use of IPPD results in the time-conscious, cost-saving development of engineering systems. Georgia Tech has proposed the development of an Integrated Design Engineering Simulator that will merge Integrated Product and Process Development with interdisciplinary analysis techniques and state-of-the-art computational technologies. To implement IPPD, a Decision-Based Design perspective is encapsulated in an approach that focuses on the role of the human designer in product development. The approach has two parts and is outlined in this paper. First, an architecture, called DREAMS, is being developed that facilitates design from a decision-based perspective. Second, a supporting computing infrastructure, called IMAGE, is being designed. The current status of development is given and future directions are outlined.
NASA Technical Reports Server (NTRS)
Taylor, Brian R.; Ratnayake, Nalin A.
2010-01-01
As part of an effort to improve emissions, noise, and performance of next generation aircraft, it is expected that future aircraft will make use of distributed, multi-objective control effectors in a closed-loop flight control system. Correlation challenges associated with parameter estimation will arise with this expected aircraft configuration. Research presented in this paper focuses on addressing the correlation problem with an appropriate input design technique and validating this technique through simulation and flight test of the X-48B aircraft. The X-48B aircraft is an 8.5 percent-scale hybrid wing body aircraft demonstrator designed by The Boeing Company (Chicago, Illinois, USA), built by Cranfield Aerospace Limited (Cranfield, Bedford, United Kingdom) and flight tested at the National Aeronautics and Space Administration Dryden Flight Research Center (Edwards, California, USA). Based on data from flight test maneuvers performed at Dryden Flight Research Center, aerodynamic parameter estimation was performed using linear regression and output error techniques. An input design technique that uses temporal separation for de-correlation of control surfaces is proposed, and simulation and flight test results are compared with the aerodynamic database. This paper will present a method to determine individual control surface aerodynamic derivatives.
NASA Astrophysics Data System (ADS)
Asaithambi, Sasikumar; Rajappa, Muthaiah
2018-05-01
In this paper, an automatic design method based on a swarm intelligence approach for CMOS analog integrated circuit (IC) design is presented. The hybrid meta-heuristics optimization technique, namely, the salp swarm algorithm (SSA), is applied to the optimal sizing of a CMOS differential amplifier and the comparator circuit. SSA is a nature-inspired optimization algorithm which mimics the navigating and hunting behavior of salp. The hybrid SSA is applied to optimize the circuit design parameters and to minimize the MOS transistor sizes. The proposed swarm intelligence approach was successfully implemented for an automatic design and optimization of CMOS analog ICs using Generic Process Design Kit (GPDK) 180 nm technology. The circuit design parameters and design specifications are validated through a simulation program for integrated circuit emphasis simulator. To investigate the efficiency of the proposed approach, comparisons have been carried out with other simulation-based circuit design methods. The performances of hybrid SSA based CMOS analog IC designs are better than the previously reported studies.
Asaithambi, Sasikumar; Rajappa, Muthaiah
2018-05-01
In this paper, an automatic design method based on a swarm intelligence approach for CMOS analog integrated circuit (IC) design is presented. The hybrid meta-heuristics optimization technique, namely, the salp swarm algorithm (SSA), is applied to the optimal sizing of a CMOS differential amplifier and the comparator circuit. SSA is a nature-inspired optimization algorithm which mimics the navigating and hunting behavior of salp. The hybrid SSA is applied to optimize the circuit design parameters and to minimize the MOS transistor sizes. The proposed swarm intelligence approach was successfully implemented for an automatic design and optimization of CMOS analog ICs using Generic Process Design Kit (GPDK) 180 nm technology. The circuit design parameters and design specifications are validated through a simulation program for integrated circuit emphasis simulator. To investigate the efficiency of the proposed approach, comparisons have been carried out with other simulation-based circuit design methods. The performances of hybrid SSA based CMOS analog IC designs are better than the previously reported studies.
Rifai, Damhuji; Abdalla, Ahmed N.; Razali, Ramdan; Ali, Kharudin; Faraj, Moneer A.
2017-01-01
The use of the eddy current technique (ECT) for the non-destructive testing of conducting materials has become increasingly important in the past few years. The use of the non-destructive ECT plays a key role in the ensuring the safety and integrity of the large industrial structures such as oil and gas pipelines. This paper introduce a novel ECT probe design integrated with the distributed ECT inspection system (DSECT) use for crack inspection on inner ferromagnetic pipes. The system consists of an array of giant magneto-resistive (GMR) sensors, a pneumatic system, a rotating magnetic field excitation source and a host PC acting as the data analysis center. Probe design parameters, namely probe diameter, an excitation coil and the number of GMR sensors in the array sensor is optimized using numerical optimization based on the desirability approach. The main benefits of DSECT can be seen in terms of its modularity and flexibility for the use of different types of magnetic transducers/sensors, and signals of a different nature with either digital or analog outputs, making it suited for the ECT probe design using an array of GMR magnetic sensors. A real-time application of the DSECT distributed system for ECT inspection can be exploited for the inspection of 70 mm carbon steel pipe. In order to predict the axial and circumference defect detection, a mathematical model is developed based on the technique known as response surface methodology (RSM). The inspection results of a carbon steel pipe sample with artificial defects indicate that the system design is highly efficient. PMID:28335399
Synthesis Methods for Robust Passification and Control
NASA Technical Reports Server (NTRS)
Kelkar, Atul G.; Joshi, Suresh M. (Technical Monitor)
2000-01-01
The research effort under this cooperative agreement has been essentially the continuation of the work from previous grants. The ongoing work has primarily focused on developing passivity-based control techniques for Linear Time-Invariant (LTI) systems. During this period, there has been a significant progress made in the area of passivity-based control of LTI systems and some preliminary results have also been obtained for nonlinear systems, as well. The prior work has addressed optimal control design for inherently passive as well as non- passive linear systems. For exploiting the robustness characteristics of passivity-based controllers the passification methodology was developed for LTI systems that are not inherently passive. Various methods of passification were first proposed in and further developed. The robustness of passification was addressed for multi-input multi-output (MIMO) systems for certain classes of uncertainties using frequency-domain methods. For MIMO systems, a state-space approach using Linear Matrix Inequality (LMI)-based formulation was presented, for passification of non-passive LTI systems. An LMI-based robust passification technique was presented for systems with redundant actuators and sensors. The redundancy in actuators and sensors was used effectively for robust passification using the LMI formulation. The passification was designed to be robust to an interval-type uncertainties in system parameters. The passification techniques were used to design a robust controller for Benchmark Active Control Technology wing under parametric uncertainties. The results on passive nonlinear systems, however, are very limited to date. Our recent work in this area was presented, wherein some stability results were obtained for passive nonlinear systems that are affine in control.
A cloud-based X73 ubiquitous mobile healthcare system: design and implementation.
Ji, Zhanlin; Ganchev, Ivan; O'Droma, Máirtín; Zhang, Xin; Zhang, Xueji
2014-01-01
Based on the user-centric paradigm for next generation networks, this paper describes a ubiquitous mobile healthcare (uHealth) system based on the ISO/IEEE 11073 personal health data (PHD) standards (X73) and cloud computing techniques. A number of design issues associated with the system implementation are outlined. The system includes a middleware on the user side, providing a plug-and-play environment for heterogeneous wireless sensors and mobile terminals utilizing different communication protocols and a distributed "big data" processing subsystem in the cloud. The design and implementation of this system are envisaged as an efficient solution for the next generation of uHealth systems.
NASA Technical Reports Server (NTRS)
Seltzer, S. M.
1976-01-01
The problem discussed is to design a digital controller for a typical satellite. The controlled plant is considered to be a rigid body acting in a plane. The controller is assumed to be a digital computer which, when combined with the proposed control algorithm, can be represented as a sampled-data system. The objective is to present a design strategy and technique for selecting numerical values for the control gains (assuming position, integral, and derivative feedback) and the sample rate. The technique is based on the parameter plane method and requires that the system be amenable to z-transform analysis.
NASA Technical Reports Server (NTRS)
1971-01-01
Computational techniques were developed and assimilated for the design optimization. The resulting computer program was then used to perform initial optimization and sensitivity studies on a typical thermal protection system (TPS) to demonstrate its application to the space shuttle TPS design. The program was developed in Fortran IV for the CDC 6400 but was subsequently converted to the Fortran V language to be used on the Univac 1108. The program allows for improvement and update of the performance prediction techniques. The program logic involves subroutines which handle the following basic functions: (1) a driver which calls for input, output, and communication between program and user and between the subroutines themselves; (2) thermodynamic analysis; (3) thermal stress analysis; (4) acoustic fatigue analysis; and (5) weights/cost analysis. In addition, a system total cost is predicted based on system weight and historical cost data of similar systems. Two basic types of input are provided, both of which are based on trajectory data. These are vehicle attitude (altitude, velocity, and angles of attack and sideslip), for external heat and pressure loads calculation, and heating rates and pressure loads as a function of time.
The Interdependencies of Theory Formation, Revision, and Experimentation
1988-06-01
as a candidate explanation for why evaporation stopped. We conclude this section with a review of the two techniques. An integrated design and the...Experiment design . Typically, there will be a number of different ways to change a theory to explain a new phenomenon. Experiments are designed to...efforts, each approach is designed to perform theory development autonomously. V1AL conjectures explanations of the new behavior based on its similarity to
Automating the design of scientific computing software
NASA Technical Reports Server (NTRS)
Kant, Elaine
1992-01-01
SINAPSE is a domain-specific software design system that generates code from specifications of equations and algorithm methods. This paper describes the system's design techniques (planning in a space of knowledge-based refinement and optimization rules), user interaction style (user has option to control decision making), and representation of knowledge (rules and objects). It also summarizes how the system knowledge has evolved over time and suggests some issues in building software design systems to facilitate reuse.
Progressive Stochastic Reconstruction Technique (PSRT) for cryo electron tomography.
Turoňová, Beata; Marsalek, Lukas; Davidovič, Tomáš; Slusallek, Philipp
2015-03-01
Cryo Electron Tomography (cryoET) plays an essential role in Structural Biology, as it is the only technique that allows to study the structure of large macromolecular complexes in their close to native environment in situ. The reconstruction methods currently in use, such as Weighted Back Projection (WBP) or Simultaneous Iterative Reconstruction Technique (SIRT), deliver noisy and low-contrast reconstructions, which complicates the application of high-resolution protocols, such as Subtomogram Averaging (SA). We propose a Progressive Stochastic Reconstruction Technique (PSRT) - a novel iterative approach to tomographic reconstruction in cryoET based on Monte Carlo random walks guided by Metropolis-Hastings sampling strategy. We design a progressive reconstruction scheme to suit the conditions present in cryoET and apply it successfully to reconstructions of macromolecular complexes from both synthetic and experimental datasets. We show how to integrate PSRT into SA, where it provides an elegant solution to the region-of-interest problem and delivers high-contrast reconstructions that significantly improve template-based localization without any loss of high-resolution structural information. Furthermore, the locality of SA is exploited to design an importance sampling scheme which significantly speeds up the otherwise slow Monte Carlo approach. Finally, we design a new memory efficient solution for the specimen-level interior problem of cryoET, removing all associated artifacts. Copyright © 2015 Elsevier Inc. All rights reserved.
Kopp, Sandra L; Smith, Hugh M
2011-01-01
Little is known about the use of Web-based education in regional anesthesia training. Benefits of Web-based education include the ability to standardize learning material quality and content, build appropriate learning progressions, use interactive multimedia technologies, and individualize delivery of course materials. The goals of this investigation were (1) to determine whether module design influences regional anesthesia knowledge acquisition, (2) to characterize learner preference patterns among anesthesia residents, and (3) to determine whether learner preferences play a role in knowledge acquisition. Direct comparison of knowledge assessments, learning styles, and learner preferences will be made between an interactive case-based and a traditional textbook-style module design. Forty-three Mayo Clinic anesthesiology residents completed 2 online modules, a knowledge pretest, posttest, an Index of Learning Styles assessment, and a participant satisfaction survey. Interscalene and lumbar plexus regional techniques were selected as the learning content for 4 Web modules constructed using the Blackboard Vista coursework application. One traditional textbook-style module and 1 interactive case-based module were designed for each of the interscalene and lumbar plexus techniques. Participants scored higher on the postmodule knowledge assessment for both of the interscalene and lumbar plexus modules. Postmodule knowledge performance scores were independent of both module design (interactive case-based versus traditional textbook style) and learning style preferences. However, nearly all participants reported a preference for Web-based learning and believe that it should be used in anesthesia resident education. Participants did not feel that Web-base learning should replace the current lecture-based curriculum. All residents scored higher on the postmodule knowledge assessment, but this improvement was independent of the module design and individual learning styles. Although residents believe that online learning should be used in anesthesia training, the results of this study do not demonstrate improved learning or justify the time and expense of developing complex case-based training modules. While there may be practical benefits of Web-based education, educators in regional anesthesia should be cautious about developing curricula based on learner preference data.
Design and Analysis Techniques for Concurrent Blackboard Systems. Ph.D. Thesis
NASA Technical Reports Server (NTRS)
Mcmanus, John William
1992-01-01
Blackboard systems are a natural progression of knowledge-based systems into a more powerful problem solving technique. They provide a way for several highly specialized knowledge sources to cooperate to solve large, complex problems. Blackboard systems incorporate the concepts developed by rule-based and expert systems programmers and include the ability to add conventionally coded knowledge sources. The small and specialized knowledge sources are easier to develop and test, and can be hosted on hardware specifically suited to the task that they are solving. The Formal Model for Blackboard Systems was developed to provide a consistent method for describing a blackboard system. A set of blackboard system design tools has been developed and validated for implementing systems that are expressed using the Formal Model. The tools are used to test and refine a proposed blackboard system design before the design is implemented. My research has shown that the level of independence and specialization of the knowledge sources directly affects the performance of blackboard systems. Using the design, simulation, and analysis tools, I developed a concurrent object-oriented blackboard system that is faster, more efficient, and more powerful than existing systems. The use of the design and analysis tools provided the highly specialized and independent knowledge sources required for my concurrent blackboard system to achieve its design goals.
All-dielectric metamaterial frequency selective surface
NASA Astrophysics Data System (ADS)
Wang, Jun; Qu, Shaobo; Li, Liyang; Wang, Jiafu; Feng, Mingde; Ma, Hua; Du, Hongliang; Xu, Zhuo
Frequency selective surface (FSS) has been extensively studied due to its potential applications in radomes, antenna reflectors, high-impedance surfaces and absorbers. Recently, a new principle of designing FSS has been proposed and mainly studied in two levels. In the level of materials, dielectric materials instead of metallic patterns are capable of achieving more functional performance in FSS design. Moreover, FSSs made of dielectric materials can be used in different extreme environments, depending on their electrical, thermal or mechanical properties. In the level of design principle, the theory of metamaterial can be used to design FSS in a convenient and concise way. In this review paper, we provide a brief summary about the recent progress in all-dielectric metamaterial frequency selective surface (ADM-FSS). The basic principle of designing ADM-FSS is summarized. As significant tools, Mie theory and dielectric resonator (DR) theory are given which illustrate clearly how they are used in the FSS design. Then, several design cases including dielectric particle-based ADM-FSS and dielectric network-based ADM-FSS are introduced and reviewed. After a discussion of these two types of ADM-FSSs, we reviewed the existing fabrication techniques that are used in building the experiment samples. Finally, issues and challenges regarding the rapid fabrication techniques and further development aspects are discussed.
NASA Astrophysics Data System (ADS)
Bandaru, Sunith; Deb, Kalyanmoy
2011-09-01
In this article, a methodology is proposed for automatically extracting innovative design principles which make a system or process (subject to conflicting objectives) optimal using its Pareto-optimal dataset. Such 'higher knowledge' would not only help designers to execute the system better, but also enable them to predict how changes in one variable would affect other variables if the system has to retain its optimal behaviour. This in turn would help solve other similar systems with different parameter settings easily without the need to perform a fresh optimization task. The proposed methodology uses a clustering-based optimization technique and is capable of discovering hidden functional relationships between the variables, objective and constraint functions and any other function that the designer wishes to include as a 'basis function'. A number of engineering design problems are considered for which the mathematical structure of these explicit relationships exists and has been revealed by a previous study. A comparison with the multivariate adaptive regression splines (MARS) approach reveals the practicality of the proposed approach due to its ability to find meaningful design principles. The success of this procedure for automated innovization is highly encouraging and indicates its suitability for further development in tackling more complex design scenarios.
ERIC Educational Resources Information Center
Sayre, Scott Alan
The purpose of this study was to develop and validate a computer-based system that would allow interactive video developers to integrate and manage the design components prior to production. These components of an interactive video (IVD) program include visual information in a variety of formats, audio information, and instructional techniques,…
Thick section aluminum weldments for SRB structures
NASA Technical Reports Server (NTRS)
Bayless, E.; Sexton, J.
1978-01-01
The Space Shuttle Solid Rocket Booster (SRB) forward and aft skirts were designed with fracture control considerations used in the design data. Fracture control is based on reliance upon nondestructive evaluation (NDE) techniques to detect potentially critical flaws. In the aerospace industry, welds on aluminum in the thicknesses (0.500 to 1.375 in.) such as those encountered on the SRB skirts are normally welded from both sides to minimize distortion. This presents a problem with the potential presence of undefined areas of incomplete fusion and the inability to detect these potential flaws by NDE techniques. To eliminate the possibility of an undetectable defect, weld joint design was revised to eliminate blind root penetrations. Weld parameters and mechanical property data were developed to verify the adequacy of the new joint design.
Conditional Random Field-Based Offline Map Matching for Indoor Environments
Bataineh, Safaa; Bahillo, Alfonso; Díez, Luis Enrique; Onieva, Enrique; Bataineh, Ikram
2016-01-01
In this paper, we present an offline map matching technique designed for indoor localization systems based on conditional random fields (CRF). The proposed algorithm can refine the results of existing indoor localization systems and match them with the map, using loose coupling between the existing localization system and the proposed map matching technique. The purpose of this research is to investigate the efficiency of using the CRF technique in offline map matching problems for different scenarios and parameters. The algorithm was applied to several real and simulated trajectories of different lengths. The results were then refined and matched with the map using the CRF algorithm. PMID:27537892
Conditional Random Field-Based Offline Map Matching for Indoor Environments.
Bataineh, Safaa; Bahillo, Alfonso; Díez, Luis Enrique; Onieva, Enrique; Bataineh, Ikram
2016-08-16
In this paper, we present an offline map matching technique designed for indoor localization systems based on conditional random fields (CRF). The proposed algorithm can refine the results of existing indoor localization systems and match them with the map, using loose coupling between the existing localization system and the proposed map matching technique. The purpose of this research is to investigate the efficiency of using the CRF technique in offline map matching problems for different scenarios and parameters. The algorithm was applied to several real and simulated trajectories of different lengths. The results were then refined and matched with the map using the CRF algorithm.
Consideration of techniques to mitigate the unauthorized 3D printing production of keys
NASA Astrophysics Data System (ADS)
Straub, Jeremy; Kerlin, Scott
2016-05-01
The illicit production of 3D printed keys based on remote-sensed imagery is problematic as it allows a would-be intruder to access a secured facility without the attack attempt being as obviously detectable as conventional techniques. This paper considers the problem from multiple perspectives. First, it looks at different attack types and considers the prospective attack from a digital information perspective. Second, based on this, techniques for securing keys are considered. Third, the design of keys is considered from the perspective of making them more difficult to duplicate using visible light sensing and 3D printing. Policy and legal considerations are discussed.
Compact high-power shipborne doppler lidar based on high spectral resolution techniques
NASA Astrophysics Data System (ADS)
Wu, Songhua; Liu, Bingyi; Dai, Guangyao; Qin, Shenguang; Liu, Jintao; Zhang, Kailin; Feng, Changzhong; Zhai, Xiaochun; Song, Xiaoquan
2018-04-01
The Compact High-Power Shipborne Doppler Wind Lidar (CHiPSDWiL) based on highspectral-resolution technique has been built up at the Ocean University of China for the measurement of the wind field and the properties of the aerosol and clouds in the troposphere. The design of the CHiPSDWiL including the transceiver, the injection seeding, the locking and the frequency measurement will be presented. Preliminary results measured by the CHiPSDWiL are provided.
Securing information display by use of visual cryptography.
Yamamoto, Hirotsugu; Hayasaki, Yoshio; Nishida, Nobuo
2003-09-01
We propose a secure display technique based on visual cryptography. The proposed technique ensures the security of visual information. The display employs a decoding mask based on visual cryptography. Without the decoding mask, the displayed information cannot be viewed. The viewing zone is limited by the decoding mask so that only one person can view the information. We have developed a set of encryption codes to maintain the designed viewing zone and have demonstrated a display that provides a limited viewing zone.
Graphical approach for multiple values logic minimization
NASA Astrophysics Data System (ADS)
Awwal, Abdul Ahad S.; Iftekharuddin, Khan M.
1999-03-01
Multiple valued logic (MVL) is sought for designing high complexity, highly compact, parallel digital circuits. However, the practical realization of an MVL-based system is dependent on optimization of cost, which directly affects the optical setup. We propose a minimization technique for MVL logic optimization based on graphical visualization, such as a Karnaugh map. The proposed method is utilized to solve signed-digit binary and trinary logic minimization problems. The usefulness of the minimization technique is demonstrated for the optical implementation of MVL circuits.
Active Vibration damping of Smart composite beams based on system identification technique
NASA Astrophysics Data System (ADS)
Bendine, Kouider; Satla, Zouaoui; Boukhoulda, Farouk Benallel; Nouari, Mohammed
2018-03-01
In the present paper, the active vibration control of a composite beam using piezoelectric actuator is investigated. The space state equation is determined using system identification technique based on the structure input output response provided by ANSYS APDL finite element package. The Linear Quadratic (LQG) control law is designed and integrated into ANSYS APDL to perform closed loop simulations. Numerical examples for different types of excitation loads are presented to test the efficiency and the accuracy of the proposed model.
Saturation-resolved-fluorescence spectroscopy of Cr3+:mullite glass ceramic
NASA Astrophysics Data System (ADS)
Liu, Huimin; Knutson, Robert; Yen, W. M.
1990-01-01
We present a saturation-based technique designed to isolate and uncouple individual components of inhomogeneously broadened spectra that are simultaneously coupled to each other through spectral overlap and energy-transfer interactions. We have termed the technique saturation-resolved-fluorescence spectroscopy; we demonstrate its usefulness in deconvoluting the complex spectra of Cr3+:mullite glass ceramic.
Monitoring Student Listening Techniques: An Approach to Teaching the Foundations of a Skill.
ERIC Educational Resources Information Center
Swanson, Charles H.
To teach listening as a discreet skill, teachers need a suitable definition of the word "skill." The author suggests defining a skill as a complex of techniques and behaviors from which performers select, depending upon the situation, to fullfill their purposes. The curricular design should be based on four components: (1) establishing attention,…
ERIC Educational Resources Information Center
Dantic, Dennis Emralino
2014-01-01
Objective: To examine and discuss the evidence base behind the effectiveness of the "teach-back" technique as an educational intervention for chronic obstructive pulmonary disease (COPD) patient self-management using respiratory inhalers. Design: A systematic literature review Method: A search was conducted through Medline, CINAHL…
LQR-Based Optimal Distributed Cooperative Design for Linear Discrete-Time Multiagent Systems.
Zhang, Huaguang; Feng, Tao; Liang, Hongjing; Luo, Yanhong
2017-03-01
In this paper, a novel linear quadratic regulator (LQR)-based optimal distributed cooperative design method is developed for synchronization control of general linear discrete-time multiagent systems on a fixed, directed graph. Sufficient conditions are derived for synchronization, which restrict the graph eigenvalues into a bounded circular region in the complex plane. The synchronizing speed issue is also considered, and it turns out that the synchronizing region reduces as the synchronizing speed becomes faster. To obtain more desirable synchronizing capacity, the weighting matrices are selected by sufficiently utilizing the guaranteed gain margin of the optimal regulators. Based on the developed LQR-based cooperative design framework, an approximate dynamic programming technique is successfully introduced to overcome the (partially or completely) model-free cooperative design for linear multiagent systems. Finally, two numerical examples are given to illustrate the effectiveness of the proposed design methods.
Design and analysis of photonic crystal micro-cavity based optical sensor platform
DOE Office of Scientific and Technical Information (OSTI.GOV)
Goyal, Amit Kumar, E-mail: amitgoyal.ceeri@gmail.com; Dutta, Hemant Sankar, E-mail: hemantdutta97@gmail.com; Pal, Suchandan, E-mail: spal@ceeri.ernet.in
2016-04-13
In this paper, the design of a two-dimensional photonic crystal micro-cavity based integrated-optic sensor platform is proposed. The behaviour of designed cavity is analyzed using two-dimensional Finite Difference Time Domain (FDTD) method. The structure is designed by deliberately inserting some defects in a photonic crystal waveguide structure. Proposed structure shows a quality factor (Q) of about 1e5 and the average sensitivity of 500nm/RIU in the wavelength range of 1450 – 1580 nm. Sensing technique is based on the detection of shift in upper-edge cut-off wavelength for a reference signal strength of –10 dB in accordance with the change in refractive index ofmore » analyte.« less
Alveolar Ridge Split Technique Using Piezosurgery with Specially Designed Tips
Moro, Alessandro; Foresta, Enrico; Falchi, Marco; De Angelis, Paolo; D'Amato, Giuseppe; Pelo, Sandro
2017-01-01
The treatment of patients with atrophic ridge who need prosthetic rehabilitation is a common problem in oral and maxillofacial surgery. Among the various techniques introduced for the expansion of alveolar ridges with a horizontal bone deficit is the alveolar ridge split technique. The aim of this article is to give a description of some new tips that have been specifically designed for the treatment of atrophic ridges with transversal bone deficit. A two-step piezosurgical split technique is also described, based on specific osteotomies of the vestibular cortex and the use of a mandibular ramus graft as interpositional graft. A total of 15 patients were treated with the proposed new tips by our department. All the expanded areas were successful in providing an adequate width and height to insert implants according to the prosthetic plan and the proposed tips allowed obtaining the most from the alveolar ridge split technique and piezosurgery. These tips have made alveolar ridge split technique simple, safe, and effective for the treatment of horizontal and vertical bone defects. Furthermore the proposed piezosurgical split technique allows obtaining horizontal and vertical bone augmentation. PMID:28246596
Alveolar Ridge Split Technique Using Piezosurgery with Specially Designed Tips.
Moro, Alessandro; Gasparini, Giulio; Foresta, Enrico; Saponaro, Gianmarco; Falchi, Marco; Cardarelli, Lorenzo; De Angelis, Paolo; Forcione, Mario; Garagiola, Umberto; D'Amato, Giuseppe; Pelo, Sandro
2017-01-01
The treatment of patients with atrophic ridge who need prosthetic rehabilitation is a common problem in oral and maxillofacial surgery. Among the various techniques introduced for the expansion of alveolar ridges with a horizontal bone deficit is the alveolar ridge split technique. The aim of this article is to give a description of some new tips that have been specifically designed for the treatment of atrophic ridges with transversal bone deficit. A two-step piezosurgical split technique is also described, based on specific osteotomies of the vestibular cortex and the use of a mandibular ramus graft as interpositional graft. A total of 15 patients were treated with the proposed new tips by our department. All the expanded areas were successful in providing an adequate width and height to insert implants according to the prosthetic plan and the proposed tips allowed obtaining the most from the alveolar ridge split technique and piezosurgery. These tips have made alveolar ridge split technique simple, safe, and effective for the treatment of horizontal and vertical bone defects. Furthermore the proposed piezosurgical split technique allows obtaining horizontal and vertical bone augmentation.
Lossless compression techniques for maskless lithography data
NASA Astrophysics Data System (ADS)
Dai, Vito; Zakhor, Avideh
2002-07-01
Future lithography systems must produce more dense chips with smaller feature sizes, while maintaining the throughput of one wafer per sixty seconds per layer achieved by today's optical lithography systems. To achieve this throughput with a direct-write maskless lithography system, using 25 nm pixels for 50 nm feature sizes, requires data rates of about 10 Tb/s. In a previous paper, we presented an architecture which achieves this data rate contingent on consistent 25 to 1 compression of lithography data, and on implementation of a decoder-writer chip with a real-time decompressor fabricated on the same chip as the massively parallel array of lithography writers. In this paper, we examine the compression efficiency of a spectrum of techniques suitable for lithography data, including two industry standards JBIG and JPEG-LS, a wavelet based technique SPIHT, general file compression techniques ZIP and BZIP2, our own 2D-LZ technique, and a simple list-of-rectangles representation RECT. Layouts rasterized both to black-and-white pixels, and to 32 level gray pixels are considered. Based on compression efficiency, JBIG, ZIP, 2D-LZ, and BZIP2 are found to be strong candidates for application to maskless lithography data, in many cases far exceeding the required compression ratio of 25. To demonstrate the feasibility of implementing the decoder-writer chip, we consider the design of a hardware decoder based on ZIP, the simplest of the four candidate techniques. The basic algorithm behind ZIP compression is Lempel-Ziv 1977 (LZ77), and the design parameters of LZ77 decompression are optimized to minimize circuit usage while maintaining compression efficiency.
Monte Carlo-based Reconstruction in Water Cherenkov Detectors using Chroma
NASA Astrophysics Data System (ADS)
Seibert, Stanley; Latorre, Anthony
2012-03-01
We demonstrate the feasibility of event reconstruction---including position, direction, energy and particle identification---in water Cherenkov detectors with a purely Monte Carlo-based method. Using a fast optical Monte Carlo package we have written, called Chroma, in combination with several variance reduction techniques, we can estimate the value of a likelihood function for an arbitrary event hypothesis. The likelihood can then be maximized over the parameter space of interest using a form of gradient descent designed for stochastic functions. Although slower than more traditional reconstruction algorithms, this completely Monte Carlo-based technique is universal and can be applied to a detector of any size or shape, which is a major advantage during the design phase of an experiment. As a specific example, we focus on reconstruction results from a simulation of the 200 kiloton water Cherenkov far detector option for LBNE.
NASA Technical Reports Server (NTRS)
Garg, Sanjay; Ouzts, Peter J.
1991-01-01
Results are presented from an application of H-infinity control design methodology to a centralized integrated flight propulsion control (IFPC) system design for a supersonic Short Takeoff and Vertical Landing (STOVL) fighter aircraft in transition flight. The emphasis is on formulating the H-infinity control design problem such that the resulting controller provides robustness to modeling uncertainties and model parameter variations with flight condition. Experience gained from a preliminary H-infinity based IFPC design study performed earlier is used as the basis to formulate the robust H-infinity control design problem and improve upon the previous design. Detailed evaluation results are presented for a reduced order controller obtained from the improved H-infinity control design showing that the control design meets the specified nominal performance objectives as well as provides stability robustness for variations in plant system dynamics with changes in aircraft trim speed within the transition flight envelope. A controller scheduling technique which accounts for changes in plant control effectiveness with variation in trim conditions is developed and off design model performance results are presented.
NASA Technical Reports Server (NTRS)
Murthy, T. Sreekanta; Kvaternik, Raymond G.
1991-01-01
A NASA/industry rotorcraft structural dynamics program known as Design Analysis Methods for VIBrationS (DAMVIBS) was initiated at Langley Research Center in 1984 with the objective of establishing the technology base needed by the industry for developing an advanced finite-element-based vibrations design analysis capability for airframe structures. As a part of the in-house activities contributing to that program, a study was undertaken to investigate the use of formal, nonlinear programming-based, numerical optimization techniques for airframe vibrations design work. Considerable progress has been made in connection with that study since its inception in 1985. This paper presents a unified summary of the experiences and results of that study. The formulation and solution of airframe optimization problems are discussed. Particular attention is given to describing the implementation of a new computational procedure based on MSC/NASTRAN and CONstrained function MINimization (CONMIN) in a computer program system called DYNOPT for the optimization of airframes subject to strength, frequency, dynamic response, and fatigue constraints. The results from the application of the DYNOPT program to the Bell AH-1G helicopter are presented and discussed.
EXPERIMENTAL MODELLING OF AORTIC ANEURYSMS
Doyle, Barry J; Corbett, Timothy J; Cloonan, Aidan J; O’Donnell, Michael R; Walsh, Michael T; Vorp, David A; McGloughlin, Timothy M
2009-01-01
A range of silicone rubbers were created based on existing commercially available materials. These silicones were designed to be visually different from one another and have distinct material properties, in particular, ultimate tensile strengths and tear strengths. In total, eleven silicone rubbers were manufactured, with the materials designed to have a range of increasing tensile strengths from approximately 2-4MPa, and increasing tear strengths from approximately 0.45-0.7N/mm. The variations in silicones were detected using a standard colour analysis technique. Calibration curves were then created relating colour intensity to individual material properties. All eleven materials were characterised and a 1st order Ogden strain energy function applied. Material coefficients were determined and examined for effectiveness. Six idealised abdominal aortic aneurysm models were also created using the two base materials of the study, with a further model created using a new mixing technique to create a rubber model with randomly assigned material properties. These models were then examined using videoextensometry and compared to numerical results. Colour analysis revealed a statistically significant linear relationship (p<0.0009) with both tensile strength and tear strength, allowing material strength to be determined using a non-destructive experimental technique. The effectiveness of this technique was assessed by comparing predicted material properties to experimentally measured methods, with good agreement in the results. Videoextensometry and numerical modelling revealed minor percentage differences, with all results achieving significance (p<0.0009). This study has successfully designed and developed a range of silicone rubbers that have unique colour intensities and material strengths. Strengths can be readily determined using a non-destructive analysis technique with proven effectiveness. These silicones may further aid towards an improved understanding of the biomechanical behaviour of aneurysms using experimental techniques. PMID:19595622
NASA Technical Reports Server (NTRS)
Kavaya, Michael J.; Spiers, Gary D.; Lobl, Elena S.; Rothermel, Jeff; Keller, Vernon W.
1996-01-01
Innovative designs of a space-based laser remote sensing 'wind machine' are presented. These designs seek compatibility with the traditionally conflicting constraints of high scientific value and low total mission cost. Mission cost is reduced by moving to smaller, lighter, more off-the-shelf instrument designs which can be accommodated on smaller launch vehicles.
Wind Turbine Blade CAD Models Used as Scaffolding Technique to Teach Design Engineers
ERIC Educational Resources Information Center
Irwin, John
2013-01-01
The Siemens PLM CAD software NX is commonly used for designing mechanical systems, and in complex systems such as the emerging area of wind power, the ability to have a model controlled by design parameters is a certain advantage. Formula driven expressions based on the amount of available wind in an area can drive the amount of effective surface…
Alternatives for Developing User Documentation for Applications Software
1991-09-01
style that is designed to match adult reading behaviors, using reader-based writing techniques, developing effective graphics , creating reference aids...involves research, analysis, design , and testing. The writer must have a solid understanding of the technical aspects of the document being prepared, good...ABSTRACT The preparation of software documentation is an iterative process that involves research, analysis, design , and testing. The writer must have
Laser-Etched Designs for Molding Hydrogel-Based Engineered Tissues
Munarin, Fabiola; Kaiser, Nicholas J.; Kim, Tae Yun; Choi, Bum-Rak
2017-01-01
Rapid prototyping and fabrication of elastomeric molds for sterile culture of engineered tissues allow for the development of tissue geometries that can be tailored to different in vitro applications and customized as implantable scaffolds for regenerative medicine. Commercially available molds offer minimal capabilities for adaptation to unique conditions or applications versus those for which they are specifically designed. Here we describe a replica molding method for the design and fabrication of poly(dimethylsiloxane) (PDMS) molds from laser-etched acrylic negative masters with ∼0.2 mm resolution. Examples of the variety of mold shapes, sizes, and patterns obtained from laser-etched designs are provided. We use the patterned PDMS molds for producing and culturing engineered cardiac tissues with cardiomyocytes derived from human-induced pluripotent stem cells. We demonstrate that tight control over tissue morphology and anisotropy results in modulation of cell alignment and tissue-level conduction properties, including the appearance and elimination of reentrant arrhythmias, or circular electrical activation patterns. Techniques for handling engineered cardiac tissues during implantation in vivo in a rat model of myocardial infarction have been developed and are presented herein to facilitate development and adoption of surgical techniques for use with hydrogel-based engineered tissues. In summary, the method presented herein for engineered tissue mold generation is straightforward and low cost, enabling rapid design iteration and adaptation to a variety of applications in tissue engineering. Furthermore, the burden of equipment and expertise is low, allowing the technique to be accessible to all. PMID:28457187
Blended Interaction Design: A Spatial Workspace Supporting HCI and Design Practice
NASA Astrophysics Data System (ADS)
Geyer, Florian
This research investigates novel methods and techniques along with tool support that result from a conceptual blend of human-computer interaction with design practice. Using blending theory with material anchors as a theoretical framework, we frame both input spaces and explore emerging structures within technical, cognitive, and social aspects. Based on our results, we will describe a framework of the emerging structures and will design and evaluate tool support within a spatial, studio-like workspace to support collaborative creativity in interaction design.
Failure Diagnosis for the Holdup Tank System via ISFA
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, Huijuan; Bragg-Sitton, Shannon; Smidts, Carol
This paper discusses the use of the integrated system failure analysis (ISFA) technique for fault diagnosis for the holdup tank system. ISFA is a simulation-based, qualitative and integrated approach used to study fault propagation in systems containing both hardware and software subsystems. The holdup tank system consists of a tank containing a fluid whose level is controlled by an inlet valve and an outlet valve. We introduce the component and functional models of the system, quantify the main parameters and simulate possible failure-propagation paths based on the fault propagation approach, ISFA. The results show that most component failures in themore » holdup tank system can be identified clearly and that ISFA is viable as a technique for fault diagnosis. Since ISFA is a qualitative technique that can be used in the very early stages of system design, this case study provides indications that it can be used early to study design aspects that relate to robustness and fault tolerance.« less
A comparison of design variables for control theory based airfoil optimization
NASA Technical Reports Server (NTRS)
Reuther, James; Jameson, Antony
1995-01-01
This paper describes the implementation of optimization techniques based on control theory for airfoil design. In our previous work in the area it was shown that control theory could be employed to devise effective optimization procedures for two-dimensional profiles by using either the potential flow or the Euler equations with either a conformal mapping or a general coordinate system. We have also explored three-dimensional extensions of these formulations recently. The goal of our present work is to demonstrate the versatility of the control theory approach by designing airfoils using both Hicks-Henne functions and B-spline control points as design variables. The research also demonstrates that the parameterization of the design space is an open question in aerodynamic design.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ha, Thi Dep, E-mail: hathidep@yahoo.com; Faculty of Electronic Technology, Industrial University of Ho Chi Minh City, Hochiminh City; Bao, JingFu, E-mail: baojingfu@uestc.edu.cn
Phononic crystals (PnCs) and n-type doped silicon technique have been widely employed in silicon-based MEMS resonators to obtain high quality factor (Q) as well as temperature-induced frequency stability. For the PnCs, their band gaps play an important role in the acoustic wave propagation. Also, the temperature and dopant doped into silicon can cause the change in its material properties such as elastic constants, Young’s modulus. Therefore, in order to design the simultaneous high Q and frequency stability silicon-based MEMS resonators by two these techniques, a careful design should study effects of temperature and dopant on the band gap characteristics tomore » examine the acoustic wave propagation in the PnC. Based on these, this paper presents (1) a proposed silicon-based PnC strip structure for support tether applications in low frequency silicon-based MEMS resonators, (2) influences of temperature and dopant on band gap characteristics of the PnC strips. The simulation results show that the largest band gap can achieve up to 33.56 at 57.59 MHz and increase 1280.13 % (also increase 131.89 % for ratio of the widest gaps) compared with the counterpart without hole. The band gap properties of the PnC strips is insignificantly effected by temperature and electron doping concentration. Also, the quality factor of two designed length extensional mode MEMS resonators with proposed PnC strip based support tethers is up to 1084.59% and 43846.36% over the same resonators with PnC strip without hole and circled corners, respectively. This theoretical study uses the finite element analysis in COMSOL Multiphysics and MATLAB softwares as simulation tools. This findings provides a background in combination of PnC and dopant techniques for high performance silicon-based MEMS resonators as well as PnC-based MEMS devices.« less
Macready, Anna L; Fallaize, Rosalind; Butler, Laurie T; Ellis, Judi A; Kuznesof, Sharron; Frewer, Lynn J; Celis-Morales, Carlos; Livingstone, Katherine M; Araújo-Soares, Vera; Fischer, Arnout Rh; Stewart-Knox, Barbara J; Mathers, John C; Lovegrove, Julie A
2018-04-09
To determine the efficacy of behavior change techniques applied in dietary and physical activity intervention studies, it is first necessary to record and describe techniques that have been used during such interventions. Published frameworks used in dietary and smoking cessation interventions undergo continuous development, and most are not adapted for Web-based delivery. The Food4Me study (N=1607) provided the opportunity to use existing frameworks to describe standardized Web-based techniques employed in a large-scale, internet-based intervention to change dietary behavior and physical activity. The aims of this study were (1) to describe techniques embedded in the Food4Me study design and explain the selection rationale and (2) to demonstrate the use of behavior change technique taxonomies, develop standard operating procedures for training, and identify strengths and limitations of the Food4Me framework that will inform its use in future studies. The 6-month randomized controlled trial took place simultaneously in seven European countries, with participants receiving one of four levels of personalized advice (generalized, intake-based, intake+phenotype-based, and intake+phenotype+gene-based). A three-phase approach was taken: (1) existing taxonomies were reviewed and techniques were identified a priori for possible inclusion in the Food4Me study, (2) a standard operating procedure was developed to maintain consistency in the use of methods and techniques across research centers, and (3) the Food4Me behavior change technique framework was reviewed and updated post intervention. An analysis of excluded techniques was also conducted. Of 46 techniques identified a priori as being applicable to Food4Me, 17 were embedded in the intervention design; 11 were from a dietary taxonomy, and 6 from a smoking cessation taxonomy. In addition, the four-category smoking cessation framework structure was adopted for clarity of communication. Smoking cessation texts were adapted for dietary use where necessary. A posteriori, a further 9 techniques were included. Examination of excluded items highlighted the distinction between techniques considered appropriate for face-to-face versus internet-based delivery. The use of existing taxonomies facilitated the description and standardization of techniques used in Food4Me. We recommend that for complex studies of this nature, technique analysis should be conducted a priori to develop standardized procedures and training and reviewed a posteriori to audit the techniques actually adopted. The present framework description makes a valuable contribution to future systematic reviews and meta-analyses that explore technique efficacy and underlying psychological constructs. This was a novel application of the behavior change taxonomies and was the first internet-based personalized nutrition intervention to use such a framework remotely. ClinicalTrials.gov NCT01530139; https://clinicaltrials.gov/ct2/show/NCT01530139 (Archived by WebCite at http://www.webcitation.org/6y8XYUft1). ©Anna L Macready, Rosalind Fallaize, Laurie T Butler, Judi A Ellis, Sharron Kuznesof, Lynn J Frewer, Carlos Celis-Morales, Katherine M Livingstone, Vera Araújo-Soares, Arnout RH Fischer, Barbara J Stewart-Knox, John C Mathers, Julie A Lovegrove. Originally published in JMIR Research Protocols (http://www.researchprotocols.org), 09.04.2018.
Experiments on Adaptive Techniques for Host-Based Intrusion Detection
DOE Office of Scientific and Technical Information (OSTI.GOV)
DRAELOS, TIMOTHY J.; COLLINS, MICHAEL J.; DUGGAN, DAVID P.
2001-09-01
This research explores four experiments of adaptive host-based intrusion detection (ID) techniques in an attempt to develop systems that can detect novel exploits. The technique considered to have the most potential is adaptive critic designs (ACDs) because of their utilization of reinforcement learning, which allows learning exploits that are difficult to pinpoint in sensor data. Preliminary results of ID using an ACD, an Elman recurrent neural network, and a statistical anomaly detection technique demonstrate an ability to learn to distinguish between clean and exploit data. We used the Solaris Basic Security Module (BSM) as a data source and performed considerablemore » preprocessing on the raw data. A detection approach called generalized signature-based ID is recommended as a middle ground between signature-based ID, which has an inability to detect novel exploits, and anomaly detection, which detects too many events including events that are not exploits. The primary results of the ID experiments demonstrate the use of custom data for generalized signature-based intrusion detection and the ability of neural network-based systems to learn in this application environment.« less
Fourier-Mellin moment-based intertwining map for image encryption
NASA Astrophysics Data System (ADS)
Kaur, Manjit; Kumar, Vijay
2018-03-01
In this paper, a robust image encryption technique that utilizes Fourier-Mellin moments and intertwining logistic map is proposed. Fourier-Mellin moment-based intertwining logistic map has been designed to overcome the issue of low sensitivity of an input image. Multi-objective Non-Dominated Sorting Genetic Algorithm (NSGA-II) based on Reinforcement Learning (MNSGA-RL) has been used to optimize the required parameters of intertwining logistic map. Fourier-Mellin moments are used to make the secret keys more secure. Thereafter, permutation and diffusion operations are carried out on input image using secret keys. The performance of proposed image encryption technique has been evaluated on five well-known benchmark images and also compared with seven well-known existing encryption techniques. The experimental results reveal that the proposed technique outperforms others in terms of entropy, correlation analysis, a unified average changing intensity and the number of changing pixel rate. The simulation results reveal that the proposed technique provides high level of security and robustness against various types of attacks.
Design of Efficient Mirror Adder in Quantum- Dot Cellular Automata
NASA Astrophysics Data System (ADS)
Mishra, Prashant Kumar; Chattopadhyay, Manju K.
2018-03-01
Lower power consumption is an essential demand for portable multimedia system using digital signal processing algorithms and architectures. Quantum dot cellular automata (QCA) is a rising nano technology for the development of high performance ultra-dense low power digital circuits. QCA based several efficient binary and decimal arithmetic circuits are implemented, however important improvements are still possible. This paper demonstrate Mirror Adder circuit design in QCA. We present comparative study of mirror adder cells designed using conventional CMOS technique and mirror adder cells designed using quantum-dot cellular automata. QCA based mirror adders are better in terms of area by order of three.
Self-Tuning of Design Variables for Generalized Predictive Control
NASA Technical Reports Server (NTRS)
Lin, Chaung; Juang, Jer-Nan
2000-01-01
Three techniques are introduced to determine the order and control weighting for the design of a generalized predictive controller. These techniques are based on the application of fuzzy logic, genetic algorithms, and simulated annealing to conduct an optimal search on specific performance indexes or objective functions. Fuzzy logic is found to be feasible for real-time and on-line implementation due to its smooth and quick convergence. On the other hand, genetic algorithms and simulated annealing are applicable for initial estimation of the model order and control weighting, and final fine-tuning within a small region of the solution space, Several numerical simulations for a multiple-input and multiple-output system are given to illustrate the techniques developed in this paper.
NASA Astrophysics Data System (ADS)
Cicak, Katarina; Lecocq, Florent; Ranzani, Leonardo; Peterson, Gabriel A.; Kotler, Shlomi; Teufel, John D.; Simmonds, Raymond W.; Aumentado, Jose
Recent developments in coupled mode theory have opened the doors to new nonreciprocal amplification techniques that can be directly leveraged to produce high quantum efficiency in current measurements in microwave quantum information. However, taking advantage of these techniques requires flexible multi-mode circuit designs comprised of low-loss materials that can be implemented using common fabrication techniques. In this talk we discuss the design and fabrication of a new class of multi-pole lumped-element superconducting parametric amplifiers based on Nb/Al-AlOx/Nb Josephson junctions on silicon or sapphire. To reduce intrinsic loss in these circuits we utilize PECVD amorphous silicon as a low-loss dielectric (tanδ 5 ×10-4), resulting in nearly quantum-limited directional amplification.
NetCoDer: A Retransmission Mechanism for WSNs Based on Cooperative Relays and Network Coding
Valle, Odilson T.; Montez, Carlos; Medeiros de Araujo, Gustavo; Vasques, Francisco; Moraes, Ricardo
2016-01-01
Some of the most difficult problems to deal with when using Wireless Sensor Networks (WSNs) are related to the unreliable nature of communication channels. In this context, the use of cooperative diversity techniques and the application of network coding concepts may be promising solutions to improve the communication reliability. In this paper, we propose the NetCoDer scheme to address this problem. Its design is based on merging cooperative diversity techniques and network coding concepts. We evaluate the effectiveness of the NetCoDer scheme through both an experimental setup with real WSN nodes and a simulation assessment, comparing NetCoDer performance against state-of-the-art TDMA-based (Time Division Multiple Access) retransmission techniques: BlockACK, Master/Slave and Redundant TDMA. The obtained results highlight that the proposed NetCoDer scheme clearly improves the network performance when compared with other retransmission techniques. PMID:27258280
Machine intelligence and autonomy for aerospace systems
NASA Technical Reports Server (NTRS)
Heer, Ewald (Editor); Lum, Henry (Editor)
1988-01-01
The present volume discusses progress toward intelligent robot systems in aerospace applications, NASA Space Program automation and robotics efforts, the supervisory control of telerobotics in space, machine intelligence and crew/vehicle interfaces, expert-system terms and building tools, and knowledge-acquisition for autonomous systems. Also discussed are methods for validation of knowledge-based systems, a design methodology for knowledge-based management systems, knowledge-based simulation for aerospace systems, knowledge-based diagnosis, planning and scheduling methods in AI, the treatment of uncertainty in AI, vision-sensing techniques in aerospace applications, image-understanding techniques, tactile sensing for robots, distributed sensor integration, and the control of articulated and deformable space structures.
Frontiers in Chemical Sensors: Novel Principles and Techniques
NASA Astrophysics Data System (ADS)
Orellana, Guillermo; Moreno-Bondi, Maria Cruz
This third volume of Springer Series on Chemical Sensors and Biosensors aims to enable the researcher or technologist to become acquainted with the latest principles and techniques that keep on enlarging the applications in this fascinating field. It deals with the novel luminescence lifetime-based techniques for interrogation of sensor arrays in high-throughput screening, cataluminescence, chemical sensing with hollow waveguides, new ways in sensor design and fabrication by means of either combinatorial methods or engineered indicator/support couples.
Techniques for noise removal and registration of TIMS data
Hummer-Miller, S.
1990-01-01
Extracting subtle differences from highly correlated thermal infrared aircraft data is possible with appropriate noise filters, constructed and applied in the spatial frequency domain. This paper discusses a heuristic approach to designing noise filters for removing high- and low-spatial frequency striping and banding. Techniques for registering thermal infrared aircraft data to a topographic base using Thematic Mapper data are presented. The noise removal and registration techniques are applied to TIMS thermal infrared aircraft data. -Author
Fitting Prony Series To Data On Viscoelastic Materials
NASA Technical Reports Server (NTRS)
Hill, S. A.
1995-01-01
Improved method of fitting Prony series to data on viscoelastic materials involves use of least-squares optimization techniques. Based on optimization techniques yields closer correlation with data than traditional method. Involves no assumptions regarding the gamma'(sub i)s and higher-order terms, and provides for as many Prony terms as needed to represent higher-order subtleties in data. Curve-fitting problem treated as design-optimization problem and solved by use of partially-constrained-optimization techniques.
ERIC Educational Resources Information Center
Ismail, Noor Azizi
2010-01-01
Purpose: The purpose of this paper is to discuss how activity-based costing (ABC) technique can be applied in the context of higher education institutions. It also discusses the obstacles and challenges to the successful implementation of activity-based management (ABM) in the higher education environment. Design/methodology/approach: This paper…
NASA Technical Reports Server (NTRS)
Melcher, Kevin J.
1997-01-01
The NASA Lewis Research Center is developing analytical methods and software tools to create a bridge between the controls and computational fluid dynamics (CFD) disciplines. Traditionally, control design engineers have used coarse nonlinear simulations to generate information for the design of new propulsion system controls. However, such traditional methods are not adequate for modeling the propulsion systems of complex, high-speed vehicles like the High Speed Civil Transport. To properly model the relevant flow physics of high-speed propulsion systems, one must use simulations based on CFD methods. Such CFD simulations have become useful tools for engineers that are designing propulsion system components. The analysis techniques and software being developed as part of this effort are an attempt to evolve CFD into a useful tool for control design as well. One major aspect of this research is the generation of linear models from steady-state CFD results. CFD simulations, often used during the design of high-speed inlets, yield high resolution operating point data. Under a NASA grant, the University of Akron has developed analytical techniques and software tools that use these data to generate linear models for control design. The resulting linear models have the same number of states as the original CFD simulation, so they are still very large and computationally cumbersome. Model reduction techniques have been successfully applied to reduce these large linear models by several orders of magnitude without significantly changing the dynamic response. The result is an accurate, easy to use, low-order linear model that takes less time to generate than those generated by traditional means. The development of methods for generating low-order linear models from steady-state CFD is most complete at the one-dimensional level, where software is available to generate models with different kinds of input and output variables. One-dimensional methods have been extended somewhat so that linear models can also be generated from two- and three-dimensional steady-state results. Standard techniques are adequate for reducing the order of one-dimensional CFD-based linear models. However, reduction of linear models based on two- and three-dimensional CFD results is complicated by very sparse, ill-conditioned matrices. Some novel approaches are being investigated to solve this problem.
Wang, Dengjiang; Zhang, Weifang; Wang, Xiangyu; Sun, Bo
2016-01-01
This study presents a novel monitoring method for hole-edge corrosion damage in plate structures based on Lamb wave tomographic imaging techniques. An experimental procedure with a cross-hole layout using 16 piezoelectric transducers (PZTs) was designed. The A0 mode of the Lamb wave was selected, which is sensitive to thickness-loss damage. The iterative algebraic reconstruction technique (ART) method was used to locate and quantify the corrosion damage at the edge of the hole. Hydrofluoric acid with a concentration of 20% was used to corrode the specimen artificially. To estimate the effectiveness of the proposed method, the real corrosion damage was compared with the predicted corrosion damage based on the tomographic method. The results show that the Lamb-wave-based tomographic method can be used to monitor the hole-edge corrosion damage accurately. PMID:28774041
Design of refractive laser beam shapers to generate complex irradiance profiles
NASA Astrophysics Data System (ADS)
Li, Meijie; Meuret, Youri; Duerr, Fabian; Vervaeke, Michael; Thienpont, Hugo
2014-05-01
A Gaussian laser beam is reshaped to have specific irradiance distributions in many applications in order to ensure optimal system performance. Refractive optics are commonly used for laser beam shaping. A refractive laser beam shaper is typically formed by either two plano-aspheric lenses or by one thick lens with two aspherical surfaces. Ray mapping is a general optical design technique to design refractive beam shapers based on geometric optics. This design technique in principle allows to generate any rotational-symmetric irradiance profile, yet in literature ray mapping is mainly developed to transform a Gaussian irradiance profile to a uniform profile. For more complex profiles especially with low intensity in the inner region, like a Dark Hollow Gaussian (DHG) irradiance profile, ray mapping technique is not directly applicable in practice. In order to these complex profiles, the numerical effort of calculating the aspherical surface points and fitting a surface with sufficient accuracy increases considerably. In this work we evaluate different sampling approaches and surface fitting methods. This allows us to propose and demonstrate a comprehensive numerical approach to efficiently design refractive laser beam shapers to generate rotational-symmetric collimated beams with a complex irradiance profile. Ray tracing analysis for several complex irradiance profiles demonstrates excellent performance of the designed lenses and the versatility of our design procedure.
Statistics based sampling for controller and estimator design
NASA Astrophysics Data System (ADS)
Tenne, Dirk
The purpose of this research is the development of statistical design tools for robust feed-forward/feedback controllers and nonlinear estimators. This dissertation is threefold and addresses the aforementioned topics nonlinear estimation, target tracking and robust control. To develop statistically robust controllers and nonlinear estimation algorithms, research has been performed to extend existing techniques, which propagate the statistics of the state, to achieve higher order accuracy. The so-called unscented transformation has been extended to capture higher order moments. Furthermore, higher order moment update algorithms based on a truncated power series have been developed. The proposed techniques are tested on various benchmark examples. Furthermore, the unscented transformation has been utilized to develop a three dimensional geometrically constrained target tracker. The proposed planar circular prediction algorithm has been developed in a local coordinate framework, which is amenable to extension of the tracking algorithm to three dimensional space. This tracker combines the predictions of a circular prediction algorithm and a constant velocity filter by utilizing the Covariance Intersection. This combined prediction can be updated with the subsequent measurement using a linear estimator. The proposed technique is illustrated on a 3D benchmark trajectory, which includes coordinated turns and straight line maneuvers. The third part of this dissertation addresses the design of controller which include knowledge of parametric uncertainties and their distributions. The parameter distributions are approximated by a finite set of points which are calculated by the unscented transformation. This set of points is used to design robust controllers which minimize a statistical performance of the plant over the domain of uncertainty consisting of a combination of the mean and variance. The proposed technique is illustrated on three benchmark problems. The first relates to the design of prefilters for a linear and nonlinear spring-mass-dashpot system and the second applies a feedback controller to a hovering helicopter. Lastly, the statistical robust controller design is devoted to a concurrent feed-forward/feedback controller structure for a high-speed low tension tape drive.
Atomic force microscopy-based characterization and design of biointerfaces
NASA Astrophysics Data System (ADS)
Alsteens, David; Gaub, Hermann E.; Newton, Richard; Pfreundschuh, Moritz; Gerber, Christoph; Müller, Daniel J.
2017-03-01
Atomic force microscopy (AFM)-based methods have matured into a powerful nanoscopic platform, enabling the characterization of a wide range of biological and synthetic biointerfaces ranging from tissues, cells, membranes, proteins, nucleic acids and functional materials. Although the unprecedented signal-to-noise ratio of AFM enables the imaging of biological interfaces from the cellular to the molecular scale, AFM-based force spectroscopy allows their mechanical, chemical, conductive or electrostatic, and biological properties to be probed. The combination of AFM-based imaging and spectroscopy structurally maps these properties and allows their 3D manipulation with molecular precision. In this Review, we survey basic and advanced AFM-related approaches and evaluate their unique advantages and limitations in imaging, sensing, parameterizing and designing biointerfaces. It is anticipated that in the next decade these AFM-related techniques will have a profound influence on the way researchers view, characterize and construct biointerfaces, thereby helping to solve and address fundamental challenges that cannot be addressed with other techniques.
Design and simulation of GaN based Schottky betavoltaic nuclear micro-battery.
San, Haisheng; Yao, Shulin; Wang, Xiang; Cheng, Zaijun; Chen, Xuyuan
2013-10-01
The current paper presents a theoretical analysis of Ni-63 nuclear micro-battery based on a wide-band gap semiconductor GaN thin-film covered with thin Ni/Au films to form Schottky barrier for carrier separation. The total energy deposition in GaN was calculated using Monte Carlo methods by taking into account the full beta spectral energy, which provided an optimal design on Schottky barrier width. The calculated results show that an 8 μm thick Schottky barrier can collect about 95% of the incident beta particle energy. Considering the actual limitations of current GaN growth technique, a Fe-doped compensation technique by MOCVD method can be used to realize the n-type GaN with a carrier concentration of 1×10(15) cm(-3), by which a GaN based Schottky betavoltaic micro-battery can achieve an energy conversion efficiency of 2.25% based on the theoretical calculations of semiconductor device physics. Copyright © 2013 Elsevier Ltd. All rights reserved.
Hercules Single-Stage Reusable Vehicle (HSRV) Operating Base
NASA Technical Reports Server (NTRS)
Moon, Michael J.; McCleskey, Carey M.
2017-01-01
Conceptual design for the layout of lunar-planetary surface support systems remains an important area needing further master planning. This paper explores a structured approach to organize the layout of a Mars-based site equipped for routinely flying a human-scale reusable taxi system. The proposed Hercules Transportation System requires a surface support capability to sustain its routine, affordable, and dependable operation. The approach organizes a conceptual Hercules operating base through functional station sets. The station set approach will allow follow-on work to trade design approaches and consider technologies for more efficient flow of material, energy, and information at future Mars bases and settlements. The station set requirements at a Mars site point to specific capabilities needed. By drawing from specific Hercules design characteristics, the technology requirements for surface-based systems will come into greater focus. This paper begins a comprehensive process for documenting functional needs, architectural design methods, and analysis techniques necessary for follow-on concept studies.
Description of the control system design for the SSF PMAD DC testbed
NASA Technical Reports Server (NTRS)
Baez, Anastacio N.; Kimnach, Greg L.
1991-01-01
The Power Management and Distribution (PMAD) DC Testbed Control System for Space Station Freedom was developed using a top down approach based on classical control system and conventional terrestrial power utilities design techniques. The design methodology includes the development of a testbed operating concept. This operating concept describes the operation of the testbed under all possible scenarios. A unique set of operating states was identified and a description of each state, along with state transitions, was generated. Each state is represented by a unique set of attributes and constraints, and its description reflects the degree of system security within which the power system is operating. Using the testbed operating states description, a functional design for the control system was developed. This functional design consists of a functional outline, a text description, and a logical flowchart for all the major control system functions. Described here are the control system design techniques, various control system functions, and the status of the design and implementation.
Automated designation of tie-points for image-to-image coregistration.
R.E. Kennedy; W.B. Cohen
2003-01-01
Image-to-image registration requires identification of common points in both images (image tie-points: ITPs). Here we describe software implementing an automated, area-based technique for identifying ITPs. The ITP software was designed to follow two strategies: ( I ) capitalize on human knowledge and pattern recognition strengths, and (2) favour robustness in many...
Directed Student Inquiry: Modeling in Roborovsky Hamsters
ERIC Educational Resources Information Center
Elwess, Nancy L.; Bouchard, Adam
2007-01-01
In this inquiry-based activity, Roborovsky hamsters are used to provide students with an opportunity to develop their skills of analysis, inquiry, and design. These hamsters are easy to maintain, yet offer students a means to use conventional techniques and those of their own design to make further observations through measuring, assessing, and…
ERIC Educational Resources Information Center
Nee, John G.; Kare, Audhut P.
1987-01-01
Explores several concepts in computer assisted design/computer assisted manufacturing (CAD/CAM). Defines, evaluates, reviews and compares advanced computer-aided geometric modeling and analysis techniques. Presents the results of a survey to establish the capabilities of minicomputer based-systems with the CAD/CAM packages evaluated. (CW)
The Design and Realization of Net Testing System on Campus Network
ERIC Educational Resources Information Center
Ren, Zhanying; Liu, Shijie
2005-01-01
According to the requirement of modern teaching theory and technology, based on software engineering, database theory, the technique of net information security and system integration, a net testing system on local network was designed and realized. The system benefits for dividing of testing & teaching and settles the problems of random…
Design, Modeling, and Measurement of a Metamaterial Electromagnetic Field Concentrator
2012-03-22
techniques to Kramers- Kronig relationship that do not appear to have this limitation [34, 67]. AFIT’s rapid design method utilizes several of these...extraction of metamaterial parameters based on Kramers- Kronig relationship,” IEEE Theory Tech. Soc., 58(10):2646–2653, 2010. [68] Teixeira, F. L. and W
ERIC Educational Resources Information Center
Murthy, Pushpalatha P. N.; Thompson, Martin; Hungwe, Kedmon
2014-01-01
A semester-long laboratory course was designed and implemented to familiarize students with modern biochemistry and molecular biology techniques. The designed format involved active student participation, evaluation of data, and critical thinking, and guided students to become independent researchers. The first part of the course focused on…
Simplified Estimation and Testing in Unbalanced Repeated Measures Designs.
Spiess, Martin; Jordan, Pascal; Wendt, Mike
2018-05-07
In this paper we propose a simple estimator for unbalanced repeated measures design models where each unit is observed at least once in each cell of the experimental design. The estimator does not require a model of the error covariance structure. Thus, circularity of the error covariance matrix and estimation of correlation parameters and variances are not necessary. Together with a weak assumption about the reason for the varying number of observations, the proposed estimator and its variance estimator are unbiased. As an alternative to confidence intervals based on the normality assumption, a bias-corrected and accelerated bootstrap technique is considered. We also propose the naive percentile bootstrap for Wald-type tests where the standard Wald test may break down when the number of observations is small relative to the number of parameters to be estimated. In a simulation study we illustrate the properties of the estimator and the bootstrap techniques to calculate confidence intervals and conduct hypothesis tests in small and large samples under normality and non-normality of the errors. The results imply that the simple estimator is only slightly less efficient than an estimator that correctly assumes a block structure of the error correlation matrix, a special case of which is an equi-correlation matrix. Application of the estimator and the bootstrap technique is illustrated using data from a task switch experiment based on an experimental within design with 32 cells and 33 participants.
Design of new face-centered cubic high entropy alloys by thermodynamic calculation
NASA Astrophysics Data System (ADS)
Choi, Won-Mi; Jung, Seungmun; Jo, Yong Hee; Lee, Sunghak; Lee, Byeong-Joo
2017-09-01
A new face-centered cubic (fcc) high entropy alloy system with non-equiatomic compositions has been designed by utilizing a CALculation of PHAse Diagram (CALPHAD) - type thermodynamic calculation technique. The new alloy system is based on the representative fcc high entropy alloy, the Cantor alloy which is an equiatomic Co- Cr-Fe-Mn-Ni five-component alloy, but fully or partly replace the cobalt by vanadium and is of non-equiatomic compositions. Alloy compositions expected to have an fcc single-phase structure between 700 °C and melting temperatures are proposed. All the proposed alloys are experimentally confirmed to have the fcc single-phase during materials processes (> 800 °C), through an X-ray diffraction analysis. It is shown that there are more chances to find fcc single-phase high entropy alloys if paying attention to non-equiatomic composition regions and that the CALPHAD thermodynamic calculation can be an efficient tool for it. An alloy design technique based on thermodynamic calculation is demonstrated and the applicability and limitation of the approach as a design tool for high entropy alloys is discussed.
[Curricular design of health postgraduate programs: the case of Masters in epidemiology].
Bobadilla, J L; Lozano, R; Bobadilla, C
1991-01-01
This paper discusses the need to create specific programs for the training of researchers in epidemiology, a field that has traditionally been ignored by the graduate programs in public health. This is due, in part, to the emphasis that has been placed on the training of professionals in other areas of public health. The paper also includes the results of a consensus exercise developed during the curricular design of the Masters Program in Epidemiology of the School of Medicine of the National Autonomous University of Mexico. The technique used during the consensus exercise was the TKJ, which allows the presentation of ideas and possible solutions for a specific problem. This is probably the first published experience in the use of such a technique for the design of an academic curriculum. Taking as a base the general characteristics of the students, the substantive, disciplinary and methodological subjects were chosen. The results showed a need for a multidisciplinary approach based on modern methodologies of statistics and epidemiology. The usefulness of the results of the curricular design and the superiority of this method to reach consensus is also discussed.
NASA Astrophysics Data System (ADS)
Martowicz, Adam; Uhl, Tadeusz
2012-10-01
The paper discusses the applicability of a reliability- and performance-based multi-criteria robust design optimization technique for micro-electromechanical systems, considering their technological uncertainties. Nowadays, micro-devices are commonly applied systems, especially in the automotive industry, taking advantage of utilizing both the mechanical structure and electronic control circuit on one board. Their frequent use motivates the elaboration of virtual prototyping tools that can be applied in design optimization with the introduction of technological uncertainties and reliability. The authors present a procedure for the optimization of micro-devices, which is based on the theory of reliability-based robust design optimization. This takes into consideration the performance of a micro-device and its reliability assessed by means of uncertainty analysis. The procedure assumes that, for each checked design configuration, the assessment of uncertainty propagation is performed with the meta-modeling technique. The described procedure is illustrated with an example of the optimization carried out for a finite element model of a micro-mirror. The multi-physics approach allowed the introduction of several physical phenomena to correctly model the electrostatic actuation and the squeezing effect present between electrodes. The optimization was preceded by sensitivity analysis to establish the design and uncertain domains. The genetic algorithms fulfilled the defined optimization task effectively. The best discovered individuals are characterized by a minimized value of the multi-criteria objective function, simultaneously satisfying the constraint on material strength. The restriction of the maximum equivalent stresses was introduced with the conditionally formulated objective function with a penalty component. The yielded results were successfully verified with a global uniform search through the input design domain.
A Comparison of FPGA and GPGPU Designs for Bayesian Occupancy Filters
Medina, Luis; Diez-Ochoa, Miguel; Correal, Raul; Cuenca-Asensi, Sergio; Godoy, Jorge; Martínez-Álvarez, Antonio
2017-01-01
Grid-based perception techniques in the automotive sector based on fusing information from different sensors and their robust perceptions of the environment are proliferating in the industry. However, one of the main drawbacks of these techniques is the traditionally prohibitive, high computing performance that is required for embedded automotive systems. In this work, the capabilities of new computing architectures that embed these algorithms are assessed in a real car. The paper compares two ad hoc optimized designs of the Bayesian Occupancy Filter; one for General Purpose Graphics Processing Unit (GPGPU) and the other for Field-Programmable Gate Array (FPGA). The resulting implementations are compared in terms of development effort, accuracy and performance, using datasets from a realistic simulator and from a real automated vehicle. PMID:29137137
Knowledge based systems: A preliminary survey of selected issues and techniques
NASA Technical Reports Server (NTRS)
Dominick, Wayne D. (Editor); Kavi, Srinu
1984-01-01
It is only recently that research in Artificial Intelligence (AI) is accomplishing practical results. Most of these results can be attributed to the design and use of expert systems (or Knowledge-Based Systems, KBS) - problem-solving computer programs that can reach a level of performance comparable to that of a human expert in some specialized problem domain. But many computer systems designed to see images, hear sounds, and recognize speech are still in a fairly early stage of development. In this report, a preliminary survey of recent work in the KBS is reported, explaining KBS concepts and issues and techniques used to construct them. Application considerations to construct the KBS and potential KBS research areas are identified. A case study (MYCIN) of a KBS is also provided.
Water supply pipe dimensioning using hydraulic power dissipation
NASA Astrophysics Data System (ADS)
Sreemathy, J. R.; Rashmi, G.; Suribabu, C. R.
2017-07-01
Proper sizing of the pipe component of water distribution networks play an important role in the overall design of the any water supply system. Several approaches have been applied for the design of networks from an economical point of view. Traditional optimization techniques and population based stochastic algorithms are widely used to optimize the networks. But the use of these approaches is mostly found to be limited to the research level due to difficulties in understanding by the practicing engineers, design engineers and consulting firms. More over due to non-availability of commercial software related to the optimal design of water distribution system,it forces the practicing engineers to adopt either trial and error or experience-based design. This paper presents a simple approach based on power dissipation in each pipeline as a parameter to design the network economically, but not to the level of global minimum cost.
2016-06-01
characteristics, experimental design techniques, and analysis methodologies that distinguish each phase of the MBSE MEASA. To ensure consistency... methodology . Experimental design selection, simulation analysis, and trade space analysis support the final two stages. Figure 27 segments the MBSE MEASA...rounding has the potential to increase the correlation between columns of the experimental design matrix. The design methodology presented in Vieira
A Program Manager’s Methodology for Developing Structured Design in Embedded Weapons Systems.
1983-12-01
the hardware selec- tion. This premise has been reiterated and substantiated by numerous case studies performed in recent years among them Barry ...measures, rules of thumb, and analysis techniques, this method with early development by re Marco is the basis for the Pressman design methcdology...desired traits of a design based on the specificaticns generated, but does not include a procedure for realizat or" of the design. Pressman , (Ref. 5
1980-11-01
Dela Bnrted) Item 19 Continued: system design design handbooks maintenance manpower simulation de’ision options cost estimating relationships prediction...determine the extent to which human resources data (HRD) are used in early system design. The third was to assess the availability and ade - quacy of...relationships, regression analysis, comparability analysis, expected value techniques) to provide initial data values in the very early stages of weapon system
Stochastic Analysis and Design of Heterogeneous Microstructural Materials System
NASA Astrophysics Data System (ADS)
Xu, Hongyi
Advanced materials system refers to new materials that are comprised of multiple traditional constituents but complex microstructure morphologies, which lead to superior properties over the conventional materials. To accelerate the development of new advanced materials system, the objective of this dissertation is to develop a computational design framework and the associated techniques for design automation of microstructure materials systems, with an emphasis on addressing the uncertainties associated with the heterogeneity of microstructural materials. Five key research tasks are identified: design representation, design evaluation, design synthesis, material informatics and uncertainty quantification. Design representation of microstructure includes statistical characterization and stochastic reconstruction. This dissertation develops a new descriptor-based methodology, which characterizes 2D microstructures using descriptors of composition, dispersion and geometry. Statistics of 3D descriptors are predicted based on 2D information to enable 2D-to-3D reconstruction. An efficient sequential reconstruction algorithm is developed to reconstruct statistically equivalent random 3D digital microstructures. In design evaluation, a stochastic decomposition and reassembly strategy is developed to deal with the high computational costs and uncertainties induced by material heterogeneity. The properties of Representative Volume Elements (RVE) are predicted by stochastically reassembling SVE elements with stochastic properties into a coarse representation of the RVE. In design synthesis, a new descriptor-based design framework is developed, which integrates computational methods of microstructure characterization and reconstruction, sensitivity analysis, Design of Experiments (DOE), metamodeling and optimization the enable parametric optimization of the microstructure for achieving the desired material properties. Material informatics is studied to efficiently reduce the dimension of microstructure design space. This dissertation develops a machine learning-based methodology to identify the key microstructure descriptors that highly impact properties of interest. In uncertainty quantification, a comparative study on data-driven random process models is conducted to provide guidance for choosing the most accurate model in statistical uncertainty quantification. Two new goodness-of-fit metrics are developed to provide quantitative measurements of random process models' accuracy. The benefits of the proposed methods are demonstrated by the example of designing the microstructure of polymer nanocomposites. This dissertation provides material-generic, intelligent modeling/design methodologies and techniques to accelerate the process of analyzing and designing new microstructural materials system.
Joint Cross-Layer Design for Wireless QoS Content Delivery
NASA Astrophysics Data System (ADS)
Chen, Jie; Lv, Tiejun; Zheng, Haitao
2005-12-01
In this paper, we propose a joint cross-layer design for wireless quality-of-service (QoS) content delivery. Central to our proposed cross-layer design is the concept of adaptation. Adaptation represents the ability to adjust protocol stacks and applications to respond to channel variations. We focus our cross-layer design especially on the application, media access control (MAC), and physical layers. The network is designed based on our proposed fast frequency-hopping orthogonal frequency division multiplex (OFDM) technique. We also propose a QoS-awareness scheduler and a power adaptation transmission scheme operating at both the base station and mobile sides. The proposed MAC scheduler coordinates the transmissions of an IP base station and mobile nodes. The scheduler also selects appropriate transmission formats and packet priorities for individual users based on current channel conditions and the users' QoS requirements. The test results show that our cross-layer design provides an excellent framework for wireless QoS content delivery.
Designing for Temporal Awareness: The Role of Temporality in Time-Critical Medical Teamwork
Kusunoki, Diana S.; Sarcevic, Aleksandra
2016-01-01
This paper describes the role of temporal information in emergency medical teamwork and how time-based features can be designed to support the temporal awareness of clinicians in this fast-paced and dynamic environment. Engagement in iterative design activities with clinicians over the course of two years revealed a strong need for time-based features and mechanisms, including timestamps for tasks based on absolute time and automatic stopclocks measuring time by counting up since task performance. We describe in detail the aspects of temporal awareness central to clinicians’ awareness needs and then provide examples of how we addressed these needs through the design of a shared information display. As an outcome of this process, we define four types of time representation techniques to facilitate the design of time-based features: (1) timestamps based on absolute time, (2) timestamps relative to the process start time, (3) time since task performance, and (4) time until the next required task. PMID:27478880
Innovative application of virtual display technique in virtual museum
NASA Astrophysics Data System (ADS)
Zhang, Jiankang
2017-09-01
Virtual museum refers to display and simulate the functions of real museum on the Internet in the form of 3 Dimensions virtual reality by applying interactive programs. Based on Virtual Reality Modeling Language, virtual museum building and its effective interaction with the offline museum lie in making full use of 3 Dimensions panorama technique, virtual reality technique and augmented reality technique, and innovatively taking advantages of dynamic environment modeling technique, real-time 3 Dimensions graphics generating technique, system integration technique and other key virtual reality techniques to make sure the overall design of virtual museum.3 Dimensions panorama technique, also known as panoramic photography or virtual reality, is a technique based on static images of the reality. Virtual reality technique is a kind of computer simulation system which can create and experience the interactive 3 Dimensions dynamic visual world. Augmented reality, also known as mixed reality, is a technique which simulates and mixes the information (visual, sound, taste, touch, etc.) that is difficult for human to experience in reality. These technologies make virtual museum come true. It will not only bring better experience and convenience to the public, but also be conducive to improve the influence and cultural functions of the real museum.
NASA Technical Reports Server (NTRS)
Pifer, Alburt E.; Hiscox, William L.; Cummins, Kenneth L.; Neumann, William T.
1991-01-01
Gated, wideband, magnetic direction finders (DFs) were originally designed to measure the bearing of cloud-to-ground lightning relative to the sensor. A recent addition to this device uses proprietary waveform discrimination logic to select return stroke signatures and certain range dependent features in the waveform to provide an estimate of range of flashes within 50 kms. The enhanced ranging techniques are discussed which were designed and developed for use in single station thunderstorm warning sensor. Included are the results of on-going evaluations being conducted under a variety of meteorological and geographic conditions.
Luo, Wei; Chen, Sheng; Chen, Lei; Li, Hualong; Miao, Pengcheng; Gao, Huiyi; Hu, Zelin; Li, Miao
2017-05-29
We describe a theoretical model to analyze temperature effects on the Kretschmann surface plasmon resonance (SPR) sensor, and describe a new double-incident angle technique to simultaneously measure changes in refractive index (RI) and temperature. The method uses the observation that output signals obtained from two different incident angles each have a linear dependence on RI and temperature, and are independent. A proof-of-concept experiment using different NaCl concentration solutions as analytes demonstrates the ability of the technique. The optical design is as simple and robust as conventional SPR detection, but provides a way to discriminate between RI-induced and temperature-induced SPR changes. This technique facilitates a way for traditional SPR sensors to detect RI in different temperature environments, and may lead to better design and fabrication of SPR sensors against temperature variation.
Li, Zhifei; Qin, Dongliang
2014-01-01
In defense related programs, the use of capability-based analysis, design, and acquisition has been significant. In order to confront one of the most challenging features of a huge design space in capability based analysis (CBA), a literature review of design space exploration was first examined. Then, in the process of an aerospace system of systems design space exploration, a bilayer mapping method was put forward, based on the existing experimental and operating data. Finally, the feasibility of the foregoing approach was demonstrated with an illustrative example. With the data mining RST (rough sets theory) and SOM (self-organized mapping) techniques, the alternative to the aerospace system of systems architecture was mapping from P-space (performance space) to C-space (configuration space), and then from C-space to D-space (design space), respectively. Ultimately, the performance space was mapped to the design space, which completed the exploration and preliminary reduction of the entire design space. This method provides a computational analysis and implementation scheme for large-scale simulation. PMID:24790572
Li, Zhifei; Qin, Dongliang; Yang, Feng
2014-01-01
In defense related programs, the use of capability-based analysis, design, and acquisition has been significant. In order to confront one of the most challenging features of a huge design space in capability based analysis (CBA), a literature review of design space exploration was first examined. Then, in the process of an aerospace system of systems design space exploration, a bilayer mapping method was put forward, based on the existing experimental and operating data. Finally, the feasibility of the foregoing approach was demonstrated with an illustrative example. With the data mining RST (rough sets theory) and SOM (self-organized mapping) techniques, the alternative to the aerospace system of systems architecture was mapping from P-space (performance space) to C-space (configuration space), and then from C-space to D-space (design space), respectively. Ultimately, the performance space was mapped to the design space, which completed the exploration and preliminary reduction of the entire design space. This method provides a computational analysis and implementation scheme for large-scale simulation.
Joint spectral characterization of photon-pair sources
NASA Astrophysics Data System (ADS)
Zielnicki, Kevin; Garay-Palmett, Karina; Cruz-Delgado, Daniel; Cruz-Ramirez, Hector; O'Boyle, Michael F.; Fang, Bin; Lorenz, Virginia O.; U'Ren, Alfred B.; Kwiat, Paul G.
2018-06-01
The ability to determine the joint spectral properties of photon pairs produced by the processes of spontaneous parametric downconversion (SPDC) and spontaneous four-wave mixing (SFWM) is crucial for guaranteeing the usability of heralded single photons and polarization-entangled pairs for multi-photon protocols. In this paper, we compare six different techniques that yield either a characterization of the joint spectral intensity or of the closely related purity of heralded single photons. These six techniques include: (i) scanning monochromator measurements, (ii) a variant of Fourier transform spectroscopy designed to extract the desired information exploiting a resource-optimized technique, (iii) dispersive fibre spectroscopy, (iv) stimulated-emission-based measurement, (v) measurement of the second-order correlation function ? for one of the two photons, and (vi) two-source Hong-Ou-Mandel interferometry. We discuss the relative performance of these techniques for the specific cases of a SPDC source designed to be factorable and SFWM sources of varying purity, and compare the techniques' relative advantages and disadvantages.
NASA Astrophysics Data System (ADS)
Zhao, Xuefeng; Cui, Yanjun; Wei, Heming; Kong, Xianglong; Zhang, Pinglei; Sun, Changsen
2013-06-01
In this paper, a novel kind of steel rebar corrosion monitoring technique for steel reinforced concrete structures is proposed, designed, and tested. The technique is based on the fiber optical white light interferometer (WLI) sensing technique. Firstly, a feasibility test was carried out using an equal-strength beam for comparison of strain sensing ability between the WLI and a fiber Bragg grating (FBG). The comparison results showed that the sensitivity of the WLI is sufficient for corrosion expansion strain monitoring. Then, two WLI corrosion sensors (WLI-CSs) were designed, fabricated, and embedded into concrete specimens to monitor expansion strain caused by steel rebar corrosion. Their performance was studied in an accelerated electrochemical corrosion test. Experimental results show that expansion strain along the fiber optical coil winding area can be detected and measured accurately by the proposed sensor. The advantages of the proposed monitoring technique allow for quantitative corrosion expansion monitoring to be executed in real time for reinforced concrete structures and with low cost.
NASA Astrophysics Data System (ADS)
Castro-Lopez, Rafael; Fernandez, Francisco V.; Rodriguez Vazquez, Angel
2005-06-01
Accelerating the synthesis of increasingly complex analog integrated circuits is key to bridge the widening gap between what we can integrate and what we can design while meeting ever-tightening time-to-market constraints. It is a well-known fact in the semiconductor industry that such goal can only be attained by means of adequate CAD methodologies, techniques, and accompanying tools. This is particularly important in analog physical synthesis (a.k.a. layout generation), where large sensitivities of the circuit performances to the many subtle details of layout implementation (device matching, loading and coupling effects, reliability, and area features are of utmost importance to analog designers), render complete automation a truly challenging task. To approach the problem, two directions have been traditionally considered, knowledge-based and optimization-based, both with their own pros and cons. Besides, recently reported solutions oriented to speed up the overall design flow by means of reuse-based practices or by cutting off time-consuming, error-prone spins between electrical and layout synthesis (a technique known as layout-aware synthesis), rely on a outstandingly rapid yet efficient layout generation method. This paper analyses the suitability of procedural layout generation based on templates (a knowledge-based approach) by examining the requirements that both layout reuse and layout-aware solutions impose, and how layout templates face them. The ability to capture the know-how of experienced layout designers and the turnaround times for layout instancing are considered main comparative aspects in relation to other layout generation approaches. A discussion on the benefit-cost trade-off of using layout templates is also included. In addition to this analysis, the paper delves deeper into systematic techniques to develop fully reusable layout templates for analog circuits, either for a change of the circuit sizing (i.e., layout retargeting) or a change of the fabrication process (i.e., layout migration). Several examples implemented with the Cadence's Virtuoso tool suite are provided as demonstration of the paper's contributions.
A Darwinian approach to control-structure design
NASA Technical Reports Server (NTRS)
Zimmerman, David C.
1993-01-01
Genetic algorithms (GA's), as introduced by Holland (1975), are one form of directed random search. The form of direction is based on Darwin's 'survival of the fittest' theories. GA's are radically different from the more traditional design optimization techniques. GA's work with a coding of the design variables, as opposed to working with the design variables directly. The search is conducted from a population of designs (i.e., from a large number of points in the design space), unlike the traditional algorithms which search from a single design point. The GA requires only objective function information, as opposed to gradient or other auxiliary information. Finally, the GA is based on probabilistic transition rules, as opposed to deterministic rules. These features allow the GA to attack problems with local-global minima, discontinuous design spaces and mixed variable problems, all in a single, consistent framework.
USDA-ARS?s Scientific Manuscript database
The molecular biological techniques for plasmid-based assembly and cloning of synthetic assembled gene open reading frames are essential for elucidating the function of the proteins encoded by the genes. These techniques involve the production of full-length cDNA libraries as a source of plasmid-bas...
Distributed Space Mission Design for Earth Observation Using Model-Based Performance Evaluation
NASA Technical Reports Server (NTRS)
Nag, Sreeja; LeMoigne-Stewart, Jacqueline; Cervantes, Ben; DeWeck, Oliver
2015-01-01
Distributed Space Missions (DSMs) are gaining momentum in their application to earth observation missions owing to their unique ability to increase observation sampling in multiple dimensions. DSM design is a complex problem with many design variables, multiple objectives determining performance and cost and emergent, often unexpected, behaviors. There are very few open-access tools available to explore the tradespace of variables, minimize cost and maximize performance for pre-defined science goals, and therefore select the most optimal design. This paper presents a software tool that can multiple DSM architectures based on pre-defined design variable ranges and size those architectures in terms of predefined science and cost metrics. The tool will help a user select Pareto optimal DSM designs based on design of experiments techniques. The tool will be applied to some earth observation examples to demonstrate its applicability in making some key decisions between different performance metrics and cost metrics early in the design lifecycle.
Banach, Marzena; Wasilewska, Agnieszka; Dlugosz, Rafal; Pauk, Jolanta
2018-05-18
Due to the problem of aging societies, there is a need for smart buildings to monitor and support people with various disabilities, including rheumatoid arthritis. The aim of this paper is to elaborate on novel techniques for wireless motion capture systems for the monitoring and rehabilitation of disabled people for application in smart buildings. The proposed techniques are based on cross-verification of distance measurements between markers and transponders in an environment with highly variable parameters. To their verification, algorithms that enable comprehensive investigation of a system with different numbers of transponders and varying ambient parameters (temperature and noise) were developed. In the estimation of the real positions of markers, various linear and nonlinear filters were used. Several thousand tests were carried out for various system parameters and different marker locations. The results show that localization error may be reduced by as much as 90%. It was observed that repetition of measurement reduces localization error by as much as one order of magnitude. The proposed system, based on wireless techniques, offers a high commercial potential. However, it requires extensive cooperation between teams, including hardware and software design, system modelling, and architectural design.
An additional study and implementation of tone calibrated technique of modulation
NASA Technical Reports Server (NTRS)
Rafferty, W.; Bechtel, L. K.; Lay, N. E.
1985-01-01
The Tone Calibrated Technique (TCT) was shown to be theoretically free from an error floor, and is only limited, in practice, by implementation constraints. The concept of the TCT transmission scheme along with a baseband implementation of a suitable demodulator is introduced. Two techniques for the generation of the TCT signal are considered: a Manchester source encoding scheme (MTCT) and a subcarrier based technique (STCT). The results are summarized for the TCT link computer simulation. The hardware implementation of the MTCT system is addressed and the digital signal processing design considerations involved in satisfying the modulator/demodulator requirements are outlined. The program findings are discussed and future direction are suggested based on conclusions made regarding the suitability of the TCT system for the transmission channel presently under consideration.
Two-layer wireless distributed sensor/control network based on RF
NASA Astrophysics Data System (ADS)
Feng, Li; Lin, Yuchi; Zhou, Jingjing; Dong, Guimei; Xia, Guisuo
2006-11-01
A project of embedded Wireless Distributed Sensor/Control Network (WDSCN) based on RF is presented after analyzing the disadvantages of traditional measure and control system. Because of high-cost and complexity, such wireless techniques as Bluetooth and WiFi can't meet the needs of WDSCN. The two-layer WDSCN is designed based on RF technique, which operates in the ISM free frequency channel with low power and high transmission speed. Also the network is low cost, portable and moveable, integrated with the technologies of computer network, sensor, microprocessor and wireless communications. The two-layer network topology is selected in the system; a simple but efficient self-organization net protocol is designed to fit the periodic data collection, event-driven and store-and-forward. Furthermore, adaptive frequency hopping technique is adopted for anti-jamming apparently. The problems about power reduction and synchronization of data in wireless system are solved efficiently. Based on the discussion above, a measure and control network is set up to control such typical instruments and sensors as temperature sensor and signal converter, collect data, and monitor environmental parameters around. This system works well in different rooms. Experiment results show that the system provides an efficient solution to WDSCN through wireless links, with high efficiency, low power, high stability, flexibility and wide working range.
Data-Adaptable Modeling and Optimization for Runtime Adaptable Systems
2016-06-08
execution scenarios e . Enables model -guided optimization algorithms that outperform state-of-the-art f. Understands the overhead of system...the Data-Adaptable System Model (DASM), that facilitates design by enabling the designer to: 1) specify both an application’s task flow as well as...systems. The MILAN [3] framework specializes in the design, simulation , and synthesis of System On Chip (SoC) applications using model -based techniques
NASA Technical Reports Server (NTRS)
Mukhopadhyay, Vivek
1999-01-01
The benchmark active controls technology and wind tunnel test program at NASA Langley Research Center was started with the objective to investigate the nonlinear, unsteady aerodynamics and active flutter suppression of wings in transonic flow. The paper will present the flutter suppression control law design process, numerical nonlinear simulation and wind tunnel test results for the NACA 0012 benchmark active control wing model. The flutter suppression control law design processes using (1) classical, (2) linear quadratic Gaussian (LQG), and (3) minimax techniques are described. A unified general formulation and solution for the LQG and minimax approaches, based on the steady state differential game theory is presented. Design considerations for improving the control law robustness and digital implementation are outlined. It was shown that simple control laws when properly designed based on physical principles, can suppress flutter with limited control power even in the presence of transonic shocks and flow separation. In wind tunnel tests in air and heavy gas medium, the closed-loop flutter dynamic pressure was increased to the tunnel upper limit of 200 psf. The control law robustness and performance predictions were verified in highly nonlinear flow conditions, gain and phase perturbations, and spoiler deployment. A non-design plunge instability condition was also successfully suppressed.
Android Based Mobile Environment for Moodle Users
ERIC Educational Resources Information Center
de Clunie, Gisela T.; Clunie, Clifton; Castillo, Aris; Rangel, Norman
2013-01-01
This paper is about the development of a platform that eases, throughout Android based mobile devices, mobility of users of virtual courses at Technological University of Panama. The platform deploys computational techniques such as "web services," design patterns, ontologies and mobile technologies to allow mobile devices communicate…
Alternative Constraint Handling Technique for Four-Bar Linkage Path Generation
NASA Astrophysics Data System (ADS)
Sleesongsom, S.; Bureerat, S.
2018-03-01
This paper proposes an extension of a new concept for path generation from our previous work by adding a new constraint handling technique. The propose technique was initially designed for problems without prescribed timing by avoiding the timing constraint, while remain constraints are solving with a new constraint handling technique. The technique is one kind of penalty technique. The comparative study is optimisation of path generation problems are solved using self-adaptive population size teaching-learning based optimization (SAP-TLBO) and original TLBO. In this study, two traditional path generation test problem are used to test the proposed technique. The results show that the new technique can be applied with the path generation problem without prescribed timing and gives better results than the previous technique. Furthermore, SAP-TLBO outperforms the original one.
DeMAID/GA an Enhanced Design Manager's Aid for Intelligent Decomposition
NASA Technical Reports Server (NTRS)
Rogers, J. L.
1996-01-01
Many companies are looking for new tools and techniques to aid a design manager in making decisions that can reduce the time and cost of a design cycle. One tool is the Design Manager's Aid for Intelligent Decomposition (DeMAID). Since the initial public release of DeMAID in 1989, much research has been done in the areas of decomposition, concurrent engineering, parallel processing, and process management; many new tools and techniques have emerged. Based on these recent research and development efforts, numerous enhancements have been added to DeMAID to further aid the design manager in saving both cost and time in a design cycle. The key enhancement, a genetic algorithm (GA), will be available in the next public release called DeMAID/GA. The GA sequences the design processes to minimize the cost and time in converging a solution. The major enhancements in the upgrade of DeMAID to DeMAID/GA are discussed in this paper. A sample conceptual design project is used to show how these enhancements can be applied to improve the design cycle.
Liu, Langechuan; Antonuk, Larry E.; El-Mohri, Youcef; Zhao, Qihua; Jiang, Hao
2014-01-01
Purpose: Active matrix flat-panel imagers (AMFPIs) incorporating thick, segmented scintillators have demonstrated order-of-magnitude improvements in detective quantum efficiency (DQE) at radiotherapy energies compared to systems based on conventional phosphor screens. Such improved DQE values facilitate megavoltage cone-beam CT (MV CBCT) imaging at clinically practical doses. However, the MV CBCT performance of such AMFPIs is highly dependent on the design parameters of the scintillators. In this paper, optimization of the design of segmented scintillators was explored using a hybrid modeling technique which encompasses both radiation and optical effects. Methods: Imaging performance in terms of the contrast-to-noise ratio (CNR) and spatial resolution of various hypothetical scintillator designs was examined through a hybrid technique involving Monte Carlo simulation of radiation transport in combination with simulation of optical gain distributions and optical point spread functions. The optical simulations employed optical parameters extracted from a best fit to measurement results reported in a previous investigation of a 1.13 cm thick, 1016 μm pitch prototype BGO segmented scintillator. All hypothetical designs employed BGO material with a thickness and element-to-element pitch ranging from 0.5 to 6 cm and from 0.508 to 1.524 mm, respectively. In the CNR study, for each design, full tomographic scans of a contrast phantom incorporating various soft-tissue inserts were simulated at a total dose of 4 cGy. Results: Theoretical values for contrast, noise, and CNR were found to be in close agreement with empirical results from the BGO prototype, strongly supporting the validity of the modeling technique. CNR and spatial resolution for the various scintillator designs demonstrate complex behavior as scintillator thickness and element pitch are varied—with a clear trade-off between these two imaging metrics up to a thickness of ∼3 cm. Based on these results, an optimization map indicating the regions of design that provide a balance between these metrics was obtained. The map shows that, for a given set of optical parameters, scintillator thickness and pixel pitch can be judiciously chosen to maximize performance without resorting to thicker, more costly scintillators. Conclusions: Modeling radiation and optical effects in thick, segmented scintillators through use of a hybrid technique can provide a practical way to gain insight as to how to optimize the performance of such devices in radiotherapy imaging. Assisted by such modeling, the development of practical designs should greatly facilitate low-dose, soft tissue visualization employing MV CBCT imaging in external beam radiotherapy. PMID:24877827
NASA Technical Reports Server (NTRS)
DeBonis, J. R.; Trefny, C. J.; Steffen, C. J., Jr.
1999-01-01
Design and analysis of the inlet for a rocket based combined cycle engine is discussed. Computational fluid dynamics was used in both the design and subsequent analysis. Reynolds averaged Navier-Stokes simulations were performed using both perfect gas and real gas assumptions. An inlet design that operates over the required Mach number range from 0 to 12 was produced. Performance data for cycle analysis was post processed using a stream thrust averaging technique. A detailed performance database for cycle analysis is presented. The effect ot vehicle forebody compression on air capture is also examined.
The SAMEX Vector Magnetograph: A Design Study for a Space-Based Solar Vector Magnetograph
NASA Technical Reports Server (NTRS)
Hagyard, M. J.; Gary, G. A.; West, E. A.
1988-01-01
This report presents the results of a pre-phase A study performed by the Marshall Space Flight Center (MSFC) for the Air Force Geophysics Laboratory (AFGL) to develop a design concept for a space-based solar vector magnetograph and hydrogen-alpha telescope. These are two of the core instruments for a proposed Air Force mission, the Solar Activities Measurement Experiments (SAMEX). This mission is designed to study the processes which give rise to activity in the solar atmosphere and to develop techniques for predicting solar activity and its effects on the terrestrial environment.
Machine-Learning Approach for Design of Nanomagnetic-Based Antennas
NASA Astrophysics Data System (ADS)
Gianfagna, Carmine; Yu, Huan; Swaminathan, Madhavan; Pulugurtha, Raj; Tummala, Rao; Antonini, Giulio
2017-08-01
We propose a machine-learning approach for design of planar inverted-F antennas with a magneto-dielectric nanocomposite substrate. It is shown that machine-learning techniques can be efficiently used to characterize nanomagnetic-based antennas by accurately mapping the particle radius and volume fraction of the nanomagnetic material to antenna parameters such as gain, bandwidth, radiation efficiency, and resonant frequency. A modified mixing rule model is also presented. In addition, the inverse problem is addressed through machine learning as well, where given the antenna parameters, the corresponding design space of possible material parameters is identified.
[Veneer computer aided design based on reverse engineering technology].
Liu, Ming-li; Chen, Xiao-dong; Wang, Yong
2012-03-01
To explore the computer aided design (CAD) method of veneer restoration, and to assess if the solution can help prosthesis meet morphology esthetics standard. A volunteer's upper right central incisor needed to be restored with veneer. Super hard stone models of patient's dentition (before and after tooth preparation) were scanned with the three-dimensional laser scanner. The veneer margin was designed as butt-to-butt type. The veneer was constructed using reverse engineering (RE) software. The technique guideline of veneers CAD was explore based on RE software, and the veneers was smooth, continuous and symmetrical, which met esthetics construction needs. It was a feasible method to reconstruct veneer restoration based on RE technology.
A Cloud-Based X73 Ubiquitous Mobile Healthcare System: Design and Implementation
Ji, Zhanlin; O'Droma, Máirtín; Zhang, Xin; Zhang, Xueji
2014-01-01
Based on the user-centric paradigm for next generation networks, this paper describes a ubiquitous mobile healthcare (uHealth) system based on the ISO/IEEE 11073 personal health data (PHD) standards (X73) and cloud computing techniques. A number of design issues associated with the system implementation are outlined. The system includes a middleware on the user side, providing a plug-and-play environment for heterogeneous wireless sensors and mobile terminals utilizing different communication protocols and a distributed “big data” processing subsystem in the cloud. The design and implementation of this system are envisaged as an efficient solution for the next generation of uHealth systems. PMID:24737958
Nonlinear ship waves and computational fluid dynamics
MIYATA, Hideaki; ORIHARA, Hideo; SATO, Yohei
2014-01-01
Research works undertaken in the first author’s laboratory at the University of Tokyo over the past 30 years are highlighted. Finding of the occurrence of nonlinear waves (named Free-Surface Shock Waves) in the vicinity of a ship advancing at constant speed provided the start-line for the progress of innovative technologies in the ship hull-form design. Based on these findings, a multitude of the Computational Fluid Dynamic (CFD) techniques have been developed over this period, and are highlighted in this paper. The TUMMAC code has been developed for wave problems, based on a rectangular grid system, while the WISDAM code treats both wave and viscous flow problems in the framework of a boundary-fitted grid system. These two techniques are able to cope with almost all fluid dynamical problems relating to ships, including the resistance, ship’s motion and ride-comfort issues. Consequently, the two codes have contributed significantly to the progress in the technology of ship design, and now form an integral part of the ship-designing process. PMID:25311139
NASA Technical Reports Server (NTRS)
Klarer, P.
1994-01-01
An alternative methodology for designing an autonomous navigation and control system is discussed. This generalized hybrid system is based on a less sequential and less anthropomorphic approach than that used in the more traditional artificial intelligence (AI) technique. The architecture is designed to allow both synchronous and asynchronous operations between various behavior modules. This is accomplished by intertask communications channels which implement each behavior module and each interconnection node as a stand-alone task. The proposed design architecture allows for construction of hybrid systems which employ both subsumption and traditional AI techniques as well as providing for a teleoperator's interface. Implementation of the architecture is planned for the prototype Robotic All Terrain Lunar Explorer Rover (RATLER) which is described briefly.
Iris unwrapping using the Bresenham circle algorithm for real-time iris recognition
NASA Astrophysics Data System (ADS)
Carothers, Matthew T.; Ngo, Hau T.; Rakvic, Ryan N.; Broussard, Randy P.
2015-02-01
An efficient parallel architecture design for the iris unwrapping process in a real-time iris recognition system using the Bresenham Circle Algorithm is presented in this paper. Based on the characteristics of the model parameters this algorithm was chosen over the widely used polar conversion technique as the iris unwrapping model. The architecture design is parallelized to increase the throughput of the system and is suitable for processing an inputted image size of 320 × 240 pixels in real-time using Field Programmable Gate Array (FPGA) technology. Quartus software is used to implement, verify, and analyze the design's performance using the VHSIC Hardware Description Language. The system's predicted processing time is faster than the modern iris unwrapping technique used today∗.
NASA Technical Reports Server (NTRS)
Shoji, J. M.; Larson, V. R.
1976-01-01
The application of advanced liquid-bipropellant rocket engine analysis techniques has been utilized for prediction of the potential delivered performance and the design of thruster wall cooling schemes for laser-heated rocket thrusters. Delivered specific impulse values greater than 1000 lbf-sec/lbm are potentially achievable based on calculations for thrusters designed for 10-kW and 5000-kW laser beam power levels. A thruster wall-cooling technique utilizing a combination of regenerative cooling and a carbon-seeded hydrogen boundary layer is presented. The flowing carbon-seeded hydrogen boundary layer provides radiation absorption of the heat radiated from the high-temperature plasma. Also described is a forced convection thruster wall cooling design for an experimental test thruster.
An efficient auto TPT stitch guidance generation for optimized standard cell design
NASA Astrophysics Data System (ADS)
Samboju, Nagaraj C.; Choi, Soo-Han; Arikati, Srini; Cilingir, Erdem
2015-03-01
As the technology continues to shrink below 14nm, triple patterning lithography (TPT) is a worthwhile lithography methodology for printing dense layers such as Metal1. However, this increases the complexity of standard cell design, as it is very difficult to develop a TPT compliant layout without compromising on the area. Hence, this emphasizes the importance to have an accurate stitch generation methodology to meet the standard cell area requirement as defined by the technology shrink factor. In this paper, we present an efficient auto TPT stitch guidance generation technique for optimized standard cell design. The basic idea here is to first identify the conflicting polygons based on the Fix Guidance [1] solution developed by Synopsys. Fix Guidance is a reduced sub-graph containing minimum set of edges along with the connecting polygons; by eliminating these edges in a design 3-color conflicts can be resolved. Once the conflicting polygons are identified using this method, they are categorized into four types [2] - (Type 1 to 4). The categorization is based on number of interactions a polygon has with the coloring links and the triangle loops of fix guidance. For each type a certain criteria for keep-out region is defined, based on which the final stitch guidance locations are generated. This technique provides various possible stitch locations to the user and helps the user to select the best stitch location considering both design flexibility (max. pin access/small area) and process-preferences. Based on this technique, a standard cell library for place and route (P and R) can be developed with colorless data and a stitch marker defined by designer using our proposed method. After P and R, the full chip (block) would contain the colorless data and standard cell stitch markers only. These stitch markers are considered as "must be stitch" candidates. Hence during full chip decomposition it is not required to generate and select the stitch markers again for the complete data; therefore, the proposed method reduces the decomposition time significantly.
Wang, Rui-Rong; Yu, Xiao-Qing; Zheng, Shu-Wang; Ye, Yang
2016-01-01
Location based services (LBS) provided by wireless sensor networks have garnered a great deal of attention from researchers and developers in recent years. Chirp spread spectrum (CSS) signaling formatting with time difference of arrival (TDOA) ranging technology is an effective LBS technique in regards to positioning accuracy, cost, and power consumption. The design and implementation of the location engine and location management based on TDOA location algorithms were the focus of this study; as the core of the system, the location engine was designed as a series of location algorithms and smoothing algorithms. To enhance the location accuracy, a Kalman filter algorithm and moving weighted average technique were respectively applied to smooth the TDOA range measurements and location results, which are calculated by the cooperation of a Kalman TDOA algorithm and a Taylor TDOA algorithm. The location management server, the information center of the system, was designed with Data Server and Mclient. To evaluate the performance of the location algorithms and the stability of the system software, we used a Nanotron nanoLOC Development Kit 3.0 to conduct indoor and outdoor location experiments. The results indicated that the location system runs stably with high accuracy at absolute error below 0.6 m.
Fabrication of strain gauge based sensors for tactile skins
NASA Astrophysics Data System (ADS)
Baptist, Joshua R.; Zhang, Ruoshi; Wei, Danming; Saadatzi, Mohammad Nasser; Popa, Dan O.
2017-05-01
Fabricating cost effective, reliable and functional sensors for electronic skins has been a challenging undertaking for the last several decades. Application of such skins include haptic interfaces, robotic manipulation, and physical human-robot interaction. Much of our recent work has focused on producing compliant sensors that can be easily formed around objects to sense normal, tension, or shear forces. Our past designs have involved the use of flexible sensors and interconnects fabricated on Kapton substrates, and piezoresistive inks that are 3D printed using Electro Hydro Dynamic (EHD) jetting onto interdigitated electrode (IDE) structures. However, EHD print heads require a specialized nozzle and the application of a high-voltage electric field; for which, tuning process parameters can be difficult based on the choice of inks and substrates. Therefore, in this paper we explore sensor fabrication techniques using a novel wet lift-off photolithographic technique for patterning the base polymer piezoresistive material, specifically Poly(3,4-ethylenedioxythiophene)-poly(styrenesulfonate) or PEDOT:PSS. Fabricated sensors are electrically and thermally characterized, and temperaturecompensated designs are proposed and validated. Packaging techniques for sensors in polymer encapsulants are proposed and demonstrated to produce a tactile interface device for a robot.
NASA Astrophysics Data System (ADS)
Lin, Yuan; Choudhury, Kingshuk R.; McAdams, H. Page; Foos, David H.; Samei, Ehsan
2014-03-01
We previously proposed a novel image-based quality assessment technique1 to assess the perceptual quality of clinical chest radiographs. In this paper, an observer study was designed and conducted to systematically validate this technique. Ten metrics were involved in the observer study, i.e., lung grey level, lung detail, lung noise, riblung contrast, rib sharpness, mediastinum detail, mediastinum noise, mediastinum alignment, subdiaphragm-lung contrast, and subdiaphragm area. For each metric, three tasks were successively presented to the observers. In each task, six ROI images were randomly presented in a row and observers were asked to rank the images only based on a designated quality and disregard the other qualities. A range slider on the top of the images was used for observers to indicate the acceptable range based on the corresponding perceptual attribute. Five boardcertificated radiologists from Duke participated in this observer study on a DICOM calibrated diagnostic display workstation and under low ambient lighting conditions. The observer data were analyzed in terms of the correlations between the observer ranking orders and the algorithmic ranking orders. Based on the collected acceptable ranges, quality consistency ranges were statistically derived. The observer study showed that, for each metric, the averaged ranking orders of the participated observers were strongly correlated with the algorithmic orders. For the lung grey level, the observer ranking orders completely accorded with the algorithmic ranking orders. The quality consistency ranges derived from this observer study were close to these derived from our previous study. The observer study indicates that the proposed image-based quality assessment technique provides a robust reflection of the perceptual image quality of the clinical chest radiographs. The derived quality consistency ranges can be used to automatically predict the acceptability of a clinical chest radiograph.
The potential application of the blackboard model of problem solving to multidisciplinary design
NASA Technical Reports Server (NTRS)
Rogers, James L.
1989-01-01
The potential application of the blackboard model of problem solving to multidisciplinary design is discussed. Multidisciplinary design problems are complex, poorly structured, and lack a predetermined decision path from the initial starting point to the final solution. The final solution is achieved using data from different engineering disciplines. Ideally, for the final solution to be the optimum solution, there must be a significant amount of communication among the different disciplines plus intradisciplinary and interdisciplinary optimization. In reality, this is not what happens in today's sequential approach to multidisciplinary design. Therefore it is highly unlikely that the final solution is the true optimum solution from an interdisciplinary optimization standpoint. A multilevel decomposition approach is suggested as a technique to overcome the problems associated with the sequential approach, but no tool currently exists with which to fully implement this technique. A system based on the blackboard model of problem solving appears to be an ideal tool for implementing this technique because it offers an incremental problem solving approach that requires no a priori determined reasoning path. Thus it has the potential of finding a more optimum solution for the multidisciplinary design problems found in today's aerospace industries.
Nguyen, Dung C; Ma, Dongsheng Brian; Roveda, Janet M W
2012-01-01
As one of the key clinical imaging methods, the computed X-ray tomography can be further improved using new nanometer CMOS sensors. This will enhance the current technique's ability in terms of cancer detection size, position, and detection accuracy on the anatomical structures. The current paper reviewed designs of SOI-based CMOS sensors and their architectural design in mammography systems. Based on the existing experimental results, using the SOI technology can provide a low-noise (SNR around 87.8 db) and high-gain (30 v/v) CMOS imager. It is also expected that, together with the fast data acquisition designs, the new type of imagers may play important roles in the near-future high-dimensional images in additional to today's 2D imagers.
New Vistas in Chemical Product and Process Design.
Zhang, Lei; Babi, Deenesh K; Gani, Rafiqul
2016-06-07
Design of chemicals-based products is broadly classified into those that are process centered and those that are product centered. In this article, the designs of both classes of products are reviewed from a process systems point of view; developments related to the design of the chemical product, its corresponding process, and its integration are highlighted. Although significant advances have been made in the development of systematic model-based techniques for process design (also for optimization, operation, and control), much work is needed to reach the same level for product design. Timeline diagrams illustrating key contributions in product design, process design, and integrated product-process design are presented. The search for novel, innovative, and sustainable solutions must be matched by consideration of issues related to the multidisciplinary nature of problems, the lack of data needed for model development, solution strategies that incorporate multiscale options, and reliability versus predictive power. The need for an integrated model-experiment-based design approach is discussed together with benefits of employing a systematic computer-aided framework with built-in design templates.
Lightweight Thermoformed Structural Components and Optics
NASA Technical Reports Server (NTRS)
Zeiders, Glenn W.; Bradford, Larry J.
2004-01-01
A technique that involves the use of thermoformed plastics has been developed to enable the design and fabrication of ultra-lightweight structural components and mirrors for use in outer space. The technique could also be used to produce items for special terrestrial uses in which minimization of weight is a primary design consideration. Although the inherent strengths of thermoplastics are clearly inferior to those of metals and composite materials, thermoplastics offer a distinct advantage in that they can be shaped, at elevated temperatures, to replicate surfaces (e.g., prescribed mirror surfaces) precisely. Furthermore, multiple elements can be bonded into structures of homogeneous design that display minimal thermal deformation aside from simple expansion. The design aspect of the present technique is based on the principle that the deflection of a plate that has internal structure depends far more on the overall thickness than on the internal details; thus, a very stiff, light structure can be made from thin plastic that is heatformed to produce a sufficiently high moment of inertia. General examples of such structures include I beams and eggcrates.
Zhang, Xiaoliang; Martin, Alastair; Jordan, Caroline; Lillaney, Prasheel; Losey, Aaron; Pang, Yong; Hu, Jeffrey; Wilson, Mark; Cooke, Daniel; Hetts, Steven W
2017-04-01
It is technically challenging to design compact yet sensitive miniature catheter radio frequency (RF) coils for endovascular interventional MR imaging. In this work, a new design method for catheter RF coils is proposed based on the coaxial transmission line resonator (TLR) technique. Due to its distributed circuit, the TLR catheter coil does not need any lumped capacitors to support its resonance, which simplifies the practical design and construction and provides a straightforward technique for designing miniature catheter-mounted imaging coils that are appropriate for interventional neurovascular procedures. The outer conductor of the TLR serves as an RF shield, which prevents electromagnetic energy loss, and improves coil Q factors. It also minimizes interaction with surrounding tissues and signal losses along the catheter coil. To investigate the technique, a prototype catheter coil was built using the proposed coaxial TLR technique and evaluated with standard RF testing and measurement methods and MR imaging experiments. Numerical simulation was carried out to assess the RF electromagnetic field behavior of the proposed TLR catheter coil and the conventional lumped-element catheter coil. The proposed TLR catheter coil was successfully tuned to 64 MHz for proton imaging at 1.5 T. B 1 fields were numerically calculated, showing improved magnetic field intensity of the TLR catheter coil over the conventional lumped-element catheter coil. MR images were acquired from a dedicated vascular phantom using the TLR catheter coil and also the system body coil. The TLR catheter coil is able to provide a significant signal-to-noise ratio (SNR) increase (a factor of 200 to 300) over its imaging volume relative to the body coil. Catheter imaging RF coil design using the proposed coaxial TLR technique is feasible and advantageous in endovascular interventional MR imaging applications.
NASA Astrophysics Data System (ADS)
Zhu, Li; Najafizadeh, Laleh
2017-06-01
We investigate the problem related to the averaging procedure in functional near-infrared spectroscopy (fNIRS) brain imaging studies. Typically, to reduce noise and to empower the signal strength associated with task-induced activities, recorded signals (e.g., in response to repeated stimuli or from a group of individuals) are averaged through a point-by-point conventional averaging technique. However, due to the existence of variable latencies in recorded activities, the use of the conventional averaging technique can lead to inaccuracies and loss of information in the averaged signal, which may result in inaccurate conclusions about the functionality of the brain. To improve the averaging accuracy in the presence of variable latencies, we present an averaging framework that employs dynamic time warping (DTW) to account for the temporal variation in the alignment of fNIRS signals to be averaged. As a proof of concept, we focus on the problem of localizing task-induced active brain regions. The framework is extensively tested on experimental data (obtained from both block design and event-related design experiments) as well as on simulated data. In all cases, it is shown that the DTW-based averaging technique outperforms the conventional-based averaging technique in estimating the location of task-induced active regions in the brain, suggesting that such advanced averaging methods should be employed in fNIRS brain imaging studies.
Model building techniques for analysis.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Walther, Howard P.; McDaniel, Karen Lynn; Keener, Donald
2009-09-01
The practice of mechanical engineering for product development has evolved into a complex activity that requires a team of specialists for success. Sandia National Laboratories (SNL) has product engineers, mechanical designers, design engineers, manufacturing engineers, mechanical analysts and experimentalists, qualification engineers, and others that contribute through product realization teams to develop new mechanical hardware. The goal of SNL's Design Group is to change product development by enabling design teams to collaborate within a virtual model-based environment whereby analysis is used to guide design decisions. Computer-aided design (CAD) models using PTC's Pro/ENGINEER software tools are heavily relied upon in the productmore » definition stage of parts and assemblies at SNL. The three-dimensional CAD solid model acts as the design solid model that is filled with all of the detailed design definition needed to manufacture the parts. Analysis is an important part of the product development process. The CAD design solid model (DSM) is the foundation for the creation of the analysis solid model (ASM). Creating an ASM from the DSM currently is a time-consuming effort; the turnaround time for results of a design needs to be decreased to have an impact on the overall product development. This effort can be decreased immensely through simple Pro/ENGINEER modeling techniques that summarize to the method features are created in a part model. This document contains recommended modeling techniques that increase the efficiency of the creation of the ASM from the DSM.« less
Li, Linlin; Ding, Steven X; Qiu, Jianbin; Yang, Ying
2017-02-01
This paper is concerned with a real-time observer-based fault detection (FD) approach for a general type of nonlinear systems in the presence of external disturbances. To this end, in the first part of this paper, we deal with the definition and the design condition for an L ∞ / L 2 type of nonlinear observer-based FD systems. This analytical framework is fundamental for the development of real-time nonlinear FD systems with the aid of some well-established techniques. In the second part, we address the integrated design of the L ∞ / L 2 observer-based FD systems by applying Takagi-Sugeno (T-S) fuzzy dynamic modeling technique as the solution tool. This fuzzy observer-based FD approach is developed via piecewise Lyapunov functions, and can be applied to the case that the premise variables of the FD system is nonsynchronous with the premise variables of the fuzzy model of the plant. In the end, a case study on the laboratory setup of three-tank system is given to show the efficiency of the proposed results.
NASA Technical Reports Server (NTRS)
Stephan, Amy; Erikson, Carol A.
1991-01-01
As an initial attempt to introduce expert system technology into an onboard environment, a model based diagnostic system using the TRW MARPLE software tool was integrated with prototype flight hardware and its corresponding control software. Because this experiment was designed primarily to test the effectiveness of the model based reasoning technique used, the expert system ran on a separate hardware platform, and interactions between the control software and the model based diagnostics were limited. While this project met its objective of showing that model based reasoning can effectively isolate failures in flight hardware, it also identified the need for an integrated development path for expert system and control software for onboard applications. In developing expert systems that are ready for flight, artificial intelligence techniques must be evaluated to determine whether they offer a real advantage onboard, identify which diagnostic functions should be performed by the expert systems and which are better left to the procedural software, and work closely with both the hardware and the software developers from the beginning of a project to produce a well designed and thoroughly integrated application.
NASA Astrophysics Data System (ADS)
Stranieri, Andrew; Yearwood, John; Pham, Binh
1999-07-01
The development of data warehouses for the storage and analysis of very large corpora of medical image data represents a significant trend in health care and research. Amongst other benefits, the trend toward warehousing enables the use of techniques for automatically discovering knowledge from large and distributed databases. In this paper, we present an application design for knowledge discovery from databases (KDD) techniques that enhance the performance of the problem solving strategy known as case- based reasoning (CBR) for the diagnosis of radiological images. The problem of diagnosing the abnormality of the cervical spine is used to illustrate the method. The design of a case-based medical image diagnostic support system has three essential characteristics. The first is a case representation that comprises textual descriptions of the image, visual features that are known to be useful for indexing images, and additional visual features to be discovered by data mining many existing images. The second characteristic of the approach presented here involves the development of a case base that comprises an optimal number and distribution of cases. The third characteristic involves the automatic discovery, using KDD techniques, of adaptation knowledge to enhance the performance of the case based reasoner. Together, the three characteristics of our approach can overcome real time efficiency obstacles that otherwise mitigate against the use of CBR to the domain of medical image analysis.
Kranz, Christine
2014-01-21
In recent years, major developments in scanning electrochemical microscopy (SECM) have significantly broadened the application range of this electroanalytical technique from high-resolution electrochemical imaging via nanoscale probes to large scale mapping using arrays of microelectrodes. A major driving force in advancing the SECM methodology is based on developing more sophisticated probes beyond conventional micro-disc electrodes usually based on noble metals or carbon microwires. This critical review focuses on the design and development of advanced electrochemical probes particularly enabling combinations of SECM with other analytical measurement techniques to provide information beyond exclusively measuring electrochemical sample properties. Consequently, this critical review will focus on recent progress and new developments towards multifunctional imaging.
Study on a novel laser target detection system based on software radio technique
NASA Astrophysics Data System (ADS)
Song, Song; Deng, Jia-hao; Wang, Xue-tian; Gao, Zhen; Sun, Ji; Sun, Zhi-hui
2008-12-01
This paper presents that software radio technique is applied to laser target detection system with the pseudo-random code modulation. Based on the theory of software radio, the basic framework of the system, hardware platform, and the implementation of the software system are detailed. Also, the block diagram of the system, DSP circuit, block diagram of the pseudo-random code generator, and soft flow diagram of signal processing are designed. Experimental results have shown that the application of software radio technique provides a novel method to realize the modularization, miniaturization and intelligence of the laser target detection system, and the upgrade and improvement of the system will become simpler, more convenient, and cheaper.
Material Encounters with Mathematics: The Case for Museum Based Cross-Curricular Integration
ERIC Educational Resources Information Center
de Freitas, Elizabeth; Bentley, Sean J.
2012-01-01
This paper reports on research from a network of high school and museum partnerships designed to explore techniques for integrating mathematics and physics learning experiences during the first year of high school. The foundation of the curriculum is a problem-based, museum-based, and hands-on approach to mathematics and physics. In this paper, we…
ART/Ada design project, phase 1. Task 2 report: Detailed design
NASA Technical Reports Server (NTRS)
Allen, Bradley P.
1988-01-01
Various issues are studied in the context of the design of an Ada based expert system building tool. Using an existing successful design as a starting point, the impact is analyzed of the Ada language and Ada development methodologies on that design, the Ada system is redesigned, and its performance is analyzed using both complexity-theoretic and empirical techniques. The algorithms specified in the overall design are refined, resolving and documenting any open design issues, identifying each system module, documenting the internal architecture and control logic, and describing the primary data structures involved in the module.
Rapid Prototyping of High Performance Signal Processing Applications
NASA Astrophysics Data System (ADS)
Sane, Nimish
Advances in embedded systems for digital signal processing (DSP) are enabling many scientific projects and commercial applications. At the same time, these applications are key to driving advances in many important kinds of computing platforms. In this region of high performance DSP, rapid prototyping is critical for faster time-to-market (e.g., in the wireless communications industry) or time-to-science (e.g., in radio astronomy). DSP system architectures have evolved from being based on application specific integrated circuits (ASICs) to incorporate reconfigurable off-the-shelf field programmable gate arrays (FPGAs), the latest multiprocessors such as graphics processing units (GPUs), or heterogeneous combinations of such devices. We, thus, have a vast design space to explore based on performance trade-offs, and expanded by the multitude of possibilities for target platforms. In order to allow systematic design space exploration, and develop scalable and portable prototypes, model based design tools are increasingly used in design and implementation of embedded systems. These tools allow scalable high-level representations, model based semantics for analysis and optimization, and portable implementations that can be verified at higher levels of abstractions and targeted toward multiple platforms for implementation. The designer can experiment using such tools at an early stage in the design cycle, and employ the latest hardware at later stages. In this thesis, we have focused on dataflow-based approaches for rapid DSP system prototyping. This thesis contributes to various aspects of dataflow-based design flows and tools as follows: 1. We have introduced the concept of topological patterns, which exploits commonly found repetitive patterns in DSP algorithms to allow scalable, concise, and parameterizable representations of large scale dataflow graphs in high-level languages. We have shown how an underlying design tool can systematically exploit a high-level application specification consisting of topological patterns in various aspects of the design flow. 2. We have formulated the core functional dataflow (CFDF) model of computation, which can be used to model a wide variety of deterministic dynamic dataflow behaviors. We have also presented key features of the CFDF model and tools based on these features. These tools provide support for heterogeneous dataflow behaviors, an intuitive and common framework for functional specification, support for functional simulation, portability from several existing dataflow models to CFDF, integrated emphasis on minimally-restricted specification of actor functionality, and support for efficient static, quasi-static, and dynamic scheduling techniques. 3. We have developed a generalized scheduling technique for CFDF graphs based on decomposition of a CFDF graph into static graphs that interact at run-time. Furthermore, we have refined this generalized scheduling technique using a new notion of "mode grouping," which better exposes the underlying static behavior. We have also developed a scheduling technique for a class of dynamic applications that generates parameterized looped schedules (PLSs), which can handle dynamic dataflow behavior without major limitations on compile-time predictability. 4. We have demonstrated the use of dataflow-based approaches for design and implementation of radio astronomy DSP systems using an application example of a tunable digital downconverter (TDD) for spectrometers. Design and implementation of this module has been an integral part of this thesis work. This thesis demonstrates a design flow that consists of a high-level software prototype, analysis, and simulation using the dataflow interchange format (DIF) tool, and integration of this design with the existing tool flow for the target implementation on an FPGA platform, called interconnect break-out board (IBOB). We have also explored the trade-off between low hardware cost for fixed configurations of digital downconverters and flexibility offered by TDD designs. 5. This thesis has contributed significantly to the development and release of the latest version of a graph package oriented toward models of computation (MoCGraph). Our enhancements to this package include support for tree data structures, and generalized schedule trees (GSTs), which provide a useful data structure for a wide variety of schedule representations. Our extensions to the MoCGraph package provided key support for the CFDF model, and functional simulation capabilities in the DIF package.
NASA Astrophysics Data System (ADS)
Lopez Lopez, Roberto
2013-02-01
This work describes the concept, design, development, evolution and application of the FastCam instrument. FastCam is an image photometer for astronomy with image capture in a high-frequency range and diraction limited, in order to apply the Lucky Imaging technique to medium- and large-sized ( 1.5 to 4 m) telescopes. The Lucky Imaging technique allows, for ground-based telescopes, to achieve the resolution limit for astronomical images under suitable conditions. This work describes the atmospheric problems and the active and adaptive optics techniques to solve them, as well as the Lucky Imaging fundamentals. A description of the considerations to the project development and design parameters is performed. Then, the optical design and dierent adaptations to several telescopes will be revised. In a next step, some of the scientic results obtained thanks to this project are shown, both in position astronomy and complex structures in globular cluster and binary systems. Dierent designs arising from the basic idea and the instruments now in development that are expanding the system's capabilities and the technique are explained. Some other possible applications to other elds in which the image sharpness is necessary despite uctuations or instabilities of the observing system will be also pointed out: ophthalmology, video-control, etc.
Safeguards by Design Challenge
DOE Office of Scientific and Technical Information (OSTI.GOV)
Alwin, Jennifer Louise
The International Atomic Energy Agency (IAEA) defines Safeguards as a system of inspection and verification of the peaceful uses of nuclear materials as part of the Nuclear Nonproliferation Treaty. IAEA oversees safeguards worldwide. Safeguards by Design (SBD) involves incorporation of safeguards technologies, techniques, and instrumentation during the design phase of a facility, rather that after the fact. Design challenge goals are the following: Design a system of safeguards technologies, techniques, and instrumentation for inspection and verification of the peaceful uses of nuclear materials. Cost should be minimized to work with the IAEA’s limited budget. Dose to workers should always bemore » as low are reasonably achievable (ALARA). Time is of the essence in operating facilities and flow of material should not be interrupted significantly. Proprietary process information in facilities may need to be protected, thus the amount of information obtained by inspectors should be the minimum required to achieve the measurement goal. Then three different design challenges are detailed: Plutonium Waste Item Measurement System, Marine-based Modular Reactor, and Floating Nuclear Power Plant (FNPP).« less
Flat-plate photovoltaic array design optimization
NASA Technical Reports Server (NTRS)
Ross, R. G., Jr.
1980-01-01
An analysis is presented which integrates the results of specific studies in the areas of photovoltaic structural design optimization, optimization of array series/parallel circuit design, thermal design optimization, and optimization of environmental protection features. The analysis is based on minimizing the total photovoltaic system life-cycle energy cost including repair and replacement of failed cells and modules. This approach is shown to be a useful technique for array optimization, particularly when time-dependent parameters such as array degradation and maintenance are involved.
Design of Composite Structures for Reliability and Damage Tolerance
NASA Technical Reports Server (NTRS)
Rais-Rohani, Masoud
1999-01-01
A summary of research conducted during the first year is presented. The research objectives were sought by conducting two tasks: (1) investigation of probabilistic design techniques for reliability-based design of composite sandwich panels, and (2) examination of strain energy density failure criterion in conjunction with response surface methodology for global-local design of damage tolerant helicopter fuselage structures. This report primarily discusses the efforts surrounding the first task and provides a discussion of some preliminary work involving the second task.
Mechanical design of a power-adjustable spectacle lens frame.
Zapata, Asuncion; Barbero, Sergio
2011-05-01
Power-adjustable spectacle lenses, based on the Alvarez-Lohmann principle, can be used to provide affordable spectacles for subjective refractive errors measurement and its correction. A new mechanical frame has been designed to maximize the advantages of this technology. The design includes a mechanism to match the interpupillary distance with that of the optical centers of the lenses. The frame can be manufactured using low cost plastic injection molding techniques. A prototype has been built to test the functioning of this mechanical design.
Design reuse experience of space and hazardous operations robots
NASA Technical Reports Server (NTRS)
Oneil, P. Graham
1994-01-01
A comparison of design drivers for space and hazardous nuclear waste operating robots details similarities and differences in operations, performance and environmental parameters for these critical environments. The similarities are exploited to provide low risk system components based on reuse principles and design knowledge. Risk reduction techniques are used for bridging areas of significant differences. As an example, risk reduction of a new sensor design for nuclear environment operations is employed to provide upgradeable replacement units in a reusable architecture for significantly higher levels of radiation.
Chen, Patty H; White, Charles E
2006-01-01
This study compared rabbit rectal thermometry with 4 other thermometry techniques: an implantable microchip temperature transponder, an environmental noncontact infrared thermometer, a tympanic infrared thermometer designed for use on humans, and a tympanic infrared thermometer designed for use on animals. The microchip transponder was implanted between the shoulder blades; the environmental noncontact infrared thermometer recorded results from the base of the right pinna and the left inner thigh, and the tympanic infrared thermometer temperatures were taken from the right ear. Results from each technique were compared to determine agreement between the test modality and the rectal temperature. The practicality and reliability of the modalities were reviewed also. According to this study, the implantable microchip transponder measurements agreed most closely with the rectal temperature.
Protocol Design Challenges in the Detection of Awareness in Aware Subjects Using EEG Signals.
Henriques, J; Gabriel, D; Grigoryeva, L; Haffen, E; Moulin, T; Aubry, R; Pazart, L; Ortega, J-P
2016-10-01
Recent studies have evidenced serious difficulties in detecting covert awareness with electroencephalography-based techniques both in unresponsive patients and in healthy control subjects. This work reproduces the protocol design in two recent mental imagery studies with a larger group comprising 20 healthy volunteers. The main goal is assessing if modifications in the signal extraction techniques, training-testing/cross-validation routines, and hypotheses evoked in the statistical analysis, can provide solutions to the serious difficulties documented in the literature. The lack of robustness in the results advises for further search of alternative protocols more suitable for machine learning classification and of better performing signal treatment techniques. Specific recommendations are made using the findings in this work. © EEG and Clinical Neuroscience Society (ECNS) 2014.
Development of dry coal feeders
NASA Technical Reports Server (NTRS)
Bonin, J. H.; Cantey, D. E.; Daniel, A. D., Jr.; Meyer, J. W.
1977-01-01
Design and fabrication of equipment of feed coal into pressurized environments were investigated. Concepts were selected based on feeder system performance and economic projections. These systems include: two approaches using rotating components, a gas or steam driven ejector, and a modified standpipe feeder concept. Results of development testing of critical components, design procedures, and performance prediction techniques are reviewed.
Craig W. Johnson; Susan Buffler
2008-01-01
Intermountain West planners, designers, and resource managers are looking for science-based procedures for determining buffer widths and management techniques that will optimize the benefits riparian ecosystems provide. This study reviewed the riparian buffer literature, including protocols used to determine optimum buffer widths for water quality and wildlife habitat...
Identification of Highly Prized Commercial Fish Using a PCR-Based Methodology
ERIC Educational Resources Information Center
Moran, Paloma; Garcia-Vasquez, Eva
2006-01-01
We report a practical class designed for undergraduate students of Marine Sciences as a part of the Genetics course. The class can also be included in undergraduate studies of food technology. The exercise was designed to emphasize the application of molecular biology techniques to fish species authentication and traceability. After a simple and…
A Psychoeducational School-Based Group Intervention for Socially Anxious Children
ERIC Educational Resources Information Center
Vassilopoulos, Stephanos P.; Brouzos, Andreas; Damer, Diana E.; Mellou, Angeliki; Mitropoulou, Alexandra
2013-01-01
This study investigated the impact of a psychoeducational group for social anxiety aimed at elementary children. An 8-week psychoeducational program based on empirically validated risk factors was designed. Interventions included cognitive restructuring, anxiety management techniques, and social skills training. Pre-and posttest data from 3 groups…
A 13-Week Research-Based Biochemistry Laboratory Curriculum
ERIC Educational Resources Information Center
Lefurgy, Scott T.; Mundorff, Emily C.
2017-01-01
Here, we present a 13-week research-based biochemistry laboratory curriculum designed to provide the students with the experience of engaging in original research while introducing foundational biochemistry laboratory techniques. The laboratory experience has been developed around the directed evolution of an enzyme chosen by the instructor, with…
ERIC Educational Resources Information Center
Verschaffel, Lieven; Van Dooren, W.; Star, J.
2017-01-01
This special issue comprises contributions that address the breadth of current lines of recent research from cognitive psychology that appear promising for positively impacting students' learning of mathematics. More specifically, we included contributions (a) that refer to cognitive psychology based principles and techniques, such as explanatory…
Unraveling Evidence-Based Practices in Special Education
ERIC Educational Resources Information Center
Cook, Bryan G.; Cook, Sara Cothren
2013-01-01
Evidence-based practices (EBPs) are instructional techniques that meet prescribed criteria related to the research design, quality, quantity, and effect size of supporting research, which have the potential to help bridge the research-to-practice gap and improve student outcomes. In this article, the authors (a) discuss the importance of clear…
Building lab-scale x-ray tube based irradiators
USDA-ARS?s Scientific Manuscript database
The construction of economical x-ray tube based irradiators in a variety of configurations is described using 1000 Watt x-ray tubes. Single tube, double tube, and four tube designs are described, as well as various cabinet construction techniques. Relatively high dose rates were achieved for small s...
Development of Multiobjective Optimization Techniques for Sonic Boom Minimization
NASA Technical Reports Server (NTRS)
Chattopadhyay, Aditi; Rajadas, John Narayan; Pagaldipti, Naryanan S.
1996-01-01
A discrete, semi-analytical sensitivity analysis procedure has been developed for calculating aerodynamic design sensitivities. The sensitivities of the flow variables and the grid coordinates are numerically calculated using direct differentiation of the respective discretized governing equations. The sensitivity analysis techniques are adapted within a parabolized Navier Stokes equations solver. Aerodynamic design sensitivities for high speed wing-body configurations are calculated using the semi-analytical sensitivity analysis procedures. Representative results obtained compare well with those obtained using the finite difference approach and establish the computational efficiency and accuracy of the semi-analytical procedures. Multidisciplinary design optimization procedures have been developed for aerospace applications namely, gas turbine blades and high speed wing-body configurations. In complex applications, the coupled optimization problems are decomposed into sublevels using multilevel decomposition techniques. In cases with multiple objective functions, formal multiobjective formulation such as the Kreisselmeier-Steinhauser function approach and the modified global criteria approach have been used. Nonlinear programming techniques for continuous design variables and a hybrid optimization technique, based on a simulated annealing algorithm, for discrete design variables have been used for solving the optimization problems. The optimization procedure for gas turbine blades improves the aerodynamic and heat transfer characteristics of the blades. The two-dimensional, blade-to-blade aerodynamic analysis is performed using a panel code. The blade heat transfer analysis is performed using an in-house developed finite element procedure. The optimization procedure yields blade shapes with significantly improved velocity and temperature distributions. The multidisciplinary design optimization procedures for high speed wing-body configurations simultaneously improve the aerodynamic, the sonic boom and the structural characteristics of the aircraft. The flow solution is obtained using a comprehensive parabolized Navier Stokes solver. Sonic boom analysis is performed using an extrapolation procedure. The aircraft wing load carrying member is modeled as either an isotropic or a composite box beam. The isotropic box beam is analyzed using thin wall theory. The composite box beam is analyzed using a finite element procedure. The developed optimization procedures yield significant improvements in all the performance criteria and provide interesting design trade-offs. The semi-analytical sensitivity analysis techniques offer significant computational savings and allow the use of comprehensive analysis procedures within design optimization studies.
NASA Technical Reports Server (NTRS)
Coen, Peter G.
1991-01-01
A new computer technique for the analysis of transport aircraft sonic boom signature characteristics was developed. This new technique, based on linear theory methods, combines the previously separate equivalent area and F function development with a signature propagation method using a single geometry description. The new technique was implemented in a stand-alone computer program and was incorporated into an aircraft performance analysis program. Through these implementations, both configuration designers and performance analysts are given new capabilities to rapidly analyze an aircraft's sonic boom characteristics throughout the flight envelope.
Ultrasonic Array for Obstacle Detection Based on CDMA with Kasami Codes
Diego, Cristina; Hernández, Álvaro; Jiménez, Ana; Álvarez, Fernando J.; Sanz, Rebeca; Aparicio, Joaquín
2011-01-01
This paper raises the design of an ultrasonic array for obstacle detection based on Phased Array (PA) techniques, which steers the acoustic beam through the environment by electronics rather than mechanical means. The transmission of every element in the array has been encoded, according to Code Division for Multiple Access (CDMA), which allows multiple beams to be transmitted simultaneously. All these features together enable a parallel scanning system which does not only improve the image rate but also achieves longer inspection distances in comparison with conventional PA techniques. PMID:22247675
A High-Level Language for Modeling Algorithms and Their Properties
NASA Astrophysics Data System (ADS)
Akhtar, Sabina; Merz, Stephan; Quinson, Martin
Designers of concurrent and distributed algorithms usually express them using pseudo-code. In contrast, most verification techniques are based on more mathematically-oriented formalisms such as state transition systems. This conceptual gap contributes to hinder the use of formal verification techniques. Leslie Lamport introduced PlusCal, a high-level algorithmic language that has the "look and feel" of pseudo-code, but is equipped with a precise semantics and includes a high-level expression language based on set theory. PlusCal models can be compiled to TLA + and verified using the model checker tlc.