NASA Technical Reports Server (NTRS)
Young, G.
1982-01-01
A design methodology capable of dealing with nonlinear systems, such as a controlled ecological life support system (CELSS), containing parameter uncertainty is discussed. The methodology was applied to the design of discrete time nonlinear controllers. The nonlinear controllers can be used to control either linear or nonlinear systems. Several controller strategies are presented to illustrate the design procedure.
NASA Technical Reports Server (NTRS)
Generazio, Edward R.
2015-01-01
Directed Design of Experiments for Validating Probability of Detection Capability of NDE Systems (DOEPOD) Manual v.1.2 The capability of an inspection system is established by applications of various methodologies to determine the probability of detection (POD). One accepted metric of an adequate inspection system is that there is 95% confidence that the POD is greater than 90% (90/95 POD). Design of experiments for validating probability of detection capability of nondestructive evaluation (NDE) systems (DOEPOD) is a methodology that is implemented via software to serve as a diagnostic tool providing detailed analysis of POD test data, guidance on establishing data distribution requirements, and resolving test issues. DOEPOD demands utilization of observance of occurrences. The DOEPOD capability has been developed to provide an efficient and accurate methodology that yields observed POD and confidence bounds for both Hit-Miss or signal amplitude testing. DOEPOD does not assume prescribed POD logarithmic or similar functions with assumed adequacy over a wide range of flaw sizes and inspection system technologies, so that multi-parameter curve fitting or model optimization approaches to generate a POD curve are not required. DOEPOD applications for supporting inspector qualifications is included.
Haptic Technologies for MEMS Design
NASA Astrophysics Data System (ADS)
Calis, Mustafa; Desmulliez, Marc P. Y.
2006-04-01
This paper presents for the first time a design methodology for MEMS/NEMS based on haptic sensing technologies. The software tool created as a result of this methodology will enable designers to model and interact in real time with their virtual prototype. One of the main advantages of haptic sensing is the ability to bring unusual microscopic forces back to the designer's world. Other significant benefits for developing such a methodology include gain productivity and the capability to include manufacturing costs within the design cycle.
NASA Astrophysics Data System (ADS)
Tangen, Steven Anthony
Due to the complexities of modern military operations and the technologies employed on today's military systems, acquisition costs and development times are becoming increasingly large. Meanwhile, the transformation of the global security environment is driving the U.S. military's own transformation. In order to meet the required capabilities of the next generation without buying prohibitively costly new systems, it is necessary for the military to evolve across the spectrum of doctrine, organization, training, materiel, leadership and education, personnel, and facilities (DOTMLPF). However, the methods for analyzing DOTMLPF approaches within the early acquisition phase of a capability-based assessment (CBA) are not as well established as the traditional technology design techniques. This makes it difficult for decision makers to decide if investments should be made in materiel or non-materiel solutions. This research develops an agent-based constructive simulation to quantitatively assess doctrine alongside materiel approaches. Additionally, life-cycle cost techniques are provided to enable a cost-effectiveness trade. These techniques are wrapped together in a decision-making environment that brings crucial information forward so informed and appropriate acquisition choices can be made. The methodology is tested on a future unmanned aerial vehicle design problem. Through the implementation of this quantitative methodology on the proof-of-concept study, it is shown that doctrinal changes including fleet composition, asset allocation, and patrol pattern were capable of dramatic improvements in system effectiveness at a much lower cost than the incorporation of candidate technologies. Additionally, this methodology was able to quantify the precise nature of strong doctrine-doctrine and doctrine-technology interactions which have been observed only qualitatively throughout military history. This dissertation outlines the methodology and demonstrates how potential approaches to capability-gaps can be identified with respect to effectiveness, cost, and time. When implemented, this methodology offers the opportunity to achieve system capabilities in a new way, improve the design of acquisition programs, and field the right combination of ways and means to address future challenges to national security.
NASA Astrophysics Data System (ADS)
Huang, Xiao
2006-04-01
Today's and especially tomorrow's competitive launch vehicle design environment requires the development of a dedicated generic Space Access Vehicle (SAV) design methodology. A total of 115 industrial, research, and academic aircraft, helicopter, missile, and launch vehicle design synthesis methodologies have been evaluated. As the survey indicates, each synthesis methodology tends to focus on a specific flight vehicle configuration, thus precluding the key capability to systematically compare flight vehicle design alternatives. The aim of the research investigation is to provide decision-making bodies and the practicing engineer a design process and tool box for robust modeling and simulation of flight vehicles where the ultimate performance characteristics may hinge on numerical subtleties. This will enable the designer of a SAV for the first time to consistently compare different classes of SAV configurations on an impartial basis. This dissertation presents the development steps required towards a generic (configuration independent) hands-on flight vehicle conceptual design synthesis methodology. This process is developed such that it can be applied to any flight vehicle class if desired. In the present context, the methodology has been put into operation for the conceptual design of a tourist Space Access Vehicle. The case study illustrates elements of the design methodology & algorithm for the class of Horizontal Takeoff and Horizontal Landing (HTHL) SAVs. The HTHL SAV design application clearly outlines how the conceptual design process can be centrally organized, executed and documented with focus on design transparency, physical understanding and the capability to reproduce results. This approach offers the project lead and creative design team a management process and tool which iteratively refines the individual design logic chosen, leading to mature design methods and algorithms. As illustrated, the HTHL SAV hands-on design methodology offers growth potential in that the same methodology can be continually updated and extended to other SAV configuration concepts, such as the Vertical Takeoff and Vertical Landing (VTVL) SAV class. Having developed, validated and calibrated the methodology for HTHL designs in the 'hands-on' mode, the report provides an outlook how the methodology will be integrated into a prototype computerized design synthesis software AVDS-PrADOSAV in a follow-on step.
[Strengthening the methodology of study designs in scientific researches].
Ren, Ze-qin
2010-06-01
Many problems in study designs have affected the validity of scientific researches seriously. We must understand the methodology of research, especially clinical epidemiology and biostatistics, and recognize the urgency in selection and implement of right study design. Thereafter we can promote the research capability and improve the overall quality of scientific researches.
Electronic Design Automation: Integrating the Design and Manufacturing Functions
NASA Technical Reports Server (NTRS)
Bachnak, Rafic; Salkowski, Charles
1997-01-01
As the complexity of electronic systems grows, the traditional design practice, a sequential process, is replaced by concurrent design methodologies. A major advantage of concurrent design is that the feedback from software and manufacturing engineers can be easily incorporated into the design. The implementation of concurrent engineering methodologies is greatly facilitated by employing the latest Electronic Design Automation (EDA) tools. These tools offer integrated simulation of the electrical, mechanical, and manufacturing functions and support virtual prototyping, rapid prototyping, and hardware-software co-design. This report presents recommendations for enhancing the electronic design and manufacturing capabilities and procedures at JSC based on a concurrent design methodology that employs EDA tools.
NASA Technical Reports Server (NTRS)
1979-01-01
Information to identify viable coal gasification and utilization technologies is presented. Analysis capabilities required to support design and implementation of coal based synthetic fuels complexes are identified. The potential market in the Southeast United States for coal based synthetic fuels is investigated. A requirements analysis to identify the types of modeling and analysis capabilities required to conduct and monitor coal gasification project designs is discussed. Models and methodologies to satisfy these requirements are identified and evaluated, and recommendations are developed. Requirements for development of technology and data needed to improve gasification feasibility and economies are examined.
An engineering methodology for implementing and testing VLSI (Very Large Scale Integrated) circuits
NASA Astrophysics Data System (ADS)
Corliss, Walter F., II
1989-03-01
The engineering methodology for producing a fully tested VLSI chip from a design layout is presented. A 16-bit correlator, NPS CORN88, that was previously designed, was used as a vehicle to demonstrate this methodology. The study of the design and simulation tools, MAGIC and MOSSIM II, was the focus of the design and validation process. The design was then implemented and the chip was fabricated by MOSIS. This fabricated chip was then used to develop a testing methodology for using the digital test facilities at NPS. NPS CORN88 was the first full custom VLSI chip, designed at NPS, to be tested with the NPS digital analysis system, Tektronix DAS 9100 series tester. The capabilities and limitations of these test facilities are examined. NPS CORN88 test results are included to demonstrate the capabilities of the digital test system. A translator, MOS2DAS, was developed to convert the MOSSIM II simulation program to the input files required by the DAS 9100 device verification software, 91DVS. Finally, a tutorial for using the digital test facilities, including the DAS 9100 and associated support equipments, is included as an appendix.
Research and development activities in unified control-structure modeling and design
NASA Technical Reports Server (NTRS)
Nayak, A. P.
1985-01-01
Results of work to develop a unified control/structures modeling and design capability for large space structures modeling are presented. Recent analytical results are presented to demonstrate the significant interdependence between structural and control properties. A new design methodology is suggested in which the structure, material properties, dynamic model and control design are all optimized simultaneously. Parallel research done by other researchers is reviewed. The development of a methodology for global design optimization is recommended as a long-term goal. It is suggested that this methodology should be incorporated into computer aided engineering programs, which eventually will be supplemented by an expert system to aid design optimization.
Connecting Curriculum, Capabilities and Careers
ERIC Educational Resources Information Center
Thomas, Ian; Depasquale, James
2016-01-01
Purpose: The reported research aims to examine the extent to which sustainability capabilities have been delivered by a specific example of Education for Sustainability (EfS) and Education for Sustainable Development (ESD), and how important the capabilities have been in the workplace. Design/methodology/approach Students who participated in an…
Threshold Capability Development in Intensive Mode Business Units
ERIC Educational Resources Information Center
Crispin, Stuart; Hancock, Phil; Male, Sally Amanda; Baillie, Caroline; MacNish, Cara; Leggoe, Jeremy; Ranmuthugala, Dev; Alam, Firoz
2016-01-01
Purpose: The purpose of this paper is to explore: student perceptions of threshold concepts and capabilities in postgraduate business education, and the potential impacts of intensive modes of teaching on student understanding of threshold concepts and development of threshold capabilities. Design/Methodology/Approach: The student experience of…
Experimental Methodology for Measuring Combustion and Injection-Coupled Responses
NASA Technical Reports Server (NTRS)
Cavitt, Ryan C.; Frederick, Robert A.; Bazarov, Vladimir G.
2006-01-01
A Russian scaling methodology for liquid rocket engines utilizing a single, full scale element is reviewed. The scaling methodology exploits the supercritical phase of the full scale propellants to simplify scaling requirements. Many assumptions are utilized in the derivation of the scaling criteria. A test apparatus design is presented to implement the Russian methodology and consequently verify the assumptions. This test apparatus will allow researchers to assess the usefulness of the scaling procedures and possibly enhance the methodology. A matrix of the apparatus capabilities for a RD-170 injector is also presented. Several methods to enhance the methodology have been generated through the design process.
Tungsten fiber reinforced superalloy composite high temperature component design considerations
NASA Technical Reports Server (NTRS)
Winsa, E. A.
1982-01-01
Tungsten fiber reinforced superalloy composites (TFRS) are intended for use in high temperature turbine components. Current turbine component design methodology is based on applying the experience, sometimes semiempirical, gained from over 30 years of superalloy component design. Current composite component design capability is generally limited to the methodology for low temperature resin matrix composites. Often the tendency is to treat TFRS as just another superalloy or low temperature composite. However, TFRS behavior is significantly different than that of superalloys, and the high environment adds consideration not common in low temperature composite component design. The methodology used for preliminary design of TFRS components are described. Considerations unique to TFRS are emphasized.
NASA Technical Reports Server (NTRS)
Kimmel, William M. (Technical Monitor); Bradley, Kevin R.
2004-01-01
This paper describes the development of a methodology for sizing Blended-Wing-Body (BWB) transports and how the capabilities of the Flight Optimization System (FLOPS) have been expanded using that methodology. In this approach, BWB transports are sized based on the number of passengers in each class that must fit inside the centerbody or pressurized vessel. Weight estimation equations for this centerbody structure were developed using Finite Element Analysis (FEA). This paper shows how the sizing methodology has been incorporated into FLOPS to enable the design and analysis of BWB transports. Previous versions of FLOPS did not have the ability to accurately represent or analyze BWB configurations in any reliable, logical way. The expanded capabilities allow the design and analysis of a 200 to 450-passenger BWB transport or the analysis of a BWB transport for which the geometry is already known. The modifications to FLOPS resulted in differences of less than 4 percent for the ramp weight of a BWB transport in this range when compared to previous studies performed by NASA and Boeing.
Development of a weight/sizing design synthesis computer program. Volume 1: Program formulation
NASA Technical Reports Server (NTRS)
Garrison, J. M.
1973-01-01
The development of a weight/sizing design synthesis methodology for use in support of the main line space shuttle program is discussed. The methodology has a minimum number of data inputs and quick turn around capabilities. The methodology makes it possible to: (1) make weight comparisons between current shuttle configurations and proposed changes, (2) determine the effects of various subsystems trades on total systems weight, and (3) determine the effects of weight on performance and performance on weight.
ERIC Educational Resources Information Center
Palos, Ramona; Veres Stancovici, Vesna
2016-01-01
Purpose: This study aims at identifying the presence of the dimensions of learning capabilities and the characteristics of a learning organization within two companies in the field of services, as well as identifying the relationships between their learning capability and the organizational culture. Design/methodology/approach: This has been a…
Formulation of a parametric systems design framework for disaster response planning
NASA Astrophysics Data System (ADS)
Mma, Stephanie Weiya
The occurrence of devastating natural disasters in the past several years have prompted communities, responding organizations, and governments to seek ways to improve disaster preparedness capabilities locally, regionally, nationally, and internationally. A holistic approach to design used in the aerospace and industrial engineering fields enables efficient allocation of resources through applied parametric changes within a particular design to improve performance metrics to selected standards. In this research, this methodology is applied to disaster preparedness, using a community's time to restoration after a disaster as the response metric. A review of the responses from Hurricane Katrina and the 2010 Haiti earthquake, among other prominent disasters, provides observations leading to some current capability benchmarking. A need for holistic assessment and planning exists for communities but the current response planning infrastructure lacks a standardized framework and standardized assessment metrics. Within the humanitarian logistics community, several different metrics exist, enabling quantification and measurement of a particular area's vulnerability. These metrics, combined with design and planning methodologies from related fields, such as engineering product design, military response planning, and business process redesign, provide insight and a framework from which to begin developing a methodology to enable holistic disaster response planning. The developed methodology was applied to the communities of Shelby County, TN and pre-Hurricane-Katrina Orleans Parish, LA. Available literature and reliable media sources provide information about the different values of system parameters within the decomposition of the community aspects and also about relationships among the parameters. The community was modeled as a system dynamics model and was tested in the implementation of two, five, and ten year improvement plans for Preparedness, Response, and Development capabilities, and combinations of these capabilities. For Shelby County and for Orleans Parish, the Response improvement plan reduced restoration time the most. For the combined capabilities, Shelby County experienced the greatest reduction in restoration time with the implementation of Development and Response capability improvements, and for Orleans Parish it was the Preparedness and Response capability improvements. Optimization of restoration time with community parameters was tested by using a Particle Swarm Optimization algorithm. Fifty different optimized restoration times were generated using the Particle Swarm Optimization algorithm and ranked using the Technique for Order Preference by Similarity to Ideal Solution. The optimization results indicate that the greatest reduction in restoration time for a community is achieved with a particular combination of different parameter values instead of the maximization of each parameter.
Rivera, José; Carrillo, Mariano; Chacón, Mario; Herrera, Gilberto; Bojorquez, Gilberto
2007-01-01
The development of smart sensors involves the design of reconfigurable systems capable of working with different input sensors. Reconfigurable systems ideally should spend the least possible amount of time in their calibration. An autocalibration algorithm for intelligent sensors should be able to fix major problems such as offset, variation of gain and lack of linearity, as accurately as possible. This paper describes a new autocalibration methodology for nonlinear intelligent sensors based on artificial neural networks, ANN. The methodology involves analysis of several network topologies and training algorithms. The proposed method was compared against the piecewise and polynomial linearization methods. Method comparison was achieved using different number of calibration points, and several nonlinear levels of the input signal. This paper also shows that the proposed method turned out to have a better overall accuracy than the other two methods. Besides, experimentation results and analysis of the complete study, the paper describes the implementation of the ANN in a microcontroller unit, MCU. In order to illustrate the method capability to build autocalibration and reconfigurable systems, a temperature measurement system was designed and tested. The proposed method is an improvement over the classic autocalibration methodologies, because it impacts on the design process of intelligent sensors, autocalibration methodologies and their associated factors, like time and cost.
NASA Technical Reports Server (NTRS)
1981-01-01
The software package evaluation was designed to analyze commercially available, field-proven, production control or manufacturing resource planning management technology and software package. The analysis was conducted by comparing SRB production control software requirements and conceptual system design to software package capabilities. The methodology of evaluation and the findings at each stage of evaluation are described. Topics covered include: vendor listing; request for information (RFI) document; RFI response rate and quality; RFI evaluation process; and capabilities versus requirements.
Automated software development workstation
NASA Technical Reports Server (NTRS)
1986-01-01
Engineering software development was automated using an expert system (rule-based) approach. The use of this technology offers benefits not available from current software development and maintenance methodologies. A workstation was built with a library or program data base with methods for browsing the designs stored; a system for graphical specification of designs including a capability for hierarchical refinement and definition in a graphical design system; and an automated code generation capability in FORTRAN. The workstation was then used in a demonstration with examples from an attitude control subsystem design for the space station. Documentation and recommendations are presented.
NASA Technical Reports Server (NTRS)
Maghami, Peiman G.; Gupta, Sandeep; Elliott, Kenny B.; Joshi, Suresh M.; Walz, Joseph E.
1994-01-01
This paper describes the first experimental validation of an optimization-based integrated controls-structures design methodology for a class of flexible space structures. The Controls-Structures-Interaction (CSI) Evolutionary Model, a laboratory test bed at Langley, is redesigned based on the integrated design methodology with two different dissipative control strategies. The redesigned structure is fabricated, assembled in the laboratory, and experimentally compared with the original test structure. Design guides are proposed and used in the integrated design process to ensure that the resulting structure can be fabricated. Experimental results indicate that the integrated design requires greater than 60 percent less average control power (by thruster actuators) than the conventional control-optimized design while maintaining the required line-of-sight performance, thereby confirming the analytical findings about the superiority of the integrated design methodology. Amenability of the integrated design structure to other control strategies is considered and evaluated analytically and experimentally. This work also demonstrates the capabilities of the Langley-developed design tool CSI DESIGN which provides a unified environment for structural and control design.
Aero-Mechanical Design Methodology for Subsonic Civil Transport High-Lift Systems
NASA Technical Reports Server (NTRS)
vanDam, C. P.; Shaw, S. G.; VanderKam, J. C.; Brodeur, R. R.; Rudolph, P. K. C.; Kinney, D.
2000-01-01
In today's highly competitive and economically driven commercial aviation market, the trend is to make aircraft systems simpler and to shorten their design cycle which reduces recurring, non-recurring and operating costs. One such system is the high-lift system. A methodology has been developed which merges aerodynamic data with kinematic analysis of the trailing-edge flap mechanism with minimum mechanism definition required. This methodology provides quick and accurate aerodynamic performance prediction for a given flap deployment mechanism early on in the high-lift system preliminary design stage. Sample analysis results for four different deployment mechanisms are presented as well as descriptions of the aerodynamic and mechanism data required for evaluation. Extensions to interactive design capabilities are also discussed.
Multidisciplinary analysis and design of printed wiring boards
NASA Astrophysics Data System (ADS)
Fulton, Robert E.; Hughes, Joseph L.; Scott, Waymond R., Jr.; Umeagukwu, Charles; Yeh, Chao-Pin
1991-04-01
Modern printed wiring board design depends on electronic prototyping using computer-based simulation and design tools. Existing electrical computer-aided design (ECAD) tools emphasize circuit connectivity with only rudimentary analysis capabilities. This paper describes a prototype integrated PWB design environment denoted Thermal Structural Electromagnetic Testability (TSET) being developed at Georgia Tech in collaboration with companies in the electronics industry. TSET provides design guidance based on enhanced electrical and mechanical CAD capabilities including electromagnetic modeling testability analysis thermal management and solid mechanics analysis. TSET development is based on a strong analytical and theoretical science base and incorporates an integrated information framework and a common database design based on a systematic structured methodology.
Cost-effectiveness methodology for computer systems selection
NASA Technical Reports Server (NTRS)
Vallone, A.; Bajaj, K. S.
1980-01-01
A new approach to the problem of selecting a computer system design has been developed. The purpose of this methodology is to identify a system design that is capable of fulfilling system objectives in the most economical way. The methodology characterizes each system design by the cost of the system life cycle and by the system's effectiveness in reaching objectives. Cost is measured by a 'system cost index' derived from an analysis of all expenditures and possible revenues over the system life cycle. Effectiveness is measured by a 'system utility index' obtained by combining the impact that each selection factor has on the system objectives and it is assessed through a 'utility curve'. A preestablished algorithm combines cost and utility and provides a ranking of the alternative system designs from which the 'best' design is selected.
Cyber-Informed Engineering: The Need for a New Risk Informed and Design Methodology
DOE Office of Scientific and Technical Information (OSTI.GOV)
Price, Joseph Daniel; Anderson, Robert Stephen
Current engineering and risk management methodologies do not contain the foundational assumptions required to address the intelligent adversary’s capabilities in malevolent cyber attacks. Current methodologies focus on equipment failures or human error as initiating events for a hazard, while cyber attacks use the functionality of a trusted system to perform operations outside of the intended design and without the operator’s knowledge. These threats can by-pass or manipulate traditionally engineered safety barriers and present false information, invalidating the fundamental basis of a safety analysis. Cyber threats must be fundamentally analyzed from a completely new perspective where neither equipment nor human operationmore » can be fully trusted. A new risk analysis and design methodology needs to be developed to address this rapidly evolving threatscape.« less
Research and development activities in unified control-structure modeling and design
NASA Technical Reports Server (NTRS)
Nayak, A. P.
1985-01-01
Results of work sponsored by JPL and other organizations to develop a unified control/structures modeling and design capability for large space structures is presented. Recent analytical results are presented to demonstrate the significant interdependence between structural and control properties. A new design methodology is suggested in which the structure, material properties, dynamic model and control design are all optimized simultaneously. The development of a methodology for global design optimization is recommended as a long term goal. It is suggested that this methodology should be incorporated into computer aided engineering programs, which eventually will be supplemented by an expert system to aid design optimization. Recommendations are also presented for near term research activities at JPL. The key recommendation is to continue the development of integrated dynamic modeling/control design techniques, with special attention given to the development of structural models specially tailored to support design.
NASA Technical Reports Server (NTRS)
Garg, Sanjay; Mattern, Duane
1994-01-01
An advanced methodology for integrated flight propulsion control (IFPC) design for future aircraft, which will use propulsion system generated forces and moments for enhanced maneuver capabilities, is briefly described. This methodology has the potential to address in a systematic manner the coupling between the airframe and the propulsion subsystems typical of such enhanced maneuverability aircraft. Application of the methodology to a short take-off vertical landing (STOVL) aircraft in the landing approach to hover transition flight phase is presented with brief description of the various steps in the IFPC design methodology. The details of the individual steps have been described in previous publications and the objective of this paper is to focus on how the components of the control system designed at each step integrate into the overall IFPC system. The full nonlinear IFPC system was evaluated extensively in nonreal-time simulations as well as piloted simulations. Results from the nonreal-time evaluations are presented in this paper. Lessons learned from this application study are summarized in terms of areas of potential improvements in the STOVL IFPC design as well as identification of technology development areas to enhance the applicability of the proposed design methodology.
OVERMODED HIGH-POWER RF MAGNETIC SWITCHES AND CIRCULATORS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tantawi, Sami
2002-08-20
We present design methodology for active rf magnetic components which are suitable for pulse compression systems of future X-band linear colliders. These components comprise an array of active elements arranged together so that the total electromagnetic field is reduced and the power handling capabilities are increased. The active element of choice is a magnetic material (garnet), which can be switched by changing a biasing magnetic field. A novel design allows these components to operate in the low loss circular waveguide mode TE{sub 01}. We describe the design methodology, the switching elements and circuits.
Complementary Role of Organizational Learning Capability in New Service Development (NSD) Process
ERIC Educational Resources Information Center
Limpibunterng, Tharinee; Johri, Lalit M.
2009-01-01
Purpose: The purpose of this paper is to investigate the role of organizational learning capability in relation to leadership tasks performed by executives and organizational performance by bridging the concepts of organizational learning and NSD. Design/methodology/approach: The NSD processes of seven telecom service providers in Thailand are…
A design and implementation methodology for diagnostic systems
NASA Technical Reports Server (NTRS)
Williams, Linda J. F.
1988-01-01
A methodology for design and implementation of diagnostic systems is presented. Also discussed are the advantages of embedding a diagnostic system in a host system environment. The methodology utilizes an architecture for diagnostic system development that is hierarchical and makes use of object-oriented representation techniques. Additionally, qualitative models are used to describe the host system components and their behavior. The methodology architecture includes a diagnostic engine that utilizes a combination of heuristic knowledge to control the sequence of diagnostic reasoning. The methodology provides an integrated approach to development of diagnostic system requirements that is more rigorous than standard systems engineering techniques. The advantages of using this methodology during various life cycle phases of the host systems (e.g., National Aerospace Plane (NASP)) include: the capability to analyze diagnostic instrumentation requirements during the host system design phase, a ready software architecture for implementation of diagnostics in the host system, and the opportunity to analyze instrumentation for failure coverage in safety critical host system operations.
Critical Issues in Research Design in Action Research in an SME Development Context
ERIC Educational Resources Information Center
McGrath, Helen; O'Toole, Thomas
2012-01-01
Purpose: The main aim of this paper is to develop guidelines on the critical issues to consider in research design in an action research (AR) environment for SME network capability development. Design/methodology/approach: The issues in research design for AR studies are developed from the authors' experience in running learning sets but, in…
IntelliTable: Inclusively-Designed Furniture with Robotic Capabilities.
Prescott, Tony J; Conran, Sebastian; Mitchinson, Ben; Cudd, Peter
2017-01-01
IntelliTable is a new proof-of-principle assistive technology system with robotic capabilities in the form of an elegant universal cantilever table able to move around by itself, or under user control. We describe the design and current capabilities of the table and the human-centered design methodology used in its development and initial evaluation. The IntelliTable study has delivered robotic platform programmed by a smartphone that can navigate around a typical home or care environment, avoiding obstacles, and positioning itself at the user's command. It can also be configured to navigate itself to pre-ordained places positions within an environment using ceiling tracking, responsive optical guidance and object-based sonar navigation.
NASA Technical Reports Server (NTRS)
Pieper, Jerry L.; Walker, Richard E.
1993-01-01
During the past three decades, an enormous amount of resources were expended in the design and development of Liquid Oxygen/Hydrocarbon and Hydrogen (LOX/HC and LOX/H2) rocket engines. A significant portion of these resources were used to develop and demonstrate the performance and combustion stability for each new engine. During these efforts, many analytical and empirical models were developed that characterize design parameters and combustion processes that influence performance and stability. Many of these models are suitable as design tools, but they have not been assembled into an industry-wide usable analytical design methodology. The objective of this program was to assemble existing performance and combustion stability models into a usable methodology capable of producing high performing and stable LOX/hydrocarbon and LOX/hydrogen propellant booster engines.
NASA Technical Reports Server (NTRS)
Evers, Ken H.; Bachert, Robert F.
1987-01-01
The IDEAL (Integrated Design and Engineering Analysis Languages) modeling methodology has been formulated and applied over a five-year period. It has proven to be a unique, integrated approach utilizing a top-down, structured technique to define and document the system of interest; a knowledge engineering technique to collect and organize system descriptive information; a rapid prototyping technique to perform preliminary system performance analysis; and a sophisticated simulation technique to perform in-depth system performance analysis.
ERIC Educational Resources Information Center
Bhatnagar, Jyotsna
2006-01-01
Purpose: The purpose of this research is to measure Organizational Learning Capability (OLC) perception in the managers of public, private and multinational organizations and establish the link between OLC and firm performance. Design/methodology/approach: The data were collected from a sample of 612 managers randomly drawn from Indian industry,…
ERIC Educational Resources Information Center
Martins, Jorge Tiago
2016-01-01
Purpose: Focusing on the specific context of two European old industrial regions--South Yorkshire (UK) and North Region of Portugal--this paper aims to identify and conceptualise a set of relational capabilities that business leaders perceive to play a key role in industrial rejuvenation. Design/Methodology/Approach: A qualitative research design…
ERIC Educational Resources Information Center
Goh, Swee C.; Elliott, Catherine; Quon, Tony K.
2012-01-01
Purpose: The purpose of this paper is to present a meta-analysis of a subset of published empirical research papers that measure learning capability and link it to organizational performance. It also seeks to examine both financial and non-financial performance. Design/methodology/approach: In a search of published research on learning capability…
ERIC Educational Resources Information Center
Khandekar, Aradhana; Sharma, Anuradha
2005-01-01
Purpose: The purpose of this article is to examine the role of human resource capability (HRC) in organisational performance and sustainable competitive advantage (SCA) in Indian global organisations. Design/Methodology/Approach: To carry out the present study, an empirical research on a random sample of 300 line or human resource managers from…
Schedule Risks Due to Delays in Advanced Technology Development
NASA Technical Reports Server (NTRS)
Reeves, John D. Jr.; Kayat, Kamal A.; Lim, Evan
2008-01-01
This paper discusses a methodology and modeling capability that probabilistically evaluates the likelihood and impacts of delays in advanced technology development prior to the start of design, development, test, and evaluation (DDT&E) of complex space systems. The challenges of understanding and modeling advanced technology development considerations are first outlined, followed by a discussion of the problem in the context of lunar surface architecture analysis. The current and planned methodologies to address the problem are then presented along with sample analyses and results. The methodology discussed herein provides decision-makers a thorough understanding of the schedule impacts resulting from the inclusion of various enabling advanced technology assumptions within system design.
Methods to Enhance Students' Entrepreneurial Mindset: A Swedish Example
ERIC Educational Resources Information Center
Lindberg, Erik; Bohman, Håkan; Hultén, Peter
2017-01-01
Purpose: The purpose of this study is to examine the effects of intervention methods in an entrepreneurship education (EE) course that was designed to enhance the students' entrepreneurial mindset by targeting their opportunity identification, creativity and risk management capabilities (RMC). Design/methodology/approach: The authors formulate…
Numerical aerodynamic simulation facility preliminary study: Executive study
NASA Technical Reports Server (NTRS)
1977-01-01
A computing system was designed with the capability of providing an effective throughput of one billion floating point operations per second for three dimensional Navier-Stokes codes. The methodology used in defining the baseline design, and the major elements of the numerical aerodynamic simulation facility are described.
RT 24 - Architecture, Modeling & Simulation, and Software Design
2010-11-01
focus on tool extensions (UPDM, SysML, SoaML, BPMN ) Leverage “best of breed” architecture methodologies Provide tooling to support the methodology DoDAF...Capability 10 Example: BPMN 11 DoDAF 2.0 MetaModel BPMN MetaModel Mapping SysML to DoDAF 2.0 12 DoDAF V2.0 Models OV-2 SysML Diagrams Requirement
2015 Army Science Planning and Strategy Meeting Series: Outcomes and Conclusions
2017-12-21
modeling and nanoscale characterization tools to enable efficient design of hybridized manufacturing ; realtime, multiscale computational capability...to enable predictive analytics for expeditionary on-demand manufacturing • Discovery of design principles to enable programming advanced genetic...goals, significant research is needed to mature the fundamental materials science, processing and manufacturing sciences, design methodologies, data
Development of a 3D numerical methodology for fast prediction of gun blast induced loading
NASA Astrophysics Data System (ADS)
Costa, E.; Lagasco, F.
2014-05-01
In this paper, the development of a methodology based on semi-empirical models from the literature to carry out 3D prediction of pressure loading on surfaces adjacent to a weapon system during firing is presented. This loading is consequent to the impact of the blast wave generated by the projectile exiting the muzzle bore. When exceeding a pressure threshold level, loading is potentially capable to induce unwanted damage to nearby hard structures as well as frangible panels or electronic equipment. The implemented model shows the ability to quickly predict the distribution of the blast wave parameters over three-dimensional complex geometry surfaces when the weapon design and emplacement data as well as propellant and projectile characteristics are available. Considering these capabilities, the use of the proposed methodology is envisaged as desirable in the preliminary design phase of the combat system to predict adverse effects and then enable to identify the most appropriate countermeasures. By providing a preliminary but sensitive estimate of the operative environmental loading, this numerical means represents a good alternative to more powerful, but time consuming advanced computational fluid dynamics tools, which use can, thus, be limited to the final phase of the design.
Threshold Concepts in Business School Curriculum--A Pedagogy for Public Trust
ERIC Educational Resources Information Center
Bajada, Christopher; Jarvis, Walter; Trayler, Rowan; Bui, Anh Tuan
2016-01-01
Purpose: The purpose of this paper is to explore some of the implications for curriculum design by operationalizing threshold concepts and capabilities (TCC) in subject delivery. The motivation for undertaking this exploration is directly related to addressing public concerns for the business school curriculum. Design/Methodology/Approach: A…
NASA Astrophysics Data System (ADS)
Echavarria, E.; Tomiyama, T.; van Bussel, G. J. W.
2007-07-01
The objective of this on-going research is to develop a design methodology to increase the availability for offshore wind farms, by means of an intelligent maintenance system capable of responding to faults by reconfiguring the system or subsystems, without increasing service visits, complexity, or costs. The idea is to make use of the existing functional redundancies within the system and sub-systems to keep the wind turbine operational, even at a reduced capacity if necessary. Re-configuration is intended to be a built-in capability to be used as a repair strategy, based on these existing functionalities provided by the components. The possible solutions can range from using information from adjacent wind turbines, such as wind speed and direction, to setting up different operational modes, for instance re-wiring, re-connecting, changing parameters or control strategy. The methodology described in this paper is based on qualitative physics and consists of a fault diagnosis system based on a model-based reasoner (MBR), and on a functional redundancy designer (FRD). Both design tools make use of a function-behaviour-state (FBS) model. A design methodology based on the re-configuration concept to achieve self-maintained wind turbines is an interesting and promising approach to reduce stoppage rate, failure events, maintenance visits, and to maintain energy output possibly at reduced rate until the next scheduled maintenance.
Discrete Adjoint-Based Design Optimization of Unsteady Turbulent Flows on Dynamic Unstructured Grids
NASA Technical Reports Server (NTRS)
Nielsen, Eric J.; Diskin, Boris; Yamaleev, Nail K.
2009-01-01
An adjoint-based methodology for design optimization of unsteady turbulent flows on dynamic unstructured grids is described. The implementation relies on an existing unsteady three-dimensional unstructured grid solver capable of dynamic mesh simulations and discrete adjoint capabilities previously developed for steady flows. The discrete equations for the primal and adjoint systems are presented for the backward-difference family of time-integration schemes on both static and dynamic grids. The consistency of sensitivity derivatives is established via comparisons with complex-variable computations. The current work is believed to be the first verified implementation of an adjoint-based optimization methodology for the true time-dependent formulation of the Navier-Stokes equations in a practical computational code. Large-scale shape optimizations are demonstrated for turbulent flows over a tiltrotor geometry and a simulated aeroelastic motion of a fighter jet.
An unstructured-grid software system for solving complex aerodynamic problems
NASA Technical Reports Server (NTRS)
Frink, Neal T.; Pirzadeh, Shahyar; Parikh, Paresh
1995-01-01
A coordinated effort has been underway over the past four years to elevate unstructured-grid methodology to a mature level. The goal of this endeavor is to provide a validated capability to non-expert users for performing rapid aerodynamic analysis and design of complex configurations. The Euler component of the system is well developed, and is impacting a broad spectrum of engineering needs with capabilities such as rapid grid generation and inviscid flow analysis, inverse design, interactive boundary layers, and propulsion effects. Progress is also being made in the more tenuous Navier-Stokes component of the system. A robust grid generator is under development for constructing quality thin-layer tetrahedral grids, along with a companion Navier-Stokes flow solver. This paper presents an overview of this effort, along with a perspective on the present and future status of the methodology.
ERIC Educational Resources Information Center
Kearney, Arthur; Harrington, Denis; Kelliher, Felicity
2014-01-01
Purpose: The paper has been developed from a critical review of available literature drawn from the micro firm, managerial capability and innovation management fields. The paper aims to address these issues. Design/methodology/approach: The paper has been developed from a critical review of available literature drawn from the micro firm,…
Multirate flutter suppression system design for the Benchmark Active Controls Technology Wing
NASA Technical Reports Server (NTRS)
Berg, Martin C.; Mason, Gregory S.
1994-01-01
To study the effectiveness of various control system design methodologies, the NASA Langley Research Center initiated the Benchmark Active Controls Project. In this project, the various methodologies will be applied to design a flutter suppression system for the Benchmark Active Controls Technology (BACT) Wing (also called the PAPA wing). Eventually, the designs will be implemented in hardware and tested on the BACT wing in a wind tunnel. This report describes a project at the University of Washington to design a multirate flutter suppression system for the BACT wing. The objective of the project was two fold. First, to develop a methodology for designing robust multirate compensators, and second, to demonstrate the methodology by applying it to the design of a multirate flutter suppression system for the BACT wing. The contributions of this project are (1) development of an algorithm for synthesizing robust low order multirate control laws (the algorithm is capable of synthesizing a single compensator which stabilizes both the nominal plant and multiple plant perturbations; (2) development of a multirate design methodology, and supporting software, for modeling, analyzing and synthesizing multirate compensators; and (3) design of a multirate flutter suppression system for NASA's BACT wing which satisfies the specified design criteria. This report describes each of these contributions in detail. Section 2.0 discusses our design methodology. Section 3.0 details the results of our multirate flutter suppression system design for the BACT wing. Finally, Section 4.0 presents our conclusions and suggestions for future research. The body of the report focuses primarily on the results. The associated theoretical background appears in the three technical papers that are included as Attachments 1-3. Attachment 4 is a user's manual for the software that is key to our design methodology.
IMPAC: An Integrated Methodology for Propulsion and Airframe Control
NASA Technical Reports Server (NTRS)
Garg, Sanjay; Ouzts, Peter J.; Lorenzo, Carl F.; Mattern, Duane L.
1991-01-01
The National Aeronautics and Space Administration is actively involved in the development of enabling technologies that will lead towards aircraft with new/enhanced maneuver capabilities such as Short Take-Off Vertical Landing (STOVL) and high angle of attack performance. Because of the high degree of dynamic coupling between the airframe and propulsion systems of these types of aircraft, one key technology is the integration of the flight and propulsion control. The NASA Lewis Research Center approach to developing Integrated Flight Propulsion Control (IFPC) technologies is an in-house research program referred to as IMPAC (Integrated Methodology for Propulsion and Airframe Control). The goals of IMPAC are to develop a viable alternative to the existing integrated control design methodologies that will allow for improved system performance and simplicity of control law synthesis and implementation, and to demonstrate the applicability of the methodology to a supersonic STOVL fighter aircraft. Based on some preliminary control design studies that included evaluation of the existing methodologies, the IFPC design methodology that is emerging at the Lewis Research Center consists of considering the airframe and propulsion system as one integrated system for an initial centralized controller design and then partitioning the centralized controller into separate airframe and propulsion system subcontrollers to ease implementation and to set meaningful design requirements for detailed subsystem control design and evaluation. An overview of IMPAC is provided and detailed discussion of the various important design and evaluation steps in the methodology are included.
Candidate control design metrics for an agile fighter
NASA Technical Reports Server (NTRS)
Murphy, Patrick C.; Bailey, Melvin L.; Ostroff, Aaron J.
1991-01-01
Success in the fighter combat environment of the future will certainly demand increasing capability from aircraft technology. These advanced capabilities in the form of superagility and supermaneuverability will require special design techniques which translate advanced air combat maneuvering requirements into design criteria. Control design metrics can provide some of these techniques for the control designer. Thus study presents an overview of control design metrics and investigates metrics for advanced fighter agility. The objectives of various metric users, such as airframe designers and pilots, are differentiated from the objectives of the control designer. Using an advanced fighter model, metric values are documented over a portion of the flight envelope through piloted simulation. These metric values provide a baseline against which future control system improvements can be compared and against which a control design methodology can be developed. Agility is measured for axial, pitch, and roll axes. Axial metrics highlight acceleration and deceleration capabilities under different flight loads and include specific excess power measurements to characterize energy meneuverability. Pitch metrics cover both body-axis and wind-axis pitch rates and accelerations. Included in pitch metrics are nose pointing metrics which highlight displacement capability between the nose and the velocity vector. Roll metrics (or torsion metrics) focus on rotational capability about the wind axis.
NASA Technical Reports Server (NTRS)
Myers, Thomas T.; Mcruer, Duane T.
1988-01-01
The development of a comprehensive and electric methodology for conceptual and preliminary design of flight control systems is presented and illustrated. The methodology is focused on the design states starting with the layout of system requirements and ending when some viable competing system architectures (feedback control structures) are defined. The approach is centered on the human pilot and the aircraft as both the sources of, and the keys to the solution of, many flight control problems. The methodology relies heavily on computational procedures which are highly interactive with the design engineer. To maximize effectiveness, these techniques, as selected and modified to be used together in the methodology, form a cadre of computational tools specifically tailored for integrated flight control system preliminary design purposes. The FCX expert system as presently developed is only a limited prototype capable of supporting basic lateral-directional FCS design activities related to the design example used. FCX presently supports design of only one FCS architecture (yaw damper plus roll damper) and the rules are largely focused on Class IV (highly maneuverable) aircraft. Despite this limited scope, the major elements which appear necessary for application of knowledge-based software concepts to flight control design were assembled and thus FCX represents a prototype which can be tested, critiqued and evolved in an ongoing process of development.
A quality-based cost model for new electronic systems and products
NASA Astrophysics Data System (ADS)
Shina, Sammy G.; Saigal, Anil
1998-04-01
This article outlines a method for developing a quality-based cost model for the design of new electronic systems and products. The model incorporates a methodology for determining a cost-effective design margin allocation for electronic products and systems and its impact on manufacturing quality and cost. A spreadsheet-based cost estimating tool was developed to help implement this methodology in order for the system design engineers to quickly estimate the effect of design decisions and tradeoffs on the quality and cost of new products. The tool was developed with automatic spreadsheet connectivity to current process capability and with provisions to consider the impact of capital equipment and tooling purchases to reduce the product cost.
RiskSOAP: Introducing and applying a methodology of risk self-awareness in road tunnel safety.
Chatzimichailidou, Maria Mikela; Dokas, Ioannis M
2016-05-01
Complex socio-technical systems, such as road tunnels, can be designed and developed with more or less elements that can either positively or negatively affect the capability of their agents to recognise imminent threats or vulnerabilities that possibly lead to accidents. This capability is called risk Situation Awareness (SA) provision. Having as a motive the introduction of better tools for designing and developing systems that are self-aware of their vulnerabilities and react to prevent accidents and losses, this paper introduces the Risk Situation Awareness Provision (RiskSOAP) methodology to the field of road tunnel safety, as a means to measure this capability in this kind of systems. The main objective is to test the soundness and the applicability of RiskSOAP to infrastructure, which is advanced in terms of technology, human integration, and minimum number of safety requirements imposed by international bodies. RiskSOAP is applied to a specific road tunnel in Greece and the accompanying indicator is calculated twice, once for the tunnel design as defined by updated European safety standards and once for the 'as-is' tunnel composition, which complies with the necessary safety requirements, but calls for enhancing safety according to what EU and PIARC further suggest. The derived values indicate the extent to which each tunnel version is capable of comprehending its threats and vulnerabilities based on its elements. The former tunnel version seems to be more enhanced both in terms of it risk awareness capability and safety as well. Another interesting finding is that despite the advanced tunnel safety specifications, there is still room for enriching the safe design and maintenance of the road tunnel. Copyright © 2016 Elsevier Ltd. All rights reserved.
Rain erosion considerations for launch vehicle insulation systems
NASA Technical Reports Server (NTRS)
Daniels, D. J.; Sieker, W. D.
1977-01-01
In recent years the Delta launch vehicle has incorporated the capability to be launched through rain. This capability was developed to eliminate a design constraint which could result in a costly launch delay. This paper presents the methodology developed to implement rain erosion protection for the insulated exterior vehicle surfaces. The effect of the interaction between insulation material rain erosion resistance, rainstorm models, surface geometry and trajectory variations is examined. It is concluded that rain erosion can significantly impact the performance of launch vehicle insulation systems and should be considered in their design.
A CAD approach to magnetic bearing design
NASA Technical Reports Server (NTRS)
Jeyaseelan, M.; Anand, D. K.; Kirk, J. A.
1988-01-01
A design methodology has been developed at the Magnetic Bearing Research Laboratory for designing magnetic bearings using a CAD approach. This is used in the algorithm of an interactive design software package. The package is a design tool developed to enable the designer to simulate the entire process of design and analysis of the system. Its capabilities include interactive input/modification of geometry, finding any possible saturation at critical sections of the system, and the design and analysis of a control system that stabilizes and maintains magnetic suspension.
An automated methodology development. [software design for combat simulation
NASA Technical Reports Server (NTRS)
Hawley, L. R.
1985-01-01
The design methodology employed in testing the applicability of Ada in large-scale combat simulations is described. Ada was considered as a substitute for FORTRAN to lower life cycle costs and ease the program development efforts. An object-oriented approach was taken, which featured definitions of military targets, the capability of manipulating their condition in real-time, and one-to-one correlation between the object states and real world states. The simulation design process was automated by the problem statement language (PSL)/problem statement analyzer (PSA). The PSL/PSA system accessed the problem data base directly to enhance the code efficiency by, e.g., eliminating non-used subroutines, and provided for automated report generation, besides allowing for functional and interface descriptions. The ways in which the methodology satisfied the responsiveness, reliability, transportability, modifiability, timeliness and efficiency goals are discussed.
NASA Technical Reports Server (NTRS)
Morgenstern, John; Norstrud, Nicole; Sokhey, Jack; Martens, Steve; Alonso, Juan J.
2013-01-01
Lockheed Martin Aeronautics Company (LM), working in conjunction with General Electric Global Research (GE GR), Rolls-Royce Liberty Works (RRLW), and Stanford University, herein presents results from the "N+2 Supersonic Validations" contract s initial 22 month phase, addressing the NASA solicitation "Advanced Concept Studies for Supersonic Commercial Transports Entering Service in the 2018 to 2020 Period." This report version adds documentation of an additional three month low boom test task. The key technical objective of this effort was to validate integrated airframe and propulsion technologies and design methodologies. These capabilities aspired to produce a viable supersonic vehicle design with environmental and performance characteristics. Supersonic testing of both airframe and propulsion technologies (including LM3: 97-023 low boom testing and April-June nozzle acoustic testing) verified LM s supersonic low-boom design methodologies and both GE and RRLW's nozzle technologies for future implementation. The N+2 program is aligned with NASA s Supersonic Project and is focused on providing system-level solutions capable of overcoming the environmental and performance/efficiency barriers to practical supersonic flight. NASA proposed "Initial Environmental Targets and Performance Goals for Future Supersonic Civil Aircraft". The LM N+2 studies are built upon LM s prior N+3 100 passenger design studies. The LM N+2 program addresses low boom design and methodology validations with wind tunnel testing, performance and efficiency goals with system level analysis, and low noise validations with two nozzle (GE and RRLW) acoustic tests.
Energy management and vehicle synthesis
NASA Astrophysics Data System (ADS)
Czysz, P.; Murthy, S. N. B.
The major drivers in the development of launch vehicles for the twenty-first century are reduction in cost of vehicles and operations, continuous reusability, mission abort capability with vehicle recovery, and readiness. One approach to the design of such vehicles is to emphasize energy management and propulsion as being the principal means of improvements given the available industrial capability and the required freedom in selecting configuration concept geometries. A methodology has been developed for the rational synthesis of vehicles based on the setting up and utilization of available data and projections, and a reference vehicle. The application of the methodology is illustrated for a single stage to orbit (SSTO) with various limits for the use of airbreathing propulsion.
Energy management and vehicle synthesis
NASA Technical Reports Server (NTRS)
Czysz, P.; Murthy, S. N. B.
1995-01-01
The major drivers in the development of launch vehicles for the twenty-first century are reduction in cost of vehicles and operations, continuous reusability, mission abort capability with vehicle recovery, and readiness. One approach to the design of such vehicles is to emphasize energy management and propulsion as being the principal means of improvements given the available industrial capability and the required freedom in selecting configuration concept geometries. A methodology has been developed for the rational synthesis of vehicles based on the setting up and utilization of available data and projections, and a reference vehicle. The application of the methodology is illustrated for a single stage to orbit (SSTO) with various limits for the use of airbreathing propulsion.
NASA Technical Reports Server (NTRS)
Jones, Thomas C.; Dorsey, John T.; Doggett, William R.
2015-01-01
The Tendon-Actuated Lightweight In-Space MANipulator (TALISMAN) is a versatile long-reach robotic manipulator that is currently being tested at NASA Langley Research Center. TALISMAN is designed to be highly mass-efficient and multi-mission capable, with applications including asteroid retrieval and manipulation, in-space servicing, and astronaut and payload positioning. The manipulator uses a modular, periodic, tension-compression design that lends itself well to analytical modeling. Given the versatility of application for TALISMAN, a structural sizing methodology was developed that could rapidly assess mass and configuration sensitivities for any specified operating work space, applied loads and mission requirements. This methodology allows the systematic sizing of the key structural members of TALISMAN, which include the truss arm links, the spreaders and the tension elements. This paper summarizes the detailed analytical derivations and methodology that support the structural sizing approach and provides results from some recent TALISMAN designs developed for current and proposed mission architectures.
Design requirements for operational earth resources ground data processing
NASA Technical Reports Server (NTRS)
Baldwin, C. J.; Bradford, L. H.; Burnett, E. S.; Hutson, D. E.; Kinsler, B. A.; Kugle, D. R.; Webber, D. S.
1972-01-01
Realistic tradeoff data and evaluation techniques were studied that permit conceptual design of operational earth resources ground processing systems. Methodology for determining user requirements that utilize the limited information available from users is presented along with definitions of sensor capabilities projected into the shuttle/station era. A tentative method is presented for synthesizing candidate ground processing concepts.
NASA Astrophysics Data System (ADS)
Lee, Dae Young
The design of a small satellite is challenging since they are constrained by mass, volume, and power. To mitigate these constraint effects, designers adopt deployable configurations on the spacecraft that result in an interesting and difficult optimization problem. The resulting optimization problem is challenging due to the computational complexity caused by the large number of design variables and the model complexity created by the deployables. Adding to these complexities, there is a lack of integration of the design optimization systems into operational optimization, and the utility maximization of spacecraft in orbit. The developed methodology enables satellite Multidisciplinary Design Optimization (MDO) that is extendable to on-orbit operation. Optimization of on-orbit operations is possible with MDO since the model predictive controller developed in this dissertation guarantees the achievement of the on-ground design behavior in orbit. To enable the design optimization of highly constrained and complex-shaped space systems, the spherical coordinate analysis technique, called the "Attitude Sphere", is extended and merged with an additional engineering tools like OpenGL. OpenGL's graphic acceleration facilitates the accurate estimation of the shadow-degraded photovoltaic cell area. This technique is applied to the design optimization of the satellite Electric Power System (EPS) and the design result shows that the amount of photovoltaic power generation can be increased more than 9%. Based on this initial methodology, the goal of this effort is extended from Single Discipline Optimization to Multidisciplinary Optimization, which includes the design and also operation of the EPS, Attitude Determination and Control System (ADCS), and communication system. The geometry optimization satisfies the conditions of the ground development phase; however, the operation optimization may not be as successful as expected in orbit due to disturbances. To address this issue, for the ADCS operations, controllers based on Model Predictive Control that are effective for constraint handling were developed and implemented. All the suggested design and operation methodologies are applied to a mission "CADRE", which is space weather mission scheduled for operation in 2016. This application demonstrates the usefulness and capability of the methodology to enhance CADRE's capabilities, and its ability to be applied to a variety of missions.
NPAC-Nozzle Performance Analysis Code
NASA Technical Reports Server (NTRS)
Barnhart, Paul J.
1997-01-01
A simple and accurate nozzle performance analysis methodology has been developed. The geometry modeling requirements are minimal and very flexible, thus allowing rapid design evaluations. The solution techniques accurately couple: continuity, momentum, energy, state, and other relations which permit fast and accurate calculations of nozzle gross thrust. The control volume and internal flow analyses are capable of accounting for the effects of: over/under expansion, flow divergence, wall friction, heat transfer, and mass addition/loss across surfaces. The results from the nozzle performance methodology are shown to be in excellent agreement with experimental data for a variety of nozzle designs over a range of operating conditions.
Application of numerical methods to heat transfer and thermal stress analysis of aerospace vehicles
NASA Technical Reports Server (NTRS)
Wieting, A. R.
1979-01-01
The paper describes a thermal-structural design analysis study of a fuel-injection strut for a hydrogen-cooled scramjet engine for a supersonic transport, utilizing finite-element methodology. Applications of finite-element and finite-difference codes to the thermal-structural design-analysis of space transports and structures are discussed. The interaction between the thermal and structural analyses has led to development of finite-element thermal methodology to improve the integration between these two disciplines. The integrated thermal-structural analysis capability developed within the framework of a computer code is outlined.
A method for the design of transonic flexible wings
NASA Technical Reports Server (NTRS)
Smith, Leigh Ann; Campbell, Richard L.
1990-01-01
Methodology was developed for designing airfoils and wings at transonic speeds which includes a technique that can account for static aeroelastic deflections. This procedure is capable of designing either supercritical or more conventional airfoil sections. Methods for including viscous effects are also illustrated and are shown to give accurate results. The methodology developed is an interactive system containing three major parts. A design module was developed which modifies airfoil sections to achieve a desired pressure distribution. This design module works in conjunction with an aerodynamic analysis module, which for this study is a small perturbation transonic flow code. Additionally, an aeroelastic module is included which determines the wing deformation due to the calculated aerodynamic loads. Because of the modular nature of the method, it can be easily coupled with any aerodynamic analysis code.
Design Optimization of Gas Generator Hybrid Propulsion Boosters
NASA Technical Reports Server (NTRS)
Weldon, Vincent; Phillips, Dwight; Fink, Larry
1990-01-01
A methodology used in support of a study for NASA/MSFC to optimize the design of gas generator hybrid propulsion booster for uprating the National Space Transportation System (NSTS) is presented. The objective was to compare alternative configurations for this booster approach, optimizing each candidate concept on different bases, in order to develop data for a trade table on which a final decision was based. The methodology is capable of processing a large number of independent and dependent variables, adjusting the overall subsystems characteristics to arrive at a best compromise integrated design to meet various specific optimization criteria subject to selected constraints. For each system considered, a detailed weight statement was generated along with preliminary cost and reliability estimates.
Cross-Cultural Group Performance
ERIC Educational Resources Information Center
Mitchell, Rebecca; Boyle, Brendan; Nicholas, Stephen
2011-01-01
Purpose: This paper aims to explore the assumption that the impact of cultural diversity on knowledge creating capability is consequent to associated differences in knowledge and perspectives, and suggests that these knowledge differences produce their effect by triggering deliberative, collaborative behaviours. Design/methodology/approach: To…
NASA Technical Reports Server (NTRS)
Noll, Thomas E.
1990-01-01
The paper describes recent accomplishments and current research projects along four main thrusts in aeroservoelasticity at NASA Langley. One activity focuses on enhancing the modeling and analysis procedures to accurately predict aeroservoelastic interactions. Improvements to the minimum-state method of approximating unsteady aerodynamics are shown to provide precise low-order models for design and simulation tasks. Recent extensions in aerodynamic correction-factor methodology are also described. With respect to analysis procedures, the paper reviews novel enhancements to matched filter theory and random process theory for predicting the critical gust profile and the associated time-correlated gust loads for structural design considerations. Two research projects leading towards improved design capability are also summarized: (1) an integrated structure/control design capability and (2) procedures for obtaining low-order robust digital control laws for aeroelastic applications.
1975-06-01
the Air Force Flight Dynamics Laboratory for use in conceptual and preliminary designs pauses of weapon system development. The methods are a...trade study method provides ai\\ iterative capability stemming from a direct interface with design synthesis programs. A detailed cost data base ;ind...system for data expmjsion is provided. The methods are designed for ease in changing cost estimating relationships and estimating coefficients
Aircraft flight test trajectory control
NASA Technical Reports Server (NTRS)
Menon, P. K. A.; Walker, R. A.
1988-01-01
Two control law design techniques are compared and the performance of the resulting controllers evaluated. The design requirement is for a flight test trajectory controller (FTTC) capable of closed-loop, outer-loop control of an F-15 aircraft performing high-quality research flight test maneuvers. The maneuver modeling, linearization, and design methodologies utilized in this research, are detailed. The results of applying these FTTCs to a nonlinear F-15 simulation are presented.
Integrating reliability and maintainability into a concurrent engineering environment
NASA Astrophysics Data System (ADS)
Phillips, Clifton B.; Peterson, Robert R.
1993-02-01
This paper describes the results of a reliability and maintainability study conducted at the University of California, San Diego and supported by private industry. Private industry thought the study was important and provided the university access to innovative tools under cooperative agreement. The current capability of reliability and maintainability tools and how they fit into the design process is investigated. The evolution of design methodologies leading up to today's capability is reviewed for ways to enhance the design process while keeping cost under control. A method for measuring the consequences of reliability and maintainability policy for design configurations in an electronic environment is provided. The interaction of selected modern computer tool sets is described for reliability, maintainability, operations, and other elements of the engineering design process. These tools provide a robust system evaluation capability that brings life cycle performance improvement information to engineers and their managers before systems are deployed, and allow them to monitor and track performance while it is in operation.
Lyon, Aaron R; Lewis, Cara C; Melvin, Abigail; Boyd, Meredith; Nicodimos, Semret; Liu, Freda F; Jungbluth, Nathaniel
2016-09-22
Health information technologies (HIT) have become nearly ubiquitous in the contemporary healthcare landscape, but information about HIT development, functionality, and implementation readiness is frequently siloed. Theory-driven methods of compiling, evaluating, and integrating information from the academic and commercial sectors are necessary to guide stakeholder decision-making surrounding HIT adoption and to develop pragmatic HIT research agendas. This article presents the Health Information Technologies-Academic and Commercial Evaluation (HIT-ACE) methodology, a structured, theory-driven method for compiling and evaluating information from multiple sectors. As an example demonstration of the methodology, we apply HIT-ACE to mental and behavioral health measurement feedback systems (MFS). MFS are a specific class of HIT that support the implementation of routine outcome monitoring, an evidence-based practice. HIT-ACE is guided by theories and frameworks related to user-centered design and implementation science. The methodology involves four phases: (1) coding academic and commercial materials, (2) developer/purveyor interviews, (3) linking putative implementation mechanisms to hit capabilities, and (4) experimental testing of capabilities and mechanisms. In the current demonstration, phase 1 included a systematic process to identify MFS in mental and behavioral health using academic literature and commercial websites. Using user-centered design, implementation science, and feedback frameworks, the HIT-ACE coding system was developed, piloted, and used to review each identified system for the presence of 38 capabilities and 18 additional characteristics via a consensus coding process. Bibliometic data were also collected to examine the representation of the systems in the scientific literature. As an example, results are presented for the application of HIT-ACE phase 1 to MFS wherein 49 separate MFS were identified, reflecting a diverse array of characteristics and capabilities. Preliminary findings demonstrate the utility of HIT-ACE to represent the scope and diversity of a given class of HIT beyond what can be identified in the academic literature. Phase 2 data collection is expected to confirm and expand the information presented and phases 3 and 4 will provide more nuanced information about the impact of specific HIT capabilities. In all, HIT-ACE is expected to support adoption decisions and additional HIT development and implementation research.
NASA Technical Reports Server (NTRS)
Niiya, Karen E.; Walker, Richard E.; Pieper, Jerry L.; Nguyen, Thong V.
1993-01-01
This final report includes a discussion of the work accomplished during the period from Dec. 1988 through Nov. 1991. The objective of the program was to assemble existing performance and combustion stability models into a usable design methodology capable of designing and analyzing high-performance and stable LOX/hydrocarbon booster engines. The methodology was then used to design a validation engine. The capabilities and validity of the methodology were demonstrated using this engine in an extensive hot fire test program. The engine used LOX/RP-1 propellants and was tested over a range of mixture ratios, chamber pressures, and acoustic damping device configurations. This volume contains time domain and frequency domain stability plots which indicate the pressure perturbation amplitudes and frequencies from approximately 30 tests of a 50K thrust rocket engine using LOX/RP-1 propellants over a range of chamber pressures from 240 to 1750 psia with mixture ratios of from 1.2 to 7.5. The data is from test configurations which used both bitune and monotune acoustic cavities and from tests with no acoustic cavities. The engine had a length of 14 inches and a contraction ratio of 2.0 using a 7.68 inch diameter injector. The data was taken from both stable and unstable tests. All combustion instabilities were spontaneous in the first tangential mode. Although stability bombs were used and generated overpressures of approximately 20 percent, no tests were driven unstable by the bombs. The stability instrumentation included six high-frequency Kistler transducers in the combustion chamber, a high-frequency Kistler transducer in each propellant manifold, and tri-axial accelerometers. Performance data is presented, both characteristic velocity efficiencies and energy release efficiencies, for those tests of sufficient duration to record steady state values.
Hypersonic Experimental and Computational Capability, Improvement and Validation. Volume 2
NASA Technical Reports Server (NTRS)
Muylaert, Jean (Editor); Kumar, Ajay (Editor); Dujarric, Christian (Editor)
1998-01-01
The results of the phase 2 effort conducted under AGARD Working Group 18 on Hypersonic Experimental and Computational Capability, Improvement and Validation are presented in this report. The first volume, published in May 1996, mainly focused on the design methodology, plans and some initial results of experiments that had been conducted to serve as validation benchmarks. The current volume presents the detailed experimental and computational data base developed during this effort.
Control/structure interaction design methodology
NASA Technical Reports Server (NTRS)
Briggs, Hugh C.; Layman, William E.
1989-01-01
The Control Structure Interaction Program is a technology development program for spacecraft that exhibit interactions between the control system and structural dynamics. The program objectives include development and verification of new design concepts (such as active structure) and new tools (such as a combined structure and control optimization algorithm) and their verification in ground and possibly flight test. The new CSI design methodology is centered around interdisciplinary engineers using new tools that closely integrate structures and controls. Verification is an important CSI theme and analysts will be closely integrated to the CSI Test Bed laboratory. Components, concepts, tools and algorithms will be developed and tested in the lab and in future Shuttle-based flight experiments. The design methodology is summarized in block diagrams depicting the evolution of a spacecraft design and descriptions of analytical capabilities used in the process. The multiyear JPL CSI implementation plan is described along with the essentials of several new tools. A distributed network of computation servers and workstations was designed that will provide a state-of-the-art development base for the CSI technologies.
Human perception testing methodology for evaluating EO/IR imaging systems
NASA Astrophysics Data System (ADS)
Graybeal, John J.; Monfort, Samuel S.; Du Bosq, Todd W.; Familoni, Babajide O.
2018-04-01
The U.S. Army's RDECOM CERDEC Night Vision and Electronic Sensors Directorate (NVESD) Perception Lab is tasked with supporting the development of sensor systems for the U.S. Army by evaluating human performance of emerging technologies. Typical research questions involve detection, recognition and identification as a function of range, blur, noise, spectral band, image processing techniques, image characteristics, and human factors. NVESD's Perception Lab provides an essential bridge between the physics of the imaging systems and the performance of the human operator. In addition to quantifying sensor performance, perception test results can also be used to generate models of human performance and to drive future sensor requirements. The Perception Lab seeks to develop and employ scientifically valid and efficient perception testing procedures within the practical constraints of Army research, including rapid development timelines for critical technologies, unique guidelines for ethical testing of Army personnel, and limited resources. The purpose of this paper is to describe NVESD Perception Lab capabilities, recent methodological improvements designed to align our methodology more closely with scientific best practice, and to discuss goals for future improvements and expanded capabilities. Specifically, we discuss modifying our methodology to improve training, to account for human fatigue, to improve assessments of human performance, and to increase experimental design consultation provided by research psychologists. Ultimately, this paper outlines a template for assessing human perception and overall system performance related to EO/IR imaging systems.
Fitting methods to paradigms: are ergonomics methods fit for systems thinking?
Salmon, Paul M; Walker, Guy H; M Read, Gemma J; Goode, Natassia; Stanton, Neville A
2017-02-01
The issues being tackled within ergonomics problem spaces are shifting. Although existing paradigms appear relevant for modern day systems, it is worth questioning whether our methods are. This paper asks whether the complexities of systems thinking, a currently ubiquitous ergonomics paradigm, are outpacing the capabilities of our methodological toolkit. This is achieved through examining the contemporary ergonomics problem space and the extent to which ergonomics methods can meet the challenges posed. Specifically, five key areas within the ergonomics paradigm of systems thinking are focused on: normal performance as a cause of accidents, accident prediction, system migration, systems concepts and ergonomics in design. The methods available for pursuing each line of inquiry are discussed, along with their ability to respond to key requirements. In doing so, a series of new methodological requirements and capabilities are identified. It is argued that further methodological development is required to provide researchers and practitioners with appropriate tools to explore both contemporary and future problems. Practitioner Summary: Ergonomics methods are the cornerstone of our discipline. This paper examines whether our current methodological toolkit is fit for purpose given the changing nature of ergonomics problems. The findings provide key research and practice requirements for methodological development.
Data mining of text as a tool in authorship attribution
NASA Astrophysics Data System (ADS)
Visa, Ari J. E.; Toivonen, Jarmo; Autio, Sami; Maekinen, Jarno; Back, Barbro; Vanharanta, Hannu
2001-03-01
It is common that text documents are characterized and classified by keywords that the authors use to give them. Visa et al. have developed a new methodology based on prototype matching. The prototype is an interesting document or a part of an extracted, interesting text. This prototype is matched with the document database of the monitored document flow. The new methodology is capable of extracting the meaning of the document in a certain degree. Our claim is that the new methodology is also capable of authenticating the authorship. To verify this claim two tests were designed. The test hypothesis was that the words and the word order in the sentences could authenticate the author. In the first test three authors were selected. The selected authors were William Shakespeare, Edgar Allan Poe, and George Bernard Shaw. Three texts from each author were examined. Every text was one by one used as a prototype. The two nearest matches with the prototype were noted. The second test uses the Reuters-21578 financial news database. A group of 25 short financial news reports from five different authors are examined. Our new methodology and the interesting results from the two tests are reported in this paper. In the first test, for Shakespeare and for Poe all cases were successful. For Shaw one text was confused with Poe. In the second test the Reuters-21578 financial news were identified by the author relatively well. The resolution is that our text mining methodology seems to be capable of authorship attribution.
NASA Technical Reports Server (NTRS)
Celaya, Jose R.; Saha, Sankalita; Goebel, Kai
2011-01-01
Accelerated aging methodologies for electrolytic components have been designed and accelerated aging experiments have been carried out. The methodology is based on imposing electrical and/or thermal overstresses via electrical power cycling in order to mimic the real world operation behavior. Data are collected in-situ and offline in order to periodically characterize the devices' electrical performance as it ages. The data generated through these experiments are meant to provide capability for the validation of prognostic algorithms (both model-based and data-driven). Furthermore, the data allow validation of physics-based and empirical based degradation models for this type of capacitor. A first set of models and algorithms has been designed and tested on the data.
Design optimization of gas generator hybrid propulsion boosters
NASA Technical Reports Server (NTRS)
Weldon, Vincent; Phillips, Dwight U.; Fink, Lawrence E.
1990-01-01
A methodology used in support of a contract study for NASA/MSFC to optimize the design of gas generator hybrid propulsion booster for uprating the National Space Transportation System (NSTS) is presented. The objective was to compare alternative configurations for this booster approach, optimizing each candidate concept on different bases, in order to develop data for a trade table on which a final decision was based. The methodology is capable of processing a large number of independent and dependent variables, adjusting the overall subsystems characteristics to arrive at a best compromise integrated design to meet various specified optimization criteria subject to selected constraints. For each system considered, a detailed weight statement was generated along with preliminary cost and reliability estimates.
Estimating the Reliability of Electronic Parts in High Radiation Fields
NASA Technical Reports Server (NTRS)
Everline, Chester; Clark, Karla; Man, Guy; Rasmussen, Robert; Johnston, Allan; Kohlhase, Charles; Paulos, Todd
2008-01-01
Radiation effects on materials and electronic parts constrain the lifetime of flight systems visiting Europa. Understanding mission lifetime limits is critical to the design and planning of such a mission. Therefore, the operational aspects of radiation dose are a mission success issue. To predict and manage mission lifetime in a high radiation environment, system engineers need capable tools to trade radiation design choices against system design and reliability, and science achievements. Conventional tools and approaches provided past missions with conservative designs without the ability to predict their lifetime beyond the baseline mission.This paper describes a more systematic approach to understanding spacecraft design margin, allowing better prediction of spacecraft lifetime. This is possible because of newly available electronic parts radiation effects statistics and an enhanced spacecraft system reliability methodology. This new approach can be used in conjunction with traditional approaches for mission design. This paper describes the fundamentals of the new methodology.
Simulation Enabled Safeguards Assessment Methodology
DOE Office of Scientific and Technical Information (OSTI.GOV)
Robert Bean; Trond Bjornard; Thomas Larson
2007-09-01
It is expected that nuclear energy will be a significant component of future supplies. New facilities, operating under a strengthened international nonproliferation regime will be needed. There is good reason to believe virtual engineering applied to the facility design, as well as to the safeguards system design will reduce total project cost and improve efficiency in the design cycle. Simulation Enabled Safeguards Assessment MEthodology (SESAME) has been developed as a software package to provide this capability for nuclear reprocessing facilities. The software architecture is specifically designed for distributed computing, collaborative design efforts, and modular construction to allow step improvements inmore » functionality. Drag and drop wireframe construction allows the user to select the desired components from a component warehouse, render the system for 3D visualization, and, linked to a set of physics libraries and/or computational codes, conduct process evaluations of the system they have designed.« less
A Framework for Preliminary Design of Aircraft Structures Based on Process Information. Part 1
NASA Technical Reports Server (NTRS)
Rais-Rohani, Masoud
1998-01-01
This report discusses the general framework and development of a computational tool for preliminary design of aircraft structures based on process information. The described methodology is suitable for multidisciplinary design optimization (MDO) activities associated with integrated product and process development (IPPD). The framework consists of three parts: (1) product and process definitions; (2) engineering synthesis, and (3) optimization. The product and process definitions are part of input information provided by the design team. The backbone of the system is its ability to analyze a given structural design for performance as well as manufacturability and cost assessment. The system uses a database on material systems and manufacturing processes. Based on the identified set of design variables and an objective function, the system is capable of performing optimization subject to manufacturability, cost, and performance constraints. The accuracy of the manufacturability measures and cost models discussed here depend largely on the available data on specific methods of manufacture and assembly and associated labor requirements. As such, our focus in this research has been on the methodology itself and not so much on its accurate implementation in an industrial setting. A three-tier approach is presented for an IPPD-MDO based design of aircraft structures. The variable-complexity cost estimation methodology and an approach for integrating manufacturing cost assessment into design process are also discussed. This report is presented in two parts. In the first part, the design methodology is presented, and the computational design tool is described. In the second part, a prototype model of the preliminary design Tool for Aircraft Structures based on Process Information (TASPI) is described. Part two also contains an example problem that applies the methodology described here for evaluation of six different design concepts for a wing spar.
Recent activities within the Aeroservoelasticity Branch at the NASA Langley Research Center
NASA Technical Reports Server (NTRS)
Noll, Thomas E.; Perry, Boyd, III; Gilbert, Michael G.
1989-01-01
The objective of research in aeroservoelasticity at the NASA Langley Research Center is to enhance the modeling, analysis, and multidisciplinary design methodologies for obtaining multifunction digital control systems for application to flexible flight vehicles. Recent accomplishments are discussed, and a status report on current activities within the Aeroservoelasticity Branch is presented. In the area of modeling, improvements to the Minimum-State Method of approximating unsteady aerodynamics are shown to provide precise, low-order aeroservoelastic models for design and simulation activities. Analytical methods based on Matched Filter Theory and Random Process Theory to provide efficient and direct predictions of the critical gust profile and the time-correlated gust loads for linear structural design considerations are also discussed. Two research projects leading towards improved design methodology are summarized. The first program is developing an integrated structure/control design capability based on hierarchical problem decomposition, multilevel optimization and analytical sensitivities. The second program provides procedures for obtaining low-order, robust digital control laws for aeroelastic applications. In terms of methodology validation and application the current activities associated with the Active Flexible Wing project are reviewed.
Facilitating Employees' and Students' Process towards Nascent Entrepreneurship
ERIC Educational Resources Information Center
Hietanen, Lenita
2015-01-01
Purpose: The purpose of this paper is to investigate a model for facilitating employees' and full-time, non-business students' entrepreneurial capabilities during their optional entrepreneurship studies at one Finnish Open University. Design/methodology/approach: The case study investigates the course in which transitions from employees or…
Capabilities, methodologies, and use of the cambio file-translation application.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lasche, George P.
2007-03-01
This report describes the capabilities, methodologies, and uses of the Cambio computer application, designed to automatically read and display nuclear spectral data files of any known format in the world and to convert spectral data to one of several commonly used analysis formats. To further assist responders, Cambio incorporates an analysis method based on non-linear fitting techniques found in open literature and implemented in openly published source code in the late 1980s. A brief description is provided of how Cambio works, of what basic formats it can currently read, and how it can be used. Cambio was developed at Sandiamore » National Laboratories and is provided as a free service to assist nuclear emergency response analysts anywhere in the world in the fight against nuclear terrorism.« less
Designing for Annual Spacelift Performance
NASA Technical Reports Server (NTRS)
McCleskey, Carey M.; Zapata, Edgar
2017-01-01
This paper presents a methodology for approaching space launch system design from a total architectural point of view. This different approach to conceptual design is contrasted with traditional approaches that focus on a single set of metrics for flight system performance, i.e., payload lift per flight, vehicle mass, specific impulse, etc. The approach presented works with a larger set of metrics, including annual system lift, or "spacelift" performance. Spacelift performance is more inclusive of the flight production capability of the total architecture, i.e., the flight and ground systems working together as a whole to produce flights on a repeated basis. In the proposed methodology, spacelift performance becomes an important design-for-support parameter for flight system concepts and truly advanced spaceport architectures of the future. The paper covers examples of existing system spacelift performance as benchmarks, points out specific attributes of space transportation systems that must be greatly improved over these existing designs, and outlines current activity in this area.
Functional Mobility Testing: A Novel Method to Establish Human System Interface Design Requirements
NASA Technical Reports Server (NTRS)
England, Scott A.; Benson, Elizabeth A.; Rajulu, Sudhakar
2008-01-01
Across all fields of human-system interface design it is vital to posses a sound methodology dictating the constraints on the system based on the capabilities of the human user. These limitations may be based on strength, mobility, dexterity, cognitive ability, etc. and combinations thereof. Data collected in an isolated environment to determine, for example, maximal strength or maximal range of motion would indeed be adequate for establishing not-to-exceed type design limitations, however these restraints on the system may be excessive over what is basally needed. Resources may potentially be saved by having a technique to determine the minimum measurements a system must accommodate. This paper specifically deals with the creation of a novel methodology for establishing mobility requirements for a new generation of space suit design concepts. Historically, the Space Shuttle and the International Space Station vehicle and space hardware design requirements documents such as the Man-Systems Integration Standards and International Space Station Flight Crew Integration Standard explicitly stated that the designers should strive to provide the maximum joint range of motion capabilities exhibited by a minimally clothed human subject. In the course of developing the Human-Systems Integration Requirements (HSIR) for the new space exploration initiative (Constellation), an effort was made to redefine the mobility requirements in the interest of safety and cost. Systems designed for manned space exploration can receive compounded gains from simplified designs that are both initially less expensive to produce and lighter, thereby, cheaper to launch.
1991-09-01
ref lect the of ficial policy or position of the Department of Defense or the U.S. Government. Accesion For NTIS CrA&,i By D, st ibtt:or~f 11--- ... Si...capability 3. A flexible, well-planned overall architecture 4. A plan for incremental achievement of full capability 5. Early definition, funding...2. a system architecture and design that will satisfy the requirements. 3. a development team that communicates effectively and have previous
Learning at Work: Organisational Affordances and Individual Engagement
ERIC Educational Resources Information Center
Bryson, Jane; Pajo, Karl; Ward, Robyn; Mallon, Mary
2006-01-01
Purpose: The purpose of this research is to explore the interaction between organisational affordances for the development of individuals' capability, and the engagement of workers at various levels with those opportunities. Design/methodology/approach: A case study of a large New Zealand wine company, using in-depth interviews. Interviews were…
Strategic Learning Capability: Through the Lens of Environmental Jolts
ERIC Educational Resources Information Center
Moon, Hanna; Lee, Chan
2015-01-01
Purpose: This paper aims to deepen the understanding of strategic learning through the lens of environmental jolts. Design/methodology/approach: Strategic learning is explained from the three paradigms of organizational learning. Findings: Organizational learning provides a firm foundation to develop and elaborate the concept of strategic learning…
Management Education's Blind Spot: Management of Workplace Relations
ERIC Educational Resources Information Center
Clydesdale, Greg
2009-01-01
Purpose: Developing interpersonal relationships is widely recognised as a key managerial capability, but business schools have been criticised for the limited attention given to the subject. The purpose of this paper is to attempt to address this deficiency in the area of teaching workplace relationships. Design/methodology/approach: The paper…
NASA Technical Reports Server (NTRS)
Madrid, G. A.; Westmoreland, P. T.
1983-01-01
A progress report is presented on a program to upgrade the existing NASA Deep Space Network in terms of a redesigned computer-controlled data acquisition system for channelling tracking, telemetry, and command data between a California-based control center and three signal processing centers in Australia, California, and Spain. The methodology for the improvements is oriented towards single subsystem development with consideration for a multi-system and multi-subsystem network of operational software. Details of the existing hardware configurations and data transmission links are provided. The program methodology includes data flow design, interface design and coordination, incremental capability availability, increased inter-subsystem developmental synthesis and testing, system and network level synthesis and testing, and system verification and validation. The software has been implemented thus far to a 65 percent completion level, and the methodology being used to effect the changes, which will permit enhanced tracking and communication with spacecraft, has been concluded to feature effective techniques.
Three tenets for secure cyber-physical system design and assessment
NASA Astrophysics Data System (ADS)
Hughes, Jeff; Cybenko, George
2014-06-01
This paper presents a threat-driven quantitative mathematical framework for secure cyber-physical system design and assessment. Called The Three Tenets, this originally empirical approach has been used by the US Air Force Research Laboratory (AFRL) for secure system research and development. The Tenets were first documented in 2005 as a teachable methodology. The Tenets are motivated by a system threat model that itself consists of three elements which must exist for successful attacks to occur: - system susceptibility; - threat accessibility and; - threat capability. The Three Tenets arise naturally by countering each threat element individually. Specifically, the tenets are: Tenet 1: Focus on What's Critical - systems should include only essential functions (to reduce susceptibility); Tenet 2: Move Key Assets Out-of-Band - make mission essential elements and security controls difficult for attackers to reach logically and physically (to reduce accessibility); Tenet 3: Detect, React, Adapt - confound the attacker by implementing sensing system elements with dynamic response technologies (to counteract the attackers' capabilities). As a design methodology, the Tenets mitigate reverse engineering and subsequent attacks on complex systems. Quantified by a Bayesian analysis and further justified by analytic properties of attack graph models, the Tenets suggest concrete cyber security metrics for system assessment.
Rocketdyne PSAM: In-house enhancement/application
NASA Technical Reports Server (NTRS)
Newell, J. F.; Rajagopal, K. R.; Ohara, K.
1991-01-01
The development was initiated of the Probabilistic Design Analysis (PDA) Process for rocket engines. This will enable engineers a quantitative assessment of calculated reliability during the design process. The PDA will help choose better designs, make them more robust, and help decide on critical tests to help demonstrate key reliability issues to aid in improving the confidence of the engine capabilities. Rockedyne's involvement with the Composite Loads Spectra (CLS) and Probabilistic Structural Analysis Methodology (PSAM) contracts started this effort and are key elements in the on-going developments. Internal development efforts and hardware applications complement and extend the CLS and PSAM efforts. The completion of the CLS option work and the follow-on PSAM developments will also be integral parts of this methodology. A brief summary of these efforts is presented.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Harben, P E; Harris, D; Myers, S
Seismic imaging and tracking methods have intelligence and monitoring applications. Current systems, however, do not adequately calibrate or model the unknown geological heterogeneity. Current systems are also not designed for rapid data acquisition and analysis in the field. This project seeks to build the core technological capabilities coupled with innovative deployment, processing, and analysis methodologies to allow seismic methods to be effectively utilized in the applications of seismic imaging and vehicle tracking where rapid (minutes to hours) and real-time analysis is required. The goal of this project is to build capabilities in acquisition system design, utilization and in full 3Dmore » finite difference modeling as well as statistical characterization of geological heterogeneity. Such capabilities coupled with a rapid field analysis methodology based on matched field processing are applied to problems associated with surveillance, battlefield management, finding hard and deeply buried targets, and portal monitoring. This project benefits the U.S. military and intelligence community in support of LLNL's national-security mission. FY03 was the final year of this project. In the 2.5 years this project has been active, numerous and varied developments and milestones have been accomplished. A wireless communication module for seismic data was developed to facilitate rapid seismic data acquisition and analysis. The E3D code was enhanced to include topographic effects. Codes were developed to implement the Karhunen-Loeve (K-L) statistical methodology for generating geological heterogeneity that can be utilized in E3D modeling. The matched field processing methodology applied to vehicle tracking and based on a field calibration to characterize geological heterogeneity was tested and successfully demonstrated in a tank tracking experiment at the Nevada Test Site. A 3-seismic-array vehicle tracking testbed was installed on-site at LLNL for testing real-time seismic tracking methods. A field experiment was conducted over a tunnel at the Nevada Site that quantified the tunnel reflection signal and, coupled with modeling, identified key needs and requirements in experimental layout of sensors. A large field experiment was conducted at the Lake Lynn Laboratory, a mine safety research facility in Pennsylvania, over a tunnel complex in realistic, difficult conditions. This experiment gathered the necessary data for a full 3D attempt to apply the methodology. The experiment also collected data to analyze the capabilities to detect and locate in-tunnel explosions for mine safety and other applications.« less
Advanced Capabilities for Wind Tunnel Testing in the 21st Century
NASA Technical Reports Server (NTRS)
Kegelman, Jerome T.; Danehy, Paul M.; Schwartz, Richard J.
2010-01-01
Wind tunnel testing methods and test technologies for the 21st century using advanced capabilities are presented. These capabilities are necessary to capture more accurate and high quality test results by eliminating the uncertainties in testing and to facilitate verification of computational tools for design. This paper discusses near term developments underway in ground testing capabilities, which will enhance the quality of information of both the test article and airstream flow details. Also discussed is a selection of new capability investments that have been made to accommodate such developments. Examples include advanced experimental methods for measuring the test gas itself; using efficient experiment methodologies, including quality assurance strategies within the test; and increasing test result information density by using extensive optical visualization together with computed flow field results. These points could be made for both major investments in existing tunnel capabilities or for entirely new capabilities.
Goranitis, Ilias; Coast, Joanna; Day, Ed; Copello, Alex; Freemantle, Nick; Frew, Emma
2017-07-01
Conventional practice within the United Kingdom and beyond is to conduct economic evaluations with "health" as evaluative space and "health maximization" as the decision-making rule. However, there is increasing recognition that this evaluative framework may not always be appropriate, and this is particularly the case within public health and social care contexts. This article presents a methodological case study designed to explore the impact of changing the evaluative space within an economic evaluation from health to capability well-being and the decision-making rule from health maximization to the maximization of sufficient capability. Capability well-being is an evaluative space grounded on Amartya Sen's capability approach and assesses well-being based on individuals' ability to do and be the things they value in life. Sufficient capability is an egalitarian approach to decision making that aims to ensure everyone in society achieves a normatively sufficient level of capability well-being. The case study is treatment for drug addiction, and the cost-effectiveness of 2 psychological interventions relative to usual care is assessed using data from a pilot trial. Analyses are undertaken from a health care and a government perspective. For the purpose of the study, quality-adjusted life years (measured using the EQ-5D-5L) and years of full capability equivalent and years of sufficient capability equivalent (both measured using the ICECAP-A [ICEpop CAPability measure for Adults]) are estimated. The study concludes that different evaluative spaces and decision-making rules have the potential to offer opposing treatment recommendations. The implications for policy makers are discussed.
A transonic-small-disturbance wing design methodology
NASA Technical Reports Server (NTRS)
Phillips, Pamela S.; Waggoner, Edgar G.; Campbell, Richard L.
1988-01-01
An automated transonic design code has been developed which modifies an initial airfoil or wing in order to generate a specified pressure distribution. The design method uses an iterative approach that alternates between a potential-flow analysis and a design algorithm that relates changes in surface pressure to changes in geometry. The analysis code solves an extended small-disturbance potential-flow equation and can model a fuselage, pylons, nacelles, and a winglet in addition to the wing. A two-dimensional option is available for airfoil analysis and design. Several two- and three-dimensional test cases illustrate the capabilities of the design code.
NASA Technical Reports Server (NTRS)
Micol, John R.
2001-01-01
Description, capabilities, initiatives, and utilization of the NASA Langley Research Center's Unitary Plan Wind Tunnel are presented. A brief overview of the facility's operational capabilities and testing techniques is provided. A recent Construction of Facilities (CoF) project to improve facility productivity and efficiency through facility automation has been completed and is discussed. Several new and maturing thrusts are underway that include systematic efforts to provide credible assessment for data quality, modifications to the new automation control system for increased compatibility with the Modern Design Of Experiments (MDOE) testing methodology, and process improvements for better test coordination, planning, and execution.
NASA Technical Reports Server (NTRS)
Micol, John R.
2001-01-01
Description, capabilities, initiatives, and utilization of the NASA Langley Research Center's Unitary Plan Wind Tunnel are presented. A brief overview of the facility's operational capabilities and testing techniques is provided. A recent Construction of Facilities (Car) project to improve facility productivity and efficiency through facility automation has been completed and is discussed. Several new and maturing thrusts are underway that include systematic efforts to provide credible assessment for data quality, modifications to the new automation control system for increased compatibility with the Modern Design of Experiments (MDOE) testing methodology, and process improvements for better test coordination, planning, and execution.
NASA Technical Reports Server (NTRS)
Hanna, Stephen G.; Jones, David L.; Creech, Stephen D.; Lawrence, Thomas D.
2012-01-01
In support of the National Aeronautics and Space Administration's (NASA) Human Exploration and Operations Mission Directorate (HEOMD), the Space Launch System (SLS) is being designed for safe, affordable, and sustainable human and scientific exploration missions beyond Earth's or-bit (BEO). The SLS Team is tasked with developing a system capable of safely and repeatedly lofting a new fleet of spaceflight vehicles beyond Earth orbit. The Cryogenic Propulsion Stage (CPS) is a key enabler for evolving the SLS capability for BEO missions. This paper reports on the methodology and initial recommendations relative to the CPS, giving a brief retrospective of early studies on this promising propulsion hardware. This paper provides an overview of the requirements development and CPS configuration in support of NASA's multiple Design Reference Missions (DRMs).
Transport composite fuselage technology: Impact dynamics and acoustic transmission
NASA Technical Reports Server (NTRS)
Jackson, A. C.; Balena, F. J.; Labarge, W. L.; Pei, G.; Pitman, W. A.; Wittlin, G.
1986-01-01
A program was performed to develop and demonstrate the impact dynamics and acoustic transmission technology for a composite fuselage which meets the design requirements of a 1990 large transport aircraft without substantial weight and cost penalties. The program developed the analytical methodology for the prediction of acoustic transmission behavior of advanced composite stiffened shell structures. The methodology predicted that the interior noise level in a composite fuselage due to turbulent boundary layer will be less than in a comparable aluminum fuselage. The verification of these analyses will be performed by NASA Langley Research Center using a composite fuselage shell fabricated by filament winding. The program also developed analytical methodology for the prediction of the impact dynamics behavior of lower fuselage structure constructed with composite materials. Development tests were performed to demonstrate that the composite structure designed to the same operating load requirement can have at least the same energy absorption capability as aluminum structure.
Development of task network models of human performance in microgravity
NASA Technical Reports Server (NTRS)
Diaz, Manuel F.; Adam, Susan
1992-01-01
This paper discusses the utility of task-network modeling for quantifying human performance variability in microgravity. The data are gathered for: (1) improving current methodologies for assessing human performance and workload in the operational space environment; (2) developing tools for assessing alternative system designs; and (3) developing an integrated set of methodologies for the evaluation of performance degradation during extended duration spaceflight. The evaluation entailed an analysis of the Remote Manipulator System payload-grapple task performed on many shuttle missions. Task-network modeling can be used as a tool for assessing and enhancing human performance in man-machine systems, particularly for modeling long-duration manned spaceflight. Task-network modeling can be directed toward improving system efficiency by increasing the understanding of basic capabilities of the human component in the system and the factors that influence these capabilities.
Heuristic decomposition for non-hierarchic systems
NASA Technical Reports Server (NTRS)
Bloebaum, Christina L.; Hajela, P.
1991-01-01
Design and optimization is substantially more complex in multidisciplinary and large-scale engineering applications due to the existing inherently coupled interactions. The paper introduces a quasi-procedural methodology for multidisciplinary optimization that is applicable for nonhierarchic systems. The necessary decision-making support for the design process is provided by means of an embedded expert systems capability. The method employs a decomposition approach whose modularity allows for implementation of specialized methods for analysis and optimization within disciplines.
Design technology co-optimization for 14/10nm metal1 double patterning layer
NASA Astrophysics Data System (ADS)
Duan, Yingli; Su, Xiaojing; Chen, Ying; Su, Yajuan; Shao, Feng; Zhang, Recco; Lei, Junjiang; Wei, Yayi
2016-03-01
Design and technology co-optimization (DTCO) can satisfy the needs of the design, generate robust design rule, and avoid unfriendly patterns at the early stage of design to ensure a high level of manufacturability of the product by the technical capability of the present process. The DTCO methodology in this paper includes design rule translation, layout analysis, model validation, hotspots classification and design rule optimization mainly. The correlation of the DTCO and double patterning (DPT) can optimize the related design rule and generate friendlier layout which meets the requirement of the 14/10nm technology node. The experiment demonstrates the methodology of DPT-compliant DTCO which is applied to a metal1 layer from the 14/10nm node. The DTCO workflow proposed in our job is an efficient solution for optimizing the design rules for 14/10 nm tech node Metal1 layer. And the paper also discussed and did the verification about how to tune the design rule of the U-shape and L-shape structures in a DPT-aware metal layer.
An Expert System-Driven Method for Parametric Trajectory Optimization During Conceptual Design
NASA Technical Reports Server (NTRS)
Dees, Patrick D.; Zwack, Mathew R.; Steffens, Michael; Edwards, Stephen; Diaz, Manuel J.; Holt, James B.
2015-01-01
During the early phases of engineering design, the costs committed are high, costs incurred are low, and the design freedom is high. It is well documented that decisions made in these early design phases drive the entire design's life cycle cost. In a traditional paradigm, key design decisions are made when little is known about the design. As the design matures, design changes become more difficult in both cost and schedule to enact. The current capability-based paradigm, which has emerged because of the constrained economic environment, calls for the infusion of knowledge usually acquired during later design phases into earlier design phases, i.e. bringing knowledge acquired during preliminary and detailed design into pre-conceptual and conceptual design. An area of critical importance to launch vehicle design is the optimization of its ascent trajectory, as the optimal trajectory will be able to take full advantage of the launch vehicle's capability to deliver a maximum amount of payload into orbit. Hence, the optimal ascent trajectory plays an important role in the vehicle's affordability posture yet little of the information required to successfully optimize a trajectory is known early in the design phase. Thus, the current paradigm of optimizing ascent trajectories involves generating point solutions for every change in a vehicle's design parameters. This is often a very tedious, manual, and time-consuming task for the analysts. Moreover, the trajectory design space is highly non-linear and multi-modal due to the interaction of various constraints. When these obstacles are coupled with the Program to Optimize Simulated Trajectories (POST), an industry standard program to optimize ascent trajectories that is difficult to use, expert trajectory analysts are required to effectively optimize a vehicle's ascent trajectory. Over the course of this paper, the authors discuss a methodology developed at NASA Marshall's Advanced Concepts Office to address these issues. The methodology is two-fold: first, capture the heuristics developed by human analysts over their many years of experience; and secondly, leverage the power of modern computing to evaluate multiple trajectories simultaneously and therefore enable the exploration of the trajectory's design space early during the pre- conceptual and conceptual phases of design. This methodology is coupled with design of experiments in order to train surrogate models, which enables trajectory design space visualization and parametric optimal ascent trajectory information to be available when early design decisions are being made.
Constraint Force Equation Methodology for Modeling Multi-Body Stage Separation Dynamics
NASA Technical Reports Server (NTRS)
Toniolo, Matthew D.; Tartabini, Paul V.; Pamadi, Bandu N.; Hotchko, Nathaniel
2008-01-01
This paper discusses a generalized approach to the multi-body separation problems in a launch vehicle staging environment based on constraint force methodology and its implementation into the Program to Optimize Simulated Trajectories II (POST2), a widely used trajectory design and optimization tool. This development facilitates the inclusion of stage separation analysis into POST2 for seamless end-to-end simulations of launch vehicle trajectories, thus simplifying the overall implementation and providing a range of modeling and optimization capabilities that are standard features in POST2. Analysis and results are presented for two test cases that validate the constraint force equation methodology in a stand-alone mode and its implementation in POST2.
Harte, Richard; Glynn, Liam; Rodríguez-Molinero, Alejandro; Baker, Paul MA; Scharf, Thomas; ÓLaighin, Gearóid
2017-01-01
Background Design processes such as human-centered design, which involve the end user throughout the product development and testing process, can be crucial in ensuring that the product meets the needs and capabilities of the user, particularly in terms of safety and user experience. The structured and iterative nature of human-centered design can often present a challenge when design teams are faced with the necessary, rapid, product development life cycles associated with the competitive connected health industry. Objective We wanted to derive a structured methodology that followed the principles of human-centered design that would allow designers and developers to ensure that the needs of the user are taken into account throughout the design process, while maintaining a rapid pace of development. In this paper, we present the methodology and its rationale before outlining how it was applied to assess and enhance the usability, human factors, and user experience of a connected health system known as the Wireless Insole for Independent and Safe Elderly Living (WIISEL) system, a system designed to continuously assess fall risk by measuring gait and balance parameters associated with fall risk. Methods We derived a three-phase methodology. In Phase 1 we emphasized the construction of a use case document. This document can be used to detail the context of use of the system by utilizing storyboarding, paper prototypes, and mock-ups in conjunction with user interviews to gather insightful user feedback on different proposed concepts. In Phase 2 we emphasized the use of expert usability inspections such as heuristic evaluations and cognitive walkthroughs with small multidisciplinary groups to review the prototypes born out of the Phase 1 feedback. Finally, in Phase 3 we emphasized classical user testing with target end users, using various metrics to measure the user experience and improve the final prototypes. Results We report a successful implementation of the methodology for the design and development of a system for detecting and predicting falls in older adults. We describe in detail what testing and evaluation activities we carried out to effectively test the system and overcome usability and human factors problems. Conclusions We feel this methodology can be applied to a wide variety of connected health devices and systems. We consider this a methodology that can be scaled to different-sized projects accordingly. PMID:28302594
Magnetic resonance imaging-compatible tactile sensing device based on a piezoelectric array.
Hamed, Abbi; Masamune, Ken; Tse, Zion Tsz Ho; Lamperth, Michael; Dohi, Takeyoshi
2012-07-01
Minimally invasive surgery is a widely used medical technique, one of the drawbacks of which is the loss of direct sense of touch during the operation. Palpation is the use of fingertips to explore and make fast assessments of tissue morphology. Although technologies are developed to equip minimally invasive surgery tools with haptic feedback capabilities, the majority focus on tissue stiffness profiling and tool-tissue interaction force measurement. For greatly increased diagnostic capability, a magnetic resonance imaging-compatible tactile sensor design is proposed, which allows minimally invasive surgery to be performed under image guidance, combining the strong capability of magnetic resonance imaging soft tissue and intuitive palpation. The sensing unit is based on a piezoelectric sensor methodology, which conforms to the stringent mechanical and electrical design requirements imposed by the magnetic resonance environment The sensor mechanical design and the device integration to a 0.2 Tesla open magnetic resonance imaging scanner are described, together with the device's magnetic resonance compatibility testing. Its design limitations and potential future improvements are also discussed. A tactile sensing unit based on a piezoelectric sensor principle is proposed, which is designed for magnetic resonance imaging guided interventions.
NASA Astrophysics Data System (ADS)
Neumann, Jay; Parlato, Russell; Tracy, Gregory; Randolph, Max
2015-09-01
Focal plane alignment for large format arrays and faster optical systems require enhanced precision methodology and stability over temperature. The increase in focal plane array size continues to drive the alignment capability. Depending on the optical system, the focal plane flatness of less than 25μm (.001") is required over transition temperatures from ambient to cooled operating temperatures. The focal plane flatness requirement must also be maintained in airborne or launch vibration environments. This paper addresses the challenge of the detector integration into the focal plane module and housing assemblies, the methodology to reduce error terms during integration and the evaluation of thermal effects. The driving factors influencing the alignment accuracy include: datum transfers, material effects over temperature, alignment stability over test, adjustment precision and traceability to NIST standard. The FPA module design and alignment methodology reduces the error terms by minimizing the measurement transfers to the housing. In the design, the proper material selection requires matched coefficient of expansion materials minimizes both the physical shift over temperature as well as lowering the stress induced into the detector. When required, the co-registration of focal planes and filters can achieve submicron relative positioning by applying precision equipment, interferometry and piezoelectric positioning stages. All measurements and characterizations maintain traceability to NIST standards. The metrology characterizes the equipment's accuracy, repeatability and precision of the measurements.
Additive Manufacturing in Production: A Study Case Applying Technical Requirements
NASA Astrophysics Data System (ADS)
Ituarte, Iñigo Flores; Coatanea, Eric; Salmi, Mika; Tuomi, Jukka; Partanen, Jouni
Additive manufacturing (AM) is expanding the manufacturing capabilities. However, quality of AM produced parts is dependent on a number of machine, geometry and process parameters. The variability of these parameters affects the manufacturing drastically and therefore standardized processes and harmonized methodologies need to be developed to characterize the technology for end use applications and enable the technology for manufacturing. This research proposes a composite methodology integrating Taguchi Design of Experiments, multi-objective optimization and statistical process control, to optimize the manufacturing process and fulfil multiple requirements imposed to an arbitrary geometry. The proposed methodology aims to characterize AM technology depending upon manufacturing process variables as well as to perform a comparative assessment of three AM technologies (Selective Laser Sintering, Laser Stereolithography and Polyjet). Results indicate that only one machine, laser-based Stereolithography, was feasible to fulfil simultaneously macro and micro level geometrical requirements but mechanical properties were not at required level. Future research will study a single AM system at the time to characterize AM machine technical capabilities and stimulate pre-normative initiatives of the technology for end use applications.
Cedar-a large scale multiprocessor
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gajski, D.; Kuck, D.; Lawrie, D.
1983-01-01
This paper presents an overview of Cedar, a large scale multiprocessor being designed at the University of Illinois. This machine is designed to accommodate several thousand high performance processors which are capable of working together on a single job, or they can be partitioned into groups of processors where each group of one or more processors can work on separate jobs. Various aspects of the machine are described including the control methodology, communication network, optimizing compiler and plans for construction. 13 references.
Convective Heating Predictions of Apollo IV Flight Data
NASA Technical Reports Server (NTRS)
White, Molly E.
2012-01-01
It has been more than 50 years since NASA engineers have attempted to design a manned space vehicle with the capability to return from beyond low Earth orbit. In this interval, our methodologies for designing the thermal protection system (TPS) to protect humans from the extremely high temperatures of re-entry have changed significantly. With these considerations in mind, we return to the Apollo IV (AS-501) flight data. This incredible data set allows us to assess the current tools and methodologies being used to design Orion MPCV. In particular, our ability to predict the aftbody separated region convective heating environments for MPCV is critical. The design uses reusable TPS in this area, whereas Apollo designers used ablative TPS which can withstand much more severe environments. This presentation will revisit the flight data, summarize the assumptions going into the analysis, present the results and draw conclusions regarding how accurately we can currently predict the heating in the aftbody separated region of a re-entry capsule.
Semi-Supervised Learning of Lift Optimization of Multi-Element Three-Segment Variable Camber Airfoil
NASA Technical Reports Server (NTRS)
Kaul, Upender K.; Nguyen, Nhan T.
2017-01-01
This chapter describes a new intelligent platform for learning optimal designs of morphing wings based on Variable Camber Continuous Trailing Edge Flaps (VCCTEF) in conjunction with a leading edge flap called the Variable Camber Krueger (VCK). The new platform consists of a Computational Fluid Dynamics (CFD) methodology coupled with a semi-supervised learning methodology. The CFD component of the intelligent platform comprises of a full Navier-Stokes solution capability (NASA OVERFLOW solver with Spalart-Allmaras turbulence model) that computes flow over a tri-element inboard NASA Generic Transport Model (GTM) wing section. Various VCCTEF/VCK settings and configurations were considered to explore optimal design for high-lift flight during take-off and landing. To determine globally optimal design of such a system, an extremely large set of CFD simulations is needed. This is not feasible to achieve in practice. To alleviate this problem, a recourse was taken to a semi-supervised learning (SSL) methodology, which is based on manifold regularization techniques. A reasonable space of CFD solutions was populated and then the SSL methodology was used to fit this manifold in its entirety, including the gaps in the manifold where there were no CFD solutions available. The SSL methodology in conjunction with an elastodynamic solver (FiDDLE) was demonstrated in an earlier study involving structural health monitoring. These CFD-SSL methodologies define the new intelligent platform that forms the basis for our search for optimal design of wings. Although the present platform can be used in various other design and operational problems in engineering, this chapter focuses on the high-lift study of the VCK-VCCTEF system. Top few candidate design configurations were identified by solving the CFD problem in a small subset of the design space. The SSL component was trained on the design space, and was then used in a predictive mode to populate a selected set of test points outside of the given design space. The new design test space thus populated was evaluated by using the CFD component by determining the error between the SSL predictions and the true (CFD) solutions, which was found to be small. This demonstrates the proposed CFD-SSL methodologies for isolating the best design of the VCK-VCCTEF system, and it holds promise for quantitatively identifying best designs of flight systems, in general.
Subsonic Wing Optimization for Handling Qualities Using ACSYNT
NASA Technical Reports Server (NTRS)
Soban, Danielle Suzanne
1996-01-01
The capability to accurately and rapidly predict aircraft stability derivatives using one comprehensive analysis tool has been created. The PREDAVOR tool has the following capabilities: rapid estimation of stability derivatives using a vortex lattice method, calculation of a longitudinal handling qualities metric, and inherent methodology to optimize a given aircraft configuration for longitudinal handling qualities, including an intuitive graphical interface. The PREDAVOR tool may be applied to both subsonic and supersonic designs, as well as conventional and unconventional, symmetric and asymmetric configurations. The workstation-based tool uses as its model a three-dimensional model of the configuration generated using a computer aided design (CAD) package. The PREDAVOR tool was applied to a Lear Jet Model 23 and the North American XB-70 Valkyrie.
Space Station communications system design and analysis
NASA Technical Reports Server (NTRS)
Ratliff, J. E.
1986-01-01
Attention is given to the methodologies currently being used as the framework within which the NASA Space Station's communications system is to be designed and analyzed. A key aspect of the CAD/analysis system being employed is its potential growth in size and capabilities, since Space Station design requirements will continue to be defined and modified. The Space Station is expected to furnish communications between itself and astronauts on EVA, Orbital Maneuvering Vehicles, Orbital Transfer Vehicles, Space Shuttle orbiters, free-flying spacecraft, coorbiting platforms, and the Space Shuttle's own Mobile Service Center.
Knowledge representation to support reasoning based on multiple models
NASA Technical Reports Server (NTRS)
Gillam, April; Seidel, Jorge P.; Parker, Alice C.
1990-01-01
Model Based Reasoning is a powerful tool used to design and analyze systems, which are often composed of numerous interactive, interrelated subsystems. Models of the subsystems are written independently and may be used together while they are still under development. Thus the models are not static. They evolve as information becomes obsolete, as improved artifact descriptions are developed, and as system capabilities change. Researchers are using three methods to support knowledge/data base growth, to track the model evolution, and to handle knowledge from diverse domains. First, the representation methodology is based on having pools, or types, of knowledge from which each model is constructed. In addition information is explicit. This includes the interactions between components, the description of the artifact structure, and the constraints and limitations of the models. The third principle we have followed is the separation of the data and knowledge from the inferencing and equation solving mechanisms. This methodology is used in two distinct knowledge-based systems: one for the design of space systems and another for the synthesis of VLSI circuits. It has facilitated the growth and evolution of our models, made accountability of results explicit, and provided credibility for the user community. These capabilities have been implemented and are being used in actual design projects.
Save money by understanding variance and tolerancing.
Stuart, K
2007-01-01
Manufacturing processes are inherently variable, which results in component and assembly variance. Unless process capability, variance and tolerancing are fully understood, incorrect design tolerances may be applied, which will lead to more expensive tooling, inflated production costs, high reject rates, product recalls and excessive warranty costs. A methodology is described for correctly allocating tolerances and performing appropriate analyses.
Supporting the Research Process through Expanded Library Data Services
ERIC Educational Resources Information Center
Wang, Minglu
2013-01-01
Purpose: The purpose of this paper is to describe how the authors gained a better understanding of the variety of library users' data needs, and how gradually some new data services were established based on current capabilities. Design/methodology/approach: This paper uses a case study of the new data services at the John Cotton Dana Library, at…
The Long and Winding Road: Problems in Developing Capabilities in an Undergraduate Commerce Degree
ERIC Educational Resources Information Center
Calma, Angelito
2017-01-01
Purpose: The purpose of this paper is to provide an analysis of specific learning outcomes in an undergraduate commerce degree in a large research-intensive university in Australia. Design/methodology/approach: It uses data collected from assurance of learning activities as part of Association to Advance Collegiate Schools of Business…
ERIC Educational Resources Information Center
Moss, Gloria; Daunton, Lyn
2006-01-01
Purpose: This research aims to fill a gap in the literature concerning the extent to which recruitment interviewers may substitute leadership capability sets (CSs) differing from those in the job specification (JS). Design/methodology/approach: Semi-structured interviews were conducted with senior personnel involved in senior staff selection in a…
The main objective of this study is to evaluate the performance of candidate sampling methods for potential use as a Federal Reference Method (FRM) capable of providing an estimate of coarse particle (PMc: particulate matter with an aerodynamic diameter between 2.5 um and 10 um...
NASA Technical Reports Server (NTRS)
Ebeling, Charles
1993-01-01
This report documents the work accomplished during the first two years of research to provide support to NASA in predicting operational and support parameters and costs of proposed space systems. The first year's research developed a methodology for deriving reliability and maintainability (R & M) parameters based upon the use of regression analysis to establish empirical relationships between performance and design specifications and corresponding mean times of failure and repair. The second year focused on enhancements to the methodology, increased scope of the model, and software improvements. This follow-on effort expands the prediction of R & M parameters and their effect on the operations and support of space transportation vehicles to include other system components such as booster rockets and external fuel tanks. It also increases the scope of the methodology and the capabilities of the model as implemented by the software. The focus is on the failure and repair of major subsystems and their impact on vehicle reliability, turn times, maintenance manpower, and repairable spares requirements. The report documents the data utilized in this study, outlines the general methodology for estimating and relating R&M parameters, presents the analyses and results of application to the initial data base, and describes the implementation of the methodology through the use of a computer model. The report concludes with a discussion on validation and a summary of the research findings and results.
Reliability and maintainability assessment factors for reliable fault-tolerant systems
NASA Technical Reports Server (NTRS)
Bavuso, S. J.
1984-01-01
A long term goal of the NASA Langley Research Center is the development of a reliability assessment methodology of sufficient power to enable the credible comparison of the stochastic attributes of one ultrareliable system design against others. This methodology, developed over a 10 year period, is a combined analytic and simulative technique. An analytic component is the Computer Aided Reliability Estimation capability, third generation, or simply CARE III. A simulative component is the Gate Logic Software Simulator capability, or GLOSS. The numerous factors that potentially have a degrading effect on system reliability and the ways in which these factors that are peculiar to highly reliable fault tolerant systems are accounted for in credible reliability assessments. Also presented are the modeling difficulties that result from their inclusion and the ways in which CARE III and GLOSS mitigate the intractability of the heretofore unworkable mathematics.
NASA Technical Reports Server (NTRS)
Malin, Jane T.; Schreckenghost, Debra L.; Woods, David D.; Potter, Scott S.; Johannesen, Leila; Holloway, Matthew; Forbus, Kenneth D.
1991-01-01
Initial results are reported from a multi-year, interdisciplinary effort to provide guidance and assistance for designers of intelligent systems and their user interfaces. The objective is to achieve more effective human-computer interaction (HCI) for systems with real time fault management capabilities. Intelligent fault management systems within the NASA were evaluated for insight into the design of systems with complex HCI. Preliminary results include: (1) a description of real time fault management in aerospace domains; (2) recommendations and examples for improving intelligent systems design and user interface design; (3) identification of issues requiring further research; and (4) recommendations for a development methodology integrating HCI design into intelligent system design.
Formal and heuristic system decomposition methods in multidisciplinary synthesis. Ph.D. Thesis, 1991
NASA Technical Reports Server (NTRS)
Bloebaum, Christina L.
1991-01-01
The multidisciplinary interactions which exist in large scale engineering design problems provide a unique set of difficulties. These difficulties are associated primarily with unwieldy numbers of design variables and constraints, and with the interdependencies of the discipline analysis modules. Such obstacles require design techniques which account for the inherent disciplinary couplings in the analyses and optimizations. The objective of this work was to develop an efficient holistic design synthesis methodology that takes advantage of the synergistic nature of integrated design. A general decomposition approach for optimization of large engineering systems is presented. The method is particularly applicable for multidisciplinary design problems which are characterized by closely coupled interactions among discipline analyses. The advantage of subsystem modularity allows for implementation of specialized methods for analysis and optimization, computational efficiency, and the ability to incorporate human intervention and decision making in the form of an expert systems capability. The resulting approach is not a method applicable to only a specific situation, but rather, a methodology which can be used for a large class of engineering design problems in which the system is non-hierarchic in nature.
An Interoperability Framework and Capability Profiling for Manufacturing Software
NASA Astrophysics Data System (ADS)
Matsuda, M.; Arai, E.; Nakano, N.; Wakai, H.; Takeda, H.; Takata, M.; Sasaki, H.
ISO/TC184/SC5/WG4 is working on ISO16100: Manufacturing software capability profiling for interoperability. This paper reports on a manufacturing software interoperability framework and a capability profiling methodology which were proposed and developed through this international standardization activity. Within the context of manufacturing application, a manufacturing software unit is considered to be capable of performing a specific set of function defined by a manufacturing software system architecture. A manufacturing software interoperability framework consists of a set of elements and rules for describing the capability of software units to support the requirements of a manufacturing application. The capability profiling methodology makes use of the domain-specific attributes and methods associated with each specific software unit to describe capability profiles in terms of unit name, manufacturing functions, and other needed class properties. In this methodology, manufacturing software requirements are expressed in terns of software unit capability profiles.
FAME, a microprocessor based front-end analysis and modeling environment
NASA Technical Reports Server (NTRS)
Rosenbaum, J. D.; Kutin, E. B.
1980-01-01
Higher order software (HOS) is a methodology for the specification and verification of large scale, complex, real time systems. The HOS methodology was implemented as FAME (front end analysis and modeling environment), a microprocessor based system for interactively developing, analyzing, and displaying system models in a low cost user-friendly environment. The nature of the model is such that when completed it can be the basis for projection to a variety of forms such as structured design diagrams, Petri-nets, data flow diagrams, and PSL/PSA source code. The user's interface with the analyzer is easily recognized by any current user of a structured modeling approach; therefore extensive training is unnecessary. Furthermore, when all the system capabilities are used one can check on proper usage of data types, functions, and control structures thereby adding a new dimension to the design process that will lead to better and more easily verified software designs.
NASA Astrophysics Data System (ADS)
Kouloumentas, Christos
2011-09-01
The concept of the all-fiberized multi-wavelength regenerator is analyzed, and the design methodology for operation at 40 Gb/s is presented. The specific methodology has been applied in the past for the experimental proof-of-principle of the technique, but it has never been reported in detail. The regenerator is based on a strong dispersion map that is implemented using alternating dispersion compensating fibers (DCF) and single-mode fibers (SMF), and minimizes the nonlinear interaction between the wavelength-division multiplexing (WDM) channels. The optimized regenerator design with + 0.86 ps/nm/km average dispersion of the nonlinear fiber section is further investigated. The specific design is capable of simultaneously processing five WDM channels with 800 GHz channel spacing and providing Q-factor improvement higher than 1 dB for each channel. The cascadeability of the regenerator is also indicated using a 6-node metropolitan network simulation model.
System Level Aerothermal Testing for the Adaptive Deployable Entry and Placement Technology (ADEPT)
NASA Technical Reports Server (NTRS)
Cassell, Alan; Gorbunov, Sergey; Yount, Bryan; Prabhu, Dinesh; de Jong, Maxim; Boghozian, Tane; Hui, Frank; Chen, Y.-K.; Kruger, Carl; Poteet, Carl;
2016-01-01
The Adaptive Deployable Entry and Placement Technology (ADEPT), a mechanically deployable entry vehicle technology, has been under development at NASA since 2011. As part of the technical maturation of ADEPT, designs capable of delivering small payloads (10 kg) are being considered to rapidly mature sub 1 m deployed diameter designs. The unique capability of ADEPT for small payloads comes from its ability to stow within a slender volume and deploy to achieve a mass efficient drag surface with a high heat rate capability. The low ballistic coefficient results in entry heating and mechanical loads that can be met by a revolutionary three-dimensionally woven carbon fabric supported by a deployable skeleton structure. This carbon fabric has test proven capability as both primary structure and payload thermal protection system. In order to rapidly advance ADEPTs technical maturation, the project is developing test methods that enable thermostructural design requirement verification of ADEPT designs at the system level using ground test facilities. Results from these tests are also relevant to larger class missions and help us define areas of focused component level testing in order to mature material and thermal response design codes. The ability to ground test sub 1 m diameter ADEPT configurations at or near full-scale provides significant value to the rapid maturation of this class of deployable entry vehicles. This paper will summarize arc jet test results, highlight design challenges, provide a summary of lessons learned and discuss future test approaches based upon this methodology.
NASA Technical Reports Server (NTRS)
Ivanco, Thomas G.; Sekula, Martin K.; Piatak, David J.; Simmons, Scott A.; Babel, Walter C.; Collins, Jesse G.; Ramey, James M.; Heald, Dean M.
2016-01-01
A data acquisition system upgrade project, known as AB-DAS, is underway at the NASA Langley Transonic Dynamics Tunnel. AB-DAS will soon serve as the primary data system and will substantially increase the scan-rate capabilities and analog channel count while maintaining other unique aeroelastic and dynamic test capabilities required of the facility. AB-DAS is configurable, adaptable, and enables buffet and aeroacoustic tests by synchronously scanning all analog channels and recording the high scan-rate time history values for each data quantity. AB-DAS is currently available for use as a stand-alone data system with limited capabilities while development continues. This paper describes AB-DAS, the design methodology, and the current features and capabilities. It also outlines the future work and projected capabilities following completion of the data system upgrade project.
Electronic Systems for Spacecraft Vehicles: Required EDA Tools
NASA Technical Reports Server (NTRS)
Bachnak, Rafic
1999-01-01
The continuous increase in complexity of electronic systems is making the design and manufacturing of such systems more challenging than ever before. As a result, designers are finding it impossible to design efficient systems without the use of sophisticated Electronic Design Automation (EDA) tools. These tools offer integrated simulation of the electrical, mechanical, and manufacturing functions and lead to a correct by design methodology. This report identifies the EDA tools that would be needed to design, analyze, simulate, and evaluate electronic systems for spacecraft vehicles. In addition, the report presents recommendations to enhance the current JSC electronic design capabilities. This includes cost information and a discussion as to the impact, both positive and negative, of implementing the recommendations.
The development of a test methodology for the evaluation of EVA gloves
NASA Technical Reports Server (NTRS)
O'Hara, John M.; Cleland, John; Winfield, Dan
1988-01-01
This paper describes the development of a standardized set of tests designed to assess EVA-gloved hand capabilities in six measurement domains: range of motion, strength, tactile perception, dexterity, fatigue, and comfort. Based upon an assessment of general human-hand functioning and EVA task requirements, several tests within each measurement domain were developed to provide a comprehensive evaluation. All tests were designed to be conducted in a glove box with the bare hand as a baseline and the EVA glove at operating pressure.
A methodology to derive Synthetic Design Hydrographs for river flood management
NASA Astrophysics Data System (ADS)
Tomirotti, Massimo; Mignosa, Paolo
2017-12-01
The design of flood protection measures requires in many cases not only the estimation of the peak discharges, but also of the volume of the floods and its time distribution. A typical solution to this kind of problems is the formulation of Synthetic Design Hydrographs (SDHs). In this paper a methodology to derive SDHs is proposed on the basis of the estimation of the Flow Duration Frequency (FDF) reduction curve and of a Peak-Duration (PD) relationship furnishing respectively the quantiles of the maximum average discharge and the average peak position in each duration. The methodology is intended to synthesize the main features of the historical floods in a unique SDH for each return period. The shape of the SDH is not selected a priori but is a result of the behaviour of FDF and PD curves, allowing to account in a very convenient way for the variability of the shapes of the observed hydrographs at local time scale. The validation of the methodology is performed with reference to flood routing problems in reservoirs, lakes and rivers. The results obtained demonstrate the capability of the SDHs to describe the effects of different hydraulic systems on the statistical regime of floods, even in presence of strong modifications induced on the probability distribution of peak flows.
Fault Injection and Monitoring Capability for a Fault-Tolerant Distributed Computation System
NASA Technical Reports Server (NTRS)
Torres-Pomales, Wilfredo; Yates, Amy M.; Malekpour, Mahyar R.
2010-01-01
The Configurable Fault-Injection and Monitoring System (CFIMS) is intended for the experimental characterization of effects caused by a variety of adverse conditions on a distributed computation system running flight control applications. A product of research collaboration between NASA Langley Research Center and Old Dominion University, the CFIMS is the main research tool for generating actual fault response data with which to develop and validate analytical performance models and design methodologies for the mitigation of fault effects in distributed flight control systems. Rather than a fixed design solution, the CFIMS is a flexible system that enables the systematic exploration of the problem space and can be adapted to meet the evolving needs of the research. The CFIMS has the capabilities of system-under-test (SUT) functional stimulus generation, fault injection and state monitoring, all of which are supported by a configuration capability for setting up the system as desired for a particular experiment. This report summarizes the work accomplished so far in the development of the CFIMS concept and documents the first design realization.
NASA Astrophysics Data System (ADS)
Iacobucci, Joseph V.
The research objective for this manuscript is to develop a Rapid Architecture Alternative Modeling (RAAM) methodology to enable traceable Pre-Milestone A decision making during the conceptual phase of design of a system of systems. Rather than following current trends that place an emphasis on adding more analysis which tends to increase the complexity of the decision making problem, RAAM improves on current methods by reducing both runtime and model creation complexity. RAAM draws upon principles from computer science, system architecting, and domain specific languages to enable the automatic generation and evaluation of architecture alternatives. For example, both mission dependent and mission independent metrics are considered. Mission dependent metrics are determined by the performance of systems accomplishing a task, such as Probability of Success. In contrast, mission independent metrics, such as acquisition cost, are solely determined and influenced by the other systems in the portfolio. RAAM also leverages advances in parallel computing to significantly reduce runtime by defining executable models that are readily amendable to parallelization. This allows the use of cloud computing infrastructures such as Amazon's Elastic Compute Cloud and the PASTEC cluster operated by the Georgia Institute of Technology Research Institute (GTRI). Also, the amount of data that can be generated when fully exploring the design space can quickly exceed the typical capacity of computational resources at the analyst's disposal. To counter this, specific algorithms and techniques are employed. Streaming algorithms and recursive architecture alternative evaluation algorithms are used that reduce computer memory requirements. Lastly, a domain specific language is created to provide a reduction in the computational time of executing the system of systems models. A domain specific language is a small, usually declarative language that offers expressive power focused on a particular problem domain by establishing an effective means to communicate the semantics from the RAAM framework. These techniques make it possible to include diverse multi-metric models within the RAAM framework in addition to system and operational level trades. A canonical example was used to explore the uses of the methodology. The canonical example contains all of the features of a full system of systems architecture analysis study but uses fewer tasks and systems. Using RAAM with the canonical example it was possible to consider both system and operational level trades in the same analysis. Once the methodology had been tested with the canonical example, a Suppression of Enemy Air Defenses (SEAD) capability model was developed. Due to the sensitive nature of analyses on that subject, notional data was developed. The notional data has similar trends and properties to realistic Suppression of Enemy Air Defenses data. RAAM was shown to be traceable and provided a mechanism for a unified treatment of a variety of metrics. The SEAD capability model demonstrated lower computer runtimes and reduced model creation complexity as compared to methods currently in use. To determine the usefulness of the implementation of the methodology on current computing hardware, RAAM was tested with system of system architecture studies of different sizes. This was necessary since system of systems may be called upon to accomplish thousands of tasks. It has been clearly demonstrated that RAAM is able to enumerate and evaluate the types of large, complex design spaces usually encountered in capability based design, oftentimes providing the ability to efficiently search the entire decision space. The core algorithms for generation and evaluation of alternatives scale linearly with expected problem sizes. The SEAD capability model outputs prompted the discovery a new issue, the data storage and manipulation requirements for an analysis. Two strategies were developed to counter large data sizes, the use of portfolio views and top 'n' analysis. This proved the usefulness of the RAAM framework and methodology during Pre-Milestone A capability based analysis. (Abstract shortened by UMI.).
Kukec, Andreja; Boznar, Marija Z; Mlakar, Primoz; Grasic, Bostjan; Herakovic, Andrej; Zadnik, Vesna; Zaletel-Kragelj, Lijana; Farkas, Jerneja; Erzen, Ivan
2014-05-01
The study of atmospheric air pollution research in complex terrains is challenged by the lack of appropriate methodology supporting the analysis of the spatial relationship between phenomena affected by a multitude of factors. The key is optimal design of a meaningful approach based on small spatial units of observation. The Zasavje region, Slovenia, was chosen as study area with the main objective to investigate in practice the role of such units in a test environment. The process consisted of three steps: modelling of pollution in the atmosphere with dispersion models, transfer of the results to geographical information system software, and then moving on to final determination of the function of small spatial units. A methodology capable of designing useful units for atmospheric air pollution research in highly complex terrains was created, and the results were deemed useful in offering starting points for further research in the field of geospatial health.
Space Station man-machine automation trade-off analysis
NASA Technical Reports Server (NTRS)
Zimmerman, W. F.; Bard, J.; Feinberg, A.
1985-01-01
The man machine automation tradeoff methodology presented is of four research tasks comprising the autonomous spacecraft system technology (ASST) project. ASST was established to identify and study system level design problems for autonomous spacecraft. Using the Space Station as an example spacecraft system requiring a certain level of autonomous control, a system level, man machine automation tradeoff methodology is presented that: (1) optimizes man machine mixes for different ground and on orbit crew functions subject to cost, safety, weight, power, and reliability constraints, and (2) plots the best incorporation plan for new, emerging technologies by weighing cost, relative availability, reliability, safety, importance to out year missions, and ease of retrofit. A fairly straightforward approach is taken by the methodology to valuing human productivity, it is still sensitive to the important subtleties associated with designing a well integrated, man machine system. These subtleties include considerations such as crew preference to retain certain spacecraft control functions; or valuing human integration/decision capabilities over equivalent hardware/software where appropriate.
NASA Technical Reports Server (NTRS)
Haley, D. C.; Almand, B. J.; Thomas, M. M.; Krauze, L. D.; Gremban, K. D.; Sanborn, J. C.; Kelly, J. H.; Depkovich, T. M.; Wolfe, W. J.; Nguyen, T.
1986-01-01
The purpose of the Robotic Simulation (ROBSIM) program is to provide a broad range of computer capabilities to assist in the design, verification, simulation, and study of robotic systems. ROBSIM is programmed in FORTRAM 77 and implemented on a VAX 11/750 computer using the VMS operating system. The programmer's guide describes the ROBSIM implementation and program logic flow, and the functions and structures of the different subroutines. With the manual and the in-code documentation, an experienced programmer can incorporate additional routines and modify existing ones to add desired capabilities.
Space Shuttle Orbiter oxygen partial pressure sensing and control system improvements
NASA Technical Reports Server (NTRS)
Frampton, Robert F.; Hoy, Dennis M.; Kelly, Kevin J.; Walleshauser, James J.
1992-01-01
A program aimed at developing a new PPO2 oxygen sensor and a replacement amplifier for the Space Shuttle Orbiter is described. Experimental design methodologies used in the test and modeling process made it possible to enhance the effectiveness of the program and to reduce its cost. Significant cost savings are due to the increased lifetime of the basic sensor cell, the maximization of useful sensor life through an increased amplifier gain adjustment capability, the use of streamlined production processes for the manufacture of the assemblies, and the refurbishment capability of the replacement sensor.
NASA Technical Reports Server (NTRS)
Haley, D. C.; Almand, B. J.; Thomas, M. M.; Krauze, L. D.; Gremban, K. D.; Sanborn, J. C.; Kelly, J. H.; Depkovich, T. M.
1984-01-01
The purpose of the Robotics Simulation (ROBSIM) program is to provide a broad range of computer capabilities to assist in the design, verification, simulation, and study of robotic systems. ROBSIM is programmed in FORTRAN 77 and implemented on a VAX 11/750 computer using the VMS operating system. This programmer's guide describes the ROBSIM implementation and program logic flow, and the functions and structures of the different subroutines. With this manual and the in-code documentation, and experienced programmer can incorporate additional routines and modify existing ones to add desired capabilities.
High-performance radial AMTEC cell design for ultra-high-power solar AMTEC systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hendricks, T.J.; Huang, C.
1999-07-01
Alkali Metal Thermal to Electric Conversion (AMTEC) technology is rapidly maturing for potential application in ultra-high-power solar AMTEC systems required by potential future US Air Force (USAF) spacecraft missions in medium-earth and geosynchronous orbits (MEO and GEO). Solar thermal AMTEC power systems potentially have several important advantages over current solar photovoltaic power systems in ultra-high-power spacecraft applications for USAF MEO and GEO missions. This work presents key aspects of radial AMTEC cell design to achieve high cell performance in solar AMTEC systems delivering larger than 50 kW(e) to support high power USAF missions. These missions typically require AMTEC cell conversionmore » efficiency larger than 25%. A sophisticated design parameter methodology is described and demonstrated which establishes optimum design parameters in any radial cell design to satisfy high-power mission requirements. Specific relationships, which are distinct functions of cell temperatures and pressures, define critical dependencies between key cell design parameters, particularly the impact of parasitic thermal losses on Beta Alumina Solid Electrolyte (BASE) area requirements, voltage, number of BASE tubes, and system power production for both maximum power-per-BASE-area and optimum efficiency conditions. Finally, some high-level system tradeoffs are demonstrated using the design parameter methodology to establish high-power radial cell design requirements and philosophy. The discussion highlights how to incorporate this methodology with sophisticated SINDA/FLUINT AMTEC cell modeling capabilities to determine optimum radial AMTEC cell designs.« less
Rotorcraft Brownout: Advanced Understanding, Control and Mitigation
2008-12-31
the Gauss Seidel iterative method . The overall steps of SIMPLER algorithm can be summarized as: 1. Guess velocity field, 2. Calculate the momentum...techniques and numerical methods , and the team will begin to develop a methodology that is capable of integrating these solutions and highlighting...rotorcraft design optimization techniques will then be undertaken using the validated computational methods . 15. SUBJECT TERMS Rotorcraft
The Effects of University Mergers in China since 1990s: From the Perspective of Knowledge Production
ERIC Educational Resources Information Center
Mao, Ya-qing; Du, Yuan; Liu, Jing-juan
2009-01-01
Purpose: The purpose of this paper is to discover and better understand the efficiency of university mergers from the perspective of knowledge production, with the research capability as the point of contact. Design/methodology/approach: In total, 20 colleges and universities directly under the central ministries that merged in 2000 were taken as…
NASA Technical Reports Server (NTRS)
Mojarradi, Mohammad M.; Kolawa, Elizabeth; Blalock, Benjamin; Johnson, R. Wayne
2005-01-01
Next generation space-based robotics systems will be constructed using distributed architectures where electronics capable of working in the extreme environments of the planets of the solar system are integrated with the sensors and actuators in plug-and-play modules and are connected through common multiple redundant data and power buses.
New methods and materials for molding and casting ice formations
NASA Technical Reports Server (NTRS)
Reehorst, Andrew L.; Richter, G. Paul
1987-01-01
This study was designed to find improved materials and techniques for molding and casting natural or simulated ice shapes that could replace the wax and plaster method. By utilizing modern molding and casting materials and techniques, a new methodology was developed that provides excellent reproduction, low-temperature capability, and reasonable turnaround time. The resulting casts are accurate and tough.
Hu, Xiao-Bing; Wang, Ming; Di Paolo, Ezequiel
2013-06-01
Searching the Pareto front for multiobjective optimization problems usually involves the use of a population-based search algorithm or of a deterministic method with a set of different single aggregate objective functions. The results are, in fact, only approximations of the real Pareto front. In this paper, we propose a new deterministic approach capable of fully determining the real Pareto front for those discrete problems for which it is possible to construct optimization algorithms to find the k best solutions to each of the single-objective problems. To this end, two theoretical conditions are given to guarantee the finding of the actual Pareto front rather than its approximation. Then, a general methodology for designing a deterministic search procedure is proposed. A case study is conducted, where by following the general methodology, a ripple-spreading algorithm is designed to calculate the complete exact Pareto front for multiobjective route optimization. When compared with traditional Pareto front search methods, the obvious advantage of the proposed approach is its unique capability of finding the complete Pareto front. This is illustrated by the simulation results in terms of both solution quality and computational efficiency.
Multifunctional Collaborative Modeling and Analysis Methods in Engineering Science
NASA Technical Reports Server (NTRS)
Ransom, Jonathan B.; Broduer, Steve (Technical Monitor)
2001-01-01
Engineers are challenged to produce better designs in less time and for less cost. Hence, to investigate novel and revolutionary design concepts, accurate, high-fidelity results must be assimilated rapidly into the design, analysis, and simulation process. This assimilation should consider diverse mathematical modeling and multi-discipline interactions necessitated by concepts exploiting advanced materials and structures. Integrated high-fidelity methods with diverse engineering applications provide the enabling technologies to assimilate these high-fidelity, multi-disciplinary results rapidly at an early stage in the design. These integrated methods must be multifunctional, collaborative, and applicable to the general field of engineering science and mechanics. Multifunctional methodologies and analysis procedures are formulated for interfacing diverse subdomain idealizations including multi-fidelity modeling methods and multi-discipline analysis methods. These methods, based on the method of weighted residuals, ensure accurate compatibility of primary and secondary variables across the subdomain interfaces. Methods are developed using diverse mathematical modeling (i.e., finite difference and finite element methods) and multi-fidelity modeling among the subdomains. Several benchmark scalar-field and vector-field problems in engineering science are presented with extensions to multidisciplinary problems. Results for all problems presented are in overall good agreement with the exact analytical solution or the reference numerical solution. Based on the results, the integrated modeling approach using the finite element method for multi-fidelity discretization among the subdomains is identified as most robust. The multiple-method approach is advantageous when interfacing diverse disciplines in which each of the method's strengths are utilized. The multifunctional methodology presented provides an effective mechanism by which domains with diverse idealizations are interfaced. This capability rapidly provides the high-fidelity results needed in the early design phase. Moreover, the capability is applicable to the general field of engineering science and mechanics. Hence, it provides a collaborative capability that accounts for interactions among engineering analysis methods.
The development of capability measures in health economics: opportunities, challenges and progress.
Coast, Joanna; Kinghorn, Philip; Mitchell, Paul
2015-04-01
Recent years have seen increased engagement amongst health economists with the capability approach developed by Amartya Sen and others. This paper focuses on the capability approach in relation to the evaluative space used for analysis within health economics. It considers the opportunities that the capability approach offers in extending this space, but also the methodological challenges associated with moving from the theoretical concepts to practical empirical applications. The paper then examines three 'families' of measures, Oxford Capability instruments (OxCap), Adult Social Care Outcome Toolkit (ASCOT) and ICEpop CAPability (ICECAP), in terms of the methodological choices made in each case. The paper concludes by discussing some of the broader issues involved in making use of the capability approach in health economics. It also suggests that continued exploration of the impact of different methodological choices will be important in moving forward.
A preliminary study of solar powdered aircraft and associated power trains
NASA Technical Reports Server (NTRS)
Hall, D. W.; Fortenbach, C. D.; Dimiceli, E. V.; Parks, R. W.
1983-01-01
The feasibility of regeneratively powered solar high altitude powered platform (HAPP) remotely piloted vehicles was assessed. Those technologies which must be pursued to make long duration solar HAPPs feasible are recommended. A methodology which involved characterization and parametric analysis of roughly two dozen variables to determine vehicles capable of fulfilling the primary mission are defined. One of these vehicles was then conceptually designed. Variations of each major design parameter were investigated along with state-of-the-art changes in power train component capabilities. The midlatitude mission studied would be attainable by a solar HAPP if fuel cell, electrolyzer and photovoltaic technologies are pursued. Vehicles will be very large and have very lightweight structures in order to attain the combinations of altitude and duration required by the primary mission.
Harte, Richard; Glynn, Liam; Rodríguez-Molinero, Alejandro; Baker, Paul Ma; Scharf, Thomas; Quinlan, Leo R; ÓLaighin, Gearóid
2017-03-16
Design processes such as human-centered design, which involve the end user throughout the product development and testing process, can be crucial in ensuring that the product meets the needs and capabilities of the user, particularly in terms of safety and user experience. The structured and iterative nature of human-centered design can often present a challenge when design teams are faced with the necessary, rapid, product development life cycles associated with the competitive connected health industry. We wanted to derive a structured methodology that followed the principles of human-centered design that would allow designers and developers to ensure that the needs of the user are taken into account throughout the design process, while maintaining a rapid pace of development. In this paper, we present the methodology and its rationale before outlining how it was applied to assess and enhance the usability, human factors, and user experience of a connected health system known as the Wireless Insole for Independent and Safe Elderly Living (WIISEL) system, a system designed to continuously assess fall risk by measuring gait and balance parameters associated with fall risk. We derived a three-phase methodology. In Phase 1 we emphasized the construction of a use case document. This document can be used to detail the context of use of the system by utilizing storyboarding, paper prototypes, and mock-ups in conjunction with user interviews to gather insightful user feedback on different proposed concepts. In Phase 2 we emphasized the use of expert usability inspections such as heuristic evaluations and cognitive walkthroughs with small multidisciplinary groups to review the prototypes born out of the Phase 1 feedback. Finally, in Phase 3 we emphasized classical user testing with target end users, using various metrics to measure the user experience and improve the final prototypes. We report a successful implementation of the methodology for the design and development of a system for detecting and predicting falls in older adults. We describe in detail what testing and evaluation activities we carried out to effectively test the system and overcome usability and human factors problems. We feel this methodology can be applied to a wide variety of connected health devices and systems. We consider this a methodology that can be scaled to different-sized projects accordingly. ©Richard Harte, Liam Glynn, Alejandro Rodríguez-Molinero, Paul MA Baker, Thomas Scharf, Leo R Quinlan, Gearóid ÓLaighin. Originally published in JMIR Human Factors (http://humanfactors.jmir.org), 16.03.2017.
New mission requirements methodologies for services provided by the Office of Space Communications
NASA Technical Reports Server (NTRS)
Holmes, Dwight P.; Hall, J. R.; Macoughtry, William; Spearing, Robert
1993-01-01
The Office of Space Communications, NASA Headquarters, has recently revised its methodology for receiving, accepting and responding to customer requests for use of that office's tracking and communications capabilities. This revision is the result of a process which has become over-burdened by the size of the currently active and proposed missions set, requirements reviews that focus on single missions rather than on mission sets, and negotiations most often not completed early enough to effect needed additions to capacity or capability prior to launch. The requirements-coverage methodology described is more responsive to project/program needs and provides integrated input into the NASA budget process early enough to effect change, and describes the mechanisms and tools in place to insure a value-added process which will benefit both NASA and its customers. Key features of the requirements methodology include the establishment of a mechanism for early identification of and systems trades with new customers, and delegates the review and approval of requirements documents to NASA centers in lieu of Headquarters, thus empowering the system design teams to establish and negotiate the detailed requirements with the user. A Mission Requirements Request (MRR) is introduced to facilitate early customer interaction. The expected result is that the time to achieve an approved set of implementation requirements which meet the customer's needs can be greatly reduced. Finally, by increasing the discipline in requirements management, through the use of base lining procedures, a tighter coupling between customer requirements and the budget is provided. A twice-yearly projection of customer requirements accommodation, designated as the Capacity Projection Plan (CPP), provides customer feedback allowing the entire mission set to be serviced.
ARCHITECT: The architecture-based technology evaluation and capability tradeoff method
NASA Astrophysics Data System (ADS)
Griendling, Kelly A.
The use of architectures for the design, development, and documentation of system-of-systems engineering has become a common practice in recent years. This practice became mandatory in the defense industry in 2004 when the Department of Defense Architecture Framework (DoDAF) Promulgation Memo mandated that all Department of Defense (DoD) architectures must be DoDAF compliant. Despite this mandate, there has been significant confusion and a lack of consistency in the creation and the use of the architecture products. Products are typically created as static documents used for communication and documentation purposes that are difficult to change and do not support engineering design activities and acquisition decision making. At the same time, acquisition guidance has been recently reformed to move from the bottom-up approach of the Requirements Generation System (RGS) to the top-down approach mandated by the Joint Capabilities Integration and Devel- opment System (JCIDS), which requires the use of DoDAF to support acquisition. Defense agencies have had difficulty adjusting to this new policy, and are struggling to determine how to meet new acquisition requirements. This research has developed the Architecture-based Technology Evaluation and Capability Tradeoff (ARCHITECT) Methodology to respond to these challenges and address concerns raised about the defense acquisition process, particularly the time required to implement parts of the process, the need to evaluate solutions across capability and mission areas, and the need to use a rigorous, traceable, repeatable method that utilizes modeling and simulation to better substantiate early-phase acquisition decisions. The objective is to create a capability-based systems engineering methodology for the early phases of design and acquisition (specifically Pre-Milestone A activities) which improves agility in defense acquisition by (1) streamlining the development of key elements of JCIDS and DoDAF, (2) moving the creation of DoDAF products forward in the defense acquisition process, and (3) using DoDAF products for more than documentation by integrating them into the problem definition and analysis of alternatives phases and applying executable architecting. This research proposes and demonstrates the plausibility of a prescriptive methodology for developing executable DoDAF products which will explicitly support decision-making in the early phases of JCIDS. A set of criteria by which CBAs should be judged is proposed, and the methodology is developed with these criteria in mind. The methodology integrates existing tools and techniques for systems engineering and system of systems engineering with several new modeling and simulation tools and techniques developed as part of this research to fill gaps noted in prior CBAs. A suppression of enemy air defenses (SEAD) mission is used to demonstrate the ap- plication of ARCHITECT and to show the plausibility of the approach. For the SEAD study, metrics are derived and a gap analysis is performed. The study then identifies and quantitatively compares system and operational architecture alternatives for performing SEAD. A series of down-selections is performed to identify promising architectures, and these promising solutions are subject to further analysis where the impacts of force structure and network structure are examined. While the numerical results of the SEAD study are notional and could not be applied to an actual SEAD CBA, the example served to highlight many of the salient features of the methodology. The SEAD study presented enabled pre-Milestone A tradeoffs to be performed quantitatively across a large number of architectural alternatives in a traceable and repeatable manner. The alternatives considered included variations on operations, systems, organizational responsibilities (through the assignment of systems to tasks), network (or collaboration) structure, interoperability level, and force structure. All of the information used in the study is preserved in the environment, which is dynamic and allows for on-the-fly analysis. The assumptions used were consistent, which was assured through the use of single file documenting all inputs, which was shared across all models. Furthermore, a model was made of the ARCHITECT methodology itself, and was used to demonstrate that even if the steps took twice as long to perform as they did in the case of the SEAD example, the methodology still provides the ability to conduct CBA analyses in less time than prior CBAs to date. Overall, it is shown that the ARCHITECT methodology results in an improvement over current CBAs in the criteria developed here.
RF power harvesting: a review on designing methodologies and applications
NASA Astrophysics Data System (ADS)
Tran, Le-Giang; Cha, Hyouk-Kyu; Park, Woo-Tae
2017-12-01
Wireless power transmission was conceptualized nearly a century ago. Certain achievements made to date have made power harvesting a reality, capable of providing alternative sources of energy. This review provides a summ ary of radio frequency (RF) power harvesting technologies in order to serve as a guide for the design of RF energy harvesting units. Since energy harvesting circuits are designed to operate with relatively small voltages and currents, they rely on state-of-the-art electrical technology for obtaining high efficiency. Thus, comprehensive analysis and discussions of various designs and their tradeoffs are included. Finally, recent applications of RF power harvesting are outlined.
Pyrolysis Model Development for a Multilayer Floor Covering
McKinnon, Mark B.; Stoliarov, Stanislav I.
2015-01-01
Comprehensive pyrolysis models that are integral to computational fire codes have improved significantly over the past decade as the demand for improved predictive capabilities has increased. High fidelity pyrolysis models may improve the design of engineered materials for better fire response, the design of the built environment, and may be used in forensic investigations of fire events. A major limitation to widespread use of comprehensive pyrolysis models is the large number of parameters required to fully define a material and the lack of effective methodologies for measurement of these parameters, especially for complex materials. The work presented here details a methodology used to characterize the pyrolysis of a low-pile carpet tile, an engineered composite material that is common in commercial and institutional occupancies. The studied material includes three distinct layers of varying composition and physical structure. The methodology utilized a comprehensive pyrolysis model (ThermaKin) to conduct inverse analyses on data collected through several experimental techniques. Each layer of the composite was individually parameterized to identify its contribution to the overall response of the composite. The set of properties measured to define the carpet composite were validated against mass loss rate curves collected at conditions outside the range of calibration conditions to demonstrate the predictive capabilities of the model. The mean error between the predicted curve and the mean experimental mass loss rate curve was calculated as approximately 20% on average for heat fluxes ranging from 30 to 70 kW·m−2, which is within the mean experimental uncertainty. PMID:28793556
Nuclear Engine System Simulation (NESS). Volume 1: Program user's guide
NASA Astrophysics Data System (ADS)
Pelaccio, Dennis G.; Scheil, Christine M.; Petrosky, Lyman J.
1993-03-01
A Nuclear Thermal Propulsion (NTP) engine system design analysis tool is required to support current and future Space Exploration Initiative (SEI) propulsion and vehicle design studies. Currently available NTP engine design models are those developed during the NERVA program in the 1960's and early 1970's and are highly unique to that design or are modifications of current liquid propulsion system design models. To date, NTP engine-based liquid design models lack integrated design of key NTP engine design features in the areas of reactor, shielding, multi-propellant capability, and multi-redundant pump feed fuel systems. Additionally, since the SEI effort is in the initial development stage, a robust, verified NTP analysis design tool could be of great use to the community. This effort developed an NTP engine system design analysis program (tool), known as the Nuclear Engine System Simulation (NESS) program, to support ongoing and future engine system and stage design study efforts. In this effort, Science Applications International Corporation's (SAIC) NTP version of the Expanded Liquid Engine Simulation (ELES) program was modified extensively to include Westinghouse Electric Corporation's near-term solid-core reactor design model. The ELES program has extensive capability to conduct preliminary system design analysis of liquid rocket systems and vehicles. The program is modular in nature and is versatile in terms of modeling state-of-the-art component and system options as discussed. The Westinghouse reactor design model, which was integrated in the NESS program, is based on the near-term solid-core ENABLER NTP reactor design concept. This program is now capable of accurately modeling (characterizing) a complete near-term solid-core NTP engine system in great detail, for a number of design options, in an efficient manner. The following discussion summarizes the overall analysis methodology, key assumptions, and capabilities associated with the NESS presents an example problem, and compares the results to related NTP engine system designs. Initial installation instructions and program disks are in Volume 2 of the NESS Program User's Guide.
Nuclear Engine System Simulation (NESS). Volume 1: Program user's guide
NASA Technical Reports Server (NTRS)
Pelaccio, Dennis G.; Scheil, Christine M.; Petrosky, Lyman J.
1993-01-01
A Nuclear Thermal Propulsion (NTP) engine system design analysis tool is required to support current and future Space Exploration Initiative (SEI) propulsion and vehicle design studies. Currently available NTP engine design models are those developed during the NERVA program in the 1960's and early 1970's and are highly unique to that design or are modifications of current liquid propulsion system design models. To date, NTP engine-based liquid design models lack integrated design of key NTP engine design features in the areas of reactor, shielding, multi-propellant capability, and multi-redundant pump feed fuel systems. Additionally, since the SEI effort is in the initial development stage, a robust, verified NTP analysis design tool could be of great use to the community. This effort developed an NTP engine system design analysis program (tool), known as the Nuclear Engine System Simulation (NESS) program, to support ongoing and future engine system and stage design study efforts. In this effort, Science Applications International Corporation's (SAIC) NTP version of the Expanded Liquid Engine Simulation (ELES) program was modified extensively to include Westinghouse Electric Corporation's near-term solid-core reactor design model. The ELES program has extensive capability to conduct preliminary system design analysis of liquid rocket systems and vehicles. The program is modular in nature and is versatile in terms of modeling state-of-the-art component and system options as discussed. The Westinghouse reactor design model, which was integrated in the NESS program, is based on the near-term solid-core ENABLER NTP reactor design concept. This program is now capable of accurately modeling (characterizing) a complete near-term solid-core NTP engine system in great detail, for a number of design options, in an efficient manner. The following discussion summarizes the overall analysis methodology, key assumptions, and capabilities associated with the NESS presents an example problem, and compares the results to related NTP engine system designs. Initial installation instructions and program disks are in Volume 2 of the NESS Program User's Guide.
Vicentini, Federico; Pedrocchi, Nicola; Malosio, Matteo; Molinari Tosatti, Lorenzo
2014-09-01
Robot-assisted neurorehabilitation often involves networked systems of sensors ("sensory rooms") and powerful devices in physical interaction with weak users. Safety is unquestionably a primary concern. Some lightweight robot platforms and devices designed on purpose include safety properties using redundant sensors or intrinsic safety design (e.g. compliance and backdrivability, limited exchange of energy). Nonetheless, the entire "sensory room" shall be required to be fail-safe and safely monitored as a system at large. Yet, sensor capabilities and control algorithms used in functional therapies require, in general, frequent updates or re-configurations, making a safety-grade release of such devices hardly sustainable in cost-effectiveness and development time. As such, promising integrated platforms for human-in-the-loop therapies could not find clinical application and manufacturing support because of lacking in the maintenance of global fail-safe properties. Under the general context of cross-machinery safety standards, the paper presents a methodology called SafeNet for helping in extending the safety rate of Human Robot Interaction (HRI) systems using unsafe components, including sensors and controllers. SafeNet considers, in fact, the robotic system as a device at large and applies the principles of functional safety (as in ISO 13489-1) through a set of architectural procedures and implementation rules. The enabled capability of monitoring a network of unsafe devices through redundant computational nodes, allows the usage of any custom sensors and algorithms, usually planned and assembled at therapy planning-time rather than at platform design-time. A case study is presented with an actual implementation of the proposed methodology. A specific architectural solution is applied to an example of robot-assisted upper-limb rehabilitation with online motion tracking. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
An Assessment Methodology to Evaluate In-Flight Engine Health Management Effectiveness
NASA Astrophysics Data System (ADS)
Maggio, Gaspare; Belyeu, Rebecca; Pelaccio, Dennis G.
2002-01-01
flight effectiveness of candidate engine health management system concepts. A next generation engine health management system will be required to be both reliable and robust in terms of anomaly detection capability. The system must be able to operate successfully in the hostile, high-stress engine system environment. This implies that its system components, such as the instrumentation, process and control, and vehicle interface and support subsystems, must be highly reliable. Additionally, the system must be able to address a vast range of possible engine operation anomalies through a host of different types of measurements supported by a fast algorithm/architecture processing capability that can identify "true" (real) engine operation anomalies. False anomaly condition reports for such a system must be essentially eliminated. The accuracy of identifying only real anomaly conditions has been an issue with the Space Shuttle Main Engine (SSME) in the past. Much improvement in many of the technologies to address these areas is required. The objectives of this study were to identify and demonstrate a consistent assessment methodology that can evaluate the capability of next generation engine health management system concepts to respond in a correct, timely manner to alleviate an operational engine anomaly condition during flight. Science Applications International Corporation (SAIC), with support from NASA Marshall Space Flight Center, identified a probabilistic modeling approach to assess engine health management system concept effectiveness using a deterministic anomaly-time event assessment modeling approach that can be applied in the engine preliminary design stage of development to assess engine health management system concept effectiveness. Much discussion in this paper focuses on the formulation and application approach in performing this assessment. This includes detailed discussion of key modeling assumptions, the overall assessment methodology approach identified, and the identification of key supporting engine health management system concept design/operation and fault mode information required to utilize this methodology. At the paper's conclusion, discussion focuses on a demonstration benchmark study that applied this methodology to the current SSME health management system. A summary of study results and lessons learned are provided. Recommendations for future work in this area are also identified at the conclusion of the paper. * Please direct all correspondence/communication pertaining to this paper to Dennis G. Pelaccio, Science
Synthetic Gene Expression Circuits for Designing Precision Tools in Oncology
Re, Angela
2017-01-01
Precision medicine in oncology needs to enhance its capabilities to match diagnostic and therapeutic technologies to individual patients. Synthetic biology streamlines the design and construction of functionalized devices through standardization and rational engineering of basic biological elements decoupled from their natural context. Remarkable improvements have opened the prospects for the availability of synthetic devices of enhanced mechanism clarity, robustness, sensitivity, as well as scalability and portability, which might bring new capabilities in precision cancer medicine implementations. In this review, we begin by presenting a brief overview of some of the major advances in the engineering of synthetic genetic circuits aimed to the control of gene expression and operating at the transcriptional, post-transcriptional/translational, and post-translational levels. We then focus on engineering synthetic circuits as an enabling methodology for the successful establishment of precision technologies in oncology. We describe significant advancements in our capabilities to tailor synthetic genetic circuits to specific applications in tumor diagnosis, tumor cell- and gene-based therapy, and drug delivery. PMID:28894736
Numerical study of external burning flowfields
NASA Technical Reports Server (NTRS)
Bittner, Robert D.; Mcclinton, Charles R.
1991-01-01
This paper demonstrates the successful application of CFD to modeling an external burning flowfield. The study used the 2D, 3D, and PNS versions of the SPARK code. Various grids, boundary conditions, and ignition methodologies have been employed. Flameholding was achieved through the use of a subsonic outflow condition and a hot block located behind the step to ignite the fuel. Since the resulting burning produces a large subsonic region downstream of the cowl, this entire surface can be pressurized to the level of the back pressure. An evaluation of interactions between the ramjet exhaust and the external burning products demonstrate the complexity of this design issue. Ths code is now capable of evaluating the external burning effectiveness for flight vehicles using simple injector schemes, and the methodology can be readily applied to other external burning designs.
pysimm: A Python Package for Simulation of Molecular Systems
NASA Astrophysics Data System (ADS)
Fortunato, Michael; Colina, Coray
pysimm, short for python simulation interface for molecular modeling, is a python package designed to facilitate the structure generation and simulation of molecular systems through convenient and programmatic access to object-oriented representations of molecular system data. This poster presents core features of pysimm and design philosophies that highlight a generalized methodology for incorporation of third-party software packages through API interfaces. The integration with the LAMMPS simulation package is explained to demonstrate this methodology. pysimm began as a back-end python library that powered a cloud-based application on nanohub.org for amorphous polymer simulation. The extension from a specific application library to general purpose simulation interface is explained. Additionally, this poster highlights the rapid development of new applications to construct polymer chains capable of controlling chain morphology such as molecular weight distribution and monomer composition.
The Defense Threat Reduction Agency's Technical Nuclear Forensics Research and Development Program
NASA Astrophysics Data System (ADS)
Franks, J.
2015-12-01
The Defense Threat Reduction Agency (DTRA) Technical Nuclear Forensics (TNF) Research and Development (R&D) Program's overarching goal is to design, develop, demonstrate, and transition advanced technologies and methodologies that improve the interagency operational capability to provide forensics conclusions after the detonation of a nuclear device. This goal is attained through the execution of three focus areas covering the span of the TNF process to enable strategic decision-making (attribution): Nuclear Forensic Materials Exploitation - Development of targeted technologies, methodologies and tools enabling the timely collection, analysis and interpretation of detonation materials.Prompt Nuclear Effects Exploitation - Improve ground-based capabilities to collect prompt nuclear device outputs and effects data for rapid, complementary and corroborative information.Nuclear Forensics Device Characterization - Development of a validated and verified capability to reverse model a nuclear device with high confidence from observables (e.g., prompt diagnostics, sample analysis, etc.) seen after an attack. This presentation will outline DTRA's TNF R&D strategy and current investments, with efforts focusing on: (1) introducing new technical data collection capabilities (e.g., ground-based prompt diagnostics sensor systems; innovative debris collection and analysis); (2) developing new TNF process paradigms and concepts of operations to decrease timelines and uncertainties, and increase results confidence; (3) enhanced validation and verification (V&V) of capabilities through technology evaluations and demonstrations; and (4) updated weapon output predictions to account for the modern threat environment. A key challenge to expanding these efforts to a global capability is the need for increased post-detonation TNF international cooperation, collaboration and peer reviews.
NASA Astrophysics Data System (ADS)
Canfield, Shawn; Edinger, Ben; Frecker, Mary I.; Koopmann, Gary H.
1999-06-01
Recent advances in robotics, tele-robotics, smart material actuators, and mechatronics raise new possibilities for innovative developments in millimeter-scale robotics capable of manipulating objects only fractions of a millimeter in size. These advances can have a wide range of applications in the biomedical community. A potential application of this technology is in minimally invasive surgery (MIS). The focus of this paper is the development of a single degree of freedom prototype to demonstrate the viability of smart materials, force feedback and compliant mechanisms for minimally invasive surgery. The prototype is a compliant gripper that is 7-mm by 17-mm, made from a single piece of titanium that is designed to function as a needle driver for small scale suturing. A custom designed piezoelectric `inchworm' actuator drives the gripper. The integrated system is computer controlled providing a user interface device capable of force feedback. The design methodology described draws from recent advances in three emerging fields in engineering: design of innovative tools for MIS, design of compliant mechanisms, and design of smart materials and actuators. The focus of this paper is on the design of a millimeter-scale inchworm actuator for use with a compliant end effector in MIS.
The design and methodology of premature ejaculation interventional studies
2016-01-01
Large well-designed clinical efficacy and safety randomized clinical trials (RCTs) are required to achieve regulatory approval of new drug treatments. The objective of this article is to make recommendations for the criteria for defining and selecting the clinical trial study population, design and efficacy outcomes measures which comprise ideal premature ejaculation (PE) interventional trial methodology. Data on clinical trial design, epidemiology, definitions, dimensions and psychological impact of PE was reviewed, critiqued and incorporated into a series of recommendations for standardisation of PE clinical trial design, outcome measures and reporting using the principles of evidence based medicine. Data from PE interventional studies are only reliable, interpretable and capable of being generalised to patients with PE, when study populations are defined by the International Society for Sexual Medicine (ISSM) multivariate definition of PE. PE intervention trials should employ a double-blind RCT methodology and include placebo control, active standard drug control, and/or dose comparison trials. Ejaculatory latency time (ELT) and subject/partner outcome measures of control, personal/partner/relationship distress and other study-specific outcome measures should be used as outcome measures. There is currently no published literature which identifies a clinically significant threshold response to intervention. The ISSM definition of PE reflects the contemporary understanding of PE and represents the state-of-the-art multi-dimensional definition of PE and is recommended as the basis of diagnosis of PE for all PE clinical trials. PMID:27652224
Benchmark Tests for Stirling Convertor Heater Head Life Assessment Conducted
NASA Technical Reports Server (NTRS)
Krause, David L.; Halford, Gary R.; Bowman, Randy R.
2004-01-01
A new in-house test capability has been developed at the NASA Glenn Research Center, where a critical component of the Stirling Radioisotope Generator (SRG) is undergoing extensive testing to aid the development of analytical life prediction methodology and to experimentally aid in verification of the flight-design component's life. The new facility includes two test rigs that are performing creep testing of the SRG heater head pressure vessel test articles at design temperature and with wall stresses ranging from operating level to seven times that (see the following photograph).
Turbine Seal Research at NASA GRC
NASA Technical Reports Server (NTRS)
Proctor, Margaret P.; Steinetz, Bruce M.; Delgado, Irebert R.; Hendricks, Robert C.
2011-01-01
Low-leakage, long-life turbomachinery seals are important to both Space and Aeronautics Missions. (1) Increased payload capability (2) Decreased specific fuel consumption and emissions (3) Decreased direct operating costs. NASA GRC has a history of significant accomplishments and collaboration with industry and academia in seals research. NASA's unique, state-of-the-art High Temperature, High Speed Turbine Seal Test Facility is an asset to the U.S. Engine / Seal Community. Current focus is on developing experimentally validated compliant, non-contacting, high temperature seal designs, analysis, and design methodologies to enable commercialization.
Conceptual Design of the Nuclear Electronic Xenon Ion System (NEXIS)
NASA Technical Reports Server (NTRS)
Monheiser, Jeff; Polk, Jay; Randolph, Tom
2004-01-01
In support of the NEXIS program, Aerojet-Redmond Operations, with review and input from the JPL and Boeing, has completed the design for a development model (DM) discharge chamber assembly and main discharge cathode assembly. These efforts along with the work by JPL to develop the carbon-carbon-composite ion optics assembly have resulted in a complete ion engine design. The goal of the NEXIS program is to significantly advance the current state of the art by developing an ion engine capable of operating at an input power of 20kW, an Isp of 7500 sec and have a total xenon through put capability of 2000 kg. In this paper we will describe the methodology used to design the discharge chamber and cathode assemblies and describe the resulting final design. Specifics will include the concepts used for the mounting of the ion optics along with the concepts used for the gimbal mounts. In addition, we will present results of a vibrational analysis showing how the engine will respond to a typical Delta IV heavy vibration spectrum.
Widefield quantitative multiplex surface enhanced Raman scattering imaging in vivo
NASA Astrophysics Data System (ADS)
McVeigh, Patrick Z.; Mallia, Rupananda J.; Veilleux, Israel; Wilson, Brian C.
2013-04-01
In recent years numerous studies have shown the potential advantages of molecular imaging in vitro and in vivo using contrast agents based on surface enhanced Raman scattering (SERS), however the low throughput of traditional point-scanned imaging methodologies have limited their use in biological imaging. In this work we demonstrate that direct widefield Raman imaging based on a tunable filter is capable of quantitative multiplex SERS imaging in vivo, and that this imaging is possible with acquisition times which are orders of magnitude lower than achievable with comparable point-scanned methodologies. The system, designed for small animal imaging, has a linear response from (0.01 to 100 pM), acquires typical in vivo images in <10 s, and with suitable SERS reporter molecules is capable of multiplex imaging without compensation for spectral overlap. To demonstrate the utility of widefield Raman imaging in biological applications, we show quantitative imaging of four simultaneous SERS reporter molecules in vivo with resulting probe quantification that is in excellent agreement with known quantities (R2>0.98).
Boiocchi, Riccardo; Gernaey, Krist V; Sin, Gürkan
2016-10-01
A methodology is developed to systematically design the membership functions of fuzzy-logic controllers for multivariable systems. The methodology consists of a systematic derivation of the critical points of the membership functions as a function of predefined control objectives. Several constrained optimization problems corresponding to different qualitative operation states of the system are defined and solved to identify, in a consistent manner, the critical points of the membership functions for the input variables. The consistently identified critical points, together with the linguistic rules, determine the long term reachability of the control objectives by the fuzzy logic controller. The methodology is highlighted using a single-stage side-stream partial nitritation/Anammox reactor as a case study. As a result, a new fuzzy-logic controller for high and stable total nitrogen removal efficiency is designed. Rigorous simulations are carried out to evaluate and benchmark the performance of the controller. The results demonstrate that the novel control strategy is capable of rejecting the long-term influent disturbances, and can achieve a stable and high TN removal efficiency. Additionally, the controller was tested, and showed robustness, against measurement noise levels typical for wastewater sensors. A feedforward-feedback configuration using the present controller would give even better performance. In comparison, a previously developed fuzzy-logic controller using merely expert and intuitive knowledge performed worse. This proved the importance of using a systematic methodology for the derivation of the membership functions for multivariable systems. These results are promising for future applications of the controller in real full-scale plants. Furthermore, the methodology can be used as a tool to help systematically design fuzzy logic control applications for other biological processes. Copyright © 2016 Elsevier Ltd. All rights reserved.
Design and ergonomics. Methods for integrating ergonomics at hand tool design stage.
Marsot, Jacques; Claudon, Laurent
2004-01-01
As a marked increase in the number of musculoskeletal disorders was noted in many industrialized countries and more specifically in companies that require the use of hand tools, the French National Research and Safety Institute (INRS) launched in 1999 a research project on the topic of integrating ergonomics into hand tool design, and more particularly to a design of a boning knife. After a brief recall of the difficulties of integrating ergonomics at the design stage, the present paper shows how 3 design methodological tools--Functional Analysis, Quality Function Deployment and TRIZ--have been applied to the design of a boning knife. Implementation of these tools enabled us to demonstrate the extent to which they are capable of responding to the difficulties of integrating ergonomics into product design.
Nuclear thermal propulsion engine system design analysis code development
NASA Astrophysics Data System (ADS)
Pelaccio, Dennis G.; Scheil, Christine M.; Petrosky, Lyman J.; Ivanenok, Joseph F.
1992-01-01
A Nuclear Thermal Propulsion (NTP) Engine System Design Analyis Code has recently been developed to characterize key NTP engine system design features. Such a versatile, standalone NTP system performance and engine design code is required to support ongoing and future engine system and vehicle design efforts associated with proposed Space Exploration Initiative (SEI) missions of interest. Key areas of interest in the engine system modeling effort were the reactor, shielding, and inclusion of an engine multi-redundant propellant pump feed system design option. A solid-core nuclear thermal reactor and internal shielding code model was developed to estimate the reactor's thermal-hydraulic and physical parameters based on a prescribed thermal output which was integrated into a state-of-the-art engine system design model. The reactor code module has the capability to model graphite, composite, or carbide fuels. Key output from the model consists of reactor parameters such as thermal power, pressure drop, thermal profile, and heat generation in cooled structures (reflector, shield, and core supports), as well as the engine system parameters such as weight, dimensions, pressures, temperatures, mass flows, and performance. The model's overall analysis methodology and its key assumptions and capabilities are summarized in this paper.
Torsional Ultrasound Sensor Optimization for Soft Tissue Characterization
Melchor, Juan; Muñoz, Rafael; Rus, Guillermo
2017-01-01
Torsion mechanical waves have the capability to characterize shear stiffness moduli of soft tissue. Under this hypothesis, a computational methodology is proposed to design and optimize a piezoelectrics-based transmitter and receiver to generate and measure the response of torsional ultrasonic waves. The procedure employed is divided into two steps: (i) a finite element method (FEM) is developed to obtain a transmitted and received waveform as well as a resonance frequency of a previous geometry validated with a semi-analytical simplified model and (ii) a probabilistic optimality criteria of the design based on inverse problem from the estimation of robust probability of detection (RPOD) to maximize the detection of the pathology defined in terms of changes of shear stiffness. This study collects different options of design in two separated models, in transmission and contact, respectively. The main contribution of this work describes a framework to establish such as forward, inverse and optimization procedures to choose a set of appropriate parameters of a transducer. This methodological framework may be generalizable for other different applications. PMID:28617353
Improvements in analysis techniques for segmented mirror arrays
NASA Astrophysics Data System (ADS)
Michels, Gregory J.; Genberg, Victor L.; Bisson, Gary R.
2016-08-01
The employment of actively controlled segmented mirror architectures has become increasingly common in the development of current astronomical telescopes. Optomechanical analysis of such hardware presents unique issues compared to that of monolithic mirror designs. The work presented here is a review of current capabilities and improvements in the methodology of the analysis of mechanically induced surface deformation of such systems. The recent improvements include capability to differentiate surface deformation at the array and segment level. This differentiation allowing surface deformation analysis at each individual segment level offers useful insight into the mechanical behavior of the segments that is unavailable by analysis solely at the parent array level. In addition, capability to characterize the full displacement vector deformation of collections of points allows analysis of mechanical disturbance predictions of assembly interfaces relative to other assembly interfaces. This capability, called racking analysis, allows engineers to develop designs for segment-to-segment phasing performance in assembly integration, 0g release, and thermal stability of operation. The performance predicted by racking has the advantage of being comparable to the measurements used in assembly of hardware. Approaches to all of the above issues are presented and demonstrated by example with SigFit, a commercially available tool integrating mechanical analysis with optical analysis.
Aerothermodynamic Flight Simulation Capabilities for Aerospace Vehicles
NASA Technical Reports Server (NTRS)
Miller, Charles G.
1998-01-01
Aerothermodynamics, encompassing aerodynamics, aeroheating, and fluid dynamics and physical processes, is the genesis for the design and development of advanced space transportation vehicles and provides crucial information to other disciplines such as structures, materials, propulsion, avionics, and guidance, navigation and control. Sources of aerothermodynamic information are ground-based facilities, Computational Fluid Dynamic (CFD) and engineering computer codes, and flight experiments. Utilization of this aerothermodynamic triad provides the optimum aerothermodynamic design to safely satisfy mission requirements while reducing design conservatism, risk and cost. The iterative aerothermodynamic process for initial screening/assessment of aerospace vehicle concepts, optimization of aerolines to achieve/exceed mission requirements, and benchmark studies for final design and establishment of the flight data book are reviewed. Aerothermodynamic methodology centered on synergism between ground-based testing and CFD predictions is discussed for various flow regimes encountered by a vehicle entering the Earth s atmosphere from low Earth orbit. An overview of the resources/infrastructure required to provide accurate/creditable aerothermodynamic information in a timely manner is presented. Impacts on Langley s aerothermodynamic capabilities due to recent programmatic changes such as Center reorganization, downsizing, outsourcing, industry (as opposed to NASA) led programs, and so forth are discussed. Sample applications of these capabilities to high Agency priority, fast-paced programs such as Reusable Launch Vehicle (RLV)/X-33 Phases I and 11, X-34, Hyper-X and X-38 are presented and lessons learned discussed. Lastly, enhancements in ground-based testing/CFD capabilities necessary to partially/fully satisfy future requirements are addressed.
Initial Assessment of Open Rotor Propulsion Applied to an Advanced Single-Aisle Aircraft
NASA Technical Reports Server (NTRS)
Guynn, Mark D.; Berton, Jeffrey J.; Hendricks, Eric S.; Tong, Michael T.; Haller, William J.; Thurman, Douglas R.
2011-01-01
Application of high speed, advanced turboprops, or propfans, to subsonic transport aircraft received significant attention and research in the 1970s and 1980s when fuel efficiency was the driving focus of aeronautical research. Recent volatility in fuel prices and concern for aviation s environmental impact have renewed interest in unducted, open rotor propulsion, and revived research by NASA and a number of engine manufacturers. Unfortunately, in the two decades that have passed since open rotor concepts were thoroughly investigated, NASA has lost experience and expertise in this technology area. This paper describes initial efforts to re-establish NASA s capability to assess aircraft designs with open rotor propulsion. Specifically, methodologies for aircraft-level sizing, performance analysis, and system-level noise analysis are described. Propulsion modeling techniques have been described in a previous paper. Initial results from application of these methods to an advanced single-aisle aircraft using open rotor engines based on historical blade designs are presented. These results indicate open rotor engines have the potential to provide large reductions in fuel consumption and emissions. Initial noise analysis indicates that current noise regulations can be met with old blade designs and modern, noiseoptimized blade designs are expected to result in even lower noise levels. Although an initial capability has been established and initial results obtained, additional development work is necessary to make NASA s open rotor system analysis capability on par with existing turbofan analysis capabilities.
Overlay Tolerances For VLSI Using Wafer Steppers
NASA Astrophysics Data System (ADS)
Levinson, Harry J.; Rice, Rory
1988-01-01
In order for VLSI circuits to function properly, the masking layers used in the fabrication of those devices must overlay each other to within the manufacturing tolerance incorporated in the circuit design. The capabilities of the alignment tools used in the masking process determine the overlay tolerances to which circuits can be designed. It is therefore of considerable importance that these capabilities be well characterized. Underestimation of the overlay accuracy results in unnecessarily large devices, resulting in poor utilization of wafer area and possible degradation of device performance. Overestimation will result in significant yield loss because of the failure to conform to the tolerances of the design rules. The proper methodology for determining the overlay capabilities of wafer steppers, the most commonly used alignment tool for the production of VLSI circuits, is the subject of this paper. Because cost-effective manufacturing process technology has been the driving force of VLSI, the impact on productivity is a primary consideration in all discussions. Manufacturers of alignment tools advertise the capabilities of their equipment. It is notable that no manufacturer currently characterizes his aligners in a manner consistent with the requirements of producing very large integrated circuits, as will be discussed. This has resulted in the situation in which the evaluation and comparison of the capabilities of alignment tools require the attention of a lithography specialist. Unfortunately, lithographic capabilities must be known by many other people, particularly the circuit designers and the managers responsible for the financial consequences of the high prices of modern alignment tools. All too frequently, the designer or manager is confronted with contradictory data, one set coming from his lithography specialist, and the other coming from a sales representative of an equipment manufacturer. Since the latter generally attempts to make his merchandise appear as attractive as possible, the lithographer is frequently placed in the position of having to explain subtle issues in order to justify his decisions. It is the purpose of this paper to provide that explanation.
Selecting a software development methodology. [of digital flight control systems
NASA Technical Reports Server (NTRS)
Jones, R. E.
1981-01-01
The state of the art analytical techniques for the development and verification of digital flight control software is studied and a practical designer oriented development and verification methodology is produced. The effectiveness of the analytic techniques chosen for the development and verification methodology are assessed both technically and financially. Technical assessments analyze the error preventing and detecting capabilities of the chosen technique in all of the pertinent software development phases. Financial assessments describe the cost impact of using the techniques, specifically, the cost of implementing and applying the techniques as well as the relizable cost savings. Both the technical and financial assessment are quantitative where possible. In the case of techniques which cannot be quantitatively assessed, qualitative judgements are expressed about the effectiveness and cost of the techniques. The reasons why quantitative assessments are not possible will be documented.
Advanced Control Considerations for Turbofan Engine Design
NASA Technical Reports Server (NTRS)
Connolly, Joseph W.; Csank, Jeffrey T.; Chicatelli, Amy
2016-01-01
This paper covers the application of a model-based engine control (MBEC) methodology featuring a self tuning on-board model for an aircraft turbofan engine simulation. The nonlinear engine model is capable of modeling realistic engine performance, allowing for a verification of the advanced control methodology over a wide range of operating points and life cycle conditions. The on-board model is a piece-wise linear model derived from the nonlinear engine model and updated using an optimal tuner Kalman Filter estimation routine, which enables the on-board model to self-tune to account for engine performance variations. MBEC is used here to show how advanced control architectures can improve efficiency during the design phase of a turbofan engine by reducing conservative operability margins. The operability margins that can be reduced, such as stall margin, can expand the engine design space and offer potential for efficiency improvements. Application of MBEC architecture to a nonlinear engine simulation is shown to reduce the thrust specific fuel consumption by approximately 1% over the baseline design, while maintaining safe operation of the engine across the flight envelope.
Landing Gear Integration in Aircraft Conceptual Design. Revision
NASA Technical Reports Server (NTRS)
Chai, Sonny T.; Mason, William H.
1997-01-01
The design of the landing gear is one of the more fundamental aspects of aircraft design. The design and integration process encompasses numerous engineering disciplines, e.g., structure, weights, runway design, and economics, and has become extremely sophisticated in the last few decades. Although the design process is well-documented, no attempt has been made until now in the development of a design methodology that can be used within an automated environment. As a result, the process remains to be a key responsibility for the configuration designer and is largely experience-based and graphically-oriented. However, as industry and government try to incorporate multidisciplinary design optimization (MDO) methods in the conceptual design phase, the need for a more systematic procedure has become apparent. The development of an MDO-capable design methodology as described in this work is focused on providing the conceptual designer with tools to help automate the disciplinary analyses, i.e., geometry, kinematics, flotation, and weight. Documented design procedures and analyses were examined to determine their applicability, and to ensure compliance with current practices and regulations. Using the latest information as obtained from industry during initial industry survey, the analyses were in terms modified and expanded to accommodate the design criteria associated with the advanced large subsonic transports. Algorithms were then developed based on the updated analysis procedures to be incorporated into existing MDO codes.
Extended cooperative control synthesis
NASA Technical Reports Server (NTRS)
Davidson, John B.; Schmidt, David K.
1994-01-01
This paper reports on research for extending the Cooperative Control Synthesis methodology to include a more accurate modeling of the pilot's controller dynamics. Cooperative Control Synthesis (CCS) is a methodology that addresses the problem of how to design control laws for piloted, high-order, multivariate systems and/or non-conventional dynamic configurations in the absence of flying qualities specifications. This is accomplished by emphasizing the parallel structure inherent in any pilot-controlled, augmented vehicle. The original CCS methodology is extended to include the Modified Optimal Control Model (MOCM), which is based upon the optimal control model of the human operator developed by Kleinman, Baron, and Levison in 1970. This model provides a modeling of the pilot's compensation dynamics that is more accurate than the simplified pilot dynamic representation currently in the CCS methodology. Inclusion of the MOCM into the CCS also enables the modeling of pilot-observation perception thresholds and pilot-observation attention allocation affects. This Extended Cooperative Control Synthesis (ECCS) allows for the direct calculation of pilot and system open- and closed-loop transfer functions in pole/zero form and is readily implemented in current software capable of analysis and design for dynamic systems. Example results based upon synthesizing an augmentation control law for an acceleration command system in a compensatory tracking task using the ECCS are compared with a similar synthesis performed by using the original CCS methodology. The ECCS is shown to provide augmentation control laws that yield more favorable, predicted closed-loop flying qualities and tracking performance than those synthesized using the original CCS methodology.
NASA Technical Reports Server (NTRS)
Cirillo, William M.; Earle, Kevin D.; Goodliff, Kandyce E.; Reeves, J. D.; Stromgren, Chel; Andraschko, Mark R.; Merrill, R. Gabe
2008-01-01
NASA s Constellation Program employs a strategic analysis methodology in providing an integrated analysis capability of Lunar exploration scenarios and to support strategic decision-making regarding those scenarios. The strategic analysis methodology integrates the assessment of the major contributors to strategic objective satisfaction performance, affordability, and risk and captures the linkages and feedbacks between all three components. Strategic analysis supports strategic decision making by senior management through comparable analysis of alternative strategies, provision of a consistent set of high level value metrics, and the enabling of cost-benefit analysis. The tools developed to implement the strategic analysis methodology are not element design and sizing tools. Rather, these models evaluate strategic performance using predefined elements, imported into a library from expert-driven design/sizing tools or expert analysis. Specific components of the strategic analysis tool set include scenario definition, requirements generation, mission manifesting, scenario lifecycle costing, crew time analysis, objective satisfaction benefit, risk analysis, and probabilistic evaluation. Results from all components of strategic analysis are evaluated a set of pre-defined figures of merit (FOMs). These FOMs capture the high-level strategic characteristics of all scenarios and facilitate direct comparison of options. The strategic analysis methodology that is described in this paper has previously been applied to the Space Shuttle and International Space Station Programs and is now being used to support the development of the baseline Constellation Program lunar architecture. This paper will present an overview of the strategic analysis methodology and will present sample results from the application of the strategic analysis methodology to the Constellation Program lunar architecture.
NASA Technical Reports Server (NTRS)
Koltai, Kolina; Ho, Nhut; Masequesmay, Gina; Niedober, David; Skoog, Mark; Cacanindin, Artemio; Johnson, Walter; Lyons, Joseph
2014-01-01
This paper discusses a case study that examined the influence of cultural, organizational and automation capability upon human trust in, and reliance on, automation. In particular, this paper focuses on the design and application of an extended case study methodology, and on the foundational lessons revealed by it. Experimental test pilots involved in the research and development of the US Air Force's newly developed Automatic Ground Collision Avoidance System served as the context for this examination. An eclectic, multi-pronged approach was designed to conduct this case study, and proved effective in addressing the challenges associated with the case's politically sensitive and military environment. Key results indicate that the system design was in alignment with pilot culture and organizational mission, indicating the potential for appropriate trust development in operational pilots. These include the low-vulnerability/ high risk nature of the pilot profession, automation transparency and suspicion, system reputation, and the setup of and communications among organizations involved in the system development.
Threshold concepts in finance: conceptualizing the curriculum
NASA Astrophysics Data System (ADS)
Hoadley, Susan; Tickle, Leonie; Wood, Leigh N.; Kyng, Tim
2015-08-01
Graduates with well-developed capabilities in finance are invaluable to our society and in increasing demand. Universities face the challenge of designing finance programmes to develop these capabilities and the essential knowledge that underpins them. Our research responds to this challenge by identifying threshold concepts that are central to the mastery of finance and by exploring their potential for informing curriculum design and pedagogical practices to improve student outcomes. In this paper, we report the results of an online survey of finance academics at multiple institutions in Australia, Canada, New Zealand, South Africa and the United Kingdom. The outcomes of our research are recommendations for threshold concepts in finance endorsed by quantitative evidence, as well as a model of the finance curriculum incorporating finance, modelling and statistics threshold concepts. In addition, we draw conclusions about the application of threshold concept theory supported by both quantitative and qualitative evidence. Our methodology and findings have general relevance to the application of threshold concept theory as a means to investigate and inform curriculum design and delivery in higher education.
ERIC Educational Resources Information Center
And Others; Rynders, John E.
1978-01-01
For many years, the educational capabilities of Down's syndrome persons have been underestimated because a large number of studies purporting to give an accurate picture of Down's syndrome persons' developmental capabilities have had serious methodological flaws. (Author)
NASA Astrophysics Data System (ADS)
Nemeth, Noel N.; Jadaan, Osama M.; Palfi, Tamas; Baker, Eric H.
Brittle materials today are being used, or considered, for a wide variety of high tech applications that operate in harsh environments, including static and rotating turbine parts, thermal protection systems, dental prosthetics, fuel cells, oxygen transport membranes, radomes, and MEMS. Designing brittle material components to sustain repeated load without fracturing while using the minimum amount of material requires the use of a probabilistic design methodology. The NASA CARES/Life 1 (Ceramic Analysis and Reliability Evaluation of Structure/Life) code provides a general-purpose analysis tool that predicts the probability of failure of a ceramic component as a function of its time in service. This capability includes predicting the time-dependent failure probability of ceramic components against catastrophic rupture when subjected to transient thermomechanical loads (including cyclic loads). The developed methodology allows for changes in material response that can occur with temperature or time (i.e. changing fatigue and Weibull parameters with temperature or time). For this article an overview of the transient reliability methodology and how this methodology is extended to account for proof testing is described. The CARES/Life code has been modified to have the ability to interface with commercially available finite element analysis (FEA) codes executed for transient load histories. Examples are provided to demonstrate the features of the methodology as implemented in the CARES/Life program.
A comprehensive plan for helicopter drag reduction
NASA Technical Reports Server (NTRS)
Williams, R. M.; Montana, P. S.
1975-01-01
Current helicopters have parasite drag levels 6 to 10 times as great as fixed wing aircraft. The commensurate poor cruise efficiency results in a substantial degradation of potential mission capability. The paper traces the origins of helicopter drag and shows that the problem (primarily due to bluff body flow separation) can be solved by the adoption of a comprehensive research and development plan. This plan, known as the Fuselage Design Methodology, comprises both nonaerodynamic and aerodynamic aspects. The aerodynamics are discussed in detail and experimental and analytical programs are described which will lead to a solution of the bluff body problem. Some recent results of work conducted at the Naval Ship Research and Development Center (NSRDC) are presented to illustrate these programs. It is concluded that a 75-per cent reduction of helicopter drag is possible by the full implementation of the Fuselage Design Methodology.
An evolving-requirements technology assessment process for advanced propulsion concepts
NASA Astrophysics Data System (ADS)
McClure, Erin Kathleen
The following dissertation investigates the development of a methodology suitable for the evaluation of advanced propulsion concepts. At early stages of development, both the future performance of these concepts and their requirements are highly uncertain, making it difficult to forecast their future value. Developing advanced propulsion concepts requires a huge investment of resources. The methodology was developed to enhance the decision-makers understanding of the concepts, so that they could mitigate the risks associated with developing such concepts. A systematic methodology to identify potential advanced propulsion concepts and assess their robustness is necessary to reduce the risk of developing advanced propulsion concepts. Existing advanced design methodologies have evaluated the robustness of technologies or concepts to variations in requirements, but they are not suitable to evaluate a large number of dissimilar concepts. Variations in requirements have been shown to impact the development of advanced propulsion concepts, and any method designed to evaluate these concepts must incorporate the possible variations of the requirements into the assessment. In order to do so, a methodology was formulated to be capable of accounting for two aspects of the problem. First, it had to systemically identify a probabilistic distribution for the future requirements. Such a distribution would allow decision-makers to quantify the uncertainty introduced by variations in requirements. Second, the methodology must be able to assess the robustness of the propulsion concepts as a function of that distribution. This dissertation describes in depth these enabling elements and proceeds to synthesize them into a new method, the Evolving Requirements Technology Assessment (ERTA). As a proof of concept, the ERTA method was used to evaluate and compare advanced propulsion systems that will be capable of powering a hurricane tracking, High Altitude, Long Endurance (HALE) unmanned aerial vehicle (UAV). The use of the ERTA methodology to assess HALE UAV propulsion concepts demonstrated that potential variations in requirements do significantly impact the assessment and selection of propulsion concepts. The proof of concept also demonstrated that traditional forecasting techniques, such as the cross impact analysis, could be used to forecast the requirements for advanced propulsion concepts probabilistically. "Fitness", a measure of relative goodness, was used to evaluate the concepts. Finally, stochastic optimizations were used to evaluate the propulsion concepts across the range of requirement sets that were considered.
High-Fidelity Multidisciplinary Design Optimization of Aircraft Configurations
NASA Technical Reports Server (NTRS)
Martins, Joaquim R. R. A.; Kenway, Gaetan K. W.; Burdette, David; Jonsson, Eirikur; Kennedy, Graeme J.
2017-01-01
To evaluate new airframe technologies we need design tools based on high-fidelity models that consider multidisciplinary interactions early in the design process. The overarching goal of this NRA is to develop tools that enable high-fidelity multidisciplinary design optimization of aircraft configurations, and to apply these tools to the design of high aspect ratio flexible wings. We develop a geometry engine that is capable of quickly generating conventional and unconventional aircraft configurations including the internal structure. This geometry engine features adjoint derivative computation for efficient gradient-based optimization. We also added overset capability to a computational fluid dynamics solver, complete with an adjoint implementation and semiautomatic mesh generation. We also developed an approach to constraining buffet and started the development of an approach for constraining utter. On the applications side, we developed a new common high-fidelity model for aeroelastic studies of high aspect ratio wings. We performed optimal design trade-o s between fuel burn and aircraft weight for metal, conventional composite, and carbon nanotube composite wings. We also assessed a continuous morphing trailing edge technology applied to high aspect ratio wings. This research resulted in the publication of 26 manuscripts so far, and the developed methodologies were used in two other NRAs. 1
NASA Astrophysics Data System (ADS)
Sindiy, Oleg V.
This dissertation presents a model-based system-of-systems engineering (SoSE) approach as a design philosophy for architecting in system-of-systems (SoS) problems. SoS refers to a special class of systems in which numerous systems with operational and managerial independence interact to generate new capabilities that satisfy societal needs. Design decisions are more complicated in a SoS setting. A revised Process Model for SoSE is presented to support three phases in SoS architecting: defining the scope of the design problem, abstracting key descriptors and their interrelations in a conceptual model, and implementing computer-based simulations for architectural analyses. The Process Model enables improved decision support considering multiple SoS features and develops computational models capable of highlighting configurations of organizational, policy, financial, operational, and/or technical features. Further, processes for verification and validation of SoS models and simulations are also important due to potential impact on critical decision-making and, thus, are addressed. Two research questions frame the research efforts described in this dissertation. The first concerns how the four key sources of SoS complexity---heterogeneity of systems, connectivity structure, multi-layer interactions, and the evolutionary nature---influence the formulation of SoS models and simulations, trade space, and solution performance and structure evaluation metrics. The second question pertains to the implementation of SoSE architecting processes to inform decision-making for a subset of SoS problems concerning the design of information exchange services in space-based operations domain. These questions motivate and guide the dissertation's contributions. A formal methodology for drawing relationships within a multi-dimensional trade space, forming simulation case studies from applications of candidate architecture solutions to a campaign of notional mission use cases, and executing multi-purpose analysis studies is presented. These efforts are coupled to the generation of aggregate and time-dependent solution performance metrics via the hierarchical decomposition of objectives and the analytical recomposition of multi-attribute qualitative program drivers from quantifiable measures. This methodology was also applied to generate problem-specific solution structure evaluation metrics that facilitate the comparison of alternate solutions at a high level of aggregation, at lower levels of abstraction, and to relate options for design variables with associated performance values. For proof-of-capability demonstration, the selected application problem concerns the design of command, control, communication, and information (C3I) architecture services for a notional campaign of crewed and robotic lunar surface missions. The impetus for the work was the demonstration of using model-based SoSE for design of sustainable interoperability capabilities between all data and communication assets in extended lunar campaigns. A comprehensive Lunar C3I simulation tool was developed by a team of researchers at Purdue University in support of NASA's Constellation Program; the author of this dissertation was a key contributor to the creation of this tool and made modifications and extensions to key components relevant to the methodological concepts presented in this dissertation. The dissertation concludes with a presentation of example results based on the interrogation of the constructed Lunar C3I computational model. The results are based on a family of studies, structured around a trade-tree of architecture options, which were conducted to test the hypothesis that the SoSE approach is efficacious in the information-exchange architecture design in space exploration domain. Included in the family of proof-of-capability studies is a simulation of the Apollo 17 mission, which allows not only for partial verification and validation of the model, but also provides insights for prioritizing future model design iterations to make it more realistic representation of the "real world." A caveat within the results presented is that they serve within the capacity of a proof-of-capability demonstration, and as such, they are a product of models and analyses that need further development before the tool's results can be employed for decision-making. Additional discussion is provided for how to further develop and validate the Lunar C3I tool and also to make it extensible to other SoS design problems of similar nature in space exploration and other problem application domains.
Getting the big picture in community science: methods that capture context.
Luke, Douglas A
2005-06-01
Community science has a rich tradition of using theories and research designs that are consistent with its core value of contextualism. However, a survey of empirical articles published in the American Journal of Community Psychology shows that community scientists utilize a narrow range of statistical tools that are not well suited to assess contextual data. Multilevel modeling, geographic information systems (GIS), social network analysis, and cluster analysis are recommended as useful tools to address contextual questions in community science. An argument for increased methodological consilience is presented, where community scientists are encouraged to adopt statistical methodology that is capable of modeling a greater proportion of the data than is typical with traditional methods.
An Approach to Improved Credibility of CFD Simulations for Rocket Injector Design
NASA Technical Reports Server (NTRS)
Tucker, Paul K.; Menon, Suresh; Merkle, Charles L.; Oefelein, Joseph C.; Yang, Vigor
2007-01-01
Computational fluid dynamics (CFD) has the potential to improve the historical rocket injector design process by simulating the sensitivity of performance and injector-driven thermal environments to. the details of the injector geometry and key operational parameters. Methodical verification and validation efforts on a range of coaxial injector elements have shown the current production CFD capability must be improved in order to quantitatively impact the injector design process.. This paper documents the status of an effort to understand and compare the predictive capabilities and resource requirements of a range of CFD methodologies on a set of model problem injectors. Preliminary results from a steady Reynolds-Average Navier-Stokes (RANS), an unsteady Reynolds-Average Navier Stokes (URANS) and three different Large Eddy Simulation (LES) techniques used to model a single element coaxial injector using gaseous oxygen and gaseous hydrogen propellants are presented. Initial observations are made comparing instantaneous results, corresponding time-averaged and steady-state solutions in the near -injector flow field. Significant differences in the flow fields exist, as expected, and are discussed. An important preliminary result is the identification of a fundamental mixing mechanism, accounted for by URANS and LES, but missing in the steady BANS methodology. Since propellant mixing is the core injector function, this mixing process may prove to have a profound effect on the ability to more correctly simulate injector performance and resulting thermal environments. Issues important to unifying the basis for future comparison such as solution initialization, required run time and grid resolution are addressed.
The Personal Satellite Assistant: An Internal Spacecraft Autonomous Mobile Monitor
NASA Technical Reports Server (NTRS)
Dorais, Gregory A.; Gawdiak, Yuri; Clancy, Daniel (Technical Monitor)
2002-01-01
This paper presents an overview of the research and development effort at the NASA Ames Research Center to create an internal spacecraft autonomous mobile monitor capable of performing intra-vehicular sensing activities by autonomously navigating onboard the International Space Station. We describe the capabilities, mission roles, rationale, high-level functional requirements, and design challenges for an autonomous mobile monitor. The rapid prototyping design methodology used, in which five prototypes of increasing fidelity are designed, is described as well as the status of these prototypes, of which two are operational and being tested, and one is actively being designed. The physical test facilities used to perform ground testing are briefly described, including a micro-gravity test facility that permits a prototype to propel itself in 3 dimensions with 6 degrees-of-freedom as if it were in an micro-gravity environment. We also describe an overview of the autonomy framework and its components including the software simulators used in the development process. Sample mission test scenarios are also described. The paper concludes with a discussion of future and related work followed by the summary.
The New Meteor Radar at Penn State: Design and First Observations
NASA Technical Reports Server (NTRS)
Urbina, J.; Seal, R.; Dyrud, L.
2011-01-01
In an effort to provide new and improved meteor radar sensing capabilities, Penn State has been developing advanced instruments and technologies for future meteor radars, with primary objectives of making such instruments more capable and more cost effective in order to study the basic properties of the global meteor flux, such as average mass, velocity, and chemical composition. Using low-cost field programmable gate arrays (FPGAs), combined with open source software tools, we describe a design methodology enabling one to develop state-of-the art radar instrumentation, by developing a generalized instrumentation core that can be customized using specialized output stage hardware. Furthermore, using object-oriented programming (OOP) techniques and open-source tools, we illustrate a technique to provide a cost-effective, generalized software framework to uniquely define an instrument s functionality through a customizable interface, implemented by the designer. The new instrument is intended to provide instantaneous profiles of atmospheric parameters and climatology on a daily basis throughout the year. An overview of the instrument design concepts and some of the emerging technologies developed for this meteor radar are presented.
The use of hierarchical clustering for the design of optimized monitoring networks
NASA Astrophysics Data System (ADS)
Soares, Joana; Makar, Paul Andrew; Aklilu, Yayne; Akingunola, Ayodeji
2018-05-01
Associativity analysis is a powerful tool to deal with large-scale datasets by clustering the data on the basis of (dis)similarity and can be used to assess the efficacy and design of air quality monitoring networks. We describe here our use of Kolmogorov-Zurbenko filtering and hierarchical clustering of NO2 and SO2 passive and continuous monitoring data to analyse and optimize air quality networks for these species in the province of Alberta, Canada. The methodology applied in this study assesses dissimilarity between monitoring station time series based on two metrics: 1 - R, R being the Pearson correlation coefficient, and the Euclidean distance; we find that both should be used in evaluating monitoring site similarity. We have combined the analytic power of hierarchical clustering with the spatial information provided by deterministic air quality model results, using the gridded time series of model output as potential station locations, as a proxy for assessing monitoring network design and for network optimization. We demonstrate that clustering results depend on the air contaminant analysed, reflecting the difference in the respective emission sources of SO2 and NO2 in the region under study. Our work shows that much of the signal identifying the sources of NO2 and SO2 emissions resides in shorter timescales (hourly to daily) due to short-term variation of concentrations and that longer-term averages in data collection may lose the information needed to identify local sources. However, the methodology identifies stations mainly influenced by seasonality, if larger timescales (weekly to monthly) are considered. We have performed the first dissimilarity analysis based on gridded air quality model output and have shown that the methodology is capable of generating maps of subregions within which a single station will represent the entire subregion, to a given level of dissimilarity. We have also shown that our approach is capable of identifying different sampling methodologies as well as outliers (stations' time series which are markedly different from all others in a given dataset).
NASA DOE POD NDE Capabilities Data Book
NASA Technical Reports Server (NTRS)
Generazio, Edward R.
2015-01-01
This data book contains the Directed Design of Experiments for Validating Probability of Detection (POD) Capability of NDE Systems (DOEPOD) analyses of the nondestructive inspection data presented in the NTIAC, Nondestructive Evaluation (NDE) Capabilities Data Book, 3rd ed., NTIAC DB-97-02. DOEPOD is designed as a decision support system to validate inspection system, personnel, and protocol demonstrating 0.90 POD with 95% confidence at critical flaw sizes, a90/95. The test methodology used in DOEPOD is based on the field of statistical sequential analysis founded by Abraham Wald. Sequential analysis is a method of statistical inference whose characteristic feature is that the number of observations required by the procedure is not determined in advance of the experiment. The decision to terminate the experiment depends, at each stage, on the results of the observations previously made. A merit of the sequential method, as applied to testing statistical hypotheses, is that test procedures can be constructed which require, on average, a substantially smaller number of observations than equally reliable test procedures based on a predetermined number of observations.
Upwind MacCormack Euler solver with non-equilibrium chemistry
NASA Technical Reports Server (NTRS)
Sherer, Scott E.; Scott, James N.
1993-01-01
A computer code, designated UMPIRE, is currently under development to solve the Euler equations in two dimensions with non-equilibrium chemistry. UMPIRE employs an explicit MacCormack algorithm with dissipation introduced via Roe's flux-difference split upwind method. The code also has the capability to employ a point-implicit methodology for flows where stiffness is introduced through the chemical source term. A technique consisting of diagonal sweeps across the computational domain from each corner is presented, which is used to reduce storage and execution requirements. Results depicting one dimensional shock tube flow for both calorically perfect gas and thermally perfect, dissociating nitrogen are presented to verify current capabilities of the program. Also, computational results from a chemical reactor vessel with no fluid dynamic effects are presented to check the chemistry capability and to verify the point implicit strategy.
NASA Technical Reports Server (NTRS)
Haley, D. C.; Almand, B. J.; Thomas, M. M.; Krauze, L. D.; Gremban, K. D.; Sanborn, J. C.; Kelley, J. H.; Depkovich, T. M.; Wolfe, W. J.; Nguyen, T.
1986-01-01
The purpose of the Robotics Simulation Program is to provide a broad range of computer capabilities to assist in the design, verification, simulation, and study of robotics systems. ROBSIM is program in FORTRAN 77 for use on a VAX 11/750 computer under the VMS operating system. This user's guide describes the capabilities of the ROBSIM programs, including the system definition function, the analysis tools function and the postprocessor function. The options a user may encounter with each of these executables are explained in detail and the different program prompts appearing to the user are included. Some useful suggestions concerning the appropriate answers to be given by the user are provided. An example user interactive run in enclosed for each of the main program services, and some of the capabilities are illustrated.
Concurrent airline fleet allocation and aircraft design with profit modeling for multiple airlines
NASA Astrophysics Data System (ADS)
Govindaraju, Parithi
A "System of Systems" (SoS) approach is particularly beneficial in analyzing complex large scale systems comprised of numerous independent systems -- each capable of independent operations in their own right -- that when brought in conjunction offer capabilities and performance beyond the constituents of the individual systems. The variable resource allocation problem is a type of SoS problem, which includes the allocation of "yet-to-be-designed" systems in addition to existing resources and systems. The methodology presented here expands upon earlier work that demonstrated a decomposition approach that sought to simultaneously design a new aircraft and allocate this new aircraft along with existing aircraft in an effort to meet passenger demand at minimum fleet level operating cost for a single airline. The result of this describes important characteristics of the new aircraft. The ticket price model developed and implemented here enables analysis of the system using profit maximization studies instead of cost minimization. A multiobjective problem formulation has been implemented to determine characteristics of a new aircraft that maximizes the profit of multiple airlines to recognize the fact that aircraft manufacturers sell their aircraft to multiple customers and seldom design aircraft customized to a single airline's operations. The route network characteristics of two simple airlines serve as the example problem for the initial studies. The resulting problem formulation is a mixed-integer nonlinear programming problem, which is typically difficult to solve. A sequential decomposition strategy is applied as a solution methodology by segregating the allocation (integer programming) and aircraft design (non-linear programming) subspaces. After solving a simple problem considering two airlines, the decomposition approach is then applied to two larger airline route networks representing actual airline operations in the year 2005. The decomposition strategy serves as a promising technique for future detailed analyses. Results from the profit maximization studies favor a smaller aircraft in terms of passenger capacity due to its higher yield generation capability on shorter routes while results from the cost minimization studies favor a larger aircraft due to its lower direct operating cost per seat mile.
VASSAR: Value assessment of system architectures using rules
NASA Astrophysics Data System (ADS)
Selva, D.; Crawley, E. F.
A key step of the mission development process is the selection of a system architecture, i.e., the layout of the major high-level system design decisions. This step typically involves the identification of a set of candidate architectures and a cost-benefit analysis to compare them. Computational tools have been used in the past to bring rigor and consistency into this process. These tools can automatically generate architectures by enumerating different combinations of decisions and options. They can also evaluate these architectures by applying cost models and simplified performance models. Current performance models are purely quantitative tools that are best fit for the evaluation of the technical performance of mission design. However, assessing the relative merit of a system architecture is a much more holistic task than evaluating performance of a mission design. Indeed, the merit of a system architecture comes from satisfying a variety of stakeholder needs, some of which are easy to quantify, and some of which are harder to quantify (e.g., elegance, scientific value, political robustness, flexibility). Moreover, assessing the merit of a system architecture at these very early stages of design often requires dealing with a mix of: a) quantitative and semi-qualitative data; objective and subjective information. Current computational tools are poorly suited for these purposes. In this paper, we propose a general methodology that can used to assess the relative merit of several candidate system architectures under the presence of objective, subjective, quantitative, and qualitative stakeholder needs. The methodology called VASSAR (Value ASsessment for System Architectures using Rules). The major underlying assumption of the VASSAR methodology is that the merit of a system architecture can assessed by comparing the capabilities of the architecture with the stakeholder requirements. Hence for example, a candidate architecture that fully satisfies all critical sta- eholder requirements is a good architecture. The assessment process is thus fundamentally seen as a pattern matching process where capabilities match requirements, which motivates the use of rule-based expert systems (RBES). This paper describes the VASSAR methodology and shows how it can be applied to a large complex space system, namely an Earth observation satellite system. Companion papers show its applicability to the NASA space communications and navigation program and the joint NOAA-DoD NPOESS program.
Aircraft integrated design and analysis: A classroom experience
NASA Technical Reports Server (NTRS)
Weisshaar, Terrence A.
1989-01-01
AAE 451 is the capstone course required of all senior undergraduates in the School of Aeronautics and Astronautics at Purdue University. During the past year the first steps of a long evolutionary process were taken to change the content and expectations of this course. These changes are the result of the availability of advanced computational capabilities and sophisticated electronic media availability at Purdue. This presentation will describe both the long range objectives and this year's experience using the High Speed Commercial Transport design, the AIAA Long Duration Aircraft design and RPV design proposal as project objectives. The central goal of these efforts is to provide a user-friendly, computer-software-based environment to supplement traditional design course methodology. The Purdue University Computer Center (PUCC), the Engineering Computer Network (ECN) and stand-alone PC's are being used for this development. This year's accomplishments center primarily on aerodynamics software obtained from NASA/Langley and its integration into the classroom. Word processor capability for oral and written work and computer graphics were also blended into the course. A total of ten HSCT designs were generated, ranging from twin-fuselage aircraft, forward swept wing aircraft to the more traditional delta and double-delta wing aircraft. Four Long Duration Aircraft designs were submitted, together with one RPV design tailored for photographic surveillance.
Structural Equation Modeling: Applications in ecological and evolutionary biology research
Pugesek, Bruce H.; von Eye, Alexander; Tomer, Adrian
2003-01-01
This book presents an introduction to the methodology of structural equation modeling, illustrates its use, and goes on to argue that it has revolutionary implications for the study of natural systems. A major theme of this book is that we have, up to this point, attempted to study systems primarily using methods (such as the univariate model) that were designed only for considering individual processes. Understanding systems requires the capacity to examine simultaneous influences and responses. Structural equation modeling (SEM) has such capabilities. It also possesses many other traits that add strength to its utility as a means of making scientific progress. In light of the capabilities of SEM, it can be argued that much of ecological theory is currently locked in an immature state that impairs its relevance. It is further argued that the principles of SEM are capable of leading to the development and evaluation of multivariate theories of the sort vitally needed for the conservation of natural systems. Supplementary information can be found at the authors website, http://www.jamesbgrace.com/. Details why multivariate analyses should be used to study ecological systems Exposes unappreciated weakness in many current popular analyses Emphasizes the future methodological developments needed to advance our understanding of ecological systems.
Operational Design Cognitive Methodology: An Analysis of COMISAF 30 August 2009 Initial Assessment
2010-04-01
the set of characteristics, capabilities, and sources of power from which a system derives its moral or physical strength, freedom of action, and...objective must be the population.” 53 The population focused objective extends beyond just the basics of physical security. The whole of government...In Warfare Studies AY10 Coursebook , edited by Sharon McBride, 196-212. Maxwell AFB, AL: Air University Press, October 2009. Kem, Jack D
Efficient evaluation of wireless real-time control networks.
Horvath, Peter; Yampolskiy, Mark; Koutsoukos, Xenofon
2015-02-11
In this paper, we present a system simulation framework for the design and performance evaluation of complex wireless cyber-physical systems. We describe the simulator architecture and the specific developments that are required to simulate cyber-physical systems relying on multi-channel, multihop mesh networks. We introduce realistic and efficient physical layer models and a system simulation methodology, which provides statistically significant performance evaluation results with low computational complexity. The capabilities of the proposed framework are illustrated in the example of WirelessHART, a centralized, real-time, multi-hop mesh network designed for industrial control and monitor applications.
A survey of decision tree classifier methodology
NASA Technical Reports Server (NTRS)
Safavian, S. R.; Landgrebe, David
1991-01-01
Decision tree classifiers (DTCs) are used successfully in many diverse areas such as radar signal classification, character recognition, remote sensing, medical diagnosis, expert systems, and speech recognition. Perhaps the most important feature of DTCs is their capability to break down a complex decision-making process into a collection of simpler decisions, thus providing a solution which is often easier to interpret. A survey of current methods is presented for DTC designs and the various existing issues. After considering potential advantages of DTCs over single-state classifiers, subjects of tree structure design, feature selection at each internal node, and decision and search strategies are discussed.
Small UAV Research and Evolution in Long Endurance Electric Powered Vehicles
NASA Technical Reports Server (NTRS)
Logan, Michael J.; Chu, Julio; Motter, Mark A.; Carter, Dennis L.; Ol, Michael; Zeune, Cale
2007-01-01
This paper describes recent research into the advancement of small, electric powered unmanned aerial vehicle (UAV) capabilities. Specifically, topics include the improvements made in battery technology, design methodologies, avionics architectures and algorithms, materials and structural concepts, propulsion system performance prediction, and others. The results of prototype vehicle designs and flight tests are discussed in the context of their usefulness in defining and validating progress in the various technology areas. Further areas of research need are also identified. These include the need for more robust operating regimes (wind, gust, etc.), and continued improvement in payload fraction vs. endurance.
A survey of decision tree classifier methodology
NASA Technical Reports Server (NTRS)
Safavian, S. Rasoul; Landgrebe, David
1990-01-01
Decision Tree Classifiers (DTC's) are used successfully in many diverse areas such as radar signal classification, character recognition, remote sensing, medical diagnosis, expert systems, and speech recognition. Perhaps, the most important feature of DTC's is their capability to break down a complex decision-making process into a collection of simpler decisions, thus providing a solution which is often easier to interpret. A survey of current methods is presented for DTC designs and the various existing issue. After considering potential advantages of DTC's over single stage classifiers, subjects of tree structure design, feature selection at each internal node, and decision and search strategies are discussed.
Lightning protection technology for small general aviation composite material aircraft
NASA Technical Reports Server (NTRS)
Plumer, J. A.; Setzer, T. E.; Siddiqi, S.
1993-01-01
An on going NASA (Small Business Innovative Research) SBIR Phase II design and development program will produce the first lightning protected, fiberglass, General Aviation aircraft that is available as a kit. The results obtained so far in development testing of typical components of the aircraft kit, such as the wing and fuselage panels indicate that the lightning protection design methodology and materials chosen are capable of protecting such small composite airframes from lightning puncture and structural damage associated with severe threat lightning strikes. The primary objective of the program has been to develop a lightening protection design for full scale test airframe and verify its adequacy with full scale laboratory testing, thus enabling production and sale of owner-built, lightning-protected, Stoddard-Hamilton Aircraft, Inc. Glasair II airplanes. A second objective has been to provide lightning protection design guidelines for the General Aviation industry, and to enable these airplanes to meet lightening protection requirements for certification of small airplanes. This paper describes the protection design approaches and development testing results obtained thus far in the program, together with design methodology which can achieve the design goals listed above. The presentation of this paper will also include results of some of the full scale verification tests, which will have been completed by the time of this conference.
NASA Astrophysics Data System (ADS)
Bandaru, Sunith; Deb, Kalyanmoy
2011-09-01
In this article, a methodology is proposed for automatically extracting innovative design principles which make a system or process (subject to conflicting objectives) optimal using its Pareto-optimal dataset. Such 'higher knowledge' would not only help designers to execute the system better, but also enable them to predict how changes in one variable would affect other variables if the system has to retain its optimal behaviour. This in turn would help solve other similar systems with different parameter settings easily without the need to perform a fresh optimization task. The proposed methodology uses a clustering-based optimization technique and is capable of discovering hidden functional relationships between the variables, objective and constraint functions and any other function that the designer wishes to include as a 'basis function'. A number of engineering design problems are considered for which the mathematical structure of these explicit relationships exists and has been revealed by a previous study. A comparison with the multivariate adaptive regression splines (MARS) approach reveals the practicality of the proposed approach due to its ability to find meaningful design principles. The success of this procedure for automated innovization is highly encouraging and indicates its suitability for further development in tackling more complex design scenarios.
Contingency theoretic methodology for agent-based web-oriented manufacturing systems
NASA Astrophysics Data System (ADS)
Durrett, John R.; Burnell, Lisa J.; Priest, John W.
2000-12-01
The development of distributed, agent-based, web-oriented, N-tier Information Systems (IS) must be supported by a design methodology capable of responding to the convergence of shifts in business process design, organizational structure, computing, and telecommunications infrastructures. We introduce a contingency theoretic model for the use of open, ubiquitous software infrastructure in the design of flexible organizational IS. Our basic premise is that developers should change in the way they view the software design process from a view toward the solution of a problem to one of the dynamic creation of teams of software components. We postulate that developing effective, efficient, flexible, component-based distributed software requires reconceptualizing the current development model. The basic concepts of distributed software design are merged with the environment-causes-structure relationship from contingency theory; the task-uncertainty of organizational- information-processing relationships from information processing theory; and the concept of inter-process dependencies from coordination theory. Software processes are considered as employees, groups of processes as software teams, and distributed systems as software organizations. Design techniques already used in the design of flexible business processes and well researched in the domain of the organizational sciences are presented. Guidelines that can be utilized in the creation of component-based distributed software will be discussed.
Informational ergonomics and design: signage design for Monte Sião Camp.
Marques, Luiz Guilherme Oliveira; Cardoso, Vânia Maria Batalha
2012-01-01
This paper refers to the development of a signage system, driven by the vector of Signage Design and Informational Ergonomics associated with Regulatory Standards. The methodology of the Ergonomic Intervention of Moraes and Mont'Alvão (2003), in its early stages, and the Method of the Signage Pyramid of Calori (2007) were used to develop the research, data collection and analysis and to guide the design by a signaling system. The system contemplated by this job is called Mount Zion, a site of 19 hectares, which has signs of disturbance. As a result, we obtained a signaling system, with graphic features that refer to formal-site, capable of meeting the needs of orientation and displacement inherent to the site.
Interactive multi-mode blade impact analysis
NASA Technical Reports Server (NTRS)
Alexander, A.; Cornell, R. W.
1978-01-01
The theoretical methodology used in developing an analysis for the response of turbine engine fan blades subjected to soft-body (bird) impacts is reported, and the computer program developed using this methodology as its basis is described. This computer program is an outgrowth of two programs that were previously developed for the purpose of studying problems of a similar nature (a 3-mode beam impact analysis and a multi-mode beam impact analysis). The present program utilizes an improved missile model that is interactively coupled with blade motion which is more consistent with actual observations. It takes into account local deformation at the impact area, blade camber effects, and the spreading of the impacted missile mass on the blade surface. In addition, it accommodates plate-type mode shapes. The analysis capability in this computer program represents a significant improvement in the development of the methodology for evaluating potential fan blade materials and designs with regard to foreign object impact resistance.
A probabilistic methodology for radar cross section prediction in conceptual aircraft design
NASA Astrophysics Data System (ADS)
Hines, Nathan Robert
System effectiveness has increasingly become the prime metric for the evaluation of military aircraft. As such, it is the decision maker's/designer's goal to maximize system effectiveness. Industry and government research documents indicate that all future military aircraft will incorporate signature reduction as an attempt to improve system effectiveness and reduce the cost of attrition. Today's operating environments demand low observable aircraft which are able to reliably take out valuable, time critical targets. Thus it is desirable to be able to design vehicles that are balanced for increased effectiveness. Previous studies have shown that shaping of the vehicle is one of the most important contributors to radar cross section, a measure of radar signature, and must be considered from the very beginning of the design process. Radar cross section estimation should be incorporated into conceptual design to develop more capable systems. This research strives to meet these needs by developing a conceptual design tool that predicts radar cross section for parametric geometries. This tool predicts the absolute radar cross section of the vehicle as well as the impact of geometry changes, allowing for the simultaneous tradeoff of the aerodynamic, performance, and cost characteristics of the vehicle with the radar cross section. Furthermore, this tool can be linked to a campaign theater analysis code to demonstrate the changes in system and system of system effectiveness due to changes in aircraft geometry. A general methodology was developed and implemented and sample computer codes applied to prototype the proposed process. Studies utilizing this radar cross section tool were subsequently performed to demonstrate the capabilities of this method and show the impact that various inputs have on the outputs of these models. The F/A-18 aircraft configuration was chosen as a case study vehicle to perform a design space exercise and to investigate the relative impact of shaping parameters on radar cross section. Finally, two unique low observable configurations were analyzed to examine the impact of shaping for stealthiness.
Geometric modeling for computer aided design
NASA Technical Reports Server (NTRS)
Schwing, James L.; Olariu, Stephen
1995-01-01
The primary goal of this grant has been the design and implementation of software to be used in the conceptual design of aerospace vehicles particularly focused on the elements of geometric design, graphical user interfaces, and the interaction of the multitude of software typically used in this engineering environment. This has resulted in the development of several analysis packages and design studies. These include two major software systems currently used in the conceptual level design of aerospace vehicles. These tools are SMART, the Solid Modeling Aerospace Research Tool, and EASIE, the Environment for Software Integration and Execution. Additional software tools were designed and implemented to address the needs of the engineer working in the conceptual design environment. SMART provides conceptual designers with a rapid prototyping capability and several engineering analysis capabilities. In addition, SMART has a carefully engineered user interface that makes it easy to learn and use. Finally, a number of specialty characteristics have been built into SMART which allow it to be used efficiently as a front end geometry processor for other analysis packages. EASIE provides a set of interactive utilities that simplify the task of building and executing computer aided design systems consisting of diverse, stand-alone, analysis codes. Resulting in a streamlining of the exchange of data between programs reducing errors and improving the efficiency. EASIE provides both a methodology and a collection of software tools to ease the task of coordinating engineering design and analysis codes.
Ball, David A; Lux, Matthew W; Graef, Russell R; Peterson, Matthew W; Valenti, Jane D; Dileo, John; Peccoud, Jean
2010-01-01
The concept of co-design is common in engineering, where it is necessary, for example, to determine the optimal partitioning between hardware and software of the implementation of a system features. Here we propose to adapt co-design methodologies for synthetic biology. As a test case, we have designed an environmental sensing device that detects the presence of three chemicals, and returns an output only if at least two of the three chemicals are present. We show that the logical operations can be implemented in three different design domains: (1) the transcriptional domain using synthetically designed hybrid promoters, (2) the protein domain using bi-molecular fluorescence complementation, and (3) the fluorescence domain using spectral unmixing and relying on electronic processing. We discuss how these heterogeneous design strategies could be formalized to develop co-design algorithms capable of identifying optimal designs meeting user specifications.
NASA Astrophysics Data System (ADS)
Halil, F. M.; Nasir, N. M.; Shukur, A. S.; Hashim, H.
2018-02-01
Design and Build construction project involved the biggest scale of the cost of investment as compared to the traditional approach. In Design and Build, the client hires a design professional that will design according to the client’s need and specification. This research aim is to explore the concept of partnering implementation practiced in the design and build procurement approach. Therefore, the selection of design professionals such as Contractors and consultants in the project is crucial to ensure the successful project completion on time, cost, and quality. The methodology adopted using quantitative approach. Administration of the questionnaire was distributed to the public client by using postal survey. Outcomes of the results, the public clients agreed that project management capabilities and commitment to budget as a crucial element of partnering from the design professional in design and build construction project.
Level-Set Methodology on Adaptive Octree Grids
NASA Astrophysics Data System (ADS)
Gibou, Frederic; Guittet, Arthur; Mirzadeh, Mohammad; Theillard, Maxime
2017-11-01
Numerical simulations of interfacial problems in fluids require a methodology capable of tracking surfaces that can undergo changes in topology and capable to imposing jump boundary conditions in a sharp manner. In this talk, we will discuss recent advances in the level-set framework, in particular one that is based on adaptive grids.
Multidisciplinary design and optimization (MDO) methodology for the aircraft conceptual design
NASA Astrophysics Data System (ADS)
Iqbal, Liaquat Ullah
An integrated design and optimization methodology has been developed for the conceptual design of an aircraft. The methodology brings higher fidelity Computer Aided Design, Engineering and Manufacturing (CAD, CAE and CAM) Tools such as CATIA, FLUENT, ANSYS and SURFCAM into the conceptual design by utilizing Excel as the integrator and controller. The approach is demonstrated to integrate with many of the existing low to medium fidelity codes such as the aerodynamic panel code called CMARC and sizing and constraint analysis codes, thus providing the multi-fidelity capabilities to the aircraft designer. The higher fidelity design information from the CAD and CAE tools for the geometry, aerodynamics, structural and environmental performance is provided for the application of the structured design methods such as the Quality Function Deployment (QFD) and the Pugh's Method. The higher fidelity tools bring the quantitative aspects of a design such as precise measurements of weight, volume, surface areas, center of gravity (CG) location, lift over drag ratio, and structural weight, as well as the qualitative aspects such as external geometry definition, internal layout, and coloring scheme early in the design process. The performance and safety risks involved with the new technologies can be reduced by modeling and assessing their impact more accurately on the performance of the aircraft. The methodology also enables the design and evaluation of the novel concepts such as the blended (BWB) and the hybrid wing body (HWB) concepts. Higher fidelity computational fluid dynamics (CFD) and finite element analysis (FEA) allow verification of the claims for the performance gains in aerodynamics and ascertain risks of structural failure due to different pressure distribution in the fuselage as compared with the tube and wing design. The higher fidelity aerodynamics and structural models can lead to better cost estimates that help reduce the financial risks as well. This helps in achieving better designs with reduced risk in lesser time and cost. The approach is shown to eliminate the traditional boundary between the conceptual and the preliminary design stages, combining the two into one consolidated preliminary design phase. Several examples for the validation and utilization of the Multidisciplinary Design and Optimization (MDO) Tool are presented using missions for the Medium and High Altitude Long Range/Endurance Unmanned Aerial Vehicles (UAVs).
Methodologies and systems for heterogeneous concurrent computing
NASA Technical Reports Server (NTRS)
Sunderam, V. S.
1994-01-01
Heterogeneous concurrent computing is gaining increasing acceptance as an alternative or complementary paradigm to multiprocessor-based parallel processing as well as to conventional supercomputing. While algorithmic and programming aspects of heterogeneous concurrent computing are similar to their parallel processing counterparts, system issues, partitioning and scheduling, and performance aspects are significantly different. In this paper, we discuss critical design and implementation issues in heterogeneous concurrent computing, and describe techniques for enhancing its effectiveness. In particular, we highlight the system level infrastructures that are required, aspects of parallel algorithm development that most affect performance, system capabilities and limitations, and tools and methodologies for effective computing in heterogeneous networked environments. We also present recent developments and experiences in the context of the PVM system and comment on ongoing and future work.
2017-11-01
The Under-body Blast Methodology (UBM) for the Test and Evaluation (T&E) program was established to provide a capability for the US Army Test and... Evaluation Command to assess the vulnerability of vehicles to under-body blast. Finite element (FE) models are part of the current UBM for T&E methodology...Methodology (UBM) for the Test and Evaluation (T&E) program was established to provide a capability for the US Army Test and Evaluation Command
NASA Astrophysics Data System (ADS)
1992-10-01
The overall objective of the DARPA/Tri-Service RASSP program is to demonstrate a capability to rapidly specify, produce, and yield domain-specific, affordable signal processors for use in Department of Defense systems such as automatic target acquisition, tracking, and recognition, electronic countermeasures, communications, and SIGINT. The objective of the study phase is to specify a recommended program plan for the government to use as a template for procurement of the RASSP design system and demonstration program. To accomplish that objective, the study phase program tasks are to specify a development methodology for signal processors (adaptable to various organizational design styles, and application areas), analyze the requirements in CAD/CAE tools to support the development methodology, identify the state and development plans of the industry relative to this area, and to recommend the additional developments not currently being addressed by the industry, which are recommended as RASSP developments. In addition, the RASSP study phase will define a linking approach for electronically linking design centers to manufacturing centers so a complete cycle for prototyping can be accomplished with significantly reduced cycle time.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lindell, M.A.; Grape, S.; Haekansson, A.
The sustainability criterion for Gen IV nuclear energy systems inherently presumes the availability of efficient fuel recycling capabilities. One area for research on advanced fuel recycling concerns safeguards aspects of this type of facilities. Since a recycling facility may be considered as sensitive from a non-proliferation perspective, it is important to address these issues early in the design process, according to the principle of Safeguards By Design. Presented in this paper is a mode of procedure, where assessments of the proliferation resistance (PR) of a recycling facility for fast reactor fuel have been performed so as to identify the weakestmore » barriers to proliferation of nuclear material. Two supplementing established methodologies have been applied; TOPS (Technological Opportunities to increase Proliferation resistance of nuclear power Systems) and PR-PP (Proliferation Resistance and Physical Protection evaluation methodology). The chosen fuel recycling facility belongs to a small Gen IV lead-cooled fast reactor system that is under study in Sweden. A schematic design of the recycling facility, where actinides are separated using solvent extraction, has been examined. The PR assessment methodologies make it possible to pinpoint areas in which the facility can be improved in order to reduce the risk of diversion. The initial facility design may then be slightly modified and/or safeguards measures may be introduced to reduce the total identified proliferation risk. After each modification of design and/or safeguards implementation, a new PR assessment of the revised system can then be carried out. This way, each modification can be evaluated and new ways to further enhance the proliferation resistance can be identified. This type of iterative procedure may support Safeguards By Design in the planning of new recycling plants and other nuclear facilities. (authors)« less
The flight telerobotic servicer Tinman concept: System design drivers and task analysis
NASA Technical Reports Server (NTRS)
Andary, J. F.; Hewitt, D. R.; Hinkal, S. W.
1989-01-01
A study was conducted to develop a preliminary definition of the Flight Telerobotic Servicer (FTS) that could be used to understand the operational concepts and scenarios for the FTS. Called the Tinman, this design concept was also used to begin the process of establishing resources and interfaces for the FTS on Space Station Freedom, the National Space Transportation System shuttle orbiter, and the Orbital Maneuvering vehicle. Starting with an analysis of the requirements and task capabilities as stated in the Phase B study requirements document, the study identified eight major design drivers for the FTS. Each of these design drivers and their impacts on the Tinman design concept are described. Next, the planning that is currently underway for providing resources for the FTS on Space Station Freedom is discussed, including up to 2000 W of peak power, up to four color video channels, and command and data rates up to 500 kbps between the telerobot and the control station. Finally, an example is presented to show how the Tinman design concept was used to analyze task scenarios and explore the operational capabilities of the FTS. A structured methodology using a standard terminology consistent with the NASA/National Bureau of Standards Standard Reference Model for Telerobot Control System Architecture (NASREM) was developed for this analysis.
SU-E-I-43: Pediatric CT Dose and Image Quality Optimization
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stevens, G; Singh, R
2014-06-01
Purpose: To design an approach to optimize radiation dose and image quality for pediatric CT imaging, and to evaluate expected performance. Methods: A methodology was designed to quantify relative image quality as a function of CT image acquisition parameters. Image contrast and image noise were used to indicate expected conspicuity of objects, and a wide-cone system was used to minimize scan time for motion avoidance. A decision framework was designed to select acquisition parameters as a weighted combination of image quality and dose. Phantom tests were used to acquire images at multiple techniques to demonstrate expected contrast, noise and dose.more » Anthropomorphic phantoms with contrast inserts were imaged on a 160mm CT system with tube voltage capabilities as low as 70kVp. Previously acquired clinical images were used in conjunction with simulation tools to emulate images at different tube voltages and currents to assess human observer preferences. Results: Examination of image contrast, noise, dose and tube/generator capabilities indicates a clinical task and object-size dependent optimization. Phantom experiments confirm that system modeling can be used to achieve the desired image quality and noise performance. Observer studies indicate that clinical utilization of this optimization requires a modified approach to achieve the desired performance. Conclusion: This work indicates the potential to optimize radiation dose and image quality for pediatric CT imaging. In addition, the methodology can be used in an automated parameter selection feature that can suggest techniques given a limited number of user inputs. G Stevens and R Singh are employees of GE Healthcare.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Klymenko, M. V.; Remacle, F., E-mail: fremacle@ulg.ac.be
2014-10-28
A methodology is proposed for designing a low-energy consuming ternary-valued full adder based on a quantum dot (QD) electrostatically coupled with a single electron transistor operating as a charge sensor. The methodology is based on design optimization: the values of the physical parameters of the system required for implementing the logic operations are optimized using a multiobjective genetic algorithm. The searching space is determined by elements of the capacitance matrix describing the electrostatic couplings in the entire device. The objective functions are defined as the maximal absolute error over actual device logic outputs relative to the ideal truth tables formore » the sum and the carry-out in base 3. The logic units are implemented on the same device: a single dual-gate quantum dot and a charge sensor. Their physical parameters are optimized to compute either the sum or the carry out outputs and are compatible with current experimental capabilities. The outputs are encoded in the value of the electric current passing through the charge sensor, while the logic inputs are supplied by the voltage levels on the two gate electrodes attached to the QD. The complex logic ternary operations are directly implemented on an extremely simple device, characterized by small sizes and low-energy consumption compared to devices based on switching single-electron transistors. The design methodology is general and provides a rational approach for realizing non-switching logic operations on QD devices.« less
NASA Astrophysics Data System (ADS)
Domercant, Jean Charles
The combination of today's national security environment and mandated acquisition policies makes it necessary for military systems to interoperate with each other to greater degrees. This growing interdependency results in complex Systems-of-Systems (SoS) that only continue to grow in complexity to meet evolving capability needs. Thus, timely and affordable acquisition becomes more difficult, especially in the face of mounting budgetary pressures. To counter this, architecting principles must be applied to SoS design. The research objective is to develop an Architecture Real Options Complexity-Based Valuation Methodology (ARC-VM) suitable for acquisition-level decision making, where there is a stated desire for more informed tradeoffs between cost, schedule, and performance during the early phases of design. First, a framework is introduced to measure architecture complexity as it directly relates to military SoS. Development of the framework draws upon a diverse set of disciplines, including Complexity Science, software architecting, measurement theory, and utility theory. Next, a Real Options based valuation strategy is developed using techniques established for financial stock options that have recently been adapted for use in business and engineering decisions. The derived complexity measure provides architects with an objective measure of complexity that focuses on relevant complex system attributes. These attributes are related to the organization and distribution of SoS functionality and the sharing and processing of resources. The use of Real Options provides the necessary conceptual and visual framework to quantifiably and traceably combine measured architecture complexity, time-valued performance levels, as well as programmatic risks and uncertainties. An example suppression of enemy air defenses (SEAD) capability demonstrates the development and usefulness of the resulting architecture complexity & Real Options based valuation methodology. Different portfolios of candidate system types are used to generate an array of architecture alternatives that are then evaluated using an engagement model. This performance data is combined with both measured architecture complexity and programmatic data to assign an acquisition value to each alternative. This proves useful when selecting alternatives most likely to meet current and future capability needs.
NASA Astrophysics Data System (ADS)
Javier Romualdez, Luis
Scientific balloon-borne instrumentation offers an attractive, competitive, and effective alternative to space-borne missions when considering the overall scope, cost, and development timescale required to design and launch scientific instruments. In particular, the balloon-borne environment provides a near-space regime that is suitable for a number of modern astronomical and cosmological experiments, where the atmospheric interference suffered by ground-based instrumentation is negligible at stratospheric altitudes. This work is centered around the analytical strategies and implementation considerations for the attitude determination and control of SuperBIT, a scientific balloon-borne payload capable of meeting the strict sub-arcsecond pointing and image stability requirements demanded by modern cosmological experiments. Broadly speaking, the designed stability specifications of SuperBIT coupled with its observational efficiency, image quality, and accessibility rivals state-of-the-art astronomical observatories such as the Hubble Space Telescope. To this end, this work presents an end-to-end design methodology for precision pointing balloon-borne payloads such as SuperBIT within an analytical yet implementationally grounded context. Simulation models of SuperBIT are analytically derived to aid in pre-assembly trade-off and case studies that are pertinent to the dynamic balloon-borne environment. From these results, state estimation techniques and control methodologies are extensively developed, leveraging the analytical framework of simulation models and design studies. This pre-assembly design phase is physically validated during assembly, integration, and testing through implementation in real-time hardware and software, which bridges the gap between analytical results and practical application. SuperBIT attitude determination and control is demonstrated throughout two engineering test flights that verify pointing and image stability requirements in flight, where the post-flight results close the overall design loop by suggesting practical improvements to pre-design methodologies. Overall, the analytical and practical results presented in this work, though centered around the SuperBIT project, provide generically useful and implementationally viable methodologies for high precision balloon-borne instrumentation, all of which are validated, justified, and improved both theoretically and practically. As such, the continuing development of SuperBIT, built from the work presented in this thesis, strives to further the potential for scientific balloon-borne astronomy in the near future.
Adaptation of Mesoscale Weather Models to Local Forecasting
NASA Technical Reports Server (NTRS)
Manobianco, John T.; Taylor, Gregory E.; Case, Jonathan L.; Dianic, Allan V.; Wheeler, Mark W.; Zack, John W.; Nutter, Paul A.
2003-01-01
Methodologies have been developed for (1) configuring mesoscale numerical weather-prediction models for execution on high-performance computer workstations to make short-range weather forecasts for the vicinity of the Kennedy Space Center (KSC) and the Cape Canaveral Air Force Station (CCAFS) and (2) evaluating the performances of the models as configured. These methodologies have been implemented as part of a continuing effort to improve weather forecasting in support of operations of the U.S. space program. The models, methodologies, and results of the evaluations also have potential value for commercial users who could benefit from tailoring their operations and/or marketing strategies based on accurate predictions of local weather. More specifically, the purpose of developing the methodologies for configuring the models to run on computers at KSC and CCAFS is to provide accurate forecasts of winds, temperature, and such specific thunderstorm-related phenomena as lightning and precipitation. The purpose of developing the evaluation methodologies is to maximize the utility of the models by providing users with assessments of the capabilities and limitations of the models. The models used in this effort thus far include the Mesoscale Atmospheric Simulation System (MASS), the Regional Atmospheric Modeling System (RAMS), and the National Centers for Environmental Prediction Eta Model ( Eta for short). The configuration of the MASS and RAMS is designed to run the models at very high spatial resolution and incorporate local data to resolve fine-scale weather features. Model preprocessors were modified to incorporate surface, ship, buoy, and rawinsonde data as well as data from local wind towers, wind profilers, and conventional or Doppler radars. The overall evaluation of the MASS, Eta, and RAMS was designed to assess the utility of these mesoscale models for satisfying the weather-forecasting needs of the U.S. space program. The evaluation methodology includes objective and subjective verification methodologies. Objective (e.g., statistical) verification of point forecasts is a stringent measure of model performance, but when used alone, it is not usually sufficient for quantifying the value of the overall contribution of the model to the weather-forecasting process. This is especially true for mesoscale models with enhanced spatial and temporal resolution that may be capable of predicting meteorologically consistent, though not necessarily accurate, fine-scale weather phenomena. Therefore, subjective (phenomenological) evaluation, focusing on selected case studies and specific weather features, such as sea breezes and precipitation, has been performed to help quantify the added value that cannot be inferred solely from objective evaluation.
Advanced Machine Learning Emulators of Radiative Transfer Models
NASA Astrophysics Data System (ADS)
Camps-Valls, G.; Verrelst, J.; Martino, L.; Vicent, J.
2017-12-01
Physically-based model inversion methodologies are based on physical laws and established cause-effect relationships. A plethora of remote sensing applications rely on the physical inversion of a Radiative Transfer Model (RTM), which lead to physically meaningful bio-geo-physical parameter estimates. The process is however computationally expensive, needs expert knowledge for both the selection of the RTM, its parametrization and the the look-up table generation, as well as its inversion. Mimicking complex codes with statistical nonlinear machine learning algorithms has become the natural alternative very recently. Emulators are statistical constructs able to approximate the RTM, although at a fraction of the computational cost, providing an estimation of uncertainty, and estimations of the gradient or finite integral forms. We review the field and recent advances of emulation of RTMs with machine learning models. We posit Gaussian processes (GPs) as the proper framework to tackle the problem. Furthermore, we introduce an automatic methodology to construct emulators for costly RTMs. The Automatic Gaussian Process Emulator (AGAPE) methodology combines the interpolation capabilities of GPs with the accurate design of an acquisition function that favours sampling in low density regions and flatness of the interpolation function. We illustrate the good capabilities of our emulators in toy examples, leaf and canopy levels PROSPECT and PROSAIL RTMs, and for the construction of an optimal look-up-table for atmospheric correction based on MODTRAN5.
[Methodology of Screening New Antibiotics: Present Status and Prospects].
Trenin, A S
2015-01-01
Due to extensive distribution of pathogen resistance to available pharmaceuticals and serious problems in the treatment of various infections and tumor diseases, the necessity of new antibiotics is urgent. The basic methodological approaches to chemical synthesis of antibiotics and screening of new antibiotics among natural products, mainly among microbial secondary metabolites, are considered in the review. Since the natural compounds are very much diverse, screening of such substances gives a good opportunity to discover antibiotics of various chemical structure and mechanism of action. Such an approach followed by chemical or biological transformation, is capable of providing the health care with new effective pharmaceuticals. The review is mainly concentrated on screening of natural products and methodological problems, such as: isolation of microbial producers from the habitats, cultivation of microorganisms producing appropriate substances, isolation and chemical characterization of microbial metabolites, identification of the biological activity of the metabolites. The main attention is paid to the problems of microbial secondary metabolism and design of new models for screening biologically active compounds. The last achievements in the field of antibiotics and most perspective approaches to future investigations are discussed. The main methodological approach to isolation and cultivation of the producers remains actual and needs constant improvement. The increase of the screening efficiency can be achieved by more rapid chemical identification of antibiotics and design of new screening models based on the biological activity detection.
NASA Technical Reports Server (NTRS)
Pamadi, Bandu N.; Toniolo, Matthew D.; Tartabini, Paul V.; Roithmayr, Carlos M.; Albertson, Cindy W.; Karlgaard, Christopher D.
2016-01-01
The objective of this report is to develop and implement a physics based method for analysis and simulation of multi-body dynamics including launch vehicle stage separation. The constraint force equation (CFE) methodology discussed in this report provides such a framework for modeling constraint forces and moments acting at joints when the vehicles are still connected. Several stand-alone test cases involving various types of joints were developed to validate the CFE methodology. The results were compared with ADAMS(Registered Trademark) and Autolev, two different industry standard benchmark codes for multi-body dynamic analysis and simulations. However, these two codes are not designed for aerospace flight trajectory simulations. After this validation exercise, the CFE algorithm was implemented in Program to Optimize Simulated Trajectories II (POST2) to provide a capability to simulate end-to-end trajectories of launch vehicles including stage separation. The POST2/CFE methodology was applied to the STS-1 Space Shuttle solid rocket booster (SRB) separation and Hyper-X Research Vehicle (HXRV) separation from the Pegasus booster as a further test and validation for its application to launch vehicle stage separation problems. Finally, to demonstrate end-to-end simulation capability, POST2/CFE was applied to the ascent, orbit insertion, and booster return of a reusable two-stage-to-orbit (TSTO) vehicle concept. With these validation exercises, POST2/CFE software can be used for performing conceptual level end-to-end simulations, including launch vehicle stage separation, for problems similar to those discussed in this report.
Design of a prototype device for remote patient care with mild cognitive impairment
NASA Astrophysics Data System (ADS)
Sanchez-Ocampo, M.; Segura-Giraldo, B.; Floréz-Hurtado, R.; Cortés-Aguirre, C.
2016-04-01
This paper describes the design of a prototype telecare system, which allows to provide home care to patients with mild cognitive impairment and thus ensures their permanence in their usual environment. Telecare is oriented towards people who require constant attention due to conditions of advanced age, illness, physical risk or limited capabilities. Telecare offers these people a greater degree of independence. QFD methodology is used to develop electronic devices intended to monitor the environment and physiological state of the user continuously, providing communication between the telecare system and a monitoring center in order to take the most appropriate actions in any abnormal event.
A product-service system approach to telehealth application design.
Flores-Vaquero, Paul; Tiwari, Ashutosh; Alcock, Jeffrey; Hutabarat, Windo; Turner, Chris
2016-06-01
A considerable proportion of current point-of-care devices do not offer a wide enough set of capabilities if they are to function in any telehealth system. There is a need for intermediate devices that lie between healthcare devices and service networks. The development of an application is suggested that allows for a smartphone to take the role of an intermediate device. This research seeks to identify the telehealth service requirements for long-term condition management using a product-service system approach. The use of product-service system has proven to be a suitable methodology for the design and development of telehealth smartphone applications. © The Author(s) 2014.
Xu, Wei
2014-01-01
This paper first discusses the major inefficiencies faced in current human factors and ergonomics (HFE) approaches: (1) delivering an optimal end-to-end user experience (UX) to users of a solution across its solution lifecycle stages; (2) strategically influencing the product business and technology capability roadmaps from a UX perspective and (3) proactively identifying new market opportunities and influencing the platform architecture capabilities on which the UX of end products relies. In response to these challenges, three case studies are presented to demonstrate how enhanced ergonomics design approaches have effectively addressed the challenges faced in current HFE approaches. Then, the enhanced ergonomics design approaches are conceptualised by a user-experience ecosystem (UXE) framework, from a UX ecosystem perspective. Finally, evidence supporting the UXE, the advantage and the formalised process for executing UXE and methodological considerations are discussed. Practitioner Summary: This paper presents enhanced ergonomics approaches to product design via three case studies to effectively address current HFE challenges by leveraging a systematic end-to-end UX approach, UX roadmaps and emerging UX associated with prioritised user needs and usages. Thus, HFE professionals can be more strategic, creative and influential.
Lunar Surface Habitat Configuration Assessment: Methodology and Observations
NASA Technical Reports Server (NTRS)
Carpenter, Amanda
2008-01-01
The Lunar Habitat Configuration Assessment evaluated the major habitat approaches that were conceptually developed during the Lunar Architecture Team II Study. The objective of the configuration assessment was to identify desired features, operational considerations, and risks to derive habitat requirements. This assessment only considered operations pertaining to the lunar surface and did not consider all habitat conceptual designs developed. To examine multiple architectures, the Habitation Focus Element Team defined several adequate concepts which warranted the need for a method to assess the various configurations. The fundamental requirement designed into each concept included the functional and operational capability to support a crew of four on a six-month lunar surface mission; however, other conceptual aspects were diverse in comparison. The methodology utilized for this assessment consisted of defining figure of merits, providing relevant information, and establishing a scoring system. In summary, the assessment considered the geometric configuration of each concept to determine the complexity of unloading, handling, mobility, leveling, aligning, mating to other elements, and the accessibility to the lunar surface. In theory, the assessment was designed to derive habitat requirements, potential technology development needs and identify risks associated with living and working on the lunar surface. Although the results were more subjective opposed to objective, the assessment provided insightful observations for further assessments and trade studies of lunar surface habitats. This overall methodology and resulting observations will be describe in detail and illustrative examples will be discussed.
A preliminary design for the GMT-Consortium Large Earth Finder (G-CLEF)
NASA Astrophysics Data System (ADS)
Szentgyorgyi, Andrew; Barnes, Stuart; Bean, Jacob; Bigelow, Bruce; Bouchez, Antonin; Chun, Moo-Young; Crane, Jeffrey D.; Epps, Harland; Evans, Ian; Evans, Janet; Frebel, Anna; Furesz, Gabor; Glenday, Alex; Guzman, Dani; Hare, Tyson; Jang, Bi-Ho; Jang, Jeong-Gyun; Jeong, Ueejong; Jordan, Andres; Kim, Kang-Min; Kim, Jihun; Li, Chih-Hao; Lopez-Morales, Mercedes; McCracken, Kenneth; McLeod, Brian; Mueller, Mark; Nah, Jakyung; Norton, Timothy; Oh, Heeyoung; Oh, Jae Sok; Ordway, Mark; Park, Byeong-Gon; Park, Chan; Park, Sung-Joon; Phillips, David; Plummer, David; Podgorski, William; Rodler, Florian; Seifahrt, Andreas; Tak, Kyung-Mo; Uomoto, Alan; Van Dam, Marcos A.; Walsworth, Ronald; Yu, Young Sam; Yuk, In-Soo
2014-08-01
The GMT-Consortium Large Earth Finder (G-CLEF) is an optical-band echelle spectrograph that has been selected as the first light instrument for the Giant Magellan Telescope (GMT). G-CLEF is a general-purpose, high dispersion spectrograph that is fiber fed and capable of extremely precise radial velocity measurements. The G-CLEF Concept Design (CoD) was selected in Spring 2013. Since then, G-CLEF has undergone science requirements and instrument requirements reviews and will be the subject of a preliminary design review (PDR) in March 2015. Since CoD review (CoDR), the overall G-CLEF design has evolved significantly as we have optimized the constituent designs of the major subsystems, i.e. the fiber system, the telescope interface, the calibration system and the spectrograph itself. These modifications have been made to enhance G-CLEF's capability to address frontier science problems, as well as to respond to the evolution of the GMT itself and developments in the technical landscape. G-CLEF has been designed by applying rigorous systems engineering methodology to flow Level 1 Scientific Objectives to Level 2 Observational Requirements and thence to Level 3 and Level 4. The rigorous systems approach applied to G-CLEF establishes a well defined science requirements framework for the engineering design. By adopting this formalism, we may flexibly update and analyze the capability of G-CLEF to respond to new scientific discoveries as we move toward first light. G-CLEF will exploit numerous technological advances and features of the GMT itself to deliver an efficient, high performance instrument, e.g. exploiting the adaptive optics secondary system to increase both throughput and radial velocity measurement precision.
Adaptive tracking for complex systems using reduced-order models
NASA Technical Reports Server (NTRS)
Carnigan, Craig R.
1990-01-01
Reduced-order models are considered in the context of parameter adaptive controllers for tracking workspace trajectories. A dual-arm manipulation task is used to illustrate the methodology and provide simulation results. A parameter adaptive controller is designed to track a payload trajectory using a four-parameter model instead of the full-order, nine-parameter model. Several simulations with different payload-to-arm mass ratios are used to illustrate the capabilities of the reduced-order model in tracking the desired trajectory.
Adaptive tracking for complex systems using reduced-order models
NASA Technical Reports Server (NTRS)
Carignan, Craig R.
1990-01-01
Reduced-order models are considered in the context of parameter adaptive controllers for tracking workspace trajectories. A dual-arm manipulation task is used to illustrate the methodology and provide simulation results. A parameter adaptive controller is designed to track the desired position trajectory of a payload using a four-parameter model instead of a full-order, nine-parameter model. Several simulations with different payload-to-arm mass ratios are used to illustrate the capabilities of the reduced-order model in tracking the desired trajectory.
NASA Technical Reports Server (NTRS)
Prokhorov, Kimberlee; Shkedi, Brienne
2006-01-01
The current International Space Station (ISS) Environmental Control and Life Support (ECLS) system is designed to support an ISS crew size of three people. The capability to expand that system to support nine crew members during a Contingency Shuttle Crew Support (CSCS) scenario has been evaluated. This paper describes how the ISS ECLS systems may be operated for supporting CSCS, and the durations expected for the oxygen supply and carbon dioxide control subsystems.
Study of techniques for redundancy verification without disrupting systems, phases 1-3
NASA Technical Reports Server (NTRS)
1970-01-01
The problem of verifying the operational integrity of redundant equipment and the impact of a requirement for verification on such equipment are considered. Redundant circuits are examined and the characteristics which determine adaptability to verification are identified. Mutually exclusive and exhaustive categories for verification approaches are established. The range of applicability of these techniques is defined in terms of signal characteristics and redundancy features. Verification approaches are discussed and a methodology for the design of redundancy verification is developed. A case study is presented which involves the design of a verification system for a hypothetical communications system. Design criteria for redundant equipment are presented. Recommendations for the development of technological areas pertinent to the goal of increased verification capabilities are given.
A system-of-systems modeling methodology for strategic general aviation design decision-making
NASA Astrophysics Data System (ADS)
Won, Henry Thome
General aviation has long been studied as a means of providing an on-demand "personal air vehicle" that bypasses the traffic at major commercial hubs. This thesis continues this research through development of a system of systems modeling methodology applicable to the selection of synergistic product concepts, market segments, and business models. From the perspective of the conceptual design engineer, the design and selection of future general aviation aircraft is complicated by the definition of constraints and requirements, and the tradeoffs among performance and cost aspects. Qualitative problem definition methods have been utilized, although their accuracy in determining specific requirement and metric values is uncertain. In industry, customers are surveyed, and business plans are created through a lengthy, iterative process. In recent years, techniques have developed for predicting the characteristics of US travel demand based on travel mode attributes, such as door-to-door time and ticket price. As of yet, these models treat the contributing systems---aircraft manufacturers and service providers---as independently variable assumptions. In this research, a methodology is developed which seeks to build a strategic design decision making environment through the construction of a system of systems model. The demonstrated implementation brings together models of the aircraft and manufacturer, the service provider, and most importantly the travel demand. Thus represented is the behavior of the consumers and the reactive behavior of the suppliers---the manufacturers and transportation service providers---in a common modeling framework. The results indicate an ability to guide the design process---specifically the selection of design requirements---through the optimization of "capability" metrics. Additionally, results indicate the ability to find synergetic solutions, that is solutions in which two systems might collaborate to achieve a better result than acting independently. Implementation of this methodology can afford engineers a more autonomous perspective in the concept exploration process, providing dynamic feedback about a design's potential success in specific market segments. The method also has potential to strengthen the connection between design and business departments, as well as between manufacturers, service providers, and infrastructure planners---bringing information about how the respective systems interact, and what might be done to improve synergism of systems.
Employment of Personnel at the Tucson Border Patrol Station
2017-06-09
RESEARCH METHODOLOGY How should the Tucson Border Patrol Station optimally employ personnel? Using a case study research methodology141 provided...BORSTAR provide better capabilities to respond and greater mobility in risk management.155 The methodologies of case study comparatives include the...35 CHAPTER 3 RESEARCH METHODOLOGY
Status of the Flooding Fragility Testing Development
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pope, C. L.; Savage, B.; Bhandari, B.
2016-06-01
This report provides an update on research addressing nuclear power plant component reliability under flooding conditions. The research includes use of the Component Flooding Evaluation Laboratory (CFEL) where individual components and component subassemblies will be tested to failure under various flooding conditions. The resulting component reliability data can then be incorporated with risk simulation strategies to provide a more thorough representation of overall plant risk. The CFEL development strategy consists of four interleaved phases. Phase 1 addresses design and application of CFEL with water rise and water spray capabilities allowing testing of passive and active components including fully electrified components.more » Phase 2 addresses research into wave generation techniques followed by the design and addition of the wave generation capability to CFEL. Phase 3 addresses methodology development activities including small scale component testing, development of full scale component testing protocol, and simulation techniques including Smoothed Particle Hydrodynamic (SPH) based computer codes. Phase 4 involves full scale component testing including work on full scale component testing in a surrogate CFEL testing apparatus.« less
Activity-Centered Domain Characterization for Problem-Driven Scientific Visualization
Marai, G. Elisabeta
2018-01-01
Although visualization design models exist in the literature in the form of higher-level methodological frameworks, these models do not present a clear methodological prescription for the domain characterization step. This work presents a framework and end-to-end model for requirements engineering in problem-driven visualization application design. The framework and model are based on the activity-centered design paradigm, which is an enhancement of human-centered design. The proposed activity-centered approach focuses on user tasks and activities, and allows an explicit link between the requirements engineering process with the abstraction stage—and its evaluation—of existing, higher-level visualization design models. In a departure from existing visualization design models, the resulting model: assigns value to a visualization based on user activities; ranks user tasks before the user data; partitions requirements in activity-related capabilities and nonfunctional characteristics and constraints; and explicitly incorporates the user workflows into the requirements process. A further merit of this model is its explicit integration of functional specifications, a concept this work adapts from the software engineering literature, into the visualization design nested model. A quantitative evaluation using two sets of interdisciplinary projects supports the merits of the activity-centered model. The result is a practical roadmap to the domain characterization step of visualization design for problem-driven data visualization. Following this domain characterization model can help remove a number of pitfalls that have been identified multiple times in the visualization design literature. PMID:28866550
Dvornikov, M V; Medenkov, A A
2015-04-01
In the current paper authors discuss problems of marine and aerospace medicine and psychophysiology, which Georgii Zarakovskii (1925-2014), a prominent domestic experts in the field of military medicine, psychology and ergonomics, solved. Authors focused on methodological approaches and results of the study of psychophysiological characteristics and human capabilities took into account for design of tools and organization of flight crews, astronauts and military experts. Authors marked the contribution to the creation of a system integrating psychophysiological features and characteristics of the person neccessary for development, testing and maintenance of aerospace engineering and organization of its professional activities. The possibilities of using the methodology of psychophysiological activity analysis in order to improve the reliability of psychophysiological military specialists, are shown.
NASA Technical Reports Server (NTRS)
Generazio, Edward R.
2014-01-01
Unknown risks are introduced into failure critical systems when probability of detection (POD) capabilities are accepted without a complete understanding of the statistical method applied and the interpretation of the statistical results. The presence of this risk in the nondestructive evaluation (NDE) community is revealed in common statements about POD. These statements are often interpreted in a variety of ways and therefore, the very existence of the statements identifies the need for a more comprehensive understanding of POD methodologies. Statistical methodologies have data requirements to be met, procedures to be followed, and requirements for validation or demonstration of adequacy of the POD estimates. Risks are further enhanced due to the wide range of statistical methodologies used for determining the POD capability. Receiver/Relative Operating Characteristics (ROC) Display, simple binomial, logistic regression, and Bayes' rule POD methodologies are widely used in determining POD capability. This work focuses on Hit-Miss data to reveal the framework of the interrelationships between Receiver/Relative Operating Characteristics Display, simple binomial, logistic regression, and Bayes' Rule methodologies as they are applied to POD. Knowledge of these interrelationships leads to an intuitive and global understanding of the statistical data, procedural and validation requirements for establishing credible POD estimates.
Structural Technology and Analysis Program (STAP) Delivery Order 0004: Durability Patch
NASA Astrophysics Data System (ADS)
Ikegami, Roy; Haugse, Eric; Trego, Angela; Rogers, Lynn; Maly, Joe
2001-06-01
Structural cracks in secondary structure, resulting from a high cycle fatigue (HCF) environment, are often referred to as nuisance cracks. This type of damage can result in costly inspections and repair. The repairs often do not last long because the repaired structure continues to respond in a resonant fashion to the environment. Although the use of materials for passive damping applications is well understood, there are few applications to high-cycle fatigue problems. This is because design information characterization temperature, resonant response frequency and strain levels are difficult to determine. The Durability Patch and Damage Dosimeter Program addressed these problems by: (1) Developing a damped repair design process which includes a methodology for designing the material and application characteristics required to optimally damp the repair. (2) Designing and developing a rugged, small, and lightweight data acquisition unit called the damage dosimeter. This is a battery operated, single board computer, capable of collecting three channels of strain and one channel of temperature, processing this data by user developed algorithms written in the C programming language, and storing the processed data in resident memory. The dosimeter is used to provide flight data needed to characterize the vibration environment. The vibration environment is then used to design the damping material characteristics and repair. The repair design methodology and dosimeter were demonstrated on B-52, C-130, and F-15 aircraft applications.
Mateo, B; Porcar-Seder, R; Solaz, J S; Dürsteler, J C
2010-07-01
This study demonstrates that appropriate measurement procedures can detect differences in head movement in a near reading task when using three different progressive addition lenses (PALs). The movements were measured using an anatomical reference system with a biomechanical rationale. This reference system was capable of representing rotations for comparing head flexion relative to trunk, head flexion relative to neck, head rotation relative to trunk and trunk flexion. The subject sample comprised 31 volunteers and three PAL designs with different viewing zones were selected. Significant differences were found between the lenses for three of the seven movement parameters examined. The differences occurred for both vertical and horizontal head movements and could be attributed to aspects of the PAL design. The measurement of the complete kinematic trunk-neck-head chain improved the number of differences that were found over those in previous studies. STATEMENT OF RELEVANCE: The study proposes a methodology based on a biomechanical rationale able to differentiate head-neck-trunk posture and movements caused by different progressive addition lens designs with minimum invasiveness. This methodology could also be applied to analyse the ergonomics of other devices that restrict the user's field of view, such as helmets, personal protective equipment or helmet-mounted displays for pilots. This analysis will allow designers to optimise designs offering higher comfort and performance.
NASA Technical Reports Server (NTRS)
Ebeling, Charles; Beasley, Kenneth D.
1992-01-01
The first year of research to provide NASA support in predicting operational and support parameters and costs of proposed space systems is reported. Some of the specific research objectives were (1) to develop a methodology for deriving reliability and maintainability parameters and, based upon their estimates, determine the operational capability and support costs, and (2) to identify data sources and establish an initial data base to implement the methodology. Implementation of the methodology is accomplished through the development of a comprehensive computer model. While the model appears to work reasonably well when applied to aircraft systems, it was not accurate when used for space systems. The model is dynamic and should be updated as new data become available. It is particularly important to integrate the current aircraft data base with data obtained from the Space Shuttle and other space systems since subsystems unique to a space vehicle require data not available from aircraft. This research only addressed the major subsystems on the vehicle.
NASA Technical Reports Server (NTRS)
Przekwas, A. J.; Athavale, M. M.; Hendricks, R. C.; Steinetz, B. M.
2006-01-01
Detailed information of the flow-fields in the secondary flowpaths and their interaction with the primary flows in gas turbine engines is necessary for successful designs with optimized secondary flow streams. Present work is focused on the development of a simulation methodology for coupled time-accurate solutions of the two flowpaths. The secondary flowstream is treated using SCISEAL, an unstructured adaptive Cartesian grid code developed for secondary flows and seals, while the mainpath flow is solved using TURBO, a density based code with capability of resolving rotor-stator interaction in multi-stage machines. An interface is being tested that links the two codes at the rim seal to allow data exchange between the two codes for parallel, coupled execution. A description of the coupling methodology and the current status of the interface development is presented. Representative steady-state solutions of the secondary flow in the UTRC HP Rig disc cavity are also presented.
Ada and the rapid development lifecycle
NASA Technical Reports Server (NTRS)
Deforrest, Lloyd; Gref, Lynn
1991-01-01
JPL is under contract, through NASA, with the US Army to develop a state-of-the-art Command Center System for the US European Command (USEUCOM). The Command Center System will receive, process, and integrate force status information from various sources and provide this integrated information to staff officers and decision makers in a format designed to enhance user comprehension and utility. The system is based on distributed workstation class microcomputers, VAX- and SUN-based data servers, and interfaces to existing military mainframe systems and communication networks. JPL is developing the Command Center System utilizing an incremental delivery methodology called the Rapid Development Methodology with adherence to government and industry standards including the UNIX operating system, X Windows, OSF/Motif, and the Ada programming language. Through a combination of software engineering techniques specific to the Ada programming language and the Rapid Development Approach, JPL was able to deliver capability to the military user incrementally, with comparable quality and improved economies of projects developed under more traditional software intensive system implementation methodologies.
A Computational Framework for Automation of Point Defect Calculations
NASA Astrophysics Data System (ADS)
Goyal, Anuj; Gorai, Prashun; Peng, Haowei; Lany, Stephan; Stevanovic, Vladan; National Renewable Energy Laboratory, Golden, Colorado 80401 Collaboration
A complete and rigorously validated open-source Python framework to automate point defect calculations using density functional theory has been developed. The framework provides an effective and efficient method for defect structure generation, and creation of simple yet customizable workflows to analyze defect calculations. The package provides the capability to compute widely accepted correction schemes to overcome finite-size effects, including (1) potential alignment, (2) image-charge correction, and (3) band filling correction to shallow defects. Using Si, ZnO and In2O3as test examples, we demonstrate the package capabilities and validate the methodology. We believe that a robust automated tool like this will enable the materials by design community to assess the impact of point defects on materials performance. National Renewable Energy Laboratory, Golden, Colorado 80401.
Effects of Solar Array Shadowing on the Power Capability of the Interim Control Module
NASA Technical Reports Server (NTRS)
Fincannon, James; Hojnicki, Jeffrey S.; Garner, James Christopher
1999-01-01
The Interim Control Module (ICM) is being built by the US Naval Research Laboratory (NRL) for NASA as a propulsion module for the International Space Station (ISS). Originally developed as a spinning spacecraft used to move payloads to their final orbit, for ISS, the ICM will be in a fixed orientation and location for long periods resulting in substantial solar panel shadowing. This paper describes the methods used to determine the incident energy incident energy on the ICM solar panels and the power capability of the electric power system (EPS). Applying this methodology has resulted in analyses and assessments used to identify ICM early design changes/options, placement and orientations that enable successful operation of the EPS under a wide variety of anticipated conditions.
Direct measurements of local bed shear stress in the presence of pressure gradients
NASA Astrophysics Data System (ADS)
Pujara, Nimish; Liu, Philip L.-F.
2014-07-01
This paper describes the development of a shear plate sensor capable of directly measuring the local mean bed shear stress in small-scale and large-scale laboratory flumes. The sensor is capable of measuring bed shear stress in the range 200 Pa with an accuracy up to 1 %. Its size, 43 mm in the flow direction, is designed to be small enough to give spatially local measurements, and its bandwidth, 75 Hz, is high enough to resolve time-varying forcing. Typically, shear plate sensors are restricted to use in zero pressure gradient flows because secondary forces on the edge of the shear plate caused by pressure gradients can introduce large errors. However, by analysis of the pressure distribution at the edges of the shear plate in mild pressure gradients, we introduce a new methodology for correcting for the pressure gradient force. The developed sensor includes pressure tappings to measure the pressure gradient in the flow, and the methodology for correction is applied to obtain accurate measurements of bed shear stress under solitary waves in a small-scale wave flume. The sensor is also validated by measurements in a turbulent flat plate boundary layer in open channel flow.
Ameer, Kashif; Bae, Seong-Woo; Jo, Yunhee; Lee, Hyun-Gyu; Ameer, Asif; Kwon, Joong-Ho
2017-08-15
Stevia rebaudiana (Bertoni) consists of stevioside and rebaudioside-A (Reb-A). We compared response surface methodology (RSM) and artificial neural network (ANN) modelling for their estimation and predictive capabilities in building effective models with maximum responses. A 5-level 3-factor central composite design was used to optimize microwave-assisted extraction (MAE) to obtain maximum yield of target responses as a function of extraction time (X 1 : 1-5min), ethanol concentration, (X 2 : 0-100%) and microwave power (X 3 : 40-200W). Maximum values of the three output parameters: 7.67% total extract yield, 19.58mg/g stevioside yield, and 15.3mg/g Reb-A yield, were obtained under optimum extraction conditions of 4min X 1 , 75% X 2 , and 160W X 3 . The ANN model demonstrated higher efficiency than did the RSM model. Hence, RSM can demonstrate interaction effects of inherent MAE parameters on target responses, whereas ANN can reliably model the MAE process with better predictive and estimation capabilities. Copyright © 2017. Published by Elsevier Ltd.
NASA Technical Reports Server (NTRS)
Smith, Jeffrey H.; Gyanfi, Max; Volkmer, Kent; Zimmerman, Wayne
1988-01-01
The efforts of a recent study aimed at identifying key issues and trade-offs associated with using a Flight Telerobotic Servicer (FTS) to aid in Space Station assembly-phase tasks is described. The use of automation and robotic (A and R) technologies for large space systems would involve a substitution of automation capabilities for human extravehicular or intravehicular activities (EVA, IVA). A methodology is presented that incorporates assessment of candidate assembly-phase tasks, telerobotic performance capabilities, development costs, and effect of operational constraints (space transportation system (STS), attached payload, and proximity operations). Changes in the region of cost-effectiveness are examined under a variety of systems design assumptions. A discussion of issues is presented with focus on three roles the FTS might serve: (1) as a research-oriented testbed to learn more about space usage of telerobotics; (2) as a research based testbed having an experimental demonstration orientation with limited assembly and servicing applications; or (3) as an operational system to augment EVA and to aid the construction of the Space Station and to reduce the programmatic (schedule) risk by increasing the flexibility of mission operations.
Current trends in the design of scaffolds for computer-aided tissue engineering.
Giannitelli, S M; Accoto, D; Trombetta, M; Rainer, A
2014-02-01
Advances introduced by additive manufacturing have significantly improved the ability to tailor scaffold architecture, enhancing the control over microstructural features. This has led to a growing interest in the development of innovative scaffold designs, as testified by the increasing amount of research activities devoted to the understanding of the correlation between topological features of scaffolds and their resulting properties, in order to find architectures capable of optimal trade-off between often conflicting requirements (such as biological and mechanical ones). The main aim of this paper is to provide a review and propose a classification of existing methodologies for scaffold design and optimization in order to address key issues and help in deciphering the complex link between design criteria and resulting scaffold properties. Copyright © 2013 Acta Materialia Inc. Published by Elsevier Ltd. All rights reserved.
Enabling Parametric Optimal Ascent Trajectory Modeling During Early Phases of Design
NASA Technical Reports Server (NTRS)
Holt, James B.; Dees, Patrick D.; Diaz, Manuel J.
2015-01-01
During the early phases of engineering design, the costs committed are high, costs incurred are low, and the design freedom is high. It is well documented that decisions made in these early design phases drive the entire design's life cycle. In a traditional paradigm, key design decisions are made when little is known about the design. As the design matures, design changes become more difficult -- in both cost and schedule -- to enact. Indeed, the current capability-based paradigm that has emerged because of the constrained economic environment calls for the infusion of knowledge acquired during later design phases into earlier design phases, i.e. bring knowledge acquired during preliminary and detailed design into pre-conceptual and conceptual design. An area of critical importance to launch vehicle design is the optimization of its ascent trajectory, as the optimal trajectory will be able to take full advantage of the launch vehicle's capability to deliver a maximum amount of payload into orbit. Hence, the optimal ascent trajectory plays an important role in the vehicle's affordability posture as the need for more economically viable access to space solutions are needed in today's constrained economic environment. The problem of ascent trajectory optimization is not a new one. There are several programs that are widely used in industry that allows trajectory analysts to, based on detailed vehicle and insertion orbit parameters, determine the optimal ascent trajectory. Yet, little information is known about the launch vehicle early in the design phase - information that is required of many different disciplines in order to successfully optimize the ascent trajectory. Thus, the current paradigm of optimizing ascent trajectories involves generating point solutions for every change in a vehicle's design parameters. This is often a very tedious, manual, and time-consuming task for the analysts. Moreover, the trajectory design space is highly non-linear and multi-modal due to the interaction of various constraints. Additionally, when these obstacles are coupled with The Program to Optimize Simulated Trajectories [1] (POST), an industry standard program to optimize ascent trajectories that is difficult to use, it requires expert trajectory analysts to effectively optimize a vehicle's ascent trajectory. As it has been pointed out, the paradigm of trajectory optimization is still a very manual one because using modern computational resources on POST is still a challenging problem. The nuances and difficulties involved in correctly utilizing, and therefore automating, the program presents a large problem. In order to address these issues, the authors will discuss a methodology that has been developed. The methodology is two-fold: first, a set of heuristics will be introduced and discussed that were captured while working with expert analysts to replicate the current state-of-the-art; secondly, leveraging the power of modern computing to evaluate multiple trajectories simultaneously, and therefore, enable the exploration of the trajectory's design space early during the pre-conceptual and conceptual phases of design. When this methodology is coupled with design of experiments in order to train surrogate models, the authors were able to visualize the trajectory design space, enabling parametric optimal ascent trajectory information to be introduced with other pre-conceptual and conceptual design tools. The potential impact of this methodology's success would be a fully automated POST evaluation suite for the purpose of conceptual and preliminary design trade studies. This will enable engineers to characterize the ascent trajectory's sensitivity to design changes in an arbitrary number of dimensions and for finding settings for trajectory specific variables, which result in optimal performance for a "dialed-in" launch vehicle design. The effort described in this paper was developed for the Advanced Concepts Office [2] at NASA Marshall Space Flight Center
NASA Astrophysics Data System (ADS)
Unger, Johannes; Hametner, Christoph; Jakubek, Stefan; Quasthoff, Marcus
2014-12-01
An accurate state of charge (SoC) estimation of a traction battery in hybrid electric non-road vehicles, which possess higher dynamics and power densities than on-road vehicles, requires a precise battery cell terminal voltage model. This paper presents a novel methodology for non-linear system identification of battery cells to obtain precise battery models. The methodology comprises the architecture of local model networks (LMN) and optimal model based design of experiments (DoE). Three main novelties are proposed: 1) Optimal model based DoE, which aims to high dynamically excite the battery cells at load ranges frequently used in operation. 2) The integration of corresponding inputs in the LMN to regard the non-linearities SoC, relaxation, hysteresis as well as temperature effects. 3) Enhancements to the local linear model tree (LOLIMOT) construction algorithm, to achieve a physical appropriate interpretation of the LMN. The framework is applicable for different battery cell chemistries and different temperatures, and is real time capable, which is shown on an industrial PC. The accuracy of the obtained non-linear battery model is demonstrated on cells with different chemistries and temperatures. The results show significant improvement due to optimal experiment design and integration of the battery non-linearities within the LMN structure.
Jaraíz, Martín; Enríquez, Lourdes; Pinacho, Ruth; Rubio, José E; Lesarri, Alberto; López-Pérez, José L
2017-04-07
A novel DFT-based Reaction Kinetics (DFT-RK) simulation approach, employed in combination with real-time data from reaction monitoring instrumentation (like UV-vis, FTIR, Raman, and 2D NMR benchtop spectrometers), is shown to provide a detailed methodology for the analysis and design of complex synthetic chemistry schemes. As an example, it is applied to the opening of epoxides by titanocene in THF, a catalytic system with abundant experimental data available. Through a DFT-RK analysis of real-time IR data, we have developed a comprehensive mechanistic model that opens new perspectives to understand previous experiments. Although derived specifically from the opening of epoxides, the prediction capabilities of the model, built on elementary reactions, together with its practical side (reaction kinetics simulations of real experimental conditions) make it a useful simulation tool for the design of new experiments, as well as for the conception and development of improved versions of the reagents. From the perspective of the methodology employed, because both the computational (DFT-RK) and the experimental (spectroscopic data) components can follow the time evolution of several species simultaneously, it is expected to provide a helpful tool for the study of complex systems in synthetic chemistry.
A generalized methodology to characterize composite materials for pyrolysis models
NASA Astrophysics Data System (ADS)
McKinnon, Mark B.
The predictive capabilities of computational fire models have improved in recent years such that models have become an integral part of many research efforts. Models improve the understanding of the fire risk of materials and may decrease the number of expensive experiments required to assess the fire hazard of a specific material or designed space. A critical component of a predictive fire model is the pyrolysis sub-model that provides a mathematical representation of the rate of gaseous fuel production from condensed phase fuels given a heat flux incident to the material surface. The modern, comprehensive pyrolysis sub-models that are common today require the definition of many model parameters to accurately represent the physical description of materials that are ubiquitous in the built environment. Coupled with the increase in the number of parameters required to accurately represent the pyrolysis of materials is the increasing prevalence in the built environment of engineered composite materials that have never been measured or modeled. The motivation behind this project is to develop a systematic, generalized methodology to determine the requisite parameters to generate pyrolysis models with predictive capabilities for layered composite materials that are common in industrial and commercial applications. This methodology has been applied to four common composites in this work that exhibit a range of material structures and component materials. The methodology utilizes a multi-scale experimental approach in which each test is designed to isolate and determine a specific subset of the parameters required to define a material in the model. Data collected in simultaneous thermogravimetry and differential scanning calorimetry experiments were analyzed to determine the reaction kinetics, thermodynamic properties, and energetics of decomposition for each component of the composite. Data collected in microscale combustion calorimetry experiments were analyzed to determine the heats of complete combustion of the volatiles produced in each reaction. Inverse analyses were conducted on sample temperature data collected in bench-scale tests to determine the thermal transport parameters of each component through degradation. Simulations of quasi-one-dimensional bench-scale gasification tests generated from the resultant models using the ThermaKin modeling environment were compared to experimental data to independently validate the models.
Abidi, Mustufa Haider; Al-Ahmari, Abdulrahman; Ahmad, Ali
2018-01-01
Advanced graphics capabilities have enabled the use of virtual reality as an efficient design technique. The integration of virtual reality in the design phase still faces impediment because of issues linked to the integration of CAD and virtual reality software. A set of empirical tests using the selected conversion parameters was found to yield properly represented virtual reality models. The reduced model yields an R-sq (pred) value of 72.71% and an R-sq (adjusted) value of 86.64%, indicating that 86.64% of the response variability can be explained by the model. The R-sq (pred) is 67.45%, which is not very high, indicating that the model should be further reduced by eliminating insignificant terms. The reduced model yields an R-sq (pred) value of 73.32% and an R-sq (adjusted) value of 79.49%, indicating that 79.49% of the response variability can be explained by the model. Using the optimization software MODE Frontier (Optimization, MOGA-II, 2014), four types of response surfaces for the three considered response variables were tested for the data of DOE. The parameter values obtained using the proposed experimental design methodology result in better graphics quality, and other necessary design attributes.
A novel integrated framework and improved methodology of computer-aided drug design.
Chen, Calvin Yu-Chian
2013-01-01
Computer-aided drug design (CADD) is a critical initiating step of drug development, but a single model capable of covering all designing aspects remains to be elucidated. Hence, we developed a drug design modeling framework that integrates multiple approaches, including machine learning based quantitative structure-activity relationship (QSAR) analysis, 3D-QSAR, Bayesian network, pharmacophore modeling, and structure-based docking algorithm. Restrictions for each model were defined for improved individual and overall accuracy. An integration method was applied to join the results from each model to minimize bias and errors. In addition, the integrated model adopts both static and dynamic analysis to validate the intermolecular stabilities of the receptor-ligand conformation. The proposed protocol was applied to identifying HER2 inhibitors from traditional Chinese medicine (TCM) as an example for validating our new protocol. Eight potent leads were identified from six TCM sources. A joint validation system comprised of comparative molecular field analysis, comparative molecular similarity indices analysis, and molecular dynamics simulation further characterized the candidates into three potential binding conformations and validated the binding stability of each protein-ligand complex. The ligand pathway was also performed to predict the ligand "in" and "exit" from the binding site. In summary, we propose a novel systematic CADD methodology for the identification, analysis, and characterization of drug-like candidates.
3D image processing architecture for camera phones
NASA Astrophysics Data System (ADS)
Atanassov, Kalin; Ramachandra, Vikas; Goma, Sergio R.; Aleksic, Milivoje
2011-03-01
Putting high quality and easy-to-use 3D technology into the hands of regular consumers has become a recent challenge as interest in 3D technology has grown. Making 3D technology appealing to the average user requires that it be made fully automatic and foolproof. Designing a fully automatic 3D capture and display system requires: 1) identifying critical 3D technology issues like camera positioning, disparity control rationale, and screen geometry dependency, 2) designing methodology to automatically control them. Implementing 3D capture functionality on phone cameras necessitates designing algorithms to fit within the processing capabilities of the device. Various constraints like sensor position tolerances, sensor 3A tolerances, post-processing, 3D video resolution and frame rate should be carefully considered for their influence on 3D experience. Issues with migrating functions such as zoom and pan from the 2D usage model (both during capture and display) to 3D needs to be resolved to insure the highest level of user experience. It is also very important that the 3D usage scenario (including interactions between the user and the capture/display device) is carefully considered. Finally, both the processing power of the device and the practicality of the scheme needs to be taken into account while designing the calibration and processing methodology.
The CORSAGE Programme: Continuous Orbital Remote Sensing of Archipelagic Geochemical Effects
NASA Technical Reports Server (NTRS)
Acker, J. G.; Brown, C. W.; Hine, A. C.
1997-01-01
Current and pending oceanographic remote sensing technology allows the conceptualization of a programme designed to investigate ocean island interactions that could induce short-term nearshore fluxes of particulate organic carbon and biogenic calcium carbonate from pelagic island archipelagoes. These events will influence the geochemistry of adjacent waters, particularly the marine carbon system. Justification and design are provided for a study that would combine oceanographic satellite remote sensing (visible and infrared radiometry, altimetry and scatterometry) with shore-based facilities. A programme incorporating the methodology outlined here would seek to identify the mechanisms that cause such events, assess their geochemical significance, and provide both analytical and predictive capabilities for observations on greater temporal and spatial scales.
NASA Technical Reports Server (NTRS)
Generazio, Edward R.
2011-01-01
The capability of an inspection system is established by applications of various methodologies to determine the probability of detection (POD). One accepted metric of an adequate inspection system is that for a minimum flaw size and all greater flaw sizes, there is 0.90 probability of detection with 95% confidence (90/95 POD). Directed design of experiments for probability of detection (DOEPOD) has been developed to provide an efficient and accurate methodology that yields estimates of POD and confidence bounds for both Hit-Miss or signal amplitude testing, where signal amplitudes are reduced to Hit-Miss by using a signal threshold Directed DOEPOD uses a nonparametric approach for the analysis or inspection data that does require any assumptions about the particular functional form of a POD function. The DOEPOD procedure identifies, for a given sample set whether or not the minimum requirement of 0.90 probability of detection with 95% confidence is demonstrated for a minimum flaw size and for all greater flaw sizes (90/95 POD). The DOEPOD procedures are sequentially executed in order to minimize the number of samples needed to demonstrate that there is a 90/95 POD lower confidence bound at a given flaw size and that the POD is monotonic for flaw sizes exceeding that 90/95 POD flaw size. The conservativeness of the DOEPOD methodology results is discussed. Validated guidelines for binomial estimation of POD for fracture critical inspection are established.
Structural equation modeling and natural systems
Grace, James B.
2006-01-01
This book, first published in 2006, presents an introduction to the methodology of structural equation modeling, illustrates its use, and goes on to argue that it has revolutionary implications for the study of natural systems. A major theme of this book is that we have, up to this point, attempted to study systems primarily using methods (such as the univariate model) that were designed only for considering individual processes. Understanding systems requires the capacity to examine simultaneous influences and responses. Structural equation modeling (SEM) has such capabilities. It also possesses many other traits that add strength to its utility as a means of making scientific progress. In light of the capabilities of SEM, it can be argued that much of ecological theory is currently locked in an immature state that impairs its relevance. It is further argued that the principles of SEM are capable of leading to the development and evaluation of multivariate theories of the sort vitally needed for the conservation of natural systems.
Dielectric Metasurface Optics: A New Platform for Compact Optical Sensing
NASA Astrophysics Data System (ADS)
Colburn, Shane
Metasurfaces, the 2D analogue of bulk metamaterials, show incredible promise for achieving nanoscale optical components that could support the growing infrastructure for the Internet of Things (IoT) and future sensing technologies. Consisting of quasiperiodic arrays of subwavelength scattering elements, metasurfaces apply spatial transfer functions to incident wavefronts, abruptly altering properties of light over a wavelength-scale thickness. By appropriately patterning scatterers on the structure, arbitrary functions can be implemented up to the limitations on the scattering properties of the particular elements. This thesis details theoretical work and simulations on the design of scattering elements with advanced capabilities for dielectric metasurfaces, showing polarization-multiplexed operation in the visible regime, multiwavelength capability in the visible regime along with a general methodology for eliminating chromatic aberrations at discrete wavelengths, and compact and tunable elements for 1550 nm operation inspired by an asymmetric Fabry-Perot cavity. These advancements enhance the capabilities of metasurfaces in the visible regime and help move toward the goal of achieving reconfigurable metasurfaces for compact and efficient optical sensors.
Practical Loop-Shaping Design of Feedback Control Systems
NASA Technical Reports Server (NTRS)
Kopasakis, George
2010-01-01
An improved methodology for designing feedback control systems has been developed based on systematically shaping the loop gain of the system to meet performance requirements such as stability margins, disturbance attenuation, and transient response, while taking into account the actuation system limitations such as actuation rates and range. Loop-shaping for controls design is not new, but past techniques do not directly address how to systematically design the controller to maximize its performance. As a result, classical feedback control systems are designed predominantly using ad hoc control design approaches such as proportional integral derivative (PID), normally satisfied when a workable solution is achieved, without a good understanding of how to maximize the effectiveness of the control design in terms of competing performance requirements, in relation to the limitations of the plant design. The conception of this improved methodology was motivated by challenges in designing control systems of the types needed for supersonic propulsion. But the methodology is generally applicable to any classical control-system design where the transfer function of the plant is known or can be evaluated. In the case of a supersonic aerospace vehicle, a major challenge is to design the system to attenuate anticipated external and internal disturbances, using such actuators as fuel injectors and valves, bypass doors, and ramps, all of which are subject to limitations in actuator response, rates, and ranges. Also, for supersonic vehicles, with long slim type of structures, coupling between the engine and the structural dynamics can produce undesirable effects that could adversely affect vehicle stability and ride quality. In order to design distributed controls that can suppress these potential adverse effects, within the full capabilities of the actuation system, it is important to employ a systematic control design methodology such as this that can maximize the effectiveness of the control design in a methodical and quantifiable way. The emphasis is in generating simple but rather powerful design techniques that will allow even designers with a layman s knowledge in controls to develop effective feedback control designs. Unlike conventional ad hoc methodologies of feedback control design, in this approach actuator rates are incorporated into the design right from the start: The relation between actuator speeds and the desired control bandwidth of the system is established explicitly. The technique developed is demonstrated via design examples in a step-by-step tutorial way. Given the actuation system rates and range limits together with design specifications in terms of stability margins, disturbance rejection, and transient response, the procedure involves designing the feedback loop gain to meet the requirements and maximizing the control system effectiveness, without exceeding the actuation system limits and saturating the controller. Then knowing the plant transfer function, the procedure involves designing the controller so that the controller transfer function together with the plant transfer function equate to the designed loop gain. The technique also shows what the limitations of the controller design are and how to trade competing design requirements such as stability margins and disturbance rejection. Finally, the technique is contrasted against other more familiar control design techniques, like PID control, to show its advantages.
NASA Technical Reports Server (NTRS)
Erickson, Gary E.
2010-01-01
Response surface methodology was used to estimate the longitudinal stage separation aerodynamic characteristics of a generic, bimese, winged multi-stage launch vehicle configuration at supersonic speeds in the NASA LaRC Unitary Plan Wind Tunnel. The Mach 3 staging was dominated by shock wave interactions between the orbiter and booster vehicles throughout the relative spatial locations of interest. The inference space was partitioned into several contiguous regions within which the separation aerodynamics were presumed to be well-behaved and estimable using central composite designs capable of fitting full second-order response functions. The underlying aerodynamic response surfaces of the booster vehicle in belly-to-belly proximity to the orbiter vehicle were estimated using piecewise-continuous lower-order polynomial functions. The quality of fit and prediction capabilities of the empirical models were assessed in detail, and the issue of subspace boundary discontinuities was addressed. Augmenting the central composite designs to full third-order using computer-generated D-optimality criteria was evaluated. The usefulness of central composite designs, the subspace sizing, and the practicality of fitting lower-order response functions over a partitioned inference space dominated by highly nonlinear and possibly discontinuous shock-induced aerodynamics are discussed.
Miniaturized unified imaging system using bio-inspired fluidic lens
NASA Astrophysics Data System (ADS)
Tsai, Frank S.; Cho, Sung Hwan; Qiao, Wen; Kim, Nam-Hyong; Lo, Yu-Hwa
2008-08-01
Miniaturized imaging systems have become ubiquitous as they are found in an ever-increasing number of devices, such as cellular phones, personal digital assistants, and web cameras. Until now, the design and fabrication methodology of such systems have not been significantly different from conventional cameras. The only established method to achieve focusing is by varying the lens distance. On the other hand, the variable-shape crystalline lens found in animal eyes offers inspiration for a more natural way of achieving an optical system with high functionality. Learning from the working concepts of the optics in the animal kingdom, we developed bio-inspired fluidic lenses for a miniature universal imager with auto-focusing, macro, and super-macro capabilities. Because of the enormous dynamic range of fluidic lenses, the miniature camera can even function as a microscope. To compensate for the image quality difference between the central vision and peripheral vision and the shape difference between a solid-state image sensor and a curved retina, we adopted a hybrid design consisting of fluidic lenses for tunability and fixed lenses for aberration and color dispersion correction. A design of the world's smallest surgical camera with 3X optical zoom capabilities is also demonstrated using the approach of hybrid lenses.
NASA Astrophysics Data System (ADS)
Bolon, Kevin M.
The lack of multi-day data for household travel and vehicle capability requirements is an impediment to evaluations of energy savings strategies, since (1) travel requirements vary from day-to-day, and (2) energy-saving transportation options often have reduced capability. This work demonstrates a survey methodology and modeling system for evaluating the energy-savings potential of household travel, considering multi-day travel requirements and capability constraints imposed by the available transportation resources. A stochastic scheduling model is introduced---the multi-day Household Activity Schedule Estimator (mPHASE)---which generates synthetic daily schedules based on "fuzzy" descriptions of activity characteristics using a finite-element representation of activity flexibility, coordination among household members, and scheduling conflict resolution. Results of a thirty-household pilot study are presented in which responses to an interactive computer assisted personal interview were used as inputs to the mPHASE model in order to illustrate the feasibility of generating complex, realistic multi-day household schedules. Study vehicles were equipped with digital cameras and GPS data acquisition equipment to validate the model results. The synthetically generated schedules captured an average of 60 percent of household travel distance, and exhibited many of the characteristics of complex household travel, including day-to-day travel variation, and schedule coordination among household members. Future advances in the methodology may improve the model results, such as encouraging more detailed and accurate responses by providing a selection of generated schedules during the interview. Finally, the Constraints-based Transportation Resource Assignment Model (CTRAM) is introduced. Using an enumerative optimization approach, CTRAM determines the energy-minimizing vehicle-to-trip assignment decisions, considering trip schedules, occupancy, and vehicle capability. Designed to accept either actual or synthetic schedules, results of an application of the optimization model to the 2001 and 2009 National Household Travel Survey data show that U.S. households can reduce energy use by 10 percent, on average, by modifying the assignment of existing vehicles to trips. Households in 2009 show a higher tendency to assign vehicles optimally than in 2001, and multi-vehicle households with diverse fleets have greater savings potential, indicating that fleet modification strategies may be effective, particularly under higher energy price conditions.
NASA Astrophysics Data System (ADS)
Aleva, D.; McCracken, J.
This paper will overview a Cognitive Task Analysis (CTA) of the tasks accomplished by space operators in the Combat Operations Division (COD) of the Joint Space Operations Center (JSpOC). The methodology used to collect data will be presented. The work was performed in support of the AFRL Space Situation Awareness Fusion Intelligent Research Environment (SAFIRE) effort. SAFIRE is a multi-directorate program led by Air Force Research Laboratory (AFRL), Space Vehicles Directorate (AFRL/RV) and supporting Future Long Term Challenge 2.6.5. It is designed to address research areas identified from completion of a Core Process 3 effort for Joint Space Operations Center (JSpOC). The report is intended to be a resource for those developing capability in support of SAFIRE, the Joint Functional Component Command (JFCC) Space Integrated Prototype (JSIP) User-Defined Operating Picture (UDOP), and other related projects. The report is under distribution restriction; our purpose here is to expose its existence to a wider audience so that qualified individuals may access it. The report contains descriptions of the organization, its most salient products, tools, and cognitive tasks. Tasks reported are derived from the data collected and presented at multiple levels of abstraction. Recommendations for leveraging the findings of the report are presented. The report contains a number of appendices that amplify the methodology, provide background or context support, and includes references in support of cognitive task methodology. In a broad sense, the CTA is intended to be the foundation for relevant, usable capability in support of space warfighters. It presents, at an unclassified level, introductory material to familiarize inquirers with the work of the COD; this is embedded in a description of the broader context of the other divisions of the JSpOC. It does NOT provide guidance for the development of Tactics, Techniques, and Procedures (TT&Ps) in the development of JSpOC processes. However, the TT&Ps are a part of the structure of work, and are, therefore, a factor in developing future capability. The authors gratefully acknowledge the cooperation and assistance from the warfighters at the JSpOC as well as the personnel of the JSpOC Capabilities Integration Office (JCIO). Their input to the process created the value of this effort.
1985-08-01
training would again be required, include work on peripheral nerves, craniotomy and craniectomy (although approximately one-fifth of the -general surgeons...said never to craniotomy and craniectomy), and closed and open reductions of fractures of facial bones. Surgical subspecialty examinations can be...0 0 0 0 0 0 Free skin grfts-sites exc face 100 0 0 0 0 0 0 Free skin grafts to face 81 16 4 0 0 0 0 Craniotomy /craniectomy 7 25 11 12 11 15 19 Burr
2016-01-01
user, the model will use surplus inventory in one category to fill shortfalls in other categories. The TFBL model also has the capability to allow...the total force. As shown in Table 2.1, we used five-character AFSCs (four digits plus the suffix) to break out pilots in the current force as...Training Unit 4th digit Suffix 4th digit Suffix 4th digit 4th digit 2 or 3 Matches a specific aircraft designation 3 Does not match a specific
Direct use of linear time-domain aerodynamics in aeroservoelastic analysis: Aerodynamic model
NASA Technical Reports Server (NTRS)
Woods, J. A.; Gilbert, Michael G.
1990-01-01
The work presented here is the first part of a continuing effort to expand existing capabilities in aeroelasticity by developing the methodology which is necessary to utilize unsteady time-domain aerodynamics directly in aeroservoelastic design and analysis. The ultimate objective is to define a fully integrated state-space model of an aeroelastic vehicle's aerodynamics, structure and controls which may be used to efficiently determine the vehicle's aeroservoelastic stability. Here, the current status of developing a state-space model for linear or near-linear time-domain indicial aerodynamic forces is presented.
Shuttle TPS thermal performance and analysis methodology
NASA Technical Reports Server (NTRS)
Neuenschwander, W. E.; Mcbride, D. U.; Armour, G. A.
1983-01-01
Thermal performance of the thermal protection system was approximately as predicted. The only extensive anomalies were filler bar scorching and over-predictions in the high Delta p gap heating regions of the orbiter. A technique to predict filler bar scorching has been developed that can aid in defining a solution. Improvement in high Delta p gap heating methodology is still under study. Minor anomalies were also examined for improvements in modeling techniques and prediction capabilities. These include improved definition of low Delta p gap heating, an analytical model for inner mode line convection heat transfer, better modeling of structure, and inclusion of sneak heating. The limited number of problems related to penetration items that presented themselves during orbital flight tests were resolved expeditiously, and designs were changed and proved successful within the time frame of that program.
Event-Based Tone Mapping for Asynchronous Time-Based Image Sensor
Simon Chane, Camille; Ieng, Sio-Hoi; Posch, Christoph; Benosman, Ryad B.
2016-01-01
The asynchronous time-based neuromorphic image sensor ATIS is an array of autonomously operating pixels able to encode luminance information with an exceptionally high dynamic range (>143 dB). This paper introduces an event-based methodology to display data from this type of event-based imagers, taking into account the large dynamic range and high temporal accuracy that go beyond available mainstream display technologies. We introduce an event-based tone mapping methodology for asynchronously acquired time encoded gray-level data. A global and a local tone mapping operator are proposed. Both are designed to operate on a stream of incoming events rather than on time frame windows. Experimental results on real outdoor scenes are presented to evaluate the performance of the tone mapping operators in terms of quality, temporal stability, adaptation capability, and computational time. PMID:27642275
Engineering and programming manual: Two-dimensional kinetic reference computer program (TDK)
NASA Technical Reports Server (NTRS)
Nickerson, G. R.; Dang, L. D.; Coats, D. E.
1985-01-01
The Two Dimensional Kinetics (TDK) computer program is a primary tool in applying the JANNAF liquid rocket thrust chamber performance prediction methodology. The development of a methodology that includes all aspects of rocket engine performance from analytical calculation to test measurements, that is physically accurate and consistent, and that serves as an industry and government reference is presented. Recent interest in rocket engines that operate at high expansion ratio, such as most Orbit Transfer Vehicle (OTV) engine designs, has required an extension of the analytical methods used by the TDK computer program. Thus, the version of TDK that is described in this manual is in many respects different from the 1973 version of the program. This new material reflects the new capabilities of the TDK computer program, the most important of which are described.
NASA Astrophysics Data System (ADS)
Horstemeyer, M. F.
This review of multiscale modeling covers a brief history of various multiscale methodologies related to solid materials and the associated experimental influences, the various influence of multiscale modeling on different disciplines, and some examples of multiscale modeling in the design of structural components. Although computational multiscale modeling methodologies have been developed in the late twentieth century, the fundamental notions of multiscale modeling have been around since da Vinci studied different sizes of ropes. The recent rapid growth in multiscale modeling is the result of the confluence of parallel computing power, experimental capabilities to characterize structure-property relations down to the atomic level, and theories that admit multiple length scales. The ubiquitous research that focus on multiscale modeling has broached different disciplines (solid mechanics, fluid mechanics, materials science, physics, mathematics, biological, and chemistry), different regions of the world (most continents), and different length scales (from atoms to autos).
NASA Astrophysics Data System (ADS)
Klein, R.; Woodward, C. S.; Johannesson, G.; Domyancic, D.; Covey, C. C.; Lucas, D. D.
2012-12-01
Uncertainty Quantification (UQ) is a critical field within 21st century simulation science that resides at the very center of the web of emerging predictive capabilities. The science of UQ holds the promise of giving much greater meaning to the results of complex large-scale simulations, allowing for quantifying and bounding uncertainties. This powerful capability will yield new insights into scientific predictions (e.g. Climate) of great impact on both national and international arenas, allow informed decisions on the design of critical experiments (e.g. ICF capsule design, MFE, NE) in many scientific fields, and assign confidence bounds to scientifically predictable outcomes (e.g. nuclear weapons design). In this talk I will discuss a major new strategic initiative (SI) we have developed at Lawrence Livermore National Laboratory to advance the science of Uncertainty Quantification at LLNL focusing in particular on (a) the research and development of new algorithms and methodologies of UQ as applied to multi-physics multi-scale codes, (b) incorporation of these advancements into a global UQ Pipeline (i.e. a computational superstructure) that will simplify user access to sophisticated tools for UQ studies as well as act as a self-guided, self-adapting UQ engine for UQ studies on extreme computing platforms and (c) use laboratory applications as a test bed for new algorithms and methodologies. The initial SI focus has been on applications for the quantification of uncertainty associated with Climate prediction, but the validated UQ methodologies we have developed are now being fed back into Science Based Stockpile Stewardship (SSS) and ICF UQ efforts. To make advancements in several of these UQ grand challenges, I will focus in talk on the following three research areas in our Strategic Initiative: Error Estimation in multi-physics and multi-scale codes ; Tackling the "Curse of High Dimensionality"; and development of an advanced UQ Computational Pipeline to enable complete UQ workflow and analysis for ensemble runs at the extreme scale (e.g. exascale) with self-guiding adaptation in the UQ Pipeline engine. This work was performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344 and was funded by the Uncertainty Quantification Strategic Initiative Laboratory Directed Research and Development Project at LLNL under project tracking code 10-SI-013 (UCRL LLNL-ABS-569112).
Application of hybrid methodology to rotors in steady and maneuvering flight
NASA Astrophysics Data System (ADS)
Rajmohan, Nischint
Helicopters are versatile flying machines that have capabilities that are unparalleled by fixed wing aircraft, such as operating in hover, performing vertical takeoff and landing on unprepared sites. This makes their use especially desirable in military and search-and-rescue operations. However, modern helicopters still suffer from high levels of noise and vibration caused by the physical phenomena occurring in the vicinity of the rotor blades. Therefore, improvement in rotorcraft design to reduce the noise and vibration levels requires understanding of the underlying physical phenomena, and accurate prediction capabilities of the resulting rotorcraft aeromechanics. The goal of this research is to study the aeromechanics of rotors in steady and maneuvering flight using hybrid Computational Fluid Dynamics (CFD) methodology. The hybrid CFD methodology uses the Navier-Stokes equations to solve the flow near the blade surface but the effect of the far wake is computed through the wake model. The hybrid CFD methodology is computationally efficient and its wake modeling approach is nondissipative making it an attractive tool to study rotorcraft aeromechanics. Several enhancements were made to the CFD methodology and it was coupled to a Computational Structural Dynamics (CSD) methodology to perform a trimmed aeroelastic analysis of a rotor in forward flight. The coupling analyses, both loose and tight were used to identify the key physical phenomena that affect rotors in different steady flight regimes. The modeling enhancements improved the airloads predictions for a variety of flight conditions. It was found that the tightly coupled method did not impact the loads significantly for steady flight conditions compared to the loosely coupled method. The coupling methodology was extended to maneuvering flight analysis by enhancing the computational and structural models to handle non-periodic flight conditions and vehicle motions in time accurate mode. The flight test control angles were employed to enable the maneuvering flight analysis. The fully coupled model provided the presence of three dynamic stall cycles on the rotor in maneuver. It is important to mention that analysis of maneuvering flight requires knowledge of the pilot input control pitch settings, and the vehicle states. As the result, these computational tools cannot be used for analysis of loads in a maneuver that has not been duplicated in a real flight. This is a significant limitation if these tools are to be selected during the design phase of a helicopter where its handling qualities are evaluated in different trajectories. Therefore, a methodology was developed to couple the CFD/CSD simulation with an inverse flight mechanics simulation to perform the maneuver analysis without using the flight test control input. The methodology showed reasonable convergence in steady flight regime and control angles predictions compared fairly well with test data. In the maneuvering flight regions, the convergence was slower due to relaxation techniques used for the numerical stability. The subsequent computed control angles for the maneuvering flight regions compared well with test data. Further, the enhancement of the rotor inflow computations in the inverse simulation through implementation of a Lagrangian wake model improved the convergence of the coupling methodology.
Integrating automated structured analysis and design with Ada programming support environments
NASA Technical Reports Server (NTRS)
Hecht, Alan; Simmons, Andy
1986-01-01
Ada Programming Support Environments (APSE) include many powerful tools that address the implementation of Ada code. These tools do not address the entire software development process. Structured analysis is a methodology that addresses the creation of complete and accurate system specifications. Structured design takes a specification and derives a plan to decompose the system subcomponents, and provides heuristics to optimize the software design to minimize errors and maintenance. It can also produce the creation of useable modules. Studies have shown that most software errors result from poor system specifications, and that these errors also become more expensive to fix as the development process continues. Structured analysis and design help to uncover error in the early stages of development. The APSE tools help to insure that the code produced is correct, and aid in finding obscure coding errors. However, they do not have the capability to detect errors in specifications or to detect poor designs. An automated system for structured analysis and design TEAMWORK, which can be integrated with an APSE to support software systems development from specification through implementation is described. These tools completement each other to help developers improve quality and productivity, as well as to reduce development and maintenance costs. Complete system documentation and reusable code also resultss from the use of these tools. Integrating an APSE with automated tools for structured analysis and design provide capabilities and advantages beyond those realized with any of these systems used by themselves.
DOT National Transportation Integrated Search
1974-08-01
Volume 3 describes the methodology for man-machine task allocation. It contains a description of man and machine performance capabilities and an explanation of the methodology employed to allocate tasks to human or automated resources. It also presen...
Intravascular Neural Interface with Nanowire Electrode
Watanabe, Hirobumi; Takahashi, Hirokazu; Nakao, Masayuki; Walton, Kerry; Llinás, Rodolfo R.
2010-01-01
Summary A minimally invasive electrical recording and stimulating technique capable of simultaneously monitoring the activity of a significant number (e.g., 103 to 104) of neurons is an absolute prerequisite in developing an effective brain–machine interface. Although there are many excellent methodologies for recording single or multiple neurons, there has been no methodology for accessing large numbers of cells in a behaving experimental animal or human individual. Brain vascular parenchyma is a promising candidate for addressing this problem. It has been proposed [1, 2] that a multitude of nanowire electrodes introduced into the central nervous system through the vascular system to address any brain area may be a possible solution. In this study we implement a design for such microcatheter for ex vivo experiments. Using Wollaston platinum wire, we design a submicron-scale electrode and develop a fabrication method. We then evaluate the mechanical properties of the electrode in a flow when passing through the intricacies of the capillary bed in ex vivo Xenopus laevis experiments. Furthermore, we demonstrate the feasibility of intravascular recording in the spinal cord of Xenopus laevis. PMID:21572940
Abu, Mary Ladidi; Nooh, Hisham Mohd; Oslan, Siti Nurbaya; Salleh, Abu Bakar
2017-11-10
Pichia guilliermondii was found capable of expressing the recombinant thermostable lipase without methanol under the control of methanol dependent alcohol oxidase 1 promoter (AOXp 1). In this study, statistical approaches were employed for the screening and optimisation of physical conditions for T1 lipase production in P. guilliermondii. The screening of six physical conditions by Plackett-Burman Design has identified pH, inoculum size and incubation time as exerting significant effects on lipase production. These three conditions were further optimised using, Box-Behnken Design of Response Surface Methodology, which predicted an optimum medium comprising pH 6, 24 h incubation time and 2% inoculum size. T1 lipase activity of 2.0 U/mL was produced with a biomass of OD 600 23.0. The process of using RSM for optimisation yielded a 3-fold increase of T1 lipase over medium before optimisation. Therefore, this result has proven that T1 lipase can be produced at a higher yield in P. guilliermondii.
The Aeronautical Data Link: Decision Framework for Architecture Analysis
NASA Technical Reports Server (NTRS)
Morris, A. Terry; Goode, Plesent W.
2003-01-01
A decision analytic approach that develops optimal data link architecture configuration and behavior to meet multiple conflicting objectives of concurrent and different airspace operations functions has previously been developed. The approach, premised on a formal taxonomic classification that correlates data link performance with operations requirements, information requirements, and implementing technologies, provides a coherent methodology for data link architectural analysis from top-down and bottom-up perspectives. This paper follows the previous research by providing more specific approaches for mapping and transitioning between the lower levels of the decision framework. The goal of the architectural analysis methodology is to assess the impact of specific architecture configurations and behaviors on the efficiency, capacity, and safety of operations. This necessarily involves understanding the various capabilities, system level performance issues and performance and interface concepts related to the conceptual purpose of the architecture and to the underlying data link technologies. Efficient and goal-directed data link architectural network configuration is conditioned on quantifying the risks and uncertainties associated with complex structural interface decisions. Deterministic and stochastic optimal design approaches will be discussed that maximize the effectiveness of architectural designs.
NASA Astrophysics Data System (ADS)
Alemany, Kristina
Electric propulsion has recently become a viable technology for spacecraft, enabling shorter flight times, fewer required planetary gravity assists, larger payloads, and/or smaller launch vehicles. With the maturation of this technology, however, comes a new set of challenges in the area of trajectory design. Because low-thrust trajectory optimization has historically required long run-times and significant user-manipulation, mission design has relied on expert-based knowledge for selecting departure and arrival dates, times of flight, and/or target bodies and gravitational swing-bys. These choices are generally based on known configurations that have worked well in previous analyses or simply on trial and error. At the conceptual design level, however, the ability to explore the full extent of the design space is imperative to locating the best solutions in terms of mass and/or flight times. Beginning in 2005, the Global Trajectory Optimization Competition posed a series of difficult mission design problems, all requiring low-thrust propulsion and visiting one or more asteroids. These problems all had large ranges on the continuous variables---launch date, time of flight, and asteroid stay times (when applicable)---as well as being characterized by millions or even billions of possible asteroid sequences. Even with recent advances in low-thrust trajectory optimization, full enumeration of these problems was not possible within the stringent time limits of the competition. This investigation develops a systematic methodology for determining a broad suite of good solutions to the combinatorial, low-thrust, asteroid tour problem. The target application is for conceptual design, where broad exploration of the design space is critical, with the goal being to rapidly identify a reasonable number of promising solutions for future analysis. The proposed methodology has two steps. The first step applies a three-level heuristic sequence developed from the physics of the problem, which allows for efficient pruning of the design space. The second phase applies a global optimization scheme to locate a broad suite of good solutions to the reduced problem. The global optimization scheme developed combines a novel branch-and-bound algorithm with a genetic algorithm and an industry-standard low-thrust trajectory optimization program to solve for the following design variables: asteroid sequence, launch date, times of flight, and asteroid stay times. The methodology is developed based on a small sample problem, which is enumerated and solved so that all possible discretized solutions are known. The methodology is then validated by applying it to a larger intermediate sample problem, which also has a known solution. Next, the methodology is applied to several larger combinatorial asteroid rendezvous problems, using previously identified good solutions as validation benchmarks. These problems include the 2nd and 3rd Global Trajectory Optimization Competition problems. The methodology is shown to be capable of achieving a reduction in the number of asteroid sequences of 6-7 orders of magnitude, in terms of the number of sequences that require low-thrust optimization as compared to the number of sequences in the original problem. More than 70% of the previously known good solutions are identified, along with several new solutions that were not previously reported by any of the competitors. Overall, the methodology developed in this investigation provides an organized search technique for the low-thrust mission design of asteroid rendezvous problems.
T-Epitope Designer: A HLA-peptide binding prediction server.
Kangueane, Pandjassarame; Sakharkar, Meena Kishore
2005-05-15
The current challenge in synthetic vaccine design is the development of a methodology to identify and test short antigen peptides as potential T-cell epitopes. Recently, we described a HLA-peptide binding model (using structural properties) capable of predicting peptides binding to any HLA allele. Consequently, we have developed a web server named T-EPITOPE DESIGNER to facilitate HLA-peptide binding prediction. The prediction server is based on a model that defines peptide binding pockets using information gleaned from X-ray crystal structures of HLA-peptide complexes, followed by the estimation of peptide binding to binding pockets. Thus, the prediction server enables the calculation of peptide binding to HLA alleles. This model is superior to many existing methods because of its potential application to any given HLA allele whose sequence is clearly defined. The web server finds potential application in T cell epitope vaccine design. http://www.bioinformation.net/ted/
Statistical analysis and yield management in LED design through TCAD device simulation
NASA Astrophysics Data System (ADS)
Létay, Gergö; Ng, Wei-Choon; Schneider, Lutz; Bregy, Adrian; Pfeiffer, Michael
2007-02-01
This paper illustrates how technology computer-aided design (TCAD), which nowadays is an essential part of CMOS technology, can be applied to LED development and manufacturing. In the first part, the essential electrical and optical models inherent to LED modeling are reviewed. The second part of the work describes a methodology to improve the efficiency of the simulation procedure by using the concept of process compact models (PCMs). The last part demonstrates the capabilities of PCMs using an example of a blue InGaN LED. In particular, a parameter screening is performed to find the most important parameters, an optimization task incorporating the robustness of the design is carried out, and finally the impact of manufacturing tolerances on yield is investigated. It is indicated how the concept of PCMs can contribute to an efficient design for manufacturing DFM-aware development.
Methodology for Conducting Analyses of Army Capabilities
1992-06-01
31 Determine Sensitivity of Operations to Functions ........................ 34 Generate Capability Issues ...40 Package and Prioritize Issues ..................................... 40 IDENTIFY AND ASSESS CAPABILITY IMPROVEMENTS .................. 43 Generate...identify critical issues , and make force modernization recommendations to Headquarters, Depart- ment of the Army (HQDA). The work described in this report
Cognitive task analysis of network analysts and managers for network situational awareness
NASA Astrophysics Data System (ADS)
Erbacher, Robert F.; Frincke, Deborah A.; Wong, Pak Chung; Moody, Sarah; Fink, Glenn
2010-01-01
The goal of our project is to create a set of next-generation cyber situational-awareness capabilities with applications to other domains in the long term. The situational-awareness capabilities being developed focus on novel visualization techniques as well as data analysis techniques designed to improve the comprehensibility of the visualizations. The objective is to improve the decision-making process to enable decision makers to choose better actions. To this end, we put extensive effort into ensuring we had feedback from network analysts and managers and understanding what their needs truly are. This paper discusses the cognitive task analysis methodology we followed to acquire feedback from the analysts. This paper also provides the details we acquired from the analysts on their processes, goals, concerns, etc. A final result we describe is the generation of a task-flow diagram.
A 3DHZETRN Code in a Spherical Uniform Sphere with Monte Carlo Verification
NASA Technical Reports Server (NTRS)
Wilson, John W.; Slaba, Tony C.; Badavi, Francis F.; Reddell, Brandon D.; Bahadori, Amir A.
2014-01-01
The computationally efficient HZETRN code has been used in recent trade studies for lunar and Martian exploration and is currently being used in the engineering development of the next generation of space vehicles, habitats, and extra vehicular activity equipment. A new version (3DHZETRN) capable of transporting High charge (Z) and Energy (HZE) and light ions (including neutrons) under space-like boundary conditions with enhanced neutron and light ion propagation is under development. In the present report, new algorithms for light ion and neutron propagation with well-defined convergence criteria in 3D objects is developed and tested against Monte Carlo simulations to verify the solution methodology. The code will be available through the software system, OLTARIS, for shield design and validation and provides a basis for personal computer software capable of space shield analysis and optimization.
Control system software, simulation, and robotic applications
NASA Technical Reports Server (NTRS)
Frisch, Harold P.
1991-01-01
All essential existing capabilities needed to create a man-machine interaction dynamics and performance (MMIDAP) capability are reviewed. The multibody system dynamics software program Order N DISCOS will be used for machine and musculo-skeletal dynamics modeling. The program JACK will be used for estimating and animating whole body human response to given loading situations and motion constraints. The basic elements of performance (BEP) task decomposition methodologies associated with the Human Performance Institute database will be used for performance assessment. Techniques for resolving the statically indeterminant muscular load sharing problem will be used for a detailed understanding of potential musculotendon or ligamentous fatigue, pain, discomfort, and trauma. The envisioned capacity is to be used for mechanical system design, human performance assessment, extrapolation of man/machine interaction test data, biomedical engineering, and soft prototyping within a concurrent engineering (CE) system.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hills, Richard G.; Maniaci, David Charles; Naughton, Jonathan W.
2015-09-01
A Verification and Validation (V&V) framework is presented for the development and execution of coordinated modeling and experimental program s to assess the predictive capability of computational models of complex systems through focused, well structured, and formal processes.The elements of the framework are based on established V&V methodology developed by various organizations including the Department of Energy, National Aeronautics and Space Administration, the American Institute of Aeronautics and Astronautics, and the American Society of Mechanical Engineers. Four main topics are addressed: 1) Program planning based on expert elicitation of the modeling physics requirements, 2) experimental design for model assessment, 3)more » uncertainty quantification for experimental observations and computational model simulations, and 4) assessment of the model predictive capability. The audience for this document includes program planners, modelers, experimentalist, V &V specialist, and customers of the modeling results.« less
NASA Astrophysics Data System (ADS)
Leuchter, S.; Reinert, F.; Müller, W.
2014-06-01
Procurement and design of system architectures capable of network centric operations demand for an assessment scheme in order to compare different alternative realizations. In this contribution an assessment method for system architectures targeted at the C4ISR domain is presented. The method addresses the integration capability of software systems from a complex and distributed software system perspective focusing communication, interfaces and software. The aim is to evaluate the capability to integrate a system or its functions within a system-of-systems network. This method uses approaches from software architecture quality assessment and applies them on the system architecture level. It features a specific goal tree of several dimensions that are relevant for enterprise integration. These dimensions have to be weighed against each other and totalized using methods from the normative decision theory in order to reflect the intention of the particular enterprise integration effort. The indicators and measurements for many of the considered quality features rely on a model based view on systems, networks, and the enterprise. That means it is applicable to System-of-System specifications based on enterprise architectural frameworks relying on defined meta-models or domain ontologies for defining views and viewpoints. In the defense context we use the NATO Architecture Framework (NAF) to ground respective system models. The proposed assessment method allows evaluating and comparing competing system designs regarding their future integration potential. It is a contribution to the system-of-systems engineering methodology.
Analysis of the impact of safeguards criteria
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mullen, M.F.; Reardon, P.T.
As part of the US Program of Technical Assistance to IAEA Safeguards, the Pacific Northwest Laboratory (PNL) was asked to assist in developing and demonstrating a model for assessing the impact of setting criteria for the application of IAEA safeguards. This report presents the results of PNL's work on the task. The report is in three parts. The first explains the technical approach and methodology. The second contains an example application of the methodology. The third presents the conclusions of the study. PNL used the model and computer programs developed as part of Task C.5 (Estimation of Inspection Efforts) ofmore » the Program of Technical Assistance. The example application of the methodology involves low-enriched uranium conversion and fuel fabrication facilities. The effects of variations in seven parameters are considered: false alarm probability, goal probability of detection, detection goal quantity, the plant operator's measurement capability, the inspector's variables measurement capability, the inspector's attributes measurement capability, and annual plant throughput. Among the key results and conclusions of the analysis are the following: the variables with the greatest impact on the probability of detection are the inspector's measurement capability, the goal quantity, and the throughput; the variables with the greatest impact on inspection costs are the throughput, the goal quantity, and the goal probability of detection; there are important interactions between variables. That is, the effects of a given variable often depends on the level or value of some other variable. With the methodology used in this study, these interactions can be quantitatively analyzed; reasonably good approximate prediction equations can be developed using the methodology described here.« less
NASA Technical Reports Server (NTRS)
Zhu, Dongming
2017-01-01
Environmental barrier coatings (EBCs) are considered technologically important because of the critical needs and their ability to effectively protect the turbine hot-section SiC/SiC ceramic matrix composite (CMC) components in harsh engine combustion environments. The development of NASA's advanced environmental barrier coatings have been aimed at significantly improved the coating system temperature capability, stability, erosion-impact, and CMAS resistance for SiC/SiC turbine airfoil and combustors component applications. The NASA environmental barrier coating developments have also emphasized thermo-mechanical creep and fatigue resistance in simulated engine heat flux and environments. Experimental results and models for advanced EBC systems will be presented to help establishing advanced EBC composition design methodologies, performance modeling and life predictions, for achieving prime-reliant, durable environmental coating systems for 2700-3000 F engine component applications. Major technical barriers in developing environmental barrier coating systems and the coating integration with next generation composites having further improved temperature capability, environmental stability, EBC-CMC fatigue-environment system durability will be discussed.
The space station assembly phase: System design trade-offs for the flight telerobotic servicer
NASA Technical Reports Server (NTRS)
Smith, Jeffrey H.; Gyamfi, Max; Volkmer, Kent; Zimmerman, Wayne
1988-01-01
The effects of a recent study aimed at identifying key issues and trade-offs associated with using a Flight Telerobotic Servicer (FTS) to aid in Space Station assembly-phase tasks is described. The use of automation and robotic (A and R) technologies for large space systems often involves a substitution of automation capabilities for human EVA or IVA activities. A methodology is presented that incorporates assessment of candidate assembly-phase tasks, telerobotic performance capabilities, development costs, and effects of operational constaints. Changes in the region of cost-effectiveness are examined under a variety of system design assumptions. A discussion of issues is presented with focus on three roles the FTS might serve: as a research-oriented test bed to learn more about space usage of telerobotics; as a research based test bed having an experimental demonstration orientation with limited assembly and servicing applications; or as an operational system to augment EVA and to aid construction of the Space Station and to reduce the program (schedule) risk by increasing the flexibility of mission operations.
Critical Technology Determination for Future Human Space Flight
NASA Technical Reports Server (NTRS)
Mercer, Carolyn R.; Vangen, Scott D.; Williams-Byrd, Julie A.; Steckleim, Jonette M.; Alexander, Leslie; Rahman, Shamin A.; Rosenthal, Matthew; Wiley, Dianne S.; Davison, Stephan C.; Korsmeyer, David J.;
2012-01-01
As the National Aeronautics and Space Administration (NASA) prepares to extend human presence throughout the solar system, technical capabilities must be developed to enable long duration flights to destinations such as near Earth asteroids, Mars, and extended stays on the Moon. As part of the NASA Human Spaceflight Architecture Team, a Technology Development Assessment Team has identified a suite of critical technologies needed to support this broad range of missions. Dialog between mission planners, vehicle developers, and technologists was used to identify a minimum but sufficient set of technologies, noting that needs are created by specific mission architecture requirements, yet specific designs are enabled by technologies. Further consideration was given to the re-use of underlying technologies to cover multiple missions to effectively use scarce resources. This suite of critical technologies is expected to provide the needed base capability to enable a variety of possible destinations and missions. This paper describes the methodology used to provide an architecture driven technology development assessment (technology pull), including technology advancement needs identified by trade studies encompassing a spectrum of flight elements and destination design reference missions.
Critical Technology Determination for Future Human Space Flight
NASA Technical Reports Server (NTRS)
Mercer, Carolyn R.; Vangen, Scott D.; Williams-Byrd, Julie A.; Stecklein, Jonette M.; Rahman, Shamim A.; Rosenthal, Matthew E.; Hornyak, David M.; Alexander, Leslie; Korsmeyer, David J.; Tu, Eugene L.;
2012-01-01
As the National Aeronautics and Space Administration (NASA) prepares to extend human presence throughout the solar system, technical capabilities must be developed to enable long duration flights to destinations such as near Earth asteroids, Mars, and extended stays on the Moon. As part of the NASA Human Spaceflight Architecture Team, a Technology Development Assessment Team has identified a suite of critical technologies needed to support this broad range of missions. Dialog between mission planners, vehicle developers, and technologists was used to identify a minimum but sufficient set of technologies, noting that needs are created by specific mission architecture requirements, yet specific designs are enabled by technologies. Further consideration was given to the re-use of underlying technologies to cover multiple missions to effectively use scarce resources. This suite of critical technologies is expected to provide the needed base capability to enable a variety of possible destinations and missions. This paper describes the methodology used to provide an architecture-driven technology development assessment ("technology pull"), including technology advancement needs identified by trade studies encompassing a spectrum of flight elements and destination design reference missions.
Adjoint-Based Mesh Adaptation for the Sonic Boom Signature Loudness
NASA Technical Reports Server (NTRS)
Rallabhandi, Sriram K.; Park, Michael A.
2017-01-01
The mesh adaptation functionality of FUN3D is utilized to obtain a mesh optimized to calculate sonic boom ground signature loudness. During this process, the coupling between the discrete-adjoints of the computational fluid dynamics tool FUN3D and the atmospheric propagation tool sBOOM is exploited to form the error estimate. This new mesh adaptation methodology will allow generation of suitable meshes adapted to reduce the estimated errors in the ground loudness, which is an optimization metric employed in supersonic aircraft design. This new output-based adaptation could allow new insights into meshing for sonic boom analysis and design, and complements existing output-based adaptation techniques such as adaptation to reduce estimated errors in off-body pressure functional. This effort could also have implications for other coupled multidisciplinary adjoint capabilities (e.g., aeroelasticity) as well as inclusion of propagation specific parameters such as prevailing winds or non-standard atmospheric conditions. Results are discussed in the context of existing methods and appropriate conclusions are drawn as to the efficacy and efficiency of the developed capability.
Analytical Design Package (ADP2): A computer aided engineering tool for aircraft transparency design
NASA Technical Reports Server (NTRS)
Wuerer, J. E.; Gran, M.; Held, T. W.
1994-01-01
The Analytical Design Package (ADP2) is being developed as a part of the Air Force Frameless Transparency Program (FTP). ADP2 is an integrated design tool consisting of existing analysis codes and Computer Aided Engineering (CAE) software. The objective of the ADP2 is to develop and confirm an integrated design methodology for frameless transparencies, related aircraft interfaces, and their corresponding tooling. The application of this methodology will generate high confidence for achieving a qualified part prior to mold fabrication. ADP2 is a customized integration of analysis codes, CAE software, and material databases. The primary CAE integration tool for the ADP2 is P3/PATRAN, a commercial-off-the-shelf (COTS) software tool. The open architecture of P3/PATRAN allows customized installations with different applications modules for specific site requirements. Integration of material databases allows the engineer to select a material, and those material properties are automatically called into the relevant analysis code. The ADP2 materials database will be composed of four independent schemas: CAE Design, Processing, Testing, and Logistics Support. The design of ADP2 places major emphasis on the seamless integration of CAE and analysis modules with a single intuitive graphical interface. This tool is being designed to serve and be used by an entire project team, i.e., analysts, designers, materials experts, and managers. The final version of the software will be delivered to the Air Force in Jan. 1994. The Analytical Design Package (ADP2) will then be ready for transfer to industry. The package will be capable of a wide range of design and manufacturing applications.
NASA Technical Reports Server (NTRS)
Hall, Edward; Isaacs, James; Henriksen, Steve; Zelkin, Natalie
2011-01-01
This report is provided as part of ITT s NASA Glenn Research Center Aerospace Communication Systems Technical Support (ACSTS) contract NNC05CA85C, Task 7: New ATM Requirements-Future Communications, C-Band and L-Band Communications Standard Development and was based on direction provided by FAA project-level agreements for New ATM Requirements-Future Communications. Task 7 included two subtasks. Subtask 7-1 addressed C-band (5091- to 5150-MHz) airport surface data communications standards development, systems engineering, test bed and prototype development, and tests and demonstrations to establish operational capability for the Aeronautical Mobile Airport Communications System (AeroMACS). Subtask 7-2 focused on systems engineering and development support of the L-band digital aeronautical communications system (L-DACS). Subtask 7-1 consisted of two phases. Phase I included development of AeroMACS concepts of use, requirements, architecture, and initial high-level safety risk assessment. Phase II builds on Phase I results and is presented in two volumes. Volume I (this document) is devoted to concepts of use, system requirements, and architecture, including AeroMACS design considerations. Volume II describes an AeroMACS prototype evaluation and presents final AeroMACS recommendations. This report also describes airport categorization and channelization methodologies. The purposes of the airport categorization task were (1) to facilitate initial AeroMACS architecture designs and enable budgetary projections by creating a set of airport categories based on common airport characteristics and design objectives, and (2) to offer high-level guidance to potential AeroMACS technology and policy development sponsors and service providers. A channelization plan methodology was developed because a common global methodology is needed to assure seamless interoperability among diverse AeroMACS services potentially supplied by multiple service providers.
Berti, Federico; Frecer, Vladimir; Miertus, Stanislav
2014-01-01
Despite the fact that HIV-Protease is an over 20 years old target, computational approaches to rational design of its inhibitors still have a great potential to stimulate the synthesis of new compounds and the discovery of new, potent derivatives, ever capable to overcome the problem of drug resistance. This review deals with successful examples of inhibitors identified by computational approaches, rather than by knowledge-based design. Such methodologies include the development of energy and scoring functions, docking protocols, statistical models, virtual combinatorial chemistry. Computations addressing drug resistance, and the development of related models as the substrate envelope hypothesis are also reviewed. In some cases, the identified structures required the development of synthetic approaches in order to obtain the desired target molecules; several examples are reported.
Doing our best: optimization and the management of risk.
Ben-Haim, Yakov
2012-08-01
Tools and concepts of optimization are widespread in decision-making, design, and planning. There is a moral imperative to "do our best." Optimization underlies theories in physics and biology, and economic theories often presume that economic agents are optimizers. We argue that in decisions under uncertainty, what should be optimized is robustness rather than performance. We discuss the equity premium puzzle from financial economics, and explain that the puzzle can be resolved by using the strategy of satisficing rather than optimizing. We discuss design of critical technological infrastructure, showing that satisficing of performance requirements--rather than optimizing them--is a preferable design concept. We explore the need for disaster recovery capability and its methodological dilemma. The disparate domains--economics and engineering--illuminate different aspects of the challenge of uncertainty and of the significance of robust-satisficing. © 2012 Society for Risk Analysis.
LSD (Landing System Development) Impact Simulation
NASA Astrophysics Data System (ADS)
Ullio, R.; Riva, N.; Pellegrino, P.; Deloo, P.
2012-07-01
In the frame of the Exploration Programs, a soft landing on the planet surface is foreseen. To ensure a successful final landing phase, a landing system by using leg tripod design landing legs with adequate crushable damping system was selected, capable of absorbing the residual velocities (vertical, horizontal and angular) at touch- down, insuring stability. TAS-I developed a numerical non linear dynamic methodology for the landing impact simulation of the Lander system by using a commercial explicit finite element analysis code (i.e. Altair RADIOSS). In this paper the most significant FE modeling approaches and results of the analytical simulation of landing impact are reported, especially with respect to the definition of leg dimensioning loads and the design update of selected parts (if necessary).
Usability Testing and Analysis Facility (UTAF)
NASA Technical Reports Server (NTRS)
Wong, Douglas T.
2010-01-01
This slide presentation reviews the work of the Usability Testing and Analysis Facility (UTAF) at NASA Johnson Space Center. It is one of the Space Human Factors Laboratories in the Habitability and Human Factors Branch (SF3) at NASA Johnson Space Center The primary focus pf the UTAF is to perform Human factors evaluation and usability testing of crew / vehicle interfaces. The presentation reviews the UTAF expertise and capabilities, the processes and methodologies, and the equipment available. It also reviews the programs that it has supported detailing the human engineering activities in support of the design of the Orion space craft, testing of the EVA integrated spacesuit, and work done for the design of the lunar projects of the Constellation Program: Altair, Lunar Electric Rover, and Outposts
Design of a 0.13-μm CMOS cascade expandable ΣΔ modulator for multi-standard RF telecom systems
NASA Astrophysics Data System (ADS)
Morgado, Alonso; del Río, Rocío; de la Rosa, José M.
2007-05-01
This paper reports a 130-nm CMOS programmable cascade ΣΔ modulator for multi-standard wireless terminals, capable of operating on three standards: GSM, Bluetooth and UMTS. The modulator is reconfigured at both architecture- and circuit- level in order to adapt its performance to the different standards specifications with optimized power consumption. The design of the building blocks is based upon a top-down CAD methodology that combines simulation and statistical optimization at different levels of the system hierarchy. Transistor-level simulations show correct operation for all standards, featuring 13-bit, 11.3-bit and 9-bit effective resolution within 200-kHz, 1-MHz and 4-MHz bandwidth, respectively.
NASA Astrophysics Data System (ADS)
Villanueva Perez, Carlos Hernan
Computational design optimization provides designers with automated techniques to develop novel and non-intuitive optimal designs. Topology optimization is a design optimization technique that allows for the evolution of a broad variety of geometries in the optimization process. Traditional density-based topology optimization methods often lack a sufficient resolution of the geometry and physical response, which prevents direct use of the optimized design in manufacturing and the accurate modeling of the physical response of boundary conditions. The goal of this thesis is to introduce a unified topology optimization framework that uses the Level Set Method (LSM) to describe the design geometry and the eXtended Finite Element Method (XFEM) to solve the governing equations and measure the performance of the design. The methodology is presented as an alternative to density-based optimization approaches, and is able to accommodate a broad range of engineering design problems. The framework presents state-of-the-art methods for immersed boundary techniques to stabilize the systems of equations and enforce the boundary conditions, and is studied with applications in 2D and 3D linear elastic structures, incompressible flow, and energy and species transport problems to test the robustness and the characteristics of the method. A comparison of the framework against density-based topology optimization approaches is studied with regards to convergence, performance, and the capability to manufacture the designs. Furthermore, the ability to control the shape of the design to operate within manufacturing constraints is developed and studied. The analysis capability of the framework is validated quantitatively through comparison against previous benchmark studies, and qualitatively through its application to topology optimization problems. The design optimization problems converge to intuitive designs and resembled well the results from previous 2D or density-based studies.
AGT (Advanced Gas Turbine) technology project
NASA Technical Reports Server (NTRS)
1988-01-01
An overall summary documentation is provided for the Advanced Gas Turbine Technology Project conducted by the Allison Gas Turbine Division of General Motors. This advanced, high risk work was initiated in October 1979 under charter from the U.S. Congress to promote an engine for transportation that would provide an alternate to reciprocating spark ignition (SI) engines for the U.S. automotive industry and simultaneously establish the feasibility of advanced ceramic materials for hot section components to be used in an automotive gas turbine. As this program evolved, dictates of available funding, Government charter, and technical developments caused program emphases to focus on the development and demonstration of the ceramic turbine hot section and away from the development of engine and powertrain technologies and subsequent vehicular demonstrations. Program technical performance concluded in June 1987. The AGT 100 program successfully achieved project objectives with significant technology advances. Specific AGT 100 program achievements are: (1) Ceramic component feasibility for use in gas turbine engines has been demonstrated; (2) A new, 100 hp engine was designed, fabricated, and tested for 572 hour at operating temperatures to 2200 F, uncooled; (3) Statistical design methodology has been applied and correlated to experimental data acquired from over 5500 hour of rig and engine testing; (4) Ceramic component processing capability has progressed from a rudimentary level able to fabricate simple parts to a sophisticated level able to provide complex geometries such as rotors and scrolls; (5) Required improvements for monolithic and composite ceramic gas turbine components to meet automotive reliability, performance, and cost goals have been identified; (6) The combustor design demonstrated lower emissions than 1986 Federal Standards on methanol, JP-5, and diesel fuel. Thus, the potential for meeting emission standards and multifuel capability has been initiated; (7) Small turbine engine aerodynamic and mechanical design capability has been initiated; and (8) An infrastructure of manpower, facilities, materials, and fabrication capabilities has been established which is available for continued development of ceramic component technology in gas turbine and other heat engines.
Engineering Concepts in Stem Cell Research.
Narayanan, Karthikeyan; Mishra, Sachin; Singh, Satnam; Pei, Ming; Gulyas, Balazs; Padmanabhan, Parasuraman
2017-12-01
The field of regenerative medicine integrates advancements made in stem cells, molecular biology, engineering, and clinical methodologies. Stem cells serve as a fundamental ingredient for therapeutic application in regenerative medicine. Apart from stem cells, engineering concepts have equally contributed to the success of stem cell based applications in improving human health. The purpose of various engineering methodologies is to develop regenerative and preventive medicine to combat various diseases and deformities. Explosion of stem cell discoveries and their implementation in clinical setting warrants new engineering concepts and new biomaterials. Biomaterials, microfluidics, and nanotechnology are the major engineering concepts used for the implementation of stem cells in regenerative medicine. Many of these engineering technologies target the specific niche of the cell for better functional capability. Controlling the niche is the key for various developmental activities leading to organogenesis and tissue homeostasis. Biomimetic understanding not only helped to improve the design of the matrices or scaffolds by incorporating suitable biological and physical components, but also ultimately aided adoption of designs that helped these materials/devices have better function. Adoption of engineering concepts in stem cell research improved overall achievement, however, several important issues such as long-term effects with respect to systems biology needs to be addressed. Here, in this review the authors will highlight some interesting breakthroughs in stem cell biology that use engineering methodologies. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Numerical Simulation of Flow in a Whirling Annular Seal and Comparison with Experiments
NASA Technical Reports Server (NTRS)
Athavale, M. M.; Hendricks, R. C.; Steinetz, B. M.
1995-01-01
The turbulent flow field in a simulated annular seal with a large clearance/radius ratio (0.015) and a whirling rotor was simulated using an advanced 3D CFD code SCISEAL. A circular whirl orbit with synchronous whirl was imposed on the rotor center. The flow field was rendered quasi-steady by making a transformation to a totaling frame. Standard k-epsilon model with wall functions was used to treat the turbulence. Experimentally measured values of flow parameters were used to specify the seal inlet and exit boundary conditions. The computed flow-field in terms of the velocity and pressure is compared with the experimental measurements inside the seal. The agreement between the numerical results and experimental data with correction is fair to good. The capability of current advanced CFD methodology to analyze this complex flow field is demonstrated. The methodology can also be extended to other whirl frequencies. Half- (or sub-) synchronous (fluid film unstable motion) and synchronous (rotor centrifugal force unbalance) whirls are the most unstable whirl modes in turbomachinery seals, and the flow code capability of simulating the flows in steady as well as whirling seals will prove to be extremely useful in the design, analyses, and performance predictions of annular as well as other types of seals.
NASA Astrophysics Data System (ADS)
van Rooij, Michael P. C.
Current turbomachinery design systems increasingly rely on multistage Computational Fluid Dynamics (CFD) as a means to assess performance of designs. However, design weaknesses attributed to improper stage matching are addressed using often ineffective strategies involving a costly iterative loop between blading modification, revision of design intent, and evaluation of aerodynamic performance. A design methodology is presented which greatly improves the process of achieving design-point aerodynamic matching. It is based on a three-dimensional viscous inverse design method which generates the blade camber surface based on prescribed pressure loading, thickness distribution and stacking line. This inverse design method has been extended to allow blading analysis and design in a multi-blade row environment. Blade row coupling was achieved through a mixing plane approximation. Parallel computing capability in the form of MPI has been implemented to reduce the computational time for multistage calculations. Improvements have been made to the flow solver to reach the level of accuracy required for multistage calculations. These include inclusion of heat flux, temperature-dependent treatment of viscosity, and improved calculation of stress components and artificial dissipation near solid walls. A validation study confirmed that the obtained accuracy is satisfactory at design point conditions. Improvements have also been made to the inverse method to increase robustness and design fidelity. These include the possibility to exclude spanwise sections of the blade near the endwalls from the design process, and a scheme that adjusts the specified loading area for changes resulting from the leading and trailing edge treatment. Furthermore, a pressure loading manager has been developed. Its function is to automatically adjust the pressure loading area distribution during the design calculation in order to achieve a specified design objective. Possible objectives are overall mass flow and compression ratio, and radial distribution of exit flow angle. To supplement the loading manager, mass flow inlet and exit boundary conditions have been implemented. Through appropriate combination of pressure or mass flow inflow/outflow boundary conditions and loading manager objectives, increased control over the design intent can be obtained. The three-dimensional multistage inverse design method with pressure loading manager was demonstrated to offer greatly enhanced blade row matching capabilities. Multistage design allows for simultaneous design of blade rows in a mutually interacting environment, which permits the redesigned blading to adapt to changing aerodynamic conditions resulting from the redesign. This ensures that the obtained blading geometry and performance implied by the prescribed pressure loading distribution are consistent with operation in the multi-blade row environment. The developed methodology offers high aerodynamic design quality and productivity, and constitutes a significant improvement over existing approaches used to address design-point aerodynamic matching.
Generalized Subset Designs in Analytical Chemistry.
Surowiec, Izabella; Vikström, Ludvig; Hector, Gustaf; Johansson, Erik; Vikström, Conny; Trygg, Johan
2017-06-20
Design of experiments (DOE) is an established methodology in research, development, manufacturing, and production for screening, optimization, and robustness testing. Two-level fractional factorial designs remain the preferred approach due to high information content while keeping the number of experiments low. These types of designs, however, have never been extended to a generalized multilevel reduced design type that would be capable to include both qualitative and quantitative factors. In this Article we describe a novel generalized fractional factorial design. In addition, it also provides complementary and balanced subdesigns analogous to a fold-over in two-level reduced factorial designs. We demonstrate how this design type can be applied with good results in three different applications in analytical chemistry including (a) multivariate calibration using microwave resonance spectroscopy for the determination of water in tablets, (b) stability study in drug product development, and (c) representative sample selection in clinical studies. This demonstrates the potential of generalized fractional factorial designs to be applied in many other areas of analytical chemistry where representative, balanced, and complementary subsets are required, especially when a combination of quantitative and qualitative factors at multiple levels exists.
Supporting Space Systems Design via Systems Dependency Analysis Methodology
NASA Astrophysics Data System (ADS)
Guariniello, Cesare
The increasing size and complexity of space systems and space missions pose severe challenges to space systems engineers. When complex systems and Systems-of-Systems are involved, the behavior of the whole entity is not only due to that of the individual systems involved but also to the interactions and dependencies between the systems. Dependencies can be varied and complex, and designers usually do not perform analysis of the impact of dependencies at the level of complex systems, or this analysis involves excessive computational cost, or occurs at a later stage of the design process, after designers have already set detailed requirements, following a bottom-up approach. While classical systems engineering attempts to integrate the perspectives involved across the variety of engineering disciplines and the objectives of multiple stakeholders, there is still a need for more effective tools and methods capable to identify, analyze and quantify properties of the complex system as a whole and to model explicitly the effect of some of the features that characterize complex systems. This research describes the development and usage of Systems Operational Dependency Analysis and Systems Developmental Dependency Analysis, two methods based on parametric models of the behavior of complex systems, one in the operational domain and one in the developmental domain. The parameters of the developed models have intuitive meaning, are usable with subjective and quantitative data alike, and give direct insight into the causes of observed, and possibly emergent, behavior. The approach proposed in this dissertation combines models of one-to-one dependencies among systems and between systems and capabilities, to analyze and evaluate the impact of failures or delays on the outcome of the whole complex system. The analysis accounts for cascading effects, partial operational failures, multiple failures or delays, and partial developmental dependencies. The user of these methods can assess the behavior of each system based on its internal status and on the topology of its dependencies on systems connected to it. Designers and decision makers can therefore quickly analyze and explore the behavior of complex systems and evaluate different architectures under various working conditions. The methods support educated decision making both in the design and in the update process of systems architecture, reducing the need to execute extensive simulations. In particular, in the phase of concept generation and selection, the information given by the methods can be used to identify promising architectures to be further tested and improved, while discarding architectures that do not show the required level of global features. The methods, when used in conjunction with appropriate metrics, also allow for improved reliability and risk analysis, as well as for automatic scheduling and re-scheduling based on the features of the dependencies and on the accepted level of risk. This dissertation illustrates the use of the two methods in sample aerospace applications, both in the operational and in the developmental domain. The applications show how to use the developed methodology to evaluate the impact of failures, assess the criticality of systems, quantify metrics of interest, quantify the impact of delays, support informed decision making when scheduling the development of systems and evaluate the achievement of partial capabilities. A larger, well-framed case study illustrates how the Systems Operational Dependency Analysis method and the Systems Developmental Dependency Analysis method can support analysis and decision making, at the mid and high level, in the design process of architectures for the exploration of Mars. The case study also shows how the methods do not replace the classical systems engineering methodologies, but support and improve them.
Methodology for the systems engineering process. Volume 3: Operational availability
NASA Technical Reports Server (NTRS)
Nelson, J. H.
1972-01-01
A detailed description and explanation of the operational availability parameter is presented. The fundamental mathematical basis for operational availability is developed, and its relationship to a system's overall performance effectiveness is illustrated within the context of identifying specific availability requirements. Thus, in attempting to provide a general methodology for treating both hypothetical and existing availability requirements, the concept of an availability state, in conjunction with the more conventional probability-time capability, is investigated. In this respect, emphasis is focused upon a balanced analytical and pragmatic treatment of operational availability within the system design process. For example, several applications of operational availability to typical aerospace systems are presented, encompassing the techniques of Monte Carlo simulation, system performance availability trade-off studies, analytical modeling of specific scenarios, as well as the determination of launch-on-time probabilities. Finally, an extensive bibliography is provided to indicate further levels of depth and detail of the operational availability parameter.
Challenges in evaluating cancer as a clinical outcome in postapproval studies of drug safety
Pinheiro, Simone P.; Rivera, Donna R.; Graham, David J.; Freedman, Andrew N.; Major, Jacqueline M.; Penberthy, Lynne; Levenson, Mark; Bradley, Marie C.; Wong, Hui-Lee; Ouellet-Hellstrom, Rita
2017-01-01
Pharmaceuticals approved in the United States are largely not known human carcinogens. However, cancer signals associated with pharmaceuticals may be hypothesized or arise after product approval. There are many study designs that can be used to evaluate cancer as an outcome in the postapproval setting. Because prospective systematic collection of cancer outcomes from a large number of individuals may be lengthy, expensive, and challenging, leveraging data from large existing databases are an integral approach. Such studies have the capability to evaluate the clinical experience of a large number of individuals, yet there are unique methodological challenges involved in their use to evaluate cancer outcomes. To discuss methodological challenges and potential solutions, the Food and Drug Administration and the National Cancer Institute convened a two-day public meeting in 2014. This commentary summarizes the most salient issues discussed at the meeting. PMID:27663208
Challenges in evaluating cancer as a clinical outcome in postapproval studies of drug safety.
Pinheiro, Simone P; Rivera, Donna R; Graham, David J; Freedman, Andrew N; Major, Jacqueline M; Penberthy, Lynne; Levenson, Mark; Bradley, Marie C; Wong, Hui-Lee; Ouellet-Hellstrom, Rita
2016-11-01
Pharmaceuticals approved in the United States are largely not known human carcinogens. However, cancer signals associated with pharmaceuticals may be hypothesized or arise after product approval. There are many study designs that can be used to evaluate cancer as an outcome in the postapproval setting. Because prospective systematic collection of cancer outcomes from a large number of individuals may be lengthy, expensive, and challenging, leveraging data from large existing databases are an integral approach. Such studies have the capability to evaluate the clinical experience of a large number of individuals, yet there are unique methodological challenges involved in their use to evaluate cancer outcomes. To discuss methodological challenges and potential solutions, the Food and Drug Administration and the National Cancer Institute convened a two-day public meeting in 2014. This commentary summarizes the most salient issues discussed at the meeting. Published by Elsevier Inc.
CARES/Life Ceramics Durability Evaluation Software Enhanced for Cyclic Fatigue
NASA Technical Reports Server (NTRS)
Nemeth, Noel N.; Powers, Lynn M.; Janosik, Lesley A.
1999-01-01
The CARES/Life computer program predicts the probability of a monolithic ceramic component's failure as a function of time in service. The program has many features and options for materials evaluation and component design. It couples commercial finite element programs--which resolve a component's temperature and stress distribution--to reliability evaluation and fracture mechanics routines for modeling strength-limiting defects. The capability, flexibility, and uniqueness of CARES/Life have attracted many users representing a broad range of interests and has resulted in numerous awards for technological achievements and technology transfer. Recent work with CARES/Life was directed at enhancing the program s capabilities with regards to cyclic fatigue. Only in the last few years have ceramics been recognized to be susceptible to enhanced degradation from cyclic loading. To account for cyclic loads, researchers at the NASA Lewis Research Center developed a crack growth model that combines the Power Law (time-dependent) and the Walker Law (cycle-dependent) crack growth models. This combined model has the characteristics of Power Law behavior (decreased damage) at high R ratios (minimum load/maximum load) and of Walker law behavior (increased damage) at low R ratios. In addition, a parameter estimation methodology for constant-amplitude, steady-state cyclic fatigue experiments was developed using nonlinear least squares and a modified Levenberg-Marquardt algorithm. This methodology is used to give best estimates of parameter values from cyclic fatigue specimen rupture data (usually tensile or flexure bar specimens) for a relatively small number of specimens. Methodology to account for runout data (unfailed specimens over the duration of the experiment) was also included.
Design optimum frac jobs using virtual intelligence techniques
NASA Astrophysics Data System (ADS)
Mohaghegh, Shahab; Popa, Andrei; Ameri, Sam
2000-10-01
Designing optimal frac jobs is a complex and time-consuming process. It usually involves the use of a two- or three-dimensional computer model. For the computer models to perform as intended, a wealth of input data is required. The input data includes wellbore configuration and reservoir characteristics such as porosity, permeability, stress and thickness profiles of the pay layers as well as the overburden layers. Among other essential information required for the design process is fracturing fluid type and volume, proppant type and volume, injection rate, proppant concentration and frac job schedule. Some of the parameters such as fluid and proppant types have discrete possible choices. Other parameters such as fluid and proppant volume, on the other hand, assume values from within a range of minimum and maximum values. A potential frac design for a particular pay zone is a combination of all of these parameters. Finding the optimum combination is not a trivial process. It usually requires an experienced engineer and a considerable amount of time to tune the parameters in order to achieve desirable outcome. This paper introduces a new methodology that integrates two virtual intelligence techniques, namely, artificial neural networks and genetic algorithms to automate and simplify the optimum frac job design process. This methodology requires little input from the engineer beyond the reservoir characterizations and wellbore configuration. The software tool that has been developed based on this methodology uses the reservoir characteristics and an optimization criteria indicated by the engineer, for example a certain propped frac length, and provides the detail of the optimum frac design that will result in the specified criteria. An ensemble of neural networks is trained to mimic the two- or three-dimensional frac simulator. Once successfully trained, these networks are capable of providing instantaneous results in response to any set of input parameters. These networks will be used as the fitness function for a genetic algorithm routine that will search for the best combination of the design parameters for the frac job. The genetic algorithm will search through the entire solution space and identify the optimal combination of parameters to be used in the design process. Considering the complexity of this task this methodology converges relatively fast, providing the engineer with several near-optimum scenarios for the frac job design. These scenarios, which can be achieved in just a minute or two, can be valuable initial points for the engineer to start his/her design job and save him/her hours of runs on the simulator.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Makarov, Yuri V.; Lu, Shuai
2008-07-15
This report presents a methodology developed to study the future impact of wind on BPA power system load following and regulation requirements. The methodology uses historical data and stochastic processes to simulate the load balancing processes in the BPA power system, by mimicking the actual power system operations. Therefore, the results are close to reality, yet the study based on this methodology is convenient to conduct. Compared with the proposed methodology, existing methodologies for doing similar analysis include dispatch model simulation and standard deviation evaluation on load and wind data. Dispatch model simulation is constrained by the design of themore » dispatch program, and standard deviation evaluation is artificial in separating the load following and regulation requirements, both of which usually do not reflect actual operational practice. The methodology used in this study provides not only capacity requirement information, it also analyzes the ramp rate requirements for system load following and regulation processes. The ramp rate data can be used to evaluate generator response/maneuverability requirements, which is another necessary capability of the generation fleet for the smooth integration of wind energy. The study results are presented in an innovative way such that the increased generation capacity or ramp requirements are compared for two different years, across 24 hours a day. Therefore, the impact of different levels of wind energy on generation requirements at different times can be easily visualized.« less
Advanced Technologies and Methodology for Automated Ultrasonic Testing Systems Quantification
DOT National Transportation Integrated Search
2011-04-29
For automated ultrasonic testing (AUT) detection and sizing accuracy, this program developed a methodology for quantification of AUT systems, advancing and quantifying AUT systems imagecapture capabilities, quantifying the performance of multiple AUT...
Calibration of CORSIM models under saturated traffic flow conditions.
DOT National Transportation Integrated Search
2013-09-01
This study proposes a methodology to calibrate microscopic traffic flow simulation models. : The proposed methodology has the capability to calibrate simultaneously all the calibration : parameters as well as demand patterns for any network topology....
A Survey of Cost Estimating Methodologies for Distributed Spacecraft Missions
NASA Technical Reports Server (NTRS)
Foreman, Veronica; Le Moigne, Jacqueline; de Weck, Oliver
2016-01-01
Satellite constellations present unique capabilities and opportunities to Earth orbiting and near-Earth scientific and communications missions, but also present new challenges to cost estimators. An effective and adaptive cost model is essential to successful mission design and implementation, and as Distributed Spacecraft Missions (DSM) become more common, cost estimating tools must become more representative of these types of designs. Existing cost models often focus on a single spacecraft and require extensive design knowledge to produce high fidelity estimates. Previous research has examined the shortcomings of existing cost practices as they pertain to the early stages of mission formulation, for both individual satellites and small satellite constellations. Recommendations have been made for how to improve the cost models for individual satellites one-at-a-time, but much of the complexity in constellation and DSM cost modeling arises from constellation systems level considerations that have not yet been examined. This paper constitutes a survey of the current state-of-the-art in cost estimating techniques with recommendations for improvements to increase the fidelity of future constellation cost estimates. To enable our investigation, we have developed a cost estimating tool for constellation missions. The development of this tool has revealed three high-priority weaknesses within existing parametric cost estimating capabilities as they pertain to DSM architectures: design iteration, integration and test, and mission operations. Within this paper we offer illustrative examples of these discrepancies and make preliminary recommendations for addressing them. DSM and satellite constellation missions are shifting the paradigm of space-based remote sensing, showing promise in the realms of Earth science, planetary observation, and various heliophysical applications. To fully reap the benefits of DSM technology, accurate and relevant cost estimating capabilities must exist; this paper offers insights critical to the future development and implementation of DSM cost estimating tools.
A Survey of Cost Estimating Methodologies for Distributed Spacecraft Missions
NASA Technical Reports Server (NTRS)
Foreman, Veronica L.; Le Moigne, Jacqueline; de Weck, Oliver
2016-01-01
Satellite constellations present unique capabilities and opportunities to Earth orbiting and near-Earth scientific and communications missions, but also present new challenges to cost estimators. An effective and adaptive cost model is essential to successful mission design and implementation, and as Distributed Spacecraft Missions (DSM) become more common, cost estimating tools must become more representative of these types of designs. Existing cost models often focus on a single spacecraft and require extensive design knowledge to produce high fidelity estimates. Previous research has examined the limitations of existing cost practices as they pertain to the early stages of mission formulation, for both individual satellites and small satellite constellations. Recommendations have been made for how to improve the cost models for individual satellites one-at-a-time, but much of the complexity in constellation and DSM cost modeling arises from constellation systems level considerations that have not yet been examined. This paper constitutes a survey of the current state-of-theart in cost estimating techniques with recommendations for improvements to increase the fidelity of future constellation cost estimates. To enable our investigation, we have developed a cost estimating tool for constellation missions. The development of this tool has revealed three high-priority shortcomings within existing parametric cost estimating capabilities as they pertain to DSM architectures: design iteration, integration and test, and mission operations. Within this paper we offer illustrative examples of these discrepancies and make preliminary recommendations for addressing them. DSM and satellite constellation missions are shifting the paradigm of space-based remote sensing, showing promise in the realms of Earth science, planetary observation, and various heliophysical applications. To fully reap the benefits of DSM technology, accurate and relevant cost estimating capabilities must exist; this paper offers insights critical to the future development and implementation of DSM cost estimating tools.
Spacecraft Conceptual Design Compared to the Apollo Lunar Lander
NASA Technical Reports Server (NTRS)
Young, C.; Bowie, J.; Rust, R.; Lenius, J.; Anderson, M.; Connolly, J.
2011-01-01
Future human exploration of the Moon will require an optimized spacecraft design with each sub-system achieving the required minimum capability and maintaining high reliability. The objective of this study was to trade capability with reliability and minimize mass for the lunar lander spacecraft. The NASA parametric concept for a 3-person vehicle to the lunar surface with a 30% mass margin totaled was considerably heavier than the Apollo 15 Lunar Module "as flown" mass of 16.4 metric tons. The additional mass was attributed to mission requirements and system design choices that were made to meet the realities of modern spaceflight. The parametric tool used to size the current concept, Envision, accounts for primary and secondary mass requirements. For example, adding an astronaut increases the mass requirements for suits, water, food, oxygen, as well as, the increase in volume. The environmental control sub-systems becomes heavier with the increased requirements and more structure was needed to support the additional mass. There was also an increase in propellant usage. For comparison, an "Apollo-like" vehicle was created by removing these additional requirements. Utilizing the Envision parametric mass calculation tool and a quantitative reliability estimation tool designed by Valador Inc., it was determined that with today?s current technology a Lunar Module (LM) with Apollo capability could be built with less mass and similar reliability. The reliability of this new lander was compared to Apollo Lunar Module utilizing the same methodology, adjusting for mission timeline changes as well as component differences. Interestingly, the parametric concept's overall estimated risk for loss of mission (LOM) and loss of crew (LOC) did not significantly improve when compared to Apollo.
Assessment, development, and application of combustor aerothermal models
NASA Technical Reports Server (NTRS)
Holdeman, J. D.; Mongia, H. C.; Mularz, E. J.
1989-01-01
The gas turbine combustion system design and development effort is an engineering exercise to obtain an acceptable solution to the conflicting design trade-offs between combustion efficiency, gaseous emissions, smoke, ignition, restart, lean blowout, burner exit temperature quality, structural durability, and life cycle cost. For many years, these combustor design trade-offs have been carried out with the help of fundamental reasoning and extensive component and bench testing, backed by empirical and experience correlations. Recent advances in the capability of computational fluid dynamics codes have led to their application to complex 3-D flows such as those in the gas turbine combustor. A number of U.S. Government and industry sponsored programs have made significant contributions to the formulation, development, and verification of an analytical combustor design methodology which will better define the aerothermal loads in a combustor, and be a valuable tool for design of future combustion systems. The contributions made by NASA Hot Section Technology (HOST) sponsored Aerothermal Modeling and supporting programs are described.
Design and validation of the eyesafe ladar testbed (ELT) using the LadarSIM system simulator
NASA Astrophysics Data System (ADS)
Neilsen, Kevin D.; Budge, Scott E.; Pack, Robert T.; Fullmer, R. Rees; Cook, T. Dean
2009-05-01
The development of an experimental full-waveform LADAR system has been enhanced with the assistance of the LadarSIM system simulation software. The Eyesafe LADAR Test-bed (ELT) was designed as a raster scanning, single-beam, energy-detection LADAR with the capability of digitizing and recording the return pulse waveform at up to 2 GHz for 3D off-line image formation research in the laboratory. To assist in the design phase, the full-waveform LADAR simulation in LadarSIM was used to simulate the expected return waveforms for various system design parameters, target characteristics, and target ranges. Once the design was finalized and the ELT constructed, the measured specifications of the system and experimental data captured from the operational sensor were used to validate the behavior of the system as predicted during the design phase. This paper presents the methodology used, and lessons learned from this "design, build, validate" process. Simulated results from the design phase are presented, and these are compared to simulated results using measured system parameters and operational sensor data. The advantages of this simulation-based process are also presented.
NASA Technical Reports Server (NTRS)
Fogel, L. J.; Calabrese, P. G.; Walsh, M. J.; Owens, A. J.
1982-01-01
Ways in which autonomous behavior of spacecraft can be extended to treat situations wherein a closed loop control by a human may not be appropriate or even possible are explored. Predictive models that minimize mean least squared error and arbitrary cost functions are discussed. A methodology for extracting cyclic components for an arbitrary environment with respect to usual and arbitrary criteria is developed. An approach to prediction and control based on evolutionary programming is outlined. A computer program capable of predicting time series is presented. A design of a control system for a robotic dense with partially unknown physical properties is presented.
Communications processor for C3 analysis and wargaming
NASA Astrophysics Data System (ADS)
Clark, L. N.; Pless, L. D.; Rapp, R. L.
1982-03-01
This thesis developed the software capability to allow the investigation of c3 problems, procedures and methodologies. The resultant communications model, that while independent of a specific wargame, is currently implemented in conjunction with the McClintic Theater Model. It provides a computerized message handling system (C3 Model) which allows simulation of communication links (circuits) with user-definable delays; garble and loss rates; and multiple circuit types, addresses, and levels of command. It is designed to be used for test and evaluation of command and control problems in the areas of organizational relationships, communication networks and procedures, and combat doctrine or tactics.
Programmed LWR metrology by multi-techniques approach
NASA Astrophysics Data System (ADS)
Reche, Jérôme; Besacier, Maxime; Gergaud, Patrice; Blancquaert, Yoann; Freychet, Guillaume; Labbaye, Thibault
2018-03-01
Nowadays, roughness control presents a huge challenge for the lithography step. For advanced nodes, this morphological aspect reaches the same order of magnitude than the Critical Dimension. Hence, the control of roughness needs an adapted metrology. In this study, specific samples with designed roughness have been manufactured using e-beam lithography. These samples have been characterized with three different methodologies: CD-SEM, OCD and SAXS. The main goal of the project is to compare the capability of each of these techniques in terms of reliability, type of information obtained, time to obtain the measurements and level of maturity for the industry.
Aerodynamic and acoustic test of a United Technologies model scale rotor at DNW
NASA Technical Reports Server (NTRS)
Yu, Yung H.; Liu, Sandy R.; Jordan, Dave E.; Landgrebe, Anton J.; Lorber, Peter F.; Pollack, Michael J.; Martin, Ruth M.
1990-01-01
The UTC model scale rotors, the DNW wind tunnel, the AFDD rotary wing test stand, the UTRC and AFDD aerodynamic and acoustic data acquisition systems, and the scope of test matrices are discussed and an introduction to the test results is provided. It is pointed out that a comprehensive aero/acoustic database of several configurations of the UTC scaled model rotor has been created. The data is expected to improve understanding of rotor aerodynamics, acoustics, and dynamics, and lead to enhanced analytical methodology and design capabilities for the next generation of rotorcraft.
NASA Astrophysics Data System (ADS)
Pini, Giovanni; Tuci, Elio
2008-06-01
In biology/psychology, the capability of natural organisms to learn from the observation/interaction with conspecifics is referred to as social learning. Roboticists have recently developed an interest in social learning, since it might represent an effective strategy to enhance the adaptivity of a team of autonomous robots. In this study, we show that a methodological approach based on artifcial neural networks shaped by evolutionary computation techniques can be successfully employed to synthesise the individual and social learning mechanisms for robots required to learn a desired action (i.e. phototaxis or antiphototaxis).
Implementation of a production Ada project: The GRODY study
NASA Technical Reports Server (NTRS)
Godfrey, Sara; Brophy, Carolyn Elizabeth
1989-01-01
The use of the Ada language and design methodologies that encourage full use of its capabilities have a strong impact on all phases of the software development project life cycle. At the National Aeronautics and Space Administration/Goddard Space Flight Center (NASA/GSFC), the Software Engineering Laboratory (SEL) conducted an experiment in parallel development of two flight dynamics systems in FORTRAN and Ada. The differences observed during the implementation, unit testing, and integration phases of the two projects are described and the lessons learned during the implementation phase of the Ada development are outlined. Included are recommendations for future Ada development projects.
NASA Astrophysics Data System (ADS)
Alzbutas, Robertas
2015-04-01
In general, the Emergency Planning Zones (EPZ) are defined as well as plant site and arrangement structures are designed to minimize the potential for natural and manmade hazards external to the plant from affecting the plant safety related functions, which can affect nearby population and environment. This may include consideration of extreme winds, fires, flooding, aircraft crash, seismic activity, etc. Thus the design basis for plant and site is deeply related to the effects of any postulated external events and the limitation of the plant capability to cope with accidents i.e. perform safety functions. It has been observed that the Probabilistic Safety Assessment (PSA) methodologies to deal with EPZ and extreme external events have not reached the same level of maturity as for severe internal events. The design basis for any plant and site is deeply related to the effects of any postulated external events and the limitation of the plant capability to cope with accidents i.e. perform safety functions. As a prime example of an advanced reactor and new Nuclear Power Plant (NPP) with enhanced safety, the International Reactor Innovative and Secure (IRIS) and Site selection for New NPP in Lithuania had been considered in this work. In the used Safety-by-Design™ approach, the PSA played obviously a key role; therefore a Preliminary IRIS PSA had been developed along with the design. For the design and pre-licensing process of IRIS the external events analysis included both qualitative evaluation and quantitative assessment. As a result of preliminary qualitative analyses, the external events that were chosen for more detailed quantitative scoping evaluation were high winds and tornadoes, aircraft crash, and seismic events. For the site selection in Lithuania a detail site evaluation process was performed and related to the EPZ and risk zoning considerations. In general, applying the quantitative assessment, bounding site characteristics could be used in order to optimize potential redefinition or future restrictions on plant siting and risk zoning. It must be noticed that the use of existing regulations and installations as the basis for this redefinition will not in any way impact the high degree of conservatism inherent in current regulations. Moreover, the remapping process makes this methodology partially independent from the uncertainties still affecting probabilistic techniques. Notwithstanding these considerations, it is still expected that applying this methodology to advanced plant designs with improved safety features will allow significant changes in the emergency planning requirements, and specifically the size of the EPZ. In particular, in the case of IRIS it is expected that taking full credit of the Safety-by-Design™ approach of the IRIS reactor will allow a dramatic changes in the EPZ, while still maintaining a level of protection to the public fully consistent with existing regulations.
Sun, Guilin; Muneer, Badar; Li, Ying; Zhu, Qi
2018-04-01
This paper presents an ultracompact design of biomedical implantable devices with integrated wireless power transfer (WPT) and RF transmission capabilities for implantable medical applications. By reusing the spiral coil in an implantable device, both RF transmission and WPT are realized without the performance degradation of both functions in ultracompact size. The complete theory of WPT based on magnetic resonant coupling is discussed and the design methodology of an integrated structure is presented in detail, which can guide the design effectively. A system with an external power transmitter and implantable structure is fabricated to validate the proposed approach. The experimental results show that the implantable structure can receive power wirelessly at 39.86 MHz with power transfer efficiency of 47.2% and can also simultaneously radiate at 2.45 GHz with an impedance bandwidth of 10.8% and a gain of -15.71 dBi in the desired direction. Furthermore, sensitivity analyses are carried out with the help of experiment and simulation. The results reveal that the system has strong tolerance to the nonideal conditions. Additionally, the specific absorption rate distribution is evaluated in the light of strict IEEE standards. The results reveal that the implantable structure can receive up to 115 mW power from an external transmitter and radiate 6.4 dB·m of power safely.
1978-09-01
This report describes an effort to specify a software design methodology applicable to the Air Force software environment . Available methodologies...of techniques for proof of correctness, design specification, and performance assessment of static designs. The rational methodology selected is a
NASA Technical Reports Server (NTRS)
Onwubiko, Chinyere; Onyebueke, Landon
1996-01-01
This program report is the final report covering all the work done on this project. The goal of this project is technology transfer of methodologies to improve design process. The specific objectives are: 1. To learn and understand the Probabilistic design analysis using NESSUS. 2. To assign Design Projects to either undergraduate or graduate students on the application of NESSUS. 3. To integrate the application of NESSUS into some selected senior level courses in Civil and Mechanical Engineering curricula. 4. To develop courseware in Probabilistic Design methodology to be included in a graduate level Design Methodology course. 5. To study the relationship between the Probabilistic design methodology and Axiomatic design methodology.
An Overview of Controls and Flying Qualities Technology on the F/A-18 High Alpha Research Vehicle
NASA Technical Reports Server (NTRS)
Pahle, Joseph W.; Wichman, Keith D.; Foster, John V.; Bundick, W. Thomas
1996-01-01
The NASA F/A-18 High Alpha Research Vehicle (HARV) has been the flight test bed of a focused technology effort to significantly increase maneuvering capability at high angles of attack. Development and flight test of control law design methodologies, handling qualities metrics, performance guidelines, and flight evaluation maneuvers are described. The HARV has been modified to include two research control effectors, thrust vectoring, and actuated forebody strakes in order to provide increased control power at high angles of attack. A research flight control system has been used to provide a flexible, easily modified capability for high-angle-of-attack research controls. Different control law design techniques have been implemented and flight-tested, including eigenstructure assignment, variable gain output feedback, pseudo controls, and model-following. Extensive piloted simulation has been used to develop nonlinear performance guide-lines and handling qualities criteria for high angles of attack. This paper reviews the development and evaluation of technologies useful for high-angle-of-attack control. Design, development, and flight test of the research flight control system, control laws, flying qualities specifications, and flight test maneuvers are described. Flight test results are used to illustrate some of the lessons learned during flight test and handling qualities evaluations.
Methodologies for processing plant material into acceptable food on a small scale
NASA Technical Reports Server (NTRS)
Parks, Thomas R.; Bindon, John N.; Bowles, Anthony J. G.; Golbitz, Peter; Lampi, Rauno A.; Marquardt, Robert F.
1994-01-01
Based on the Controlled Environment Life Support System (CELSS) production of only four crops, wheat, white potatoes, soybeans, and sweet potatoes; a crew size of twelve; a daily planting/harvesting regimen; and zero-gravity conditions, estimates were made on the quantity of food that would need to be grown to provide adequate nutrition; and the corresponding amount of biomass that would result. Projections were made of the various types of products that could be made from these crops, the unit operations that would be involved, and what menu capability these products could provide. Equipment requirements to perform these unit operations were screened to identify commercially available units capable of operating (or being modified to operate) under CELSS/zero-gravity conditions. Concept designs were developed for those equipment needs for which no suitable units were commercially available. Prototypes of selected concept designs were constructed and tested on a laboratory scale, as were selected commercially available units. This report discusses the practical considerations taken into account in the various design alternatives, some of the many product/process factors that relate to equipment development, and automation alternatives. Recommendations are made on both general and specific areas in which it was felt additional investigation would benefit CELSS missions.
Reinventing The Design Process: Teams and Models
NASA Technical Reports Server (NTRS)
Wall, Stephen D.
1999-01-01
The future of space mission designing will be dramatically different from the past. Formerly, performance-driven paradigms emphasized data return with cost and schedule being secondary issues. Now and in the future, costs are capped and schedules fixed-these two variables must be treated as independent in the design process. Accordingly, JPL has redesigned its design process. At the conceptual level, design times have been reduced by properly defining the required design depth, improving the linkages between tools, and managing team dynamics. In implementation-phase design, system requirements will be held in crosscutting models, linked to subsystem design tools through a central database that captures the design and supplies needed configuration management and control. Mission goals will then be captured in timelining software that drives the models, testing their capability to execute the goals. Metrics are used to measure and control both processes and to ensure that design parameters converge through the design process within schedule constraints. This methodology manages margins controlled by acceptable risk levels. Thus, teams can evolve risk tolerance (and cost) as they would any engineering parameter. This new approach allows more design freedom for a longer time, which tends to encourage revolutionary and unexpected improvements in design.
NASA Astrophysics Data System (ADS)
Moffitt, Blake Almy
Unmanned Aerial Vehicles (UAVs) are the most dynamic growth sector of the aerospace industry today. The need to provide persistent intelligence, surveillance, and reconnaissance for military operations is driving the planned acquisition of over 5,000 UAVs over the next five years. The most pressing need is for quiet, small UAVs with endurance beyond what is capable with advanced batteries or small internal combustion propulsion systems. Fuel cell systems demonstrate high efficiency, high specific energy, low noise, low temperature operation, modularity, and rapid refuelability making them a promising enabler of the small, quiet, and persistent UAVs that military planners are seeking. Despite the perceived benefits, the actual near-term performance of fuel cell powered UAVs is unknown. Until the auto industry began spending billions of dollars in research, fuel cell systems were too heavy for useful flight applications. However, the last decade has seen rapid development with fuel cell gravimetric and volumetric power density nearly doubling every 2--3 years. As a result, a few design studies and demonstrator aircraft have appeared, but overall the design methodology and vehicles are still in their infancy. The design of fuel cell aircraft poses many challenges. Fuel cells differ fundamentally from combustion based propulsion in how they generate power and interact with other aircraft subsystems. As a result, traditional multidisciplinary analysis (MDA) codes are inappropriate. Building new MDAs is difficult since fuel cells are rapidly changing in design, and various competitive architectures exist for balance of plant, hydrogen storage, and all electric aircraft subsystems. In addition, fuel cell design and performance data is closely protected which makes validation difficult and uncertainty significant. Finally, low specific power and high volumes compared to traditional combustion based propulsion result in more highly constrained design spaces that are problematic for design space exploration. To begin addressing the current gaps in fuel cell aircraft development, a methodology has been developed to explore and characterize the near-term performance of fuel cell powered UAVs. The first step of the methodology is the development of a valid MDA. This is accomplished by using propagated uncertainty estimates to guide the decomposition of a MDA into key contributing analyses (CAs) that can be individually refined and validated to increase the overall accuracy of the MDA. To assist in MDA development, a flexible framework for simultaneously solving the CAs is specified. This enables the MDA to be easily adapted to changes in technology and the changes in data that occur throughout a design process. Various CAs that model a polymer electrolyte membrane fuel cell (PEMFC) UAV are developed, validated, and shown to be in agreement with hardware-in-the-loop simulations of a fully developed fuel cell propulsion system. After creating a valid MDA, the final step of the methodology is the synthesis of the MDA with an uncertainty propagation analysis, an optimization routine, and a chance constrained problem formulation. This synthesis allows an efficient calculation of the probabilistic constraint boundaries and Pareto frontiers that will govern the design space and influence design decisions relating to optimization and uncertainty mitigation. A key element of the methodology is uncertainty propagation. The methodology uses Systems Sensitivity Analysis (SSA) to estimate the uncertainty of key performance metrics due to uncertainties in design variables and uncertainties in the accuracy of the CAs. A summary of SSA is provided and key rules for properly decomposing a MDA for use with SSA are provided. Verification of SSA uncertainty estimates via Monte Carlo simulations is provided for both an example problem as well as a detailed MDA of a fuel cell UAV. Implementation of the methodology was performed on a small fuel cell UAV designed to carry a 2.2 kg payload with 24 hours of endurance. Uncertainty distributions for both design variables and the CAs were estimated based on experimental results and were found to dominate the design space. To reduce uncertainty and test the flexibility of the MDA framework, CAs were replaced with either empirical, or semi-empirical relationships during the optimization process. The final design was validated via a hardware-in-the loop simulation. Finally, the fuel cell UAV probabilistic design space was studied. A graphical representation of the design space was generated and the optima due to deterministic and probabilistic constraints were identified. The methodology was used to identify Pareto frontiers of the design space which were shown on contour plots of the design space. Unanticipated discontinuities of the Pareto fronts were observed as different constraints became active providing useful information on which to base design and development decisions.
NASA Technical Reports Server (NTRS)
Pototzky, Anthony; Wieseman, Carol; Hoadley, Sherwood Tiffany; Mukhopadhyay, Vivek
1991-01-01
Described here is the development and implementation of on-line, near real time controller performance evaluation (CPE) methods capability. Briefly discussed are the structure of data flow, the signal processing methods used to process the data, and the software developed to generate the transfer functions. This methodology is generic in nature and can be used in any type of multi-input/multi-output (MIMO) digital controller application, including digital flight control systems, digitally controlled spacecraft structures, and actively controlled wind tunnel models. Results of applying the CPE methodology to evaluate (in near real time) MIMO digital flutter suppression systems being tested on the Rockwell Active Flexible Wing (AFW) wind tunnel model are presented to demonstrate the CPE capability.
Reliability based design optimization: Formulations and methodologies
NASA Astrophysics Data System (ADS)
Agarwal, Harish
Modern products ranging from simple components to complex systems should be designed to be optimal and reliable. The challenge of modern engineering is to ensure that manufacturing costs are reduced and design cycle times are minimized while achieving requirements for performance and reliability. If the market for the product is competitive, improved quality and reliability can generate very strong competitive advantages. Simulation based design plays an important role in designing almost any kind of automotive, aerospace, and consumer products under these competitive conditions. Single discipline simulations used for analysis are being coupled together to create complex coupled simulation tools. This investigation focuses on the development of efficient and robust methodologies for reliability based design optimization in a simulation based design environment. Original contributions of this research are the development of a novel efficient and robust unilevel methodology for reliability based design optimization, the development of an innovative decoupled reliability based design optimization methodology, the application of homotopy techniques in unilevel reliability based design optimization methodology, and the development of a new framework for reliability based design optimization under epistemic uncertainty. The unilevel methodology for reliability based design optimization is shown to be mathematically equivalent to the traditional nested formulation. Numerical test problems show that the unilevel methodology can reduce computational cost by at least 50% as compared to the nested approach. The decoupled reliability based design optimization methodology is an approximate technique to obtain consistent reliable designs at lesser computational expense. Test problems show that the methodology is computationally efficient compared to the nested approach. A framework for performing reliability based design optimization under epistemic uncertainty is also developed. A trust region managed sequential approximate optimization methodology is employed for this purpose. Results from numerical test studies indicate that the methodology can be used for performing design optimization under severe uncertainty.
Aircraft integrated design and analysis: A classroom experience
NASA Technical Reports Server (NTRS)
1988-01-01
AAE 451 is the capstone course required of all senior undergraduates in the School of Aeronautics and Astronautics at Purdue University. During the past year the first steps of a long evolutionary process were taken to change the content and expectations of this course. These changes are the result of the availability of advanced computational capabilities and sophisticated electronic media availability at Purdue. This presentation will describe both the long range objectives and this year's experience using the High Speed Commercial Transport (HSCT) design, the AIAA Long Duration Aircraft design and a Remotely Piloted Vehicle (RPV) design proposal as project objectives. The central goal of these efforts was to provide a user-friendly, computer-software-based, environment to supplement traditional design course methodology. The Purdue University Computer Center (PUCC), the Engineering Computer Network (ECN), and stand-alone PC's were used for this development. This year's accomplishments centered primarily on aerodynamics software obtained from the NASA Langley Research Center and its integration into the classroom. Word processor capability for oral and written work and computer graphics were also blended into the course. A total of 10 HSCT designs were generated, ranging from twin-fuselage and forward-swept wing aircraft, to the more traditional delta and double-delta wing aircraft. Four Long Duration Aircraft designs were submitted, together with one RPV design tailored for photographic surveillance. Supporting these activities were three video satellite lectures beamed from NASA/Langley to Purdue. These lectures covered diverse areas such as an overview of HSCT design, supersonic-aircraft stability and control, and optimization of aircraft performance. Plans for next year's effort will be reviewed, including dedicated computer workstation utilization, remote satellite lectures, and university/industrial cooperative efforts.
Object-oriented Approach to High-level Network Monitoring and Management
NASA Technical Reports Server (NTRS)
Mukkamala, Ravi
2000-01-01
An absolute prerequisite for the management of large investigating methods to build high-level monitoring computer networks is the ability to measure their systems that are built on top of existing monitoring performance. Unless we monitor a system, we cannot tools. Due to the heterogeneous nature of the hope to manage and control its performance. In this underlying systems at NASA Langley Research Center, paper, we describe a network monitoring system that we use an object-oriented approach for the design, we are currently designing and implementing. Keeping, first, we use UML (Unified Modeling Language) to in mind the complexity of the task and the required model users' requirements. Second, we identify the flexibility for future changes, we use an object-oriented existing capabilities of the underlying monitoring design methodology. The system is built using the system. Third, we try to map the former with the latter. APIs offered by the HP OpenView system.
Integrated orbital servicing study for low-cost payload programs. Volume 1: Executive summary
NASA Technical Reports Server (NTRS)
Derocher, W. L., Jr.
1975-01-01
Various operating methodologies to achieve low-cost space operations were investigated as part of the Space Transportation System (STS) planning. The emphasis was to show that the development investment, initial fleet costs, and supporting facilities for the STS could be effectively offset by exploiting the capabilities of the STS to satisfy mission requirements and reduce the cost of payload programs. The following major conclusions were reached: (1) the development of an on-orbit servicer maintenance system is compatible with many spacecraft programs and is recommended as the most cost-effective system, (2) spacecraft can be designed to be serviceable with acceptable design, weight, volume, and cost effects, (3) use of on-orbit servicing over a 12 year period results in savings ranging between four and nine billion dollars, (4) the pivoting arm on-orbit servicer was selected and a preliminary design was prepared, (5) orbital maintenance has no significant impact on the STS.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mathieu, Johanna L.; Gadgil, Ashok J.; Kowolik, Kristin
2009-09-14
Researchers have invented a material called ARUBA -- Arsenic Removal Using Bottom Ash -- that effectively and affordably removes arsenic from Bangladesh groundwater. Through analysis of studies across a range of disciplines, observations, and informal interviews conducted over three trips to Bangladesh, we have applied mechanical engineering design methodology to develop eight key design strategies, which were used in the development of a low-cost, community-scale water treatment system that uses ARUBA to removearsenic from drinking water. We have constructed, tested, and analysed a scale version of the system. Experiments have shown that the system is capable of reducing high levelsmore » of arsenic (nearly 600 ppb) to below the Bangladesh standard of 50 ppb, while remaining affordable to people living on less than US$2/day. The system could be sustainably implemented as a public-private partnership in rural Bangladesh.« less
Numerical methods for engine-airframe integration
DOE Office of Scientific and Technical Information (OSTI.GOV)
Murthy, S.N.B.; Paynter, G.C.
1986-01-01
Various papers on numerical methods for engine-airframe integration are presented. The individual topics considered include: scientific computing environment for the 1980s, overview of prediction of complex turbulent flows, numerical solutions of the compressible Navier-Stokes equations, elements of computational engine/airframe integrations, computational requirements for efficient engine installation, application of CAE and CFD techniques to complete tactical missile design, CFD applications to engine/airframe integration, and application of a second-generation low-order panel methods to powerplant installation studies. Also addressed are: three-dimensional flow analysis of turboprop inlet and nacelle configurations, application of computational methods to the design of large turbofan engine nacelles, comparison ofmore » full potential and Euler solution algorithms for aeropropulsive flow field computations, subsonic/transonic, supersonic nozzle flows and nozzle integration, subsonic/transonic prediction capabilities for nozzle/afterbody configurations, three-dimensional viscous design methodology of supersonic inlet systems for advanced technology aircraft, and a user's technology assessment.« less
Hyper-X Mach 7 Scramjet Design, Ground Test and Flight Results
NASA Technical Reports Server (NTRS)
Ferlemann, Shelly M.; McClinton, Charles R.; Rock, Ken E.; Voland, Randy T.
2005-01-01
The successful Mach 7 flight test of the Hyper-X (X-43) research vehicle has provided the major, essential demonstration of the capability of the airframe integrated scramjet engine. This flight was a crucial first step toward realizing the potential for airbreathing hypersonic propulsion for application to space launch vehicles. However, it is not sufficient to have just achieved a successful flight. The more useful knowledge gained from the flight is how well the prediction methods matched the actual test results in order to have confidence that these methods can be applied to the design of other scramjet engines and powered vehicles. The propulsion predictions for the Mach 7 flight test were calculated using the computer code, SRGULL, with input from computational fluid dynamics (CFD) and wind tunnel tests. This paper will discuss the evolution of the Mach 7 Hyper-X engine, ground wind tunnel experiments, propulsion prediction methodology, flight results and validation of design methods.
APPROACHES TO GEOMETRIC DATA ANALYSIS ON BIG AREA ADDITIVELY MANUFACTURED (BAAM) PARTS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dreifus, Gregory D; Ally, Nadya R; Post, Brian K
The promise of additive manufacturing is that a user can design and print complex geometries that are very difficult, if not impossible, to machine. The capabilities of 3D printing are restricted by a number of factors, including properties of the build material, time constraints, and geometric design restrictions. In this paper, a thorough accounting and study of the geometric restrictions that exist in the current iteration of additive manufacturing (AM) fused deposition modeling (FDM) technologies are discussed. Offline and online methodologies for collecting data sets for qualitative analysis of large scale AM, in particular Oak Ridge National Laboratory s (ORNL)more » big area additive manufacturing (BAAM) system, are summarized. In doing so, a survey of tools for designers and software developers is provided. In particular, strategies in which geometric data can be used as training sets for smarter AM technologies in the future are explained as well.« less
NASA Technical Reports Server (NTRS)
Foster, John V.; Ross, Holly M.; Ashley, Patrick A.
1993-01-01
Designers of the next-generation fighter and attack airplanes are faced with the requirements of good high-angle-of-attack maneuverability as well as efficient high speed cruise capability with low radar cross section (RCS) characteristics. As a result, they are challenged with the task of making critical design trades to achieve the desired levels of maneuverability and performance. This task has highlighted the need for comprehensive, flight-validated lateral-directional control power design guidelines for high angles of attack. A joint NASA/U.S. Navy study has been initiated to address this need and to investigate the complex flight dynamics characteristics and controls requirements for high-angle-of-attack lateral-directional maneuvering. A multi-year research program is underway which includes ground-based piloted simulation and flight validation. This paper will give a status update of this program that will include a program overview, description of test methodology and preliminary results.
NASA Technical Reports Server (NTRS)
Foster, John V.; Ross, Holly M.; Ashley, Patrick A.
1993-01-01
Designers of the next-generation fighter and attack airplanes are faced with the requirements of good high angle-of-attack maneuverability as well as efficient high speed cruise capability with low radar cross section (RCS) characteristics. As a result, they are challenged with the task of making critical design trades to achieve the desired levels of maneuverability and performance. This task has highlighted the need for comprehensive, flight-validated lateral-directional control power design guidelines for high angles of attack. A joint NASA/U.S. Navy study has been initiated to address this need and to investigate the complex flight dynamics characteristics and controls requirements for high angle-of-attack lateral-directional maneuvering. A multi-year research program is underway which includes groundbased piloted simulation and flight validation. This paper will give a status update of this program that will include a program overview, description of test methodology and preliminary results.
NASA Astrophysics Data System (ADS)
Bearden, David A.; Duclos, Donald P.; Barrera, Mark J.; Mosher, Todd J.; Lao, Norman Y.
1997-12-01
Emerging technologies and micro-instrumentation are changing the way remote sensing spacecraft missions are developed and implemented. Government agencies responsible for procuring space systems are increasingly requesting analyses to estimate cost, performance and design impacts of advanced technology insertion for both state-of-the-art systems as well as systems to be built 5 to 10 years in the future. Numerous spacecraft technology development programs are being sponsored by Department of Defense (DoD) and National Aeronautics and Space Administration (NASA) agencies with the goal of enhancing spacecraft performance, reducing mass, and reducing cost. However, it is often the case that technology studies, in the interest of maximizing subsystem-level performance and/or mass reduction, do not anticipate synergistic system-level effects. Furthermore, even though technical risks are often identified as one of the largest cost drivers for space systems, many cost/design processes and models ignore effects of cost risk in the interest of quick estimates. To address these issues, the Aerospace Corporation developed a concept analysis methodology and associated software tools. These tools, collectively referred to as the concept analysis and design evaluation toolkit (CADET), facilitate system architecture studies and space system conceptual designs focusing on design heritage, technology selection, and associated effects on cost, risk and performance at the system and subsystem level. CADET allows: (1) quick response to technical design and cost questions; (2) assessment of the cost and performance impacts of existing and new designs/technologies; and (3) estimation of cost uncertainties and risks. These capabilities aid mission designers in determining the configuration of remote sensing missions that meet essential requirements in a cost- effective manner. This paper discuses the development of CADET modules and their application to several remote sensing satellite mission concepts.
Design and Analysis of Morpheus Lander Flight Control System
NASA Technical Reports Server (NTRS)
Jang, Jiann-Woei; Yang, Lee; Fritz, Mathew; Nguyen, Louis H.; Johnson, Wyatt R.; Hart, Jeremy J.
2014-01-01
The Morpheus Lander is a vertical takeoff and landing test bed vehicle developed to demonstrate the system performance of the Guidance, Navigation and Control (GN&C) system capability for the integrated autonomous landing and hazard avoidance system hardware and software. The Morpheus flight control system design must be robust to various mission profiles. This paper presents a design methodology for employing numerical optimization to develop the Morpheus flight control system. The design objectives include attitude tracking accuracy and robust stability with respect to rigid body dynamics and propellant slosh. Under the assumption that the Morpheus time-varying dynamics and control system can be frozen over a short period of time, the flight controllers are designed to stabilize all selected frozen-time control systems in the presence of parametric uncertainty. Both control gains in the inner attitude control loop and guidance gains in the outer position control loop are designed to maximize the vehicle performance while ensuring robustness. The flight control system designs provided herein have been demonstrated to provide stable control systems in both Draper Ares Stability Analysis Tool (ASAT) and the NASA/JSC Trick-based Morpheus time domain simulation.
NASA Technical Reports Server (NTRS)
Abihana, Osama A.; Gonzalez, Oscar R.
1993-01-01
The main objectives of our research are to present a self-contained overview of fuzzy sets and fuzzy logic, develop a methodology for control system design using fuzzy logic controllers, and to design and implement a fuzzy logic controller for a real system. We first present the fundamental concepts of fuzzy sets and fuzzy logic. Fuzzy sets and basic fuzzy operations are defined. In addition, for control systems, it is important to understand the concepts of linguistic values, term sets, fuzzy rule base, inference methods, and defuzzification methods. Second, we introduce a four-step fuzzy logic control system design procedure. The design procedure is illustrated via four examples, showing the capabilities and robustness of fuzzy logic control systems. This is followed by a tuning procedure that we developed from our design experience. Third, we present two Lyapunov based techniques for stability analysis. Finally, we present our design and implementation of a fuzzy logic controller for a linear actuator to be used to control the direction of the Free Flight Rotorcraft Research Vehicle at LaRC.
Design Optimization and Residual Strength Assessment of a Cylindrical Composite Shell Structure
NASA Technical Reports Server (NTRS)
Rais-Rohani, Masoud
2000-01-01
A summary of research conducted during the specified period is presented. The research objectives included the investigation of an efficient technique for the design optimization and residual strength assessment of a semi-monocoque cylindrical shell structure made of composite materials. The response surface methodology is used in modeling the buckling response of individual skin panels under the combined axial compression and shear loading. These models are inserted into the MSC/NASTRAN code for design optimization of the cylindrical structure under a combined bending-torsion loading condition. The comparison between the monolithic and sandwich skin design cases indicated a 35% weight saving in using sandwich skin panels. In addition, the residual strength of the optimum design was obtained by identifying the most critical region of the structure and introducing a damage in the form of skin-stringer and skin-stringer-frame detachment. The comparison between the two skin design concepts indicated that the sandwich skin design is capable of retaining a higher residual strength than its monolithic counterpart. The results of this investigation are presented and discussed in this report.
An Approach for Performance Based Glove Mobility Requirements
NASA Technical Reports Server (NTRS)
Aitchison, Lindsay; Benson, Elizabeth; England, Scott
2016-01-01
The Space Suit Assembly (SSA) Development Team at NASA Johnson Space Center has invested heavily in the advancement of rear-entry planetary exploration suit design but largely deferred development of extravehicular activity (EVA) glove designs, and accepted the risk of using the current flight gloves, Phase VI, for exploration missions. However, as design reference missions mature, the risks of using heritage hardware have highlighted the need for developing robust new glove technologies. To address the technology gap, the NASA Space Technology Mission Directorate's Game-Changing Development Program provided start-up funding for the High Performance EVA Glove (HPEG) Element as part of the Next Generation Life Support (NGLS) Project in the fall of 2013. The overarching goal of the HPEG Element is to develop a robust glove design that increases human performance during EVA and creates pathway for implementation of emergent technologies, with specific aims of increasing pressurized mobility to 60% of barehanded capability, increasing the durability in on-pristine environments, and decreasing the potential of gloves to cause injury during use. The HPEG Element focused initial efforts on developing quantifiable and repeatable methodologies for assessing glove performance with respect to mobility, injury potential, thermal conductivity, and abrasion resistance. The team used these methodologies to establish requirements against which emerging technologies and glove designs can be assessed at both the component and assembly levels. The mobility performance testing methodology was an early focus for the HPEG team as it stems from collaborations between the SSA Development team and the JSC Anthropometry and Biomechanics Facility (ABF) that began investigating new methods for suited mobility and fit early in the Constellation Program. The combined HPEG and ABF team used lessons learned from the previous efforts as well as additional reviews of methodologies in physical and occupational therapy arenas to develop a protocol that assesses gloved range of motion, strength, dexterity, tactility, and fit in comparative quantitative terms and also provides qualitative insight to direct hardware design iterations. The protocol was evaluated using five experienced test subjects wearing the EMU pressurized to 4.3psid with three different glove configurations. The results of the testing are presented to illustrate where the protocol is and is not valid for benchmark comparisons. The process for requirements development based upon the results is also presented along with suggested performance values for the High Performance EVA Gloves currently in development.
An Approach for Performance Based Glove Mobility Requirements
NASA Technical Reports Server (NTRS)
Aitchison, Lindsay; Benson, Elizabeth; England, Scott
2015-01-01
The Space Suit Assembly (SSA) Development Team at NASA Johnson Space Center has invested heavily in the advancement of rear-entry planetary exploration suit design but largely deferred development of extravehicular activity (EVA) glove designs, and accepted the risk of using the current flight gloves, Phase VI, for exploration missions. However, as design reference missions mature, the risks of using heritage hardware have highlighted the need for developing robust new glove technologies. To address the technology gap, the NASA Space Technology Mission Directorate's Game-Changing Development Program provided start-up funding for the High Performance EVA Glove (HPEG) Element as part of the Next Generation Life Support (NGLS) Project in the fall of 2013. The overarching goal of the HPEG Element is to develop a robust glove design that increases human performance during EVA and creates pathway for implementation of emergent technologies, with specific aims of increasing pressurized mobility to 60% of barehanded capability, increasing the durability in on-pristine environments, and decreasing the potential of gloves to cause injury during use. The HPEG Element focused initial efforts on developing quantifiable and repeatable methodologies for assessing glove performance with respect to mobility, injury potential, thermal conductivity, and abrasion resistance. The team used these methodologies to establish requirements against which emerging technologies and glove designs can be assessed at both the component and assembly levels. The mobility performance testing methodology was an early focus for the HPEG team as it stems from collaborations between the SSA Development team and the JSC Anthropometry and Biomechanics Facility (ABF) that began investigating new methods for suited mobility and fit early in the Constellation Program. The combined HPEG and ABF team used lessons learned from the previous efforts as well as additional reviews of methodologies in physical and occupational therapy arenas to develop a protocol that assesses gloved range of motion, strength, dexterity, tactility, and fit in comparative quantitative terms and also provides qualitative insight to direct hardware design iterations. The protocol was evaluated using five experienced test subjects wearing the EMU pressurized to 4.3psid with three different glove configurations. The results of the testing are presented to illustrate where the protocol is and is not valid for benchmark comparisons. The process for requirements development based upon the results is also presented along with suggested performance values for the High Performance EVA Gloves to be procured in fiscal year 2015.
High Temperature, Permanent Magnet Biased, Fault Tolerant, Homopolar Magnetic Bearing Development
NASA Technical Reports Server (NTRS)
Palazzolo, Alan; Tucker, Randall; Kenny, Andrew; Kang, Kyung-Dae; Ghandi, Varun; Liu, Jinfang; Choi, Heeju; Provenza, Andrew
2008-01-01
This paper summarizes the development of a magnetic bearing designed to operate at 1,000 F. A novel feature of this high temperature magnetic bearing is its homopolar construction which incorporates state of the art high temperature, 1,000 F, permanent magnets. A second feature is its fault tolerance capability which provides the desired control forces with over one-half of the coils failed. The construction and design methodology of the bearing is outlined and test results are shown. The agreement between a 3D finite element, magnetic field based prediction for force is shown to be in good agreement with predictions at room and high temperature. A 5 axis test rig will be complete soon to provide a means to test the magnetic bearings at high temperature and speed.
Novel folding device for manufacturing aerospace composite structures
NASA Astrophysics Data System (ADS)
Tewfic, Tarik; Sarhadi, M.
2000-10-01
A new manufacturing methodology, termed shape-inclusive lay-up has been applied that allows the generation of three-dimensional preforms for the resin transfer molding (RTM) process. A flexible novel folding device for forming dry fabrics including non-crimp fabric (NCF) preform is designed and integrated with a Material Delivery System (MDS) into a robotic cell for manufacturing dry fiber composite aerospace components. The paper describes detailed design, implementation and operational performance of a prototype device. The proposed folding device has been implemented and tested by manufacturing a range of reinforcement structure preforms (C,T,J and I reinforcement preforms), normally used in aerostructure applications. A key advantage of the proposed device is its flexibility. The system is capable of manufacturing a wide range of components of various sizes without the need for reconfiguration.
Noguero, Adrián; Calvo, Isidro; Pérez, Federico; Almeida, Luis
2013-01-01
There is an increasing number of Ambient Intelligence (AmI) systems that are time-sensitive and resource-aware. From healthcare to building and even home/office automation, it is now common to find systems combining interactive and sensing multimedia traffic with relatively simple sensors and actuators (door locks, presence detectors, RFIDs, HVAC, information panels, etc.). Many of these are today known as Cyber-Physical Systems (CPS). Quite frequently, these systems must be capable of (1) prioritizing different traffic flows (process data, alarms, non-critical data, etc.), (2) synchronizing actions in several distributed devices and, to certain degree, (3) easing resource management (e.g., detecting faulty nodes, managing battery levels, handling overloads, etc.). This work presents FTT-MA, a high-level middleware architecture aimed at easing the design, deployment and operation of such AmI systems. FTT-MA ensures that both functional and non-functional aspects of the applications are met even during reconfiguration stages. The paper also proposes a methodology, together with a design tool, to create this kind of systems. Finally, a sample case study is presented that illustrates the use of the middleware and the methodology proposed in the paper. PMID:23669711
Noguero, Adrián; Calvo, Isidro; Pérez, Federico; Almeida, Luis
2013-05-13
There is an increasing number of Ambient Intelligence (AmI) systems that are time-sensitive and resource-aware. From healthcare to building and even home/office automation, it is now common to find systems combining interactive and sensing multimedia traffic with relatively simple sensors and actuators (door locks, presence detectors, RFIDs, HVAC, information panels, etc.). Many of these are today known as Cyber-Physical Systems (CPS). Quite frequently, these systems must be capable of (1) prioritizing different traffic flows (process data, alarms, non-critical data, etc.), (2) synchronizing actions in several distributed devices and, to certain degree, (3) easing resource management (e.g., detecting faulty nodes, managing battery levels, handling overloads, etc.). This work presents FTT-MA, a high-level middleware architecture aimed at easing the design, deployment and operation of such AmI systems. FTT-MA ensures that both functional and non-functional aspects of the applications are met even during reconfiguration stages. The paper also proposes a methodology, together with a design tool, to create this kind of systems. Finally, a sample case study is presented that illustrates the use of the middleware and the methodology proposed in the paper.
NASA Technical Reports Server (NTRS)
Fridge, Ernest M., III
1991-01-01
Programs in use today generally have all of the function and information processing capabilities required to do their specified job. However, older programs usually use obsolete technology, are not integrated properly with other programs, and are difficult to maintain. Reengineering is becoming a prominent discipline as organizations try to move their systems to more modern and maintainable technologies. The Johnson Space Center (JSC) Software Technology Branch (STB) is researching and developing a system to support reengineering older FORTRAN programs into more maintainable forms that can also be more readily translated to a modern languages such as FORTRAN 8x, Ada, or C. This activity has led to the development of maintenance strategies for design recovery and reengineering. These strategies include a set of standards, methodologies, and the concepts for a software environment to support design recovery and reengineering. A brief description of the problem being addressed and the approach that is being taken by the STB toward providing an economic solution to the problem is provided. A statement of the maintenance problems, the benefits and drawbacks of three alternative solutions, and a brief history of the STB experience in software reengineering are followed by the STB new FORTRAN standards, methodology, and the concepts for a software environment.
NASA Astrophysics Data System (ADS)
Bunge, Mario
2011-05-01
Pseudoscience is error, substantive or methodological, parading as science. Obvious examples are parapsychology, "intelligent design," and homeopathy. Psychoanalysis and pop evolutionary psychology are less obvious, yet no less flawed in both method and doctrine. The fact that science can be faked to the point of deceiving science lovers suggests the need for a rigorous sifting device, one capable of revealing out the worm in the apple. This device is needed to evaluate research proposal as well as new fashions. Such a device can be designed only with the help of a correct definition of science, one attending not only to methodological aspects, such as testability and predictive power, but also to other features of scientific knowledge, such as intelligibility, corrigibility, and compatibility with the bulk of antecedent knowledge. The aim of this paper is to suggest such a criterion, to illustrate it with a handful of topical examples, and to emphasize the role of philosophy in either promoting or blocking scientific progress. This article is a revised version of a chapter in the author's forthcoming book Matter and Mind (Springer). [The Appendix on inductive logic was written at the request of the editors in order to elaborate claims made in #10 (4).
Technology Assessment for Powertrain Components Final Report CRADA No. TC-1124-95
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tokarz, F.; Gough, C.
LLNL utilized its defense technology assessment methodologies in combination with its capabilities in the energy; manufacturing, and transportation technologies to demonstrate a methodology that synthesized available but incomplete information on advanced automotive technologies into a comprehensive framework.
Knowledge-based decision support for Space Station assembly sequence planning
NASA Astrophysics Data System (ADS)
1991-04-01
A complete Personal Analysis Assistant (PAA) for Space Station Freedom (SSF) assembly sequence planning consists of three software components: the system infrastructure, intra-flight value added, and inter-flight value added. The system infrastructure is the substrate on which software elements providing inter-flight and intra-flight value-added functionality are built. It provides the capability for building representations of assembly sequence plans and specification of constraints and analysis options. Intra-flight value-added provides functionality that will, given the manifest for each flight, define cargo elements, place them in the National Space Transportation System (NSTS) cargo bay, compute performance measure values, and identify violated constraints. Inter-flight value-added provides functionality that will, given major milestone dates and capability requirements, determine the number and dates of required flights and develop a manifest for each flight. The current project is Phase 1 of a projected two phase program and delivers the system infrastructure. Intra- and inter-flight value-added were to be developed in Phase 2, which has not been funded. Based on experience derived from hundreds of projects conducted over the past seven years, ISX developed an Intelligent Systems Engineering (ISE) methodology that combines the methods of systems engineering and knowledge engineering to meet the special systems development requirements posed by intelligent systems, systems that blend artificial intelligence and other advanced technologies with more conventional computing technologies. The ISE methodology defines a phased program process that begins with an application assessment designed to provide a preliminary determination of the relative technical risks and payoffs associated with a potential application, and then moves through requirements analysis, system design, and development.
Knowledge-based decision support for Space Station assembly sequence planning
NASA Technical Reports Server (NTRS)
1991-01-01
A complete Personal Analysis Assistant (PAA) for Space Station Freedom (SSF) assembly sequence planning consists of three software components: the system infrastructure, intra-flight value added, and inter-flight value added. The system infrastructure is the substrate on which software elements providing inter-flight and intra-flight value-added functionality are built. It provides the capability for building representations of assembly sequence plans and specification of constraints and analysis options. Intra-flight value-added provides functionality that will, given the manifest for each flight, define cargo elements, place them in the National Space Transportation System (NSTS) cargo bay, compute performance measure values, and identify violated constraints. Inter-flight value-added provides functionality that will, given major milestone dates and capability requirements, determine the number and dates of required flights and develop a manifest for each flight. The current project is Phase 1 of a projected two phase program and delivers the system infrastructure. Intra- and inter-flight value-added were to be developed in Phase 2, which has not been funded. Based on experience derived from hundreds of projects conducted over the past seven years, ISX developed an Intelligent Systems Engineering (ISE) methodology that combines the methods of systems engineering and knowledge engineering to meet the special systems development requirements posed by intelligent systems, systems that blend artificial intelligence and other advanced technologies with more conventional computing technologies. The ISE methodology defines a phased program process that begins with an application assessment designed to provide a preliminary determination of the relative technical risks and payoffs associated with a potential application, and then moves through requirements analysis, system design, and development.
NASA Astrophysics Data System (ADS)
Agostoni, S.; Cheli, F.; Leo, E.; Pezzola, M.
2012-08-01
Motor vehicle ride comfort is mainly affected by reciprocating engine inertia unbalances. These forces are transmitted to the driver through the main frame, the engine mounts, and the auxiliary sub systems—all components with which he physically comes into contact. On-road traction vehicle engines are mainly characterized by transient exercise. Thus, an excitation frequency range from 800 RPM (≈15 Hz for stationary vehicles) up to 15,000 RPM (≈250 Hz as a cut off condition) occurs. Several structural resonances are induced by the unbalancing forces spectrum, thus exposing the driver to amplified vibrations. The aim of this research is to reduce driver vibration exposure, by acting on the modal response of structures with which the driver comes into contact. An experimental methodology, capable of identifying local vibration modes was developed. The application of this methodology on a reference vehicle allows us to detect if/when/how the above mentioned resonances are excited. Numerical models were used to study structural modifications. In this article, a handlebar equipped with an innovative multi reciprocating tuned mass damper was optimized. All structural modifications were designed, developed and installed on a vehicle. Modal investigations were then performed in order to predict modification efficiency. Furthermore, functional solution efficiency was verified during sweep tests performed on a target vehicle, by means of a roller bench capable of replicating on-road loads. Three main investigation zones of the vehicle were detected and monitored using accelerometers: (1) engine mounts, to characterize vibration emissions; (2) bindings connecting the engine to the frame, in order to detect vibration transfer paths, with particular attention being paid to local dynamic amplifications due to compliances and (3) the terminal components with which the driver comes into contact.
Two-Scale 13C Metabolic Flux Analysis for Metabolic Engineering.
Ando, David; Garcia Martin, Hector
2018-01-01
Accelerating the Design-Build-Test-Learn (DBTL) cycle in synthetic biology is critical to achieving rapid and facile bioengineering of organisms for the production of, e.g., biofuels and other chemicals. The Learn phase involves using data obtained from the Test phase to inform the next Design phase. As part of the Learn phase, mathematical models of metabolic fluxes give a mechanistic level of comprehension to cellular metabolism, isolating the principle drivers of metabolic behavior from the peripheral ones, and directing future experimental designs and engineering methodologies. Furthermore, the measurement of intracellular metabolic fluxes is specifically noteworthy as providing a rapid and easy-to-understand picture of how carbon and energy flow throughout the cell. Here, we present a detailed guide to performing metabolic flux analysis in the Learn phase of the DBTL cycle, where we show how one can take the isotope labeling data from a 13 C labeling experiment and immediately turn it into a determination of cellular fluxes that points in the direction of genetic engineering strategies that will advance the metabolic engineering process.For our modeling purposes we use the Joint BioEnergy Institute (JBEI) Quantitative Metabolic Modeling (jQMM) library, which provides an open-source, python-based framework for modeling internal metabolic fluxes and making actionable predictions on how to modify cellular metabolism for specific bioengineering goals. It presents a complete toolbox for performing different types of flux analysis such as Flux Balance Analysis, 13 C Metabolic Flux Analysis, and it introduces the capability to use 13 C labeling experimental data to constrain comprehensive genome-scale models through a technique called two-scale 13 C Metabolic Flux Analysis (2S- 13 C MFA) [1]. In addition to several other capabilities, the jQMM is also able to predict the effects of knockouts using the MoMA and ROOM methodologies. The use of the jQMM library is illustrated through a step-by-step demonstration, which is also contained in a digital Jupyter Notebook format that enhances reproducibility and provides the capability to be adopted to the user's specific needs. As an open-source software project, users can modify and extend the code base and make improvements at will, providing a base for future modeling efforts.
Towards a Methodology for the Design of Multimedia Public Access Interfaces.
ERIC Educational Resources Information Center
Rowley, Jennifer
1998-01-01
Discussion of information systems methodologies that can contribute to interface design for public access systems covers: the systems life cycle; advantages of adopting information systems methodologies; soft systems methodologies; task-oriented approaches to user interface design; holistic design, the Star model, and prototyping; the…
Overview of the production of sintered SiC optics and optical sub-assemblies
NASA Astrophysics Data System (ADS)
Williams, S.; Deny, P.
2005-08-01
The following is an overview on sintered silicon carbide (SSiC) material properties and processing requirements for the manufacturing of components for advanced technology optical systems. The overview will compare SSiC material properties to typical materials used for optics and optical structures. In addition, it will review manufacturing processes required to produce optical components in detail by process step. The process overview will illustrate current manufacturing process and concepts to expand the process size capability. The overview will include information on the substantial capital equipment employed in the manufacturing of SSIC. This paper will also review common in-process inspection methodology and design rules. The design rules are used to improve production yield, minimize cost, and maximize the inherent benefits of SSiC for optical systems. Optimizing optical system designs for a SSiC manufacturing process will allow systems designers to utilize SSiC as a low risk, cost competitive, and fast cycle time technology for next generation optical systems.
NASA Technical Reports Server (NTRS)
Padilla, Peter A.
1991-01-01
An investigation was made in AIRLAB of the fault handling performance of the Fault Tolerant MultiProcessor (FTMP). Fault handling errors detected during fault injection experiments were characterized. In these fault injection experiments, the FTMP disabled a working unit instead of the faulted unit once in every 500 faults, on the average. System design weaknesses allow active faults to exercise a part of the fault management software that handles Byzantine or lying faults. Byzantine faults behave such that the faulted unit points to a working unit as the source of errors. The design's problems involve: (1) the design and interface between the simplex error detection hardware and the error processing software, (2) the functional capabilities of the FTMP system bus, and (3) the communication requirements of a multiprocessor architecture. These weak areas in the FTMP's design increase the probability that, for any hardware fault, a good line replacement unit (LRU) is mistakenly disabled by the fault management software.
Senior-driven design and development of tablet-based cognitive games.
Marques, João; Vasconcelos, Ana; Teixeira, Luís F
2013-01-01
This paper describes the design and development of a tablet-based gaming platform targeting the senior population, aiming at improving their overall wellbeing by stimulating their cognitive capabilities and promoting social interaction between players. To achieve these goals, we started by performing a study of the specific characteristics of the senior user as well as what makes a game appealing to the player. Furthermore we investigated why the tablet proves to be an advantageous device to our target audience. Based on the results of our research, we developed a solution that incorporates cognitive and social mechanisms into its games, while performing iterative evaluations together with the final user by adopting a user-centered design methodology. In each design phase, a pre-selected group of senior participants experimented with the game platform and provided feedback to improve its features and usability. Through a series of short-term and a long-term evaluation, the game platform proved to be appealing to its intended users, providing an enjoyable gaming experience.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Driscoll, Frederick R.
The University of Washington (UW) - Northwest National Marine Renewable Energy Center (UW-NNMREC) and the National Renewable Energy Laboratory (NREL) will collaborate to advance research and development (R&D) of Marine Hydrokinetic (MHK) renewable energy technology, specifically renewable energy captured from ocean tidal currents. UW-NNMREC is endeavoring to establish infrastructure, capabilities and tools to support in-water testing of marine energy technology. NREL is leveraging its experience and capabilities in field testing of wind systems to develop protocols and instrumentation to advance field testing of MHK systems. Under this work, UW-NNMREC and NREL will work together to develop a common instrumentation systemmore » and testing methodologies, standards and protocols. UW-NNMREC is also establishing simulation capabilities for MHK turbine and turbine arrays. NREL has extensive experience in wind turbine array modeling and is developing several computer based numerical simulation capabilities for MHK systems. Under this CRADA, UW-NNMREC and NREL will work together to augment single device and array modeling codes. As part of this effort UW NNMREC will also work with NREL to run simulations on NREL's high performance computer system.« less
A methodology for secure recovery of spacecrafts based on a trusted hardware platform
NASA Astrophysics Data System (ADS)
Juliato, Marcio; Gebotys, Catherine
2017-02-01
This paper proposes a methodology for the secure recovery of spacecrafts and the recovery of its cryptographic capabilities in emergency scenarios recurring from major unintentional failures and malicious attacks. The proposed approach employs trusted modules to achieve higher reliability and security levels in space missions due to the presence of integrity check capabilities as well as secure recovery mechanisms. Additionally, several recovery protocols are thoroughly discussed and analyzed against a wide variety of attacks. Exhaustive search attacks are shown in a wide variety of contexts and are shown to be infeasible and totally independent of the computational power of attackers. Experimental results have shown that the proposed methodology allows for the fast and secure recovery of spacecrafts, demanding minimum implementation area, power consumption and bandwidth.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Myers, S; Larsen, S; Wagoner, J
Seismic imaging and tracking methods have intelligence and monitoring applications. Current systems, however, do not adequately calibrate or model the unknown geological heterogeneity. Current systems are also not designed for rapid data acquisition and analysis in the field. This project seeks to build the core technological capabilities coupled with innovative deployment, processing, and analysis methodologies to allow seismic methods to be effectively utilized in the applications of seismic imaging and vehicle tracking where rapid (minutes to hours) and real-time analysis is required. The goal of this project is to build capabilities in acquisition system design, utilization of full three-dimensional (3D)more » finite difference modeling, as well as statistical characterization of geological heterogeneity. Such capabilities coupled with a rapid field analysis methodology based on matched field processing are applied to problems associated with surveillance, battlefield management, finding hard and deeply buried targets, and portal monitoring. This project, in support of LLNL's national-security mission, benefits the U.S. military and intelligence community. Fiscal year (FY) 2003 was the final year of this project. In the 2.5 years this project has been active, numerous and varied developments and milestones have been accomplished. A wireless communication module for seismic data was developed to facilitate rapid seismic data acquisition and analysis. The E3D code was enhanced to include topographic effects. Codes were developed to implement the Karhunen-Loeve (K-L) statistical methodology for generating geological heterogeneity that can be utilized in E3D modeling. The matched field processing methodology applied to vehicle tracking and based on a field calibration to characterize geological heterogeneity was tested and successfully demonstrated in a tank tracking experiment at the Nevada Test Site. A three-seismic-array vehicle tracking testbed was installed on site at LLNL for testing real-time seismic tracking methods. A field experiment was conducted over a tunnel at the Nevada Site that quantified the tunnel reflection signal and, coupled with modeling, identified key needs and requirements in experimental layout of sensors. A large field experiment was conducted at the Lake Lynn Laboratory, a mine safety research facility in Pennsylvania, over a tunnel complex in realistic, difficult conditions. This experiment gathered the necessary data for a full 3D attempt to apply the methodology. The experiment also collected data to analyze the capabilities to detect and locate in-tunnel explosions for mine safety and other applications. In FY03 specifically, a large and complex simulation experiment was conducted that tested the full modeling-based approach to geological characterization using E2D, the K-L statistical methodology, and matched field processing applied to tunnel detection with surface seismic sensors. The simulation validated the full methodology and the need for geological heterogeneity to be accounted for in the overall approach. The Lake Lynn site area was geologically modeled using the code Earthvision to produce a 32 million node 3D model grid for E3D. Model linking issues were resolved and a number of full 3D model runs were accomplished using shot locations that matched the data. E3D-generated wavefield movies showed the reflection signal would be too small to be observed in the data due to trapped and attenuated energy in the weathered layer. An analysis of the few sensors coupled to bedrock did not improve the reflection signal strength sufficiently because the shots, though buried, were within the surface layer and hence attenuated. Ability to model a complex 3D geological structure and calculate synthetic seismograms that are in good agreement with actual data (especially for surface waves and below the complex weathered layer) was demonstrated. We conclude that E3D is a powerful tool for assessing the conditions under which a tunnel could be detected in a specific geological setting. Finally, the Lake Lynn tunnel explosion data were analyzed using standard array processing techniques. The results showed that single detonations could be detected and located but simultaneous detonations would require a strategic placement of arrays.« less
NASA Technical Reports Server (NTRS)
Hall, Edward; Magner, James
2011-01-01
This report is provided as part of ITT s NASA Glenn Research Center Aerospace Communication Systems Technical Support (ACSTS) contract NNC05CA85C, Task 7: New ATM Requirements-Future Communications, C-Band and L-Band Communications Standard Development and was based on direction provided by FAA project-level agreements for New ATM Requirements-Future Communications. Task 7 included two subtasks. Subtask 7-1 addressed C-band (5091- to 5150-MHz) airport surface data communications standards development, systems engineering, test bed and prototype development, and tests and demonstrations to establish operational capability for the Aeronautical Mobile Airport Communications System (AeroMACS). Subtask 7-2 focused on systems engineering and development support of the L-band digital aeronautical communications system (L-DACS). Subtask 7-1 consisted of two phases. Phase I included development of AeroMACS concepts of use, requirements, architecture, and initial high-level safety risk assessment. Phase II builds on Phase I results and is presented in two volumes. Volume I is devoted to concepts of use, system requirements, and architecture, including AeroMACS design considerations. Volume II (this document) describes an AeroMACS prototype evaluation and presents final AeroMACS recommendations. This report also describes airport categorization and channelization methodologies. The purposes of the airport categorization task were (1) to facilitate initial AeroMACS architecture designs and enable budgetary projections by creating a set of airport categories based on common airport characteristics and design objectives, and (2) to offer high-level guidance to potential AeroMACS technology and policy development sponsors and service providers. A channelization plan methodology was developed because a common global methodology is needed to assure seamless interoperability among diverse AeroMACS services potentially supplied by multiple service providers.
Human factor engineering based design and modernization of control rooms with new I and C systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Larraz, J.; Rejas, L.; Ortega, F.
2012-07-01
Instrumentation and Control (I and C) systems of the latest nuclear power plants are based on the use of digital technology, distributed control systems and the integration of information in data networks (Distributed Control and Instrumentation Systems). This has a repercussion on Control Rooms (CRs), where the operations and monitoring interfaces correspond to these systems. These technologies are also used in modernizing I and C systems in currently operative nuclear power plants. The new interfaces provide additional capabilities for operation and supervision, as well as a high degree of flexibility, versatility and reliability. An example of this is the implementationmore » of solutions such as compact stations, high level supervision screens, overview displays, computerized procedures, new operational support systems or intelligent alarms processing systems in the modernized Man-Machine Interface (MMI). These changes in the MMI are accompanied by newly added Software (SW) controls and new solutions in automation. Tecnatom has been leading various projects in this area for several years, both in Asian countries and in the United States, using in all cases international standards from which Tecnatom own methodologies have been developed and optimized. The experience acquired in applying this methodology to the design of new control rooms is to a large extent applicable also to the modernization of current control rooms. An adequate design of the interface between the operator and the systems will facilitate safe operation, contribute to the prompt identification of problems and help in the distribution of tasks and communications between the different members of the operating shift. Based on Tecnatom experience in the field, this article presents the methodological approach used as well as the most relevant aspects of this kind of project. (authors)« less
Biosonar-inspired technology: goals, challenges and insights.
Müller, Rolf; Kuc, Roman
2007-12-01
Bioinspired engineering based on biosonar systems in nature is reviewed and discussed in terms of the merits of different approaches and their results: biosonar systems are attractive technological paragons because of their capabilities, built-in task-specific knowledge, intelligent system integration and diversity. Insights from the diverse set of sensing tasks solved by bats are relevant to a wide range of application areas such as sonar, biomedical ultrasound, non-destructive testing, sensors for autonomous systems and wireless communication. Challenges in the design of bioinspired sonar systems are posed by transducer performance, actuation for sensor mobility, design, actuation and integration of beamforming baffle shapes, echo encoding for signal processing, estimation algorithms and their implementations, as well as system integration and feedback control. The discussed examples of experimental systems have capabilities that include localization and tracking using binaural and multiple-band hearing as well as self-generated dynamic cues, classification of small deterministic and large random targets, beamforming with bioinspired baffle shapes, neuromorphic spike processing, artifact rejection in sonar maps and passing range estimation. In future research, bioinspired engineering could capitalize on some of its strengths to serve as a model system for basic automation methodologies for the bioinspired engineering process.
Thermophysics modeling of an infrared detector cryochamber for transient operational scenario
NASA Astrophysics Data System (ADS)
Singhal, Mayank; Singhal, Gaurav; Verma, Avinash C.; Kumar, Sushil; Singh, Manmohan
2016-05-01
An infrared detector (IR) is essentially a transducer capable of converting radiant energy in the infrared regime into a measurable form. The benefit of infrared radiation is that it facilitates viewing objects in dark or through obscured conditions by detecting the infrared energy emitted by them. One of the most significant applications of IR detector systems is for target acquisition and tracking of projectile systems. IR detectors also find widespread applications in the industry and commercial market. The performance of infrared detector is sensitive to temperatures and performs best when cooled to cryogenic temperatures in the range of nearly 120 K. However, the necessity to operate in such cryogenic regimes increases the complexity in the application of IR detectors. This entails a need for detailed thermophysics analysis to be able to determine the actual cooling load specific to the application and also due to its interaction with the environment. This will enable design of most appropriate cooling methodologies suitable for specific scenarios. The focus of the present work is to develop a robust thermo-physical numerical methodology for predicting IR cryochamber behavior under transient conditions, which is the most critical scenario, taking into account all relevant heat loads including radiation in its original form. The advantage of the developed code against existing commercial software (COMSOL, ANSYS, etc.), is that it is capable of handling gas conduction together with radiation terms effectively, employing a ubiquitous software such as MATLAB. Also, it requires much smaller computational resources and is significantly less time intensive. It provides physically correct results enabling thermal characterization of cryochamber geometry in conjunction with appropriate cooling methodology. The code has been subsequently validated experimentally as the observed cooling characteristics are found to be in close agreement with the results predicted using the developed model thereby proving its efficacy.
2017-01-01
Chemiluminescence probes are considered to be among the most sensitive diagnostic tools that provide high signal-to-noise ratio for various applications such as DNA detection and immunoassays. We have developed a new molecular methodology to design and foresee light-emission properties of turn-ON chemiluminescence dioxetane probes suitable for use under physiological conditions. The methodology is based on incorporation of a substituent on the benzoate species obtained during the chemiexcitation pathway of Schaap’s adamantylidene–dioxetane probe. The substituent effect was initially evaluated on the fluorescence emission generated by the benzoate species and then on the chemiluminescence of the dioxetane luminophores. A striking substituent effect on the chemiluminescence efficiency of the probes was obtained when acrylate and acrylonitrile electron-withdrawing groups were installed. The chemiluminescence quantum yield of the best probe was more than 3 orders of magnitude higher than that of a standard, commercially available adamantylidene–dioxetane probe. These are the most powerful chemiluminescence dioxetane probes synthesized to date that are suitable for use under aqueous conditions. One of our probes was capable of providing high-quality chemiluminescence cell images based on endogenous activity of β-galactosidase. This is the first demonstration of cell imaging achieved by a non-luciferin small-molecule probe with direct chemiluminescence mode of emission. We anticipate that the strategy presented here will lead to development of efficient chemiluminescence probes for various applications in the field of sensing and imaging. PMID:28470053
Polyphony: superposition independent methods for ensemble-based drug discovery.
Pitt, William R; Montalvão, Rinaldo W; Blundell, Tom L
2014-09-30
Structure-based drug design is an iterative process, following cycles of structural biology, computer-aided design, synthetic chemistry and bioassay. In favorable circumstances, this process can lead to the structures of hundreds of protein-ligand crystal structures. In addition, molecular dynamics simulations are increasingly being used to further explore the conformational landscape of these complexes. Currently, methods capable of the analysis of ensembles of crystal structures and MD trajectories are limited and usually rely upon least squares superposition of coordinates. Novel methodologies are described for the analysis of multiple structures of a protein. Statistical approaches that rely upon residue equivalence, but not superposition, are developed. Tasks that can be performed include the identification of hinge regions, allosteric conformational changes and transient binding sites. The approaches are tested on crystal structures of CDK2 and other CMGC protein kinases and a simulation of p38α. Known interaction - conformational change relationships are highlighted but also new ones are revealed. A transient but druggable allosteric pocket in CDK2 is predicted to occur under the CMGC insert. Furthermore, an evolutionarily-conserved conformational link from the location of this pocket, via the αEF-αF loop, to phosphorylation sites on the activation loop is discovered. New methodologies are described and validated for the superimposition independent conformational analysis of large collections of structures or simulation snapshots of the same protein. The methodologies are encoded in a Python package called Polyphony, which is released as open source to accompany this paper [http://wrpitt.bitbucket.org/polyphony/].
NASA Astrophysics Data System (ADS)
Mancuso, Peter Timothy
Fixed-wing unmanned aerial vehicles (UAVs) that offer vertical takeoff and landing (VTOL) and forward flight capability suffer from sub-par performance in both flight modes. Achieving the next generation of efficient hybrid aircraft requires innovations in: (i) power management, (ii) efficient structures, and (iii) control methodologies. Existing hybrid UAVs generally utilize one of three transitioning mechanisms: an external power mechanism to tilt the rotor-propulsion pod, separate propulsion units and rotors during hover and forward flight, or tilt body craft (smaller scale). Thus, hybrid concepts require more energy compared to dedicated fixed-wing or rotorcraft UAVs. Moreover, design trade-offs to reinforce the wing structure (typically to accommodate the propulsion systems and enable hover, i.e. tilt-rotor concepts) adversely impacts the aerodynamics, controllability and efficiency of the aircraft in both hover and forward flight modes. The goal of this research is to develop more efficient VTOL/ hover and forward flight UAVs. In doing so, the transition sequence, transition mechanism, and actuator performance are heavily considered. A design and control methodology was implemented to address these issues through a series of computer simulations and prototype benchtop tests to verify the proposed solution. Finally, preliminary field testing with a first-generation prototype was conducted. The methods used in this research offer guidelines and a new dual-arm rotor UAV concept to designing more efficient hybrid UAVs in both hover and forward flight.
An extensible six-step methodology to automatically generate fuzzy DSSs for diagnostic applications.
d'Acierno, Antonio; Esposito, Massimo; De Pietro, Giuseppe
2013-01-01
The diagnosis of many diseases can be often formulated as a decision problem; uncertainty affects these problems so that many computerized Diagnostic Decision Support Systems (in the following, DDSSs) have been developed to aid the physician in interpreting clinical data and thus to improve the quality of the whole process. Fuzzy logic, a well established attempt at the formalization and mechanization of human capabilities in reasoning and deciding with noisy information, can be profitably used. Recently, we informally proposed a general methodology to automatically build DDSSs on the top of fuzzy knowledge extracted from data. We carefully refine and formalize our methodology that includes six stages, where the first three stages work with crisp rules, whereas the last three ones are employed on fuzzy models. Its strength relies on its generality and modularity since it supports the integration of alternative techniques in each of its stages. The methodology is designed and implemented in the form of a modular and portable software architecture according to a component-based approach. The architecture is deeply described and a summary inspection of the main components in terms of UML diagrams is outlined as well. A first implementation of the architecture has been then realized in Java following the object-oriented paradigm and used to instantiate a DDSS example aimed at accurately diagnosing breast masses as a proof of concept. The results prove the feasibility of the whole methodology implemented in terms of the architecture proposed.
An extensible six-step methodology to automatically generate fuzzy DSSs for diagnostic applications
2013-01-01
Background The diagnosis of many diseases can be often formulated as a decision problem; uncertainty affects these problems so that many computerized Diagnostic Decision Support Systems (in the following, DDSSs) have been developed to aid the physician in interpreting clinical data and thus to improve the quality of the whole process. Fuzzy logic, a well established attempt at the formalization and mechanization of human capabilities in reasoning and deciding with noisy information, can be profitably used. Recently, we informally proposed a general methodology to automatically build DDSSs on the top of fuzzy knowledge extracted from data. Methods We carefully refine and formalize our methodology that includes six stages, where the first three stages work with crisp rules, whereas the last three ones are employed on fuzzy models. Its strength relies on its generality and modularity since it supports the integration of alternative techniques in each of its stages. Results The methodology is designed and implemented in the form of a modular and portable software architecture according to a component-based approach. The architecture is deeply described and a summary inspection of the main components in terms of UML diagrams is outlined as well. A first implementation of the architecture has been then realized in Java following the object-oriented paradigm and used to instantiate a DDSS example aimed at accurately diagnosing breast masses as a proof of concept. Conclusions The results prove the feasibility of the whole methodology implemented in terms of the architecture proposed. PMID:23368970
NASA Technical Reports Server (NTRS)
Melcher, Kevin J.; Cruz, Jose A.; Johnson Stephen B.; Lo, Yunnhon
2015-01-01
This paper describes a quantitative methodology for bounding the false positive (FP) and false negative (FN) probabilities associated with a human-rated launch vehicle abort trigger (AT) that includes sensor data qualification (SDQ). In this context, an AT is a hardware and software mechanism designed to detect the existence of a specific abort condition. Also, SDQ is an algorithmic approach used to identify sensor data suspected of being corrupt so that suspect data does not adversely affect an AT's detection capability. The FP and FN methodologies presented here were developed to support estimation of the probabilities of loss of crew and loss of mission for the Space Launch System (SLS) which is being developed by the National Aeronautics and Space Administration (NASA). The paper provides a brief overview of system health management as being an extension of control theory; and describes how ATs and the calculation of FP and FN probabilities relate to this theory. The discussion leads to a detailed presentation of the FP and FN methodology and an example showing how the FP and FN calculations are performed. This detailed presentation includes a methodology for calculating the change in FP and FN probabilities that result from including SDQ in the AT architecture. To avoid proprietary and sensitive data issues, the example incorporates a mixture of open literature and fictitious reliability data. Results presented in the paper demonstrate the effectiveness of the approach in providing quantitative estimates that bound the probability of a FP or FN abort determination.
Climate Model Diagnostic Analyzer Web Service System
NASA Astrophysics Data System (ADS)
Lee, S.; Pan, L.; Zhai, C.; Tang, B.; Kubar, T. L.; Li, J.; Zhang, J.; Wang, W.
2015-12-01
Both the National Research Council Decadal Survey and the latest Intergovernmental Panel on Climate Change Assessment Report stressed the need for the comprehensive and innovative evaluation of climate models with the synergistic use of global satellite observations in order to improve our weather and climate simulation and prediction capabilities. The abundance of satellite observations for fundamental climate parameters and the availability of coordinated model outputs from CMIP5 for the same parameters offer a great opportunity to understand and diagnose model biases in climate models. In addition, the Obs4MIPs efforts have created several key global observational datasets that are readily usable for model evaluations. However, a model diagnostic evaluation process requires physics-based multi-variable comparisons that typically involve large-volume and heterogeneous datasets, making them both computationally- and data-intensive. In response, we have developed a novel methodology to diagnose model biases in contemporary climate models and implementing the methodology as a web-service based, cloud-enabled, provenance-supported climate-model evaluation system. The evaluation system is named Climate Model Diagnostic Analyzer (CMDA), which is the product of the research and technology development investments of several current and past NASA ROSES programs. The current technologies and infrastructure of CMDA are designed and selected to address several technical challenges that the Earth science modeling and model analysis community faces in evaluating and diagnosing climate models. In particular, we have three key technology components: (1) diagnostic analysis methodology; (2) web-service based, cloud-enabled technology; (3) provenance-supported technology. The diagnostic analysis methodology includes random forest feature importance ranking, conditional probability distribution function, conditional sampling, and time-lagged correlation map. We have implemented the new methodology as web services and incorporated the system into the Cloud. We have also developed a provenance management system for CMDA where CMDA service semantics modeling, service search and recommendation, and service execution history management are designed and implemented.
Blade Testing Trends (Presentation)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Desmond, M.
2014-08-01
As an invited guest speaker, Michael Desmond presented on NREL's NWTC structural testing methods and capabilities at the 2014 Sandia Blade Workshop held on August 26-28, 2014 in Albuquerque, NM. Although dynamometer and field testing capabilities were mentioned, the presentation focused primarily on wind turbine blade testing, including descriptions and capabilities for accredited certification testing, historical methodology and technology deployment, and current research and development activities.
Aiming at Tobacco Harm Reduction: A survey comparing smokers differing in readiness to quit
Loumakou, Maria; Brouskeli, Vasiliki; Sarafidou, Jasmin-Olga
2006-01-01
Background Greece has the highest smoking rates (in the 15-nation bloc) in Europe. The purpose of this study was to investigate Greek smokers' intention and appraisal of capability to quit employing the theoretical frameworks of Decisional Balance (DB) and Cognitive Dissonance (CD). Methods A cross-sectional study including 401 Greek habitual smokers (205 men and 195 women), falling into four groups according to their intention and self-appraised capability to quit smoking was carried out. Participants completed a questionnaire recording their attitude towards smoking, intention and self appraised capability to quit smoking, socio-demographic information, as well as a DB and a CD scale. Results The most numerous group of smokers (38%) consisted of those who neither intended nor felt capable to quit and these smokers perceived more benefits of smoking than negatives. DB changed gradually according to smokers' "readiness" to quit: the more ready they felt to quit the less the pros of smoking outnumbered the cons. Regarding relief of CD, smokers who intended but did not feel capable to quit employed more "excuses" compared to those who felt capable. Additionally smokers with a past history of unsuccessful quit attempts employed fewer "excuses" even though they were more frequently found among those who intended but did not feel capable to quit. Conclusion Findings provide support for the DB theory. On the other hand, "excuses" do not appear to be extensively employed to reduce the conflict between smoking and concern for health. There is much heterogeneity regarding smokers' intention and appraised capability to quit, reflecting theoretical and methodological problems with the distinction among stages of change. Harm reduction programs and interventions designed to increase the implementation of smoking cessation should take into account the detrimental effect of past unsuccessful quit attempts. PMID:16569250
System Dynamics Aviation Readiness Modeling Demonstration
2005-08-31
requirements. It is recommended that the Naval Aviation Enterprise take a close look at the requirements i.e., performance measures, methodology ...unit’s capability to perform specific Joint Mission Essential Task List (JMETL) requirements now and in the future. This assessment methodology must...the time-associated costs. The new methodology must base decisions on currently available data and databases. A “useful” readiness model should be
Nanosatellite and Plug-and-Play Architecture 2 (NAPA 2)
2017-02-28
potentially other militarily relevant roles. The "i- Missions" focus area studies the kinetics of rapid mission development. The methodology involves...the US and Sweden in the Nanosatellite and Plug-and-play Architecture or "NAPA" program) is to pioneer a methodology for creating mission capable 6U...spacecraft. The methodology involves interchangeable blackbox (self-describing) components, software (middleware and applications), advanced
CORSSTOL: Cylinder Optimization of Rings, Skin, and Stringers with Tolerance sensitivity
NASA Technical Reports Server (NTRS)
Finckenor, J.; Bevill, M.
1995-01-01
Cylinder Optimization of Rings, Skin, and Stringers with Tolerance (CORSSTOL) sensitivity is a design optimization program incorporating a method to examine the effects of user-provided manufacturing tolerances on weight and failure. CORSSTOL gives designers a tool to determine tolerances based on need. This is a decisive way to choose the best design among several manufacturing methods with differing capabilities and costs. CORSSTOL initially optimizes a stringer-stiffened cylinder for weight without tolerances. The skin and stringer geometry are varied, subject to stress and buckling constraints. Then the same analysis and optimization routines are used to minimize the maximum material condition weight subject to the least favorable combination of tolerances. The adjusted optimum dimensions are provided with the weight and constraint sensitivities of each design variable. The designer can immediately identify critical tolerances. The safety of parts made out of tolerance can also be determined. During design and development of weight-critical systems, design/analysis tools that provide product-oriented results are of vital significance. The development of this program and methodology provides designers with an effective cost- and weight-saving design tool. The tolerance sensitivity method can be applied to any system defined by a set of deterministic equations.
Improving the Unsteady Aerodynamic Performance of Transonic Turbines using Neural Networks
NASA Technical Reports Server (NTRS)
Rai, Man Mohan; Madavan, Nateri K.; Huber, Frank W.
1999-01-01
A recently developed neural net-based aerodynamic design procedure is used in the redesign of a transonic turbine stage to improve its unsteady aerodynamic performance. The redesign procedure used incorporates the advantages of both traditional response surface methodology and neural networks by employing a strategy called parameter-based partitioning of the design space. Starting from the reference design, a sequence of response surfaces based on both neural networks and polynomial fits are constructed to traverse the design space in search of an optimal solution that exhibits improved unsteady performance. The procedure combines the power of neural networks and the economy of low-order polynomials (in terms of number of simulations required and network training requirements). A time-accurate, two-dimensional, Navier-Stokes solver is used to evaluate the various intermediate designs and provide inputs to the optimization procedure. The procedure yielded a modified design that improves the aerodynamic performance through small changes to the reference design geometry. These results demonstrate the capabilities of the neural net-based design procedure, and also show the advantages of including high-fidelity unsteady simulations that capture the relevant flow physics in the design optimization process.
Neural Net-Based Redesign of Transonic Turbines for Improved Unsteady Aerodynamic Performance
NASA Technical Reports Server (NTRS)
Madavan, Nateri K.; Rai, Man Mohan; Huber, Frank W.
1998-01-01
A recently developed neural net-based aerodynamic design procedure is used in the redesign of a transonic turbine stage to improve its unsteady aerodynamic performance. The redesign procedure used incorporates the advantages of both traditional response surface methodology (RSM) and neural networks by employing a strategy called parameter-based partitioning of the design space. Starting from the reference design, a sequence of response surfaces based on both neural networks and polynomial fits are constructed to traverse the design space in search of an optimal solution that exhibits improved unsteady performance. The procedure combines the power of neural networks and the economy of low-order polynomials (in terms of number of simulations required and network training requirements). A time-accurate, two-dimensional, Navier-Stokes solver is used to evaluate the various intermediate designs and provide inputs to the optimization procedure. The optimization procedure yields a modified design that improves the aerodynamic performance through small changes to the reference design geometry. The computed results demonstrate the capabilities of the neural net-based design procedure, and also show the tremendous advantages that can be gained by including high-fidelity unsteady simulations that capture the relevant flow physics in the design optimization process.
Re-Engineering Complex Legacy Systems at NASA
NASA Technical Reports Server (NTRS)
Ruszkowski, James; Meshkat, Leila
2010-01-01
The Flight Production Process (FPP) Re-engineering project has established a Model-Based Systems Engineering (MBSE) methodology and the technological infrastructure for the design and development of a reference, product-line architecture as well as an integrated workflow model for the Mission Operations System (MOS) for human space exploration missions at NASA Johnson Space Center. The design and architectural artifacts have been developed based on the expertise and knowledge of numerous Subject Matter Experts (SMEs). The technological infrastructure developed by the FPP Re-engineering project has enabled the structured collection and integration of this knowledge and further provides simulation and analysis capabilities for optimization purposes. A key strength of this strategy has been the judicious combination of COTS products with custom coding. The lean management approach that has led to the success of this project is based on having a strong vision for the whole lifecycle of the project and its progress over time, a goal-based design and development approach, a small team of highly specialized people in areas that are critical to the project, and an interactive approach for infusing new technologies into existing processes. This project, which has had a relatively small amount of funding, is on the cutting edge with respect to the utilization of model-based design and systems engineering. An overarching challenge that was overcome by this project was to convince upper management of the needs and merits of giving up more conventional design methodologies (such as paper-based documents and unwieldy and unstructured flow diagrams and schedules) in favor of advanced model-based systems engineering approaches.
Molina-Viedma, Ángel Jesús; López-Alba, Elías; Felipe-Sesé, Luis; Díaz, Francisco A; Rodríguez-Ahlquist, Javier; Iglesias-Vallejo, Manuel
2018-02-02
In real aircraft structures the comfort and the occupational performance of crewmembers and passengers are affected by the presence of noise. In this sense, special attention is focused on mechanical and material design for isolation and vibration control. Experimental characterization and, in particular, experimental modal analysis, provides information for adequate cabin noise control. Traditional sensors employed in the aircraft industry for this purpose are invasive and provide a low spatial resolution. This paper presents a methodology for experimental modal characterization of a front fuselage full-scale demonstrator using high-speed 3D digital image correlation, which is non-invasive, ensuring that the structural response is unperturbed by the instrumentation mass. Specifically, full-field measurements on the passenger window area were conducted when the structure was excited using an electrodynamic shaker. The spectral analysis of the measured time-domain displacements made it possible to identify natural frequencies and full-field operational deflection shapes. Changes in the modal parameters due to cabin pressurization and the behavior of different local structural modifications were assessed using this methodology. The proposed full-field methodology allowed the characterization of relevant dynamic response patterns, complementing the capabilities provided by accelerometers.
López-Alba, Elías; Felipe-Sesé, Luis; Díaz, Francisco A.; Rodríguez-Ahlquist, Javier; Iglesias-Vallejo, Manuel
2018-01-01
In real aircraft structures the comfort and the occupational performance of crewmembers and passengers are affected by the presence of noise. In this sense, special attention is focused on mechanical and material design for isolation and vibration control. Experimental characterization and, in particular, experimental modal analysis, provides information for adequate cabin noise control. Traditional sensors employed in the aircraft industry for this purpose are invasive and provide a low spatial resolution. This paper presents a methodology for experimental modal characterization of a front fuselage full-scale demonstrator using high-speed 3D digital image correlation, which is non-invasive, ensuring that the structural response is unperturbed by the instrumentation mass. Specifically, full-field measurements on the passenger window area were conducted when the structure was excited using an electrodynamic shaker. The spectral analysis of the measured time-domain displacements made it possible to identify natural frequencies and full-field operational deflection shapes. Changes in the modal parameters due to cabin pressurization and the behavior of different local structural modifications were assessed using this methodology. The proposed full-field methodology allowed the characterization of relevant dynamic response patterns, complementing the capabilities provided by accelerometers. PMID:29393897
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pointer, William David; Shaver, Dillon; Liu, Yang
The U.S. Department of Energy, Office of Nuclear Energy charges participants in the Nuclear Energy Advanced Modeling and Simulation (NEAMS) program with the development of advanced modeling and simulation capabilities that can be used to address design, performance and safety challenges in the development and deployment of advanced reactor technology. The NEAMS has established a high impact problem (HIP) team to demonstrate the applicability of these tools to identification and mitigation of sources of steam generator flow induced vibration (SGFIV). The SGFIV HIP team is working to evaluate vibration sources in an advanced helical coil steam generator using computational fluidmore » dynamics (CFD) simulations of the turbulent primary coolant flow over the outside of the tubes and CFD simulations of the turbulent multiphase boiling secondary coolant flow inside the tubes integrated with high resolution finite element method assessments of the tubes and their associated structural supports. This report summarizes the demonstration of a methodology for the multiphase boiling flow analysis inside the helical coil steam generator tube. A helical coil steam generator configuration has been defined based on the experiments completed by Polytecnico di Milano in the SIET helical coil steam generator tube facility. Simulations of the defined problem have been completed using the Eulerian-Eulerian multi-fluid modeling capabilities of the commercial CFD code STAR-CCM+. Simulations suggest that the two phases will quickly stratify in the slightly inclined pipe of the helical coil steam generator. These results have been successfully benchmarked against both empirical correlations for pressure drop and simulations using an alternate CFD methodology, the dispersed phase mixture modeling capabilities of the open source CFD code Nek5000.« less
Analysis and design of a 3rd order velocity-controlled closed-loop for MEMS vibratory gyroscopes.
Wu, Huan-ming; Yang, Hai-gang; Yin, Tao; Jiao, Ji-wei
2013-09-18
The time-average method currently available is limited to analyzing the specific performance of the automatic gain control-proportional and integral (AGC-PI) based velocity-controlled closed-loop in a micro-electro-mechanical systems (MEMS) vibratory gyroscope, since it is hard to solve nonlinear functions in the time domain when the control loop reaches to 3rd order. In this paper, we propose a linearization design approach to overcome this limitation by establishing a 3rd order linear model of the control loop and transferring the analysis to the frequency domain. Order reduction is applied on the built linear model's transfer function by constructing a zero-pole doublet, and therefore mathematical expression of each control loop's performance specification is obtained. Then an optimization methodology is summarized, which reveals that a robust, stable and swift control loop can be achieved by carefully selecting the system parameters following a priority order. Closed-loop drive circuits are designed and implemented using 0.35 μm complementary metal oxide semiconductor (CMOS) process, and experiments carried out on a gyroscope prototype verify the optimization methodology that an optimized stability of the control loop can be achieved by constructing the zero-pole doublet, and disturbance rejection capability (D.R.C) of the control loop can be improved by increasing the integral term.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Edwin A. Harvego; James E. O'Brien; Michael G. McKellar
2012-11-01
Results of a system evaluation and lifecycle cost analysis are presented for a commercial-scale high-temperature electrolysis (HTE) central hydrogen production plant. The plant design relies on grid electricity to power the electrolysis process and system components, and industrial natural gas to provide process heat. The HYSYS process analysis software was used to evaluate the reference central plant design capable of producing 50,000 kg/day of hydrogen. The HYSYS software performs mass and energy balances across all components to allow optimization of the design using a detailed process flow sheet and realistic operating conditions specified by the analyst. The lifecycle cost analysismore » was performed using the H2A analysis methodology developed by the Department of Energy (DOE) Hydrogen Program. This methodology utilizes Microsoft Excel spreadsheet analysis tools that require detailed plant performance information (obtained from HYSYS), along with financial and cost information to calculate lifecycle costs. The results of the lifecycle analyses indicate that for a 10% internal rate of return, a large central commercial-scale hydrogen production plant can produce 50,000 kg/day of hydrogen at an average cost of $2.68/kg. When the cost of carbon sequestration is taken into account, the average cost of hydrogen production increases by $0.40/kg to $3.08/kg.« less
Horizon Mission Methodology - A tool for the study of technology innovation and new paradigms
NASA Technical Reports Server (NTRS)
Anderson, John L.
1993-01-01
The Horizon Mission (HM) methodology was developed to provide a means of identifying and evaluating highly innovative, breakthrough technology concepts (BTCs) and for assessing their potential impact on advanced space missions. The methodology is based on identifying new capabilities needed by hypothetical 'horizon' space missions having performance requirements that cannot be met even by extrapolating known space technologies. Normal human evaluation of new ideas such as BTCs appears to be governed (and limited) by 'inner models of reality' defined as paradigms. Thus, new ideas are evaluated by old models. This paper describes the use of the HM Methodology to define possible future paradigms that would provide alternatives to evaluation by current paradigms. The approach is to represent a future paradigm by a set of new BTC-based capabilities - called a paradigm abstract. The paper describes methods of constructing and using the abstracts for evaluating BTCs for space applications and for exploring the concept of paradigms and paradigm shifts as a representation of technology innovation.
The effect of requirements prioritization on avionics system conceptual design
NASA Astrophysics Data System (ADS)
Lorentz, John
This dissertation will provide a detailed approach and analysis of a new collaborative requirements prioritization methodology that has been used successfully on four Coast Guard avionics acquisition and development programs valued at $400M+. A statistical representation of participant study results will be discussed and analyzed in detail. Many technically compliant projects fail to deliver levels of performance and capability that the customer desires. Some of these systems completely meet "threshold" levels of performance; however, the distribution of resources in the process devoted to the development and management of the requirements does not always represent the voice of the customer. This is especially true for technically complex projects such as modern avionics systems. A simplified facilitated process for prioritization of system requirements will be described. The collaborative prioritization process, and resulting artifacts, aids the systems engineer during early conceptual design. All requirements are not the same in terms of customer priority. While there is a tendency to have many thresholds inside of a system design, there is usually a subset of requirements and system performance that is of the utmost importance to the design. These critical capabilities and critical levels of performance typically represent the reason the system is being built. The systems engineer needs processes to identify these critical capabilities, the associated desired levels of performance, and the risks associated with the specific requirements that define the critical capability. The facilitated prioritization exercise is designed to collaboratively draw out these critical capabilities and levels of performance so they can be emphasized in system design. Developing the purpose, scheduling and process for prioritization events are key elements of systems engineering and modern project management. The benefits of early collaborative prioritization flow throughout the project schedule, resulting in greater success during system deployment and operational testing. This dissertation will discuss the data and findings from participant studies, present a literature review of systems engineering and design processes, and test the hypothesis that the prioritization process had no effect on stakeholder sentiment related to the conceptual design. In addition, the "Requirements Rationalization" process will be discussed in detail. Avionics, like many other systems, has transitioned from a discrete electronics engineering, hard engineering discipline to incorporate software engineering as a core process of the technology development cycle. As with other software-based systems, avionics now has significant soft system attributes that must be considered in the design process. The boundless opportunities that exist in software design demand prioritization to focus effort onto the critical functions that the software must provide. This has been a well documented and understood phenomenon in the software development community for many years. This dissertation will attempt to link the effect of software integrated avionics to the benefits of prioritization of requirements in the problem space and demonstrate the sociological and technical benefits of early prioritization practices.
Szczurek, Aleksander; Birk, Udo; Knecht, Hans; Dobrucki, Jurek; Mai, Sabine; Cremer, Christoph
2018-01-01
Methods of super-resolving light microscopy (SRM) have found an exponentially growing range of applications in cell biology, including nuclear structure analyses. Recent developments have proven that Single Molecule Localization Microscopy (SMLM), a type of SRM, is particularly useful for enhanced spatial analysis of the cell nucleus due to its highest resolving capability combined with very specific fluorescent labeling. In this commentary we offer a brief review of the latest methodological development in the field of SMLM of chromatin designated DNA Structure Fluctuation Assisted Binding Activated Localization Microscopy (abbreviated as fBALM) as well as its potential future applications in biology and medicine.
Advanced flight control system study
NASA Technical Reports Server (NTRS)
Hartmann, G. L.; Wall, J. E., Jr.; Rang, E. R.; Lee, H. P.; Schulte, R. W.; Ng, W. K.
1982-01-01
A fly by wire flight control system architecture designed for high reliability includes spare sensor and computer elements to permit safe dispatch with failed elements, thereby reducing unscheduled maintenance. A methodology capable of demonstrating that the architecture does achieve the predicted performance characteristics consists of a hierarchy of activities ranging from analytical calculations of system reliability and formal methods of software verification to iron bird testing followed by flight evaluation. Interfacing this architecture to the Lockheed S-3A aircraft for flight test is discussed. This testbed vehicle can be expanded to support flight experiments in advanced aerodynamics, electromechanical actuators, secondary power systems, flight management, new displays, and air traffic control concepts.
Hydrocode predictions of collisional outcomes: Effects of target size
NASA Technical Reports Server (NTRS)
Ryan, Eileen V.; Asphaug, Erik; Melosh, H. J.
1991-01-01
Traditionally, laboratory impact experiments, designed to simulate asteroid collisions, attempted to establish a predictive capability for collisional outcomes given a particular set of initial conditions. Unfortunately, laboratory experiments are restricted to using targets considerably smaller than the modelled objects. It is therefore necessary to develop some methodology for extrapolating the extensive experimental results to the size regime of interest. Results are reported obtained through the use of two dimensional hydrocode based on 2-D SALE and modified to include strength effects and the fragmentation equations. The hydrocode was tested by comparing its predictions for post-impact fragment size distributions to those observed in laboratory impact experiments.
The Development of Design Tools for Fault Tolerant Quantum Dot Cellular Automata Based Logic
NASA Technical Reports Server (NTRS)
Armstrong, Curtis D.; Humphreys, William M.
2003-01-01
We are developing software to explore the fault tolerance of quantum dot cellular automata gate architectures in the presence of manufacturing variations and device defects. The Topology Optimization Methodology using Applied Statistics (TOMAS) framework extends the capabilities of the A Quantum Interconnected Network Array Simulator (AQUINAS) by adding front-end and back-end software and creating an environment that integrates all of these components. The front-end tools establish all simulation parameters, configure the simulation system, automate the Monte Carlo generation of simulation files, and execute the simulation of these files. The back-end tools perform automated data parsing, statistical analysis and report generation.
Knecht, Hans; Dobrucki, Jurek; Mai, Sabine
2018-01-01
ABSTRACT Methods of super-resolving light microscopy (SRM) have found an exponentially growing range of applications in cell biology, including nuclear structure analyses. Recent developments have proven that Single Molecule Localization Microscopy (SMLM), a type of SRM, is particularly useful for enhanced spatial analysis of the cell nucleus due to its highest resolving capability combined with very specific fluorescent labeling. In this commentary we offer a brief review of the latest methodological development in the field of SMLM of chromatin designated DNA Structure Fluctuation Assisted Binding Activated Localization Microscopy (abbreviated as fBALM) as well as its potential future applications in biology and medicine. PMID:29297245
High-Performance Modeling and Simulation of Anchoring in Granular Media for NEO Applications
NASA Technical Reports Server (NTRS)
Quadrelli, Marco B.; Jain, Abhinandan; Negrut, Dan; Mazhar, Hammad
2012-01-01
NASA is interested in designing a spacecraft capable of visiting a near-Earth object (NEO), performing experiments, and then returning safely. Certain periods of this mission would require the spacecraft to remain stationary relative to the NEO, in an environment characterized by very low gravity levels; such situations require an anchoring mechanism that is compact, easy to deploy, and upon mission completion, easy to remove. The design philosophy used in this task relies on the simulation capability of a high-performance multibody dynamics physics engine. On Earth, it is difficult to create low-gravity conditions, and testing in low-gravity environments, whether artificial or in space, can be costly and very difficult to achieve. Through simulation, the effect of gravity can be controlled with great accuracy, making it ideally suited to analyze the problem at hand. Using Chrono::Engine, a simulation pack age capable of utilizing massively parallel Graphic Processing Unit (GPU) hardware, several validation experiments were performed. Modeling of the regolith interaction has been carried out, after which the anchor penetration tests were performed and analyzed. The regolith was modeled by a granular medium composed of very large numbers of convex three-dimensional rigid bodies, subject to microgravity levels and interacting with each other with contact, friction, and cohesional forces. The multibody dynamics simulation approach used for simulating anchors penetrating a soil uses a differential variational inequality (DVI) methodology to solve the contact problem posed as a linear complementarity method (LCP). Implemented within a GPU processing environment, collision detection is greatly accelerated compared to traditional CPU (central processing unit)- based collision detection. Hence, systems of millions of particles interacting with complex dynamic systems can be efficiently analyzed, and design recommendations can be made in a much shorter time. The figure shows an example of this capability where the Brazil Nut problem is simulated: as the container full of granular material is vibrated, the large ball slowly moves upwards. This capability was expanded to account for anchors of different shapes and penetration velocities, interacting with granular soils.
Transient Reliability Analysis Capability Developed for CARES/Life
NASA Technical Reports Server (NTRS)
Nemeth, Noel N.
2001-01-01
The CARES/Life software developed at the NASA Glenn Research Center provides a general-purpose design tool that predicts the probability of the failure of a ceramic component as a function of its time in service. This award-winning software has been widely used by U.S. industry to establish the reliability and life of a brittle material (e.g., ceramic, intermetallic, and graphite) structures in a wide variety of 21st century applications.Present capabilities of the NASA CARES/Life code include probabilistic life prediction of ceramic components subjected to fast fracture, slow crack growth (stress corrosion), and cyclic fatigue failure modes. Currently, this code can compute the time-dependent reliability of ceramic structures subjected to simple time-dependent loading. For example, in slow crack growth failure conditions CARES/Life can handle sustained and linearly increasing time-dependent loads, whereas in cyclic fatigue applications various types of repetitive constant-amplitude loads can be accounted for. However, in real applications applied loads are rarely that simple but vary with time in more complex ways such as engine startup, shutdown, and dynamic and vibrational loads. In addition, when a given component is subjected to transient environmental and or thermal conditions, the material properties also vary with time. A methodology has now been developed to allow the CARES/Life computer code to perform reliability analysis of ceramic components undergoing transient thermal and mechanical loading. This means that CARES/Life will be able to analyze finite element models of ceramic components that simulate dynamic engine operating conditions. The methodology developed is generalized to account for material property variation (on strength distribution and fatigue) as a function of temperature. This allows CARES/Life to analyze components undergoing rapid temperature change in other words, components undergoing thermal shock. In addition, the capability has been developed to perform reliability analysis for components that undergo proof testing involving transient loads. This methodology was developed for environmentally assisted crack growth (crack growth as a function of time and loading), but it will be extended to account for cyclic fatigue (crack growth as a function of load cycles) as well.
Final Technical Report for "Nuclear Technologies for Near Term Fusion Devices"
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wilson, Paul P.H.; Sawan, Mohamed E.; Davis, Andrew
Over approximately 18 years, this project evolved to focus on a number of related topics, all tied to the nuclear analysis of fusion energy systems. For the earliest years, the University of Wisconsin (UW)’s effort was in support of the Advanced Power Extraction (APEX) study to investigate high power density first wall and blanket systems. A variety of design concepts were studied before this study gave way to a design effort for a US Test Blanket Module (TBM) to be installed in ITER. Simultaneous to this TBM project, nuclear analysis supported the conceptual design of a number of fusion nuclearmore » science facilities that might fill a role in the path to fusion energy. Beginning in approximately 2005, this project added a component focused on the development of novel radiation transport software capability in support of the above nuclear analysis needs. Specifically, a clear need was identified to support neutron and photon transport on the complex geometries associated with Computer-Aided Design (CAD). Following the initial development of the Direct Accelerated Geoemtry Monte Carlo (DAGMC) capability, additional features were added, including unstructured mesh tallies and multi-physics analysis such as the Rigorous 2-Step (R2S) methodology for Shutdown Dose Rate (SDR) prediction. Throughout the project, there were also smaller tasks in support of the fusion materials community and for the testing of changes to the nuclear data that is fundamental to this kind of nuclear analysis.« less
Metric integration architecture for product development
NASA Astrophysics Data System (ADS)
Sieger, David B.
1997-06-01
Present-day product development endeavors utilize the concurrent engineering philosophy as a logical means for incorporating a variety of viewpoints into the design of products. Since this approach provides no explicit procedural provisions, it is necessary to establish at least a mental coupling with a known design process model. The central feature of all such models is the management and transformation of information. While these models assist in structuring the design process, characterizing the basic flow of operations that are involved, they provide no guidance facilities. The significance of this feature, and the role it plays in the time required to develop products, is increasing in importance due to the inherent process dynamics, system/component complexities, and competitive forces. The methodology presented in this paper involves the use of a hierarchical system structure, discrete event system specification (DEVS), and multidimensional state variable based metrics. This approach is unique in its capability to quantify designer's actions throughout product development, provide recommendations about subsequent activity selection, and coordinate distributed activities of designers and/or design teams across all design stages. Conceptual design tool implementation results are used to demonstrate the utility of this technique in improving the incremental decision making process.
High-speed aerodynamic design of space vehicle and required hypersonic wind tunnel facilities
NASA Astrophysics Data System (ADS)
Sakakibara, Seizou; Hozumi, Kouichi; Soga, Kunio; Nomura, Shigeaki
Problems associated with the aerodynamic design of space vehicles with emphasis of the role of hypersonic wind tunnel facilities in the development of the vehicle are considered. At first, to identify wind tunnel and computational fluid dynamics (CFD) requirements, operational environments are postulated for hypervelocity vehicles. Typical flight corridors are shown with the associated flow density: real gas effects, low density flow, and non-equilibrium flow. Based on an evaluation of these flight regimes and consideration of the operational requirements, the wind tunnel testing requirements for the aerodynamic design are examined. Then, the aerodynamic design logic and optimization techniques to develop and refine the configurations in a traditional phased approach based on the programmatic design of space vehicle are considered. Current design methodology for the determination of aerodynamic characteristics for designing the space vehicle, i.e., (1) ground test data, (2) numerical flow field solutions and (3) flight test data, are also discussed. Based on these considerations and by identifying capabilities and limits of experimental and computational methods, the role of a large conventional hypersonic wind tunnel and the high enthalpy tunnel and the interrelationship of the wind tunnels and CFD methods in actual aerodynamic design and analysis are discussed.
Television Futures: A Social Action Methodology for Studying Interpretive Communities.
ERIC Educational Resources Information Center
Jensen, Klaus Bruhn
1990-01-01
Presents a qualitative methodology employing workshop sessions for studying audience assessment of the mass media's service to the public. Finds that viewers are capable of a sophisticated critique of television, and raises implications both for the politics of communication and for further reception studies. (MG)
ERIC Educational Resources Information Center
Heck, Ronald H.
1996-01-01
Identifies salient conceptual and methodological issues involved in cross-cultural research. Surveys principals and teachers from California and the Marshall Islands regarding perceptions of principals' leadership capabilities in three areas: school governance, school climate and culture, and instructional organization. There was substantial…
To evaluate the capabilities of in vitro assays to predict arsenic (As) relative bioavailability (RBA), we examined the relationship between As bioaccessibility, determined using a number of in vitro bioaccessibility (IVBA) methodologies (SBRC, IVG, PBET, DIN and UBM) and As RBA ...
Modeling, simulation, and control of an extraterrestrial oxygen production plant
NASA Technical Reports Server (NTRS)
Schooley, L.; Cellier, F.; Zeigler, B.; Doser, A.; Farrenkopf, G.
1991-01-01
The immediate objective is the development of a new methodology for simulation of process plants used to produce oxygen and/or other useful materials from local planetary resources. Computer communication, artificial intelligence, smart sensors, and distributed control algorithms are being developed and implemented so that the simulation or an actual plant can be controlled from a remote location. The ultimate result of this research will provide the capability for teleoperation of such process plants which may be located on Mars, Luna, an asteroid, or other objects in space. A very useful near-term result will be the creation of an interactive design tool, which can be used to create and optimize the process/plant design and the control strategy. This will also provide a vivid, graphic demonstration mechanism to convey the results of other researchers to the sponsor.
The NASA Lewis integrated propulsion and flight control simulator
NASA Technical Reports Server (NTRS)
Bright, Michelle M.; Simon, Donald L.
1991-01-01
A new flight simulation facility was developed at NASA-Lewis. The purpose of this flight simulator is to allow integrated propulsion control and flight control algorithm development and evaluation in real time. As a preliminary check of the simulator facility capabilities and correct integration of its components, the control design and physics models for a short take-off and vertical landing fighter aircraft model were shown, with their associated system integration and architecture, pilot vehicle interfaces, and display symbology. The initial testing and evaluation results show that this fixed based flight simulator can provide real time feedback and display of both airframe and propulsion variables for validation of integrated flight and propulsion control systems. Additionally, through the use of this flight simulator, various control design methodologies and cockpit mechanizations can be tested and evaluated in a real time environment.
Research of Steel-dielectric Transition Using Subminiature Eddy-current Transducer
NASA Astrophysics Data System (ADS)
Dmitriev, S. F.; Malikov, V. N.; Sagalakov, A. M.; Ishkov, A. V.
2018-05-01
The research aims to develop a subminiature transducer for electrical steel investigation. The authors determined the capability to study steel characteristics at different depths based on variations of eddy-current transducer amplitude at the steel-dielectric boundary. A subminiature transformer-type transducer was designed, which enables to perform local investigations of ferromagnetic materials using an eddy-current method based on local studies of the steel electrical conductivity. Having the designed transducer as a basis, a hardware-software complex was built to perform experimental studies of steel at the interface boundary. Test results are reported for a specimen with continuous and discrete measurements taken at different frequencies. The article provides the key technical information about the eddy current transformer used and describes the methodology of measurements that makes it possible to control steel to dielectric transition.
In Situ Potassium-Argon Geochronology Using Fluxed Fusion and a Double Spike
NASA Technical Reports Server (NTRS)
Hurowitz, Joel A.; Hecht, Michael H.; Zimmerman, Wayne F.; Neidholdt, Evan L.; Sinha, Mahadeva P.; Sturhahn, Wolfgang; Coleman, Max; McCleese, Daniel J.; Farley, Kenneth A.; Eiler, John M.;
2012-01-01
A document highlights an Li-based fluxing agent that enables sample fusion and quantitative Ar-release at relatively low temperatures (900-1,000 C), readily achievable with current flight resistance furnace designs. A solid, double spike containing known quantities of Ar-39 and K-41 was developed that, when added in known amounts to a sample, enables the extraction of a Ar-40/K-40 ratio for age estimation without a sample mass measurement. The use of a combination of a flux and a double spike as a means of solving the mechanical hurdles to an in situ K-Ar geochronology measurement has never been proposed before. This methodology and instrument design would provide a capability for assessing the ages of rocks and minerals on the surfaces of planets and other rocky terrestrial bodies in the solar system.
A programing system for research and applications in structural optimization
NASA Technical Reports Server (NTRS)
Sobieszczanski-Sobieski, J.; Rogers, J. L., Jr.
1981-01-01
The paper describes a computer programming system designed to be used for methodology research as well as applications in structural optimization. The flexibility necessary for such diverse utilizations is achieved by combining, in a modular manner, a state-of-the-art optimization program, a production level structural analysis program, and user supplied and problem dependent interface programs. Standard utility capabilities existing in modern computer operating systems are used to integrate these programs. This approach results in flexibility of the optimization procedure organization and versatility in the formulation of contraints and design variables. Features shown in numerical examples include: (1) variability of structural layout and overall shape geometry, (2) static strength and stiffness constraints, (3) local buckling failure, and (4) vibration constraints. The paper concludes with a review of the further development trends of this programing system.
Laser Threat Analysis System (LTAS)
NASA Astrophysics Data System (ADS)
Pfaltz, John M.; Richardson, Christina E.; Ruiz, Abel; Barsalou, Norman; Thomas, Robert J.
2002-11-01
LTAS is a totally integrated modeling and simulation environment designed for the purpose of ascertaining the susceptibility of Air Force pilots and air crews to optical radiation threats. Using LTAS, mission planners can assess the operational impact of optically directed energy weapons and countermeasures. Through various scenarios, threat analysts are able to determine the capability of laser threats and their impact on operational missions including the air crew's ability to complete their mission effectively. Additionally, LTAS allows the risk of laser use on training ranges and the requirement for laser protection to be evaluated. LTAS gives mission planners and threat analysts complete control of the threat environment including threat parameter control and placement, terrain mapping (line-of-site), atmospheric conditions, and laser eye protection (LEP) selection. This report summarizes the design of the final version of LTAS, and the modeling methodologies implemented to accomplish analysis.
Sharma, Anjali; Kakkar, Ashok
2015-09-17
To address current complex health problems, there has been an increasing demand for smart nanocarriers that could perform multiple complimentary biological tasks with high efficacy. This has provoked the design of tailor made nanocarriers, and the scientific community has made tremendous effort in meeting daunting challenges associated with synthetically articulating multiple functions into a single scaffold. Branched and hyper-branched macromolecular architectures have offered opportunities in enabling carriers with capabilities including location, delivery, imaging etc. Development of simple and versatile synthetic methodologies for these nanomaterials has been the key in diversifying macromolecule based medical therapy and treatment. This review highlights the advancement from conventional "only one function" to multifunctional nanomedicine. It is achieved by synthetic elaboration of multivalent platforms in miktoarm polymers and dendrimers by physical encapsulation, covalent linking and combinations thereof.
Beal, Jacob; Lu, Ting; Weiss, Ron
2011-01-01
Background The field of synthetic biology promises to revolutionize our ability to engineer biological systems, providing important benefits for a variety of applications. Recent advances in DNA synthesis and automated DNA assembly technologies suggest that it is now possible to construct synthetic systems of significant complexity. However, while a variety of novel genetic devices and small engineered gene networks have been successfully demonstrated, the regulatory complexity of synthetic systems that have been reported recently has somewhat plateaued due to a variety of factors, including the complexity of biology itself and the lag in our ability to design and optimize sophisticated biological circuitry. Methodology/Principal Findings To address the gap between DNA synthesis and circuit design capabilities, we present a platform that enables synthetic biologists to express desired behavior using a convenient high-level biologically-oriented programming language, Proto. The high level specification is compiled, using a regulatory motif based mechanism, to a gene network, optimized, and then converted to a computational simulation for numerical verification. Through several example programs we illustrate the automated process of biological system design with our platform, and show that our compiler optimizations can yield significant reductions in the number of genes () and latency of the optimized engineered gene networks. Conclusions/Significance Our platform provides a convenient and accessible tool for the automated design of sophisticated synthetic biological systems, bridging an important gap between DNA synthesis and circuit design capabilities. Our platform is user-friendly and features biologically relevant compiler optimizations, providing an important foundation for the development of sophisticated biological systems. PMID:21850228
The Distributed Geothermal Market Demand Model (dGeo): Documentation
DOE Office of Scientific and Technical Information (OSTI.GOV)
McCabe, Kevin; Mooney, Meghan E; Sigrin, Benjamin O
The National Renewable Energy Laboratory (NREL) developed the Distributed Geothermal Market Demand Model (dGeo) as a tool to explore the potential role of geothermal distributed energy resources (DERs) in meeting thermal energy demands in the United States. The dGeo model simulates the potential for deployment of geothermal DERs in the residential and commercial sectors of the continental United States for two specific technologies: ground-source heat pumps (GHP) and geothermal direct use (DU) for district heating. To quantify the opportunity space for these technologies, dGeo leverages a highly resolved geospatial database and robust bottom-up, agent-based modeling framework. This design is consistentmore » with others in the family of Distributed Generation Market Demand models (dGen; Sigrin et al. 2016), including the Distributed Solar Market Demand (dSolar) and Distributed Wind Market Demand (dWind) models. dGeo is intended to serve as a long-term scenario-modeling tool. It has the capability to simulate the technical potential, economic potential, market potential, and technology deployment of GHP and DU through the year 2050 under a variety of user-defined input scenarios. Through these capabilities, dGeo can provide substantial analytical value to various stakeholders interested in exploring the effects of various techno-economic, macroeconomic, financial, and policy factors related to the opportunity for GHP and DU in the United States. This report documents the dGeo modeling design, methodology, assumptions, and capabilities.« less
Development of GENOA Progressive Failure Parallel Processing Software Systems
NASA Technical Reports Server (NTRS)
Abdi, Frank; Minnetyan, Levon
1999-01-01
A capability consisting of software development and experimental techniques has been developed and is described. The capability is integrated into GENOA-PFA to model polymer matrix composite (PMC) structures. The capability considers the physics and mechanics of composite materials and structure by integration of a hierarchical multilevel macro-scale (lamina, laminate, and structure) and micro scale (fiber, matrix, and interface) simulation analyses. The modeling involves (1) ply layering methodology utilizing FEM elements with through-the-thickness representation, (2) simulation of effects of material defects and conditions (e.g., voids, fiber waviness, and residual stress) on global static and cyclic fatigue strengths, (3) including material nonlinearities (by updating properties periodically) and geometrical nonlinearities (by Lagrangian updating), (4) simulating crack initiation. and growth to failure under static, cyclic, creep, and impact loads. (5) progressive fracture analysis to determine durability and damage tolerance. (6) identifying the percent contribution of various possible composite failure modes involved in critical damage events. and (7) determining sensitivities of failure modes to design parameters (e.g., fiber volume fraction, ply thickness, fiber orientation. and adhesive-bond thickness). GENOA-PFA progressive failure analysis is now ready for use to investigate the effects on structural responses to PMC material degradation from damage induced by static, cyclic (fatigue). creep, and impact loading in 2D/3D PMC structures subjected to hygrothermal environments. Its use will significantly facilitate targeting design parameter changes that will be most effective in reducing the probability of a given failure mode occurring.
Characterizing Postural Sway during Quiet Stance Based on the Intermittent Control Hypothesis
NASA Astrophysics Data System (ADS)
Nomura, Taishin; Nakamura, Toru; Fukada, Kei; Sakoda, Saburo
2007-07-01
This article illustrates a signal processing methodology for the time series of postural sway and accompanied electromyographs from the lower limb muscles during quiet stance. It was shown that the proposed methodology was capable of identifying the underlying postural control mechanisms. A preliminary application of the methodology provided evidence that supports the intermittent control hypothesis alternative to the conventional stiffness control hypothesis during human quiet upright stance.
Horizon Missions Methodology - Using new paradigms to overcome conceptual blocks to innovation
NASA Technical Reports Server (NTRS)
Anderson, John L.
1993-01-01
The Horizon Mission Methodology was developed to provide a systematic analytical approach for evaluating and identifying technological requirements for breakthrough technology options (BTOs) and for assessing their potential to provide revolutionary capabilities for advanced space missions. Here, attention is given to the further use of the methodology as a new tool for a broader range of studies dealing with technology innovation and new technology paradigms.
Applications of different design methodologies in navigation systems and development at JPL
NASA Technical Reports Server (NTRS)
Thurman, S. W.
1990-01-01
The NASA/JPL deep space navigation system consists of a complex array of measurement systems, data processing systems, and support facilities, with components located both on the ground and on-board interplanetary spacecraft. From its beginings nearly 30 years ago, this system has steadily evolved and grown to meet the demands for ever-increasing navigation accuracy placed on it by a succession of unmanned planetary missions. Principal characteristics of this system are its capabilities and great complexity. Three examples in the design and development of interplanetary space navigation systems are examined in order to make a brief assessment of the usefulness of three basic design theories, known as normative, rational, and heuristic. Evaluation of the examples indicates that a heuristic approach, coupled with rational-based mathematical and computational analysis methods, is used most often in problems such as orbit determination strategy development and mission navigation system design, while normative methods have seen only limited use is such applications as the development of large software systems and in the design of certain operational navigation subsystems.
The Role of System Thinking Development and Experiential Learning on Enterprise Transformation
NASA Astrophysics Data System (ADS)
Lopez, Gabriel
The recent economic downturn has had global repercussions in all businesses alike. Competition is fierce and a survival of the fittest model is always present; fast delivery times and innovative designs ultimately translate into the enterprises' bottom line. In such market conditions, enterprises have to find ways to develop and train their workforce in a manner that enhances the innovative capabilities of the enterprise. Additionally, if companies are to stay competitive, they have to ensure critical skills in their workforce are transferred from generation to generation. This study builds on recent research on system-thinking development via experiential learning methodologies. First, a conceptual framework model was developed. This conceptual model captures a methodology to construct a system-thinking apprenticeship program suitable for system engineers. Secondly, a survey of system engineering professionals was conducted in order to assess and refine the proposed conceptual model. This dissertation captures the findings of the conceptual model and the implications of the study for enterprises and for system engineering organizations.
Framework for the Parametric System Modeling of Space Exploration Architectures
NASA Technical Reports Server (NTRS)
Komar, David R.; Hoffman, Jim; Olds, Aaron D.; Seal, Mike D., II
2008-01-01
This paper presents a methodology for performing architecture definition and assessment prior to, or during, program formulation that utilizes a centralized, integrated architecture modeling framework operated by a small, core team of general space architects. This framework, known as the Exploration Architecture Model for IN-space and Earth-to-orbit (EXAMINE), enables: 1) a significantly larger fraction of an architecture trade space to be assessed in a given study timeframe; and 2) the complex element-to-element and element-to-system relationships to be quantitatively explored earlier in the design process. Discussion of the methodology advantages and disadvantages with respect to the distributed study team approach typically used within NASA to perform architecture studies is presented along with an overview of EXAMINE s functional components and tools. An example Mars transportation system architecture model is used to demonstrate EXAMINE s capabilities in this paper. However, the framework is generally applicable for exploration architecture modeling with destinations to any celestial body in the solar system.
Performance Optimization Control of ECH using Fuzzy Inference Application
NASA Astrophysics Data System (ADS)
Dubey, Abhay Kumar
Electro-chemical honing (ECH) is a hybrid electrolytic precision micro-finishing technology that, by combining physico-chemical actions of electro-chemical machining and conventional honing processes, provides the controlled functional surfaces-generation and fast material removal capabilities in a single operation. Process multi-performance optimization has become vital for utilizing full potential of manufacturing processes to meet the challenging requirements being placed on the surface quality, size, tolerances and production rate of engineering components in this globally competitive scenario. This paper presents an strategy that integrates the Taguchi matrix experimental design, analysis of variances and fuzzy inference system (FIS) to formulate a robust practical multi-performance optimization methodology for complex manufacturing processes like ECH, which involve several control variables. Two methodologies one using a genetic algorithm tuning of FIS (GA-tuned FIS) and another using an adaptive network based fuzzy inference system (ANFIS) have been evaluated for a multi-performance optimization case study of ECH. The actual experimental results confirm their potential for a wide range of machining conditions employed in ECH.
Wave energy focusing to subsurface poroelastic formations to promote oil mobilization
NASA Astrophysics Data System (ADS)
Karve, Pranav M.; Kallivokas, Loukas F.
2015-07-01
We discuss an inverse source formulation aimed at focusing wave energy produced by ground surface sources to target subsurface poroelastic formations. The intent of the focusing is to facilitate or enhance the mobility of oil entrapped within the target formation. The underlying forward wave propagation problem is cast in two spatial dimensions for a heterogeneous poroelastic target embedded within a heterogeneous elastic semi-infinite host. The semi-infiniteness of the elastic host is simulated by augmenting the (finite) computational domain with a buffer of perfectly matched layers. The inverse source algorithm is based on a systematic framework of partial-differential-equation-constrained optimization. It is demonstrated, via numerical experiments, that the algorithm is capable of converging to the spatial and temporal characteristics of surface loads that maximize energy delivery to the target formation. Consequently, the methodology is well-suited for designing field implementations that could meet a desired oil mobility threshold. Even though the methodology, and the results presented herein are in two dimensions, extensions to three dimensions are straightforward.
FINDING A METHOD FOR THE MADNESS: A COMPARATIVE ANALYSIS OF STRATEGIC DESIGN METHODOLOGIES
2017-06-01
FINDING A METHOD FOR THE MADNESS: A COMPARATIVE ANALYSIS OF STRATEGIC DESIGN METHODOLOGIES BY AMANDA DONNELLY A THESIS...work develops a comparative model for strategic design methodologies, focusing on the primary elements of vision, time, process, communication and...collaboration, and risk assessment. My analysis dissects and compares three potential design methodologies including, net assessment, scenarios and
A new pre-loaded beam geometric stiffness matrix with full rigid body capabilities
NASA Astrophysics Data System (ADS)
Bosela, P. A.; Fertis, D. G.; Shaker, F. J.
1992-09-01
Space structures, such as the Space Station solar arrays, must be extremely light-weight, flexible structures. Accurate prediction of the natural frequencies and mode shapes is essential for determining the structural adequacy of components, and designing a controls system. The tension pre-load in the 'blanket' of photovoltaic solar collectors, and the free/free boundary conditions of a structure in space, causes serious reservations on the use of standard finite element techniques of solution. In particular, a phenomenon known as 'grounding', or false stiffening, of the stiffness matrix occurs during rigid body rotation. The authors have previously shown that the grounding phenomenon is caused by a lack of rigid body rotational capability, and is typical in beam geometric stiffness matrices formulated by others, including those which contain higher order effects. The cause of the problem was identified as the force imbalance inherent in the formulations. In this paper, the authors develop a beam geometric stiffness matrix for a directed force problem, and show that the resultant global stiffness matrix contains complete rigid body mode capabilities, and performs very well in the diagonalization methodology customarily used in dynamic analysis.
Imaging characteristics of photogrammetric camera systems
Welch, R.; Halliday, J.
1973-01-01
In view of the current interest in high-altitude and space photographic systems for photogrammetric mapping, the United States Geological Survey (U.S.G.S.) undertook a comprehensive research project designed to explore the practical aspects of applying the latest image quality evaluation techniques to the analysis of such systems. The project had two direct objectives: (1) to evaluate the imaging characteristics of current U.S.G.S. photogrammetric camera systems; and (2) to develop methodologies for predicting the imaging capabilities of photogrammetric camera systems, comparing conventional systems with new or different types of systems, and analyzing the image quality of photographs. Image quality was judged in terms of a number of evaluation factors including response functions, resolving power, and the detectability and measurability of small detail. The limiting capabilities of the U.S.G.S. 6-inch and 12-inch focal length camera systems were established by analyzing laboratory and aerial photographs in terms of these evaluation factors. In the process, the contributing effects of relevant parameters such as lens aberrations, lens aperture, shutter function, image motion, film type, and target contrast procedures for analyzing image quality and predicting and comparing performance capabilities. ?? 1973.
NASA Astrophysics Data System (ADS)
Udell, C.; Selker, J. S.
2017-12-01
The increasing availability and functionality of Open-Source software and hardware along with 3D printing, low-cost electronics, and proliferation of open-access resources for learning rapid prototyping are contributing to fundamental transformations and new technologies in environmental sensing. These tools invite reevaluation of time-tested methodologies and devices toward more efficient, reusable, and inexpensive alternatives. Building upon Open-Source design facilitates community engagement and invites a Do-It-Together (DIT) collaborative framework for research where solutions to complex problems may be crowd-sourced. However, barriers persist that prevent researchers from taking advantage of the capabilities afforded by open-source software, hardware, and rapid prototyping. Some of these include: requisite technical skillsets, knowledge of equipment capabilities, identifying inexpensive sources for materials, money, space, and time. A university MAKER space staffed by engineering students to assist researchers is one proposed solution to overcome many of these obstacles. This presentation investigates the unique capabilities the USDA-funded Openly Published Environmental Sensing (OPEnS) Lab affords researchers, within Oregon State and internationally, and the unique functions these types of initiatives support at the intersection of MAKER spaces, Open-Source academic research, and open-access dissemination.
Planned Environmental Microbiology Aspects of Future Lunar and Mars Missions
NASA Technical Reports Server (NTRS)
Ott, C. Mark; Castro, Victoria A.; Pierson, Duane L.
2006-01-01
With the establishment of the Constellation Program, NASA has initiated efforts designed similar to the Apollo Program to return to the moon and subsequently travel to Mars. Early lunar sorties will take 4 crewmembers to the moon for 4 to 7 days. Later missions will increase in duration up to 6 months as a lunar habitat is constructed. These missions and vehicle designs are the forerunners of further missions destined for human exploration of Mars. Throughout the planning and design process, lessons learned from the International Space Station (ISS) and past programs will be implemented toward future exploration goals. The standards and requirements for these missions will vary depending on life support systems, mission duration, crew activities, and payloads. From a microbiological perspective, preventative measures will remain the primary techniques to mitigate microbial risk. Thus, most of the effort will focus on stringent preflight monitoring requirements and engineering controls designed into the vehicle, such as HEPA air filters. Due to volume constraints in the CEV, in-flight monitoring will be limited for short-duration missions to the measurement of biocide concentration for water potability. Once long-duration habitation begins on the lunar surface, a more extensive environmental monitoring plan will be initiated. However, limited in-flight volume constraints and the inability to return samples to Earth will increase the need for crew capabilities in determining the nature of contamination problems and method of remediation. In addition, limited shelf life of current monitoring hardware consumables and limited capabilities to dispose of biohazardous trash will drive flight hardware toward non-culture based methodologies, such as hardware that rapidly distinguishes biotic versus abiotic surface contamination. As missions progress to Mars, environmental systems will depend heavily on regeneration of air and water and biological waste remediation and regeneration systems, increasing the need for environmental monitoring. Almost complete crew autonomy will be needed for assessment and remediation of contamination problems. Cabin capacity will be limited; thus, current methods of microbial monitoring will be inadequate. Future methodology must limit consumables, and these consumables must have a shelf life of over three years. In summary, missions to the moon and Mars will require a practical design that prudently uses available resources to mitigate microbial risk to the crew.
The Interpretative Phenomenological Analysis (IPA): A Guide to a Good Qualitative Research Approach
ERIC Educational Resources Information Center
Alase, Abayomi
2017-01-01
As a research methodology, qualitative research method infuses an added advantage to the exploratory capability that researchers need to explore and investigate their research studies. Qualitative methodology allows researchers to advance and apply their interpersonal and subjectivity skills to their research exploratory processes. However, in a…
Using Design Capability Indices to Satisfy Ranged Sets of Design Requirements
NASA Technical Reports Server (NTRS)
Chen, Wei; Allen, Janet K.; Simpson, Timothy W.; Mistree, Farrokh
1996-01-01
For robust design it is desirable to allow the design requirements to vary within a certain range rather than setting point targets. This is particularly important during the early stages of design when little is known about the system and its requirements. Toward this end, design capability indices are developed in this paper to assess the capability of a family of designs, represented by a range of top-level design specifications, to satisfy a ranged set of design requirements. Design capability indices are based on process capability indices from statistical process control and provide a single objective, alternate approach to the use of Taguchi's signal-to- noise ratio which is often used for robust design. Successful implementation of design capability indices ensures that a family of designs conforms to a given ranged set of design requirements. To demonstrate an application and the usefulness of design capability indices, the design of a solar powered irrigation system is presented. Our focus in this paper is on the development and implementation of design capability indices as an alternate approach to the use of the signal-to-noise ratio and not on the results of the example problem, per se.
Analysis and design of continuous class-E power amplifier at sub-nominal condition
NASA Astrophysics Data System (ADS)
Chen, Peng; Yang, Kai; Zhang, Tianliang
2017-12-01
The continuous class-E power amplifier at sub-nominal condition is proposed in this paper. The class-E power amplifier at continuous mode means it can be high efficient on a series matching networks while at sub-nominal condition means it only requires the zero-voltage-switching condition. Comparing with the classical class-E power amplifier, the proposed design method releases two additional design freedoms, which increase the class-E power amplifier's design flexibility. Also, the proposed continuous class-E power amplifier at sub-nominal condition can perform high efficiency over a broad bandwidth. The performance study of the continuous class-E power amplifier at sub-nominal condition is derived and the design procedure is summarised. The normalised switch voltage and current waveforms are investigated. Furthermore, the influences of different sub-nominal conditions on the power losses of the switch-on resistor and the output power capability are also discussed. A broadband continuous class-E power amplifier based on a Gallium Nitride (GaN) transistor is designed and testified to verify the proposed design methodology. The measurement results show, it can deliver 10-15 W output power with 64-73% power-added efficiency over 1.4-2.8 GHz.
Wu, Frances M; Rundall, Thomas G; Shortell, Stephen M; Bloom, Joan R
2016-06-20
Purpose - The purpose of this paper is to describe the current landscape of health information technology (HIT) in early accountable care organizations (ACOs), the different strategies ACOs are using to develop HIT-based capabilities, and how ACOs are using these capabilities within their care management processes to advance health outcomes for their patient population. Design/methodology/approach - Mixed methods study pairing data from a cross-sectional National Survey of ACOs with in-depth, semi-structured interviews with leaders from 11 ACOs (both completed in 2013). Findings - Early ACOs vary widely in their electronic health record, data integration, and analytic capabilities. The most common HIT capability was drug-drug and drug-allergy interaction checks, with 53.2 percent of respondents reporting that the ACO possessed the capability to a high degree. Outpatient and inpatient data integration was the least common HIT capability (8.1 percent). In the interviews, ACO leaders commented on different HIT development strategies to gain a more comprehensive picture of patient needs and service utilization. ACOs realize the necessity for robust data analytics, and are exploring a variety of approaches to achieve it. Research limitations/implications - Data are self-reported. The qualitative portion was based on interviews with 11 ACOs, limiting generalizability to the universe of ACOs but allowing for a range of responses. Practical implications - ACOs are challenged with the development of sophisticated HIT infrastructure. They may benefit from targeted assistance and incentives to implement health information exchanges with other providers to promote more coordinated care management for their patient population. Originality/value - Using new empirical data, this study increases understanding of the extent of ACOs' current and developing HIT capabilities to support ongoing care management.
Experimental Design of a UCAV-Based High-Energy Laser Weapon
2016-12-01
propagation. The Design of Experiments (DOE) methodology is then applied to determine the significance of the UCAV-HEL design parameters and their... Design of Experiments (DOE) methodology is then applied to determine the significance of the UCAV-HEL design parameters and their effect on the...73 A. DESIGN OF EXPERIMENTS METHODOLOGY .............................73 B. OPERATIONAL CONCEPT
Management of health care expenditure by soft computing methodology
NASA Astrophysics Data System (ADS)
Maksimović, Goran; Jović, Srđan; Jovanović, Radomir; Aničić, Obrad
2017-01-01
In this study was managed the health care expenditure by soft computing methodology. The main goal was to predict the gross domestic product (GDP) according to several factors of health care expenditure. Soft computing methodologies were applied since GDP prediction is very complex task. The performances of the proposed predictors were confirmed with the simulation results. According to the results, support vector regression (SVR) has better prediction accuracy compared to other soft computing methodologies. The soft computing methods benefit from the soft computing capabilities of global optimization in order to avoid local minimum issues.
Grounding language in action and perception: From cognitive agents to humanoid robots
NASA Astrophysics Data System (ADS)
Cangelosi, Angelo
2010-06-01
In this review we concentrate on a grounded approach to the modeling of cognition through the methodologies of cognitive agents and developmental robotics. This work will focus on the modeling of the evolutionary and developmental acquisition of linguistic capabilities based on the principles of symbol grounding. We review cognitive agent and developmental robotics models of the grounding of language to demonstrate their consistency with the empirical and theoretical evidence on language grounding and embodiment, and to reveal the benefits of such an approach in the design of linguistic capabilities in cognitive robotic agents. In particular, three different models will be discussed, where the complexity of the agent's sensorimotor and cognitive system gradually increases: from a multi-agent simulation of language evolution, to a simulated robotic agent model for symbol grounding transfer, to a model of language comprehension in the humanoid robot iCub. The review also discusses the benefits of the use of humanoid robotic platform, and specifically of the open source iCub platform, for the study of embodied cognition.
Shared services centers and work sustainability: which contributions from ergonomics?
Arnoud, Justine; Falzon, Pierre
2012-01-01
This study examines the way in which Shared Services Centers (SSCs) were implemented in a French multinational company. It aims to characterize the change according to the capabilities model developed by Amartya Sen: what are the effects of SSCs in terms of capabilities development and developmental quality of work, i.e. in the enabling potential of work? A 3-step methodology has been used: first, an investigation was conducted in a pay service of a local entity moving into SSC in 2013; second, two investigations were conducted in another pay service of a SSC: first, a few months after the change, and then, one year after the change (the same operators were interviewed). Results show a tendency to the decrease of the enabling potential. Additionally, it was noted that administrators are kept away from the design process and have to struggle with inappropriate rules. The efficiency and sustainability of the SSC are questioned; in this context, the human factor specialist has an important role to play.
NASA Technical Reports Server (NTRS)
Glass, B. J.; Hack, E. C.
1990-01-01
A knowledge-based control system for real-time control and fault detection, isolation and recovery (FDIR) of a prototype two-phase Space Station Freedom external thermal control system (TCS) is discussed in this paper. The Thermal Expert System (TEXSYS) has been demonstrated in recent tests to be capable of both fault anticipation and detection and real-time control of the thermal bus. Performance requirements were achieved by using a symbolic control approach, layering model-based expert system software on a conventional numerical data acquisition and control system. The model-based capabilities of TEXSYS were shown to be advantageous during software development and testing. One representative example is given from on-line TCS tests of TEXSYS. The integration and testing of TEXSYS with a live TCS testbed provides some insight on the use of formal software design, development and documentation methodologies to qualify knowledge-based systems for on-line or flight applications.
Controlling protected designation of origin of wine by Raman spectroscopy.
Mandrile, Luisa; Zeppa, Giuseppe; Giovannozzi, Andrea Mario; Rossi, Andrea Mario
2016-11-15
In this paper, a Fourier Transform Raman spectroscopy method, to authenticate the provenience of wine, for food traceability applications was developed. In particular, due to the specific chemical fingerprint of the Raman spectrum, it was possible to discriminate different wines produced in the Piedmont area (North West Italy) in accordance with i) grape varieties, ii) production area and iii) ageing time. In order to create a consistent training set, more than 300 samples from tens of different producers were analyzed, and a chemometric treatment of raw spectra was applied. A discriminant analysis method was employed in the classification procedures, providing a classification capability (percentage of correct answers) of 90% for validation of grape analysis and geographical area provenance, and a classification capability of 84% for ageing time classification. The present methodology was applied successfully to raw materials without any preliminary treatment of the sample, providing a response in a very short time. Copyright © 2016 Elsevier Ltd. All rights reserved.
Handling and safety enhancement of race cars using active aerodynamic systems
NASA Astrophysics Data System (ADS)
Diba, Fereydoon; Barari, Ahmad; Esmailzadeh, Ebrahim
2014-09-01
A methodology is presented in this work that employs the active inverted wings to enhance the road holding by increasing the downward force on the tyres. In the proposed active system, the angles of attack of the vehicle's wings are adjusted by using a real-time controller to increase the road holding and hence improve the vehicle handling. The handling of the race car and safety of the driver are two important concerns in the design of race cars. The handling of a vehicle depends on the dynamic capabilities of the vehicle and also the pneumatic tyres' limitations. The vehicle side-slip angle, as a measure of the vehicle dynamic safety, should be narrowed into an acceptable range. This paper demonstrates that active inverted wings can provide noteworthy dynamic capabilities and enhance the safety features of race cars. Detailed analytical study and formulations of the race car nonlinear model with the airfoils are presented. Computer simulations are carried out to evaluate the performance of the proposed active aerodynamic system.
2018-06-07
Gaming Space A Game-Theoretic Methodology for Assessing the Deterrent Value of Space Control Options C O R...in space. Adversaries have already employed non -kinetic OSC capabilities, such as Global Positioning System jammers, in recent conflicts, and they...as part of the project “Assessing the Deterrent Value of Defensive Space Control Options.” The purpose of the project was to develop a methodology
NASA Technical Reports Server (NTRS)
Polotzky, Anthony S.; Wieseman, Carol; Hoadley, Sherwood Tiffany; Mukhopadhyay, Vivek
1990-01-01
The development of a controller performance evaluation (CPE) methodology for multiinput/multioutput digital control systems is described. The equations used to obtain the open-loop plant, controller transfer matrices, and return-difference matrices are given. Results of applying the CPE methodology to evaluate MIMO digital flutter suppression systems being tested on an active flexible wing wind-tunnel model are presented to demonstrate the CPE capability.
Reliability-Based Design Optimization of a Composite Airframe Component
NASA Technical Reports Server (NTRS)
Patnaik, Surya N.; Pai, Shantaram S.; Coroneos, Rula M.
2009-01-01
A stochastic design optimization methodology (SDO) has been developed to design components of an airframe structure that can be made of metallic and composite materials. The design is obtained as a function of the risk level, or reliability, p. The design method treats uncertainties in load, strength, and material properties as distribution functions, which are defined with mean values and standard deviations. A design constraint or a failure mode is specified as a function of reliability p. Solution to stochastic optimization yields the weight of a structure as a function of reliability p. Optimum weight versus reliability p traced out an inverted-S-shaped graph. The center of the inverted-S graph corresponded to 50 percent (p = 0.5) probability of success. A heavy design with weight approaching infinity could be produced for a near-zero rate of failure that corresponds to unity for reliability p (or p = 1). Weight can be reduced to a small value for the most failure-prone design with a reliability that approaches zero (p = 0). Reliability can be changed for different components of an airframe structure. For example, the landing gear can be designed for a very high reliability, whereas it can be reduced to a small extent for a raked wingtip. The SDO capability is obtained by combining three codes: (1) The MSC/Nastran code was the deterministic analysis tool, (2) The fast probabilistic integrator, or the FPI module of the NESSUS software, was the probabilistic calculator, and (3) NASA Glenn Research Center s optimization testbed CometBoards became the optimizer. The SDO capability requires a finite element structural model, a material model, a load model, and a design model. The stochastic optimization concept is illustrated considering an academic example and a real-life raked wingtip structure of the Boeing 767-400 extended range airliner made of metallic and composite materials.
Harte, Richard; Glynn, Liam; Rodríguez-Molinero, Alejandro; Baker, Paul MA; Scharf, Thomas; ÓLaighin, Gearóid
2017-01-01
Background Design processes such as human-centered design (HCD), which involve the end user throughout the product development and testing process, can be crucial in ensuring that the product meets the needs and capabilities of the user, particularly in terms of safety and user experience. The structured and iterative nature of HCD can often conflict with the necessary rapid product development life-cycles associated with the competitive connected health industry. Objective The aim of this study was to apply a structured HCD methodology to the development of a smartphone app that was to be used within a connected health fall risk detection system. Our methodology utilizes so called discount usability engineering techniques to minimize the burden on resources during development and maintain a rapid pace of development. This study will provide prospective designers a detailed description of the application of a HCD methodology. Methods A 3-phase methodology was applied. In the first phase, a descriptive “use case” was developed by the system designers and analyzed by both expert stakeholders and end users. The use case described the use of the app and how various actors would interact with it and in what context. A working app prototype and a user manual were then developed based on this feedback and were subjected to a rigorous usability inspection. Further changes were made both to the interface and support documentation. The now advanced prototype was exposed to user testing by end users where further design recommendations were made. Results With combined expert and end-user analysis of a comprehensive use case having originally identified 21 problems with the system interface, we have only seen and observed 3 of these problems in user testing, implying that 18 problems were eliminated between phase 1 and 3. Satisfactory ratings were obtained during validation testing by both experts and end users, and final testing by users shows the system requires low mental, physical, and temporal demands according to the NASA Task Load Index (NASA-TLX). Conclusions From our observation of older adults’ interactions with smartphone interfaces, there were some recurring themes. Clear and relevant feedback as the user attempts to complete a task is critical. Feedback should include pop-ups, sound tones, color or texture changes, or icon changes to indicate that a function has been completed successfully, such as for the connection sequence. For text feedback, clear and unambiguous language should be used so as not to create anxiety, particularly when it comes to saving data. Warning tones or symbols, such as caution symbols or shrill tones, should only be used if absolutely necessary. Our HCD methodology, designed and implemented based on the principles of the International Standard Organizaton (ISO) 9241-210 standard, produced a functional app interface within a short production cycle, which is now suitable for use by older adults in long term clinical trials. PMID:28559227
ERIC Educational Resources Information Center
Khalil, Deena; Kier, Meredith
2017-01-01
This article is about introducing Critical Race Design (CRD), a research methodology that centers race and equity at the nucleus of educational opportunities by design. First, the authors define design-based implementation research (DBIR; Penuel, Fishman, Cheng, & Sabelli, 2011) as an equity-oriented education research methodology where…
Decision making in prioritization of required operational capabilities
NASA Astrophysics Data System (ADS)
Andreeva, P.; Karev, M.; Kovacheva, Ts.
2015-10-01
The paper describes an expert heuristic approach to prioritization of required operational capabilities in the field of defense. Based on expert assessment and by application of the method of Analytical Hierarchical Process, a methodology for their prioritization has been developed. It has been applied to practical simulation decision making games.
Utility of Army Design Methodology in U.S. Coast Guard Counter Narcotic Interdiction Strategy
2017-06-09
UTILITY OF ARMY DESIGN METHODOLOGY IN U.S. COAST GUARD COUNTER NARCOTIC INTERDICTION STRATEGY A thesis presented to the...Thesis 3. DATES COVERED (From - To) AUG 2016 – JUN 2017 4. TITLE AND SUBTITLE Utility of Army Design Methodology in U.S. Coast Guard Counter...Distribution is Unlimited 13. SUPPLEMENTARY NOTES 14. ABSTRACT This study investigates the utility of using Army Design Methodology (ADM) to
2017-11-01
ARL-TR-8225 ● NOV 2017 US Army Research Laboratory Methodology for Designing and Developing a New Ultra-Wideband Antenna Based...Research Laboratory Methodology for Designing and Developing a New Ultra-Wideband Antenna Based on Bio-Inspired Optimization Techniques by...SUBTITLE Methodology for Designing and Developing a New Ultra-Wideband Antenna Based on Bio-Inspired Optimization Techniques 5a. CONTRACT NUMBER
NASA Astrophysics Data System (ADS)
Ahmed, Ammar; Arthur, Craig; Edwards, Mark
2010-06-01
Bulk electricity transmission lines are linear assets that can be very exposed to wind effects, particularly where they traverse steep topography or open coastal terrain in cyclonic regions. Interconnected nature of the lattice type towers and conductors also, present complex vulnerabilities. These relate to the direction of wind attack to the conductors and the cascading failure mechanisms in which the failure of a single tower has cascading effects on neighbouring towers. Such behaviour is exacerbated by the finely tuned nature of tower design which serves to minimize cost and reserve strength at design wind speeds. There is a clear need to better quantify the interdependent vulnerabilities of these critical infrastructure assets in the context of the severe wind hazard. This paper presents a novel methodology developed for the Critical Infrastructure Protection Modelling and Analysis (CIPMA) capability for assessing local wind speeds and the likelihood of tower failure for a range of transmission tower and conductor types. CIPMA is a program managed by the Federal Attorney-General's Department and Geoscience Australia is leading the technical development. The methodology then involves the development of heuristically derived vulnerability models that are consistent with Australian industry experience and full-scale static tower testing results, considering isolated tower loss along with three interdependent failure mechanisms to give overall likelihoods of failure.
Design and performance of energy efficient propellers for Mach 0.8 cruise
NASA Technical Reports Server (NTRS)
Mikkelson, D. C.; Blaha, B. J.; Mitchell, G. A.; Wikete, J. E.
1977-01-01
The increased emphasis on fuel conservation in the world has stimulated a series of studies of both conventional and unconventional propulsion systems for commercial aircraft. Preliminary results from these studies indicate that a fuel saving of 14 to 40 percent may be realized by the use of an advanced high-speed turboprop. This turboprop must be capable of high efficiency at Mach 0.8 cruise above 9.144 km altitude if it is to compete with turbofan powered commercial aircraft. Several advanced aerodynamic concepts were investigated in recent wind tunnel tests under NASA sponsorship on two propeller models. These concepts included aerodynamically integrated propeller/nacelles, area ruling, blade sweep, reduced blade thickness and power (disk) loadings several times higher than conventional designs. The aerodynamic design methodology for these models is discussed. In addition, some of the preliminary test results are presented which indicate that propeller net efficiencies near 80 percent were obtained for high disk loading propellers operating at Mach 0.8.
Driver face tracking using semantics-based feature of eyes on single FPGA
NASA Astrophysics Data System (ADS)
Yu, Ying-Hao; Chen, Ji-An; Ting, Yi-Siang; Kwok, Ngaiming
2017-06-01
Tracking driver's face is one of the essentialities for driving safety control. This kind of system is usually designed with complicated algorithms to recognize driver's face by means of powerful computers. The design problem is not only about detecting rate but also from parts damages under rigorous environments by vibration, heat, and humidity. A feasible strategy to counteract these damages is to integrate entire system into a single chip in order to achieve minimum installation dimension, weight, power consumption, and exposure to air. Meanwhile, an extraordinary methodology is also indispensable to overcome the dilemma of low-computing capability and real-time performance on a low-end chip. In this paper, a novel driver face tracking system is proposed by employing semantics-based vague image representation (SVIR) for minimum hardware resource usages on a FPGA, and the real-time performance is also guaranteed at the same time. Our experimental results have indicated that the proposed face tracking system is viable and promising for the smart car design in the future.
NASA Astrophysics Data System (ADS)
Gorelick, Steven M.; Voss, Clifford I.; Gill, Philip E.; Murray, Walter; Saunders, Michael A.; Wright, Margaret H.
1984-04-01
A simulation-management methodology is demonstrated for the rehabilitation of aquifers that have been subjected to chemical contamination. Finite element groundwater flow and contaminant transport simulation are combined with nonlinear optimization. The model is capable of determining well locations plus pumping and injection rates for groundwater quality control. Examples demonstrate linear or nonlinear objective functions subject to linear and nonlinear simulation and water management constraints. Restrictions can be placed on hydraulic heads, stresses, and gradients, in addition to contaminant concentrations and fluxes. These restrictions can be distributed over space and time. Three design strategies are demonstrated for an aquifer that is polluted by a constant contaminant source: they are pumping for contaminant removal, water injection for in-ground dilution, and a pumping, treatment, and injection cycle. A transient model designs either contaminant plume interception or in-ground dilution so that water quality standards are met. The method is not limited to these cases. It is generally applicable to the optimization of many types of distributed parameter systems.
Design and performance of energy efficient propellers for Mach 0. 8 cruise
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mikkelson, D.C.; Blaha, B.J.; Mitchell, G.A.
1977-01-01
The increased emphasis on fuel conservation in the world has stimulated a series of studies of both conventional and unconventional propulsion systems for commercial aircraft. Preliminary results from these studies indicate that a fuel saving of 14 to 40 percent may be realized by the use of an advanced high-speed turboprop. This turboprop must be capable of high efficiency at Mach 0.8 cruise above 9.144 km altitude if it is to compete with turbofan powered commercial aircraft. Several advanced aerodynamic concepts were investigated in recent wind tunnel tests under NASA sponsorship on two propeller models. These concepts included aerodynamically integratedmore » propeller/nacelles, area ruling, blade sweep, reduced blade thickness and power (disk) loadings several times higher than conventional designs. The aerodynamic design methodology for these models is discussed. In addition, some of the preliminary test results are presented which indicate that propeller net efficiencies near 80 percent were obtained for high disk loading propellers operating at Mach 0.8.« less
Project WISH: The Emerald City, phase 2
NASA Technical Reports Server (NTRS)
1991-01-01
The purpose of the Permanently Manned Autonomous Space Oasis, designated Project WISH: The Emerald City, is to serve as permanent living quarters for space colonists. In addition, it will serve as a stopover for space missions and will be capable of restationing itself practically anywhere within the solar system to provide support for these missions. The station should be self-sufficient, with no specific dependence on any resources from Earth. The 1990 to 1991 design team continued work started by last year's class. Further studies were conducted in the areas of orbital mechanics, propulsion, attitude control, and human factors. Critical elements were identified in each of these areas, and guidelines were established for the design of the Emerald City. Using the knowledge gained from these studies, two particular missions of interest, a Saturn Envelope mission and an Earth to Mars mission, were examined. The size and mass estimates, along with the methodologies used in their determination, are considered to be the main accomplishments of phase 2.
NASA Astrophysics Data System (ADS)
Shobeiri, Vahid; Ahmadi-Nedushan, Behrouz
2017-12-01
This article presents a method for the automatic generation of optimal strut-and-tie models in reinforced concrete structures using a bi-directional evolutionary structural optimization method. The methodology presented is developed for compliance minimization relying on the Abaqus finite element software package. The proposed approach deals with the generation of truss-like designs in a three-dimensional environment, addressing the design of corbels and joints as well as bridge piers and pile caps. Several three-dimensional examples are provided to show the capabilities of the proposed framework in finding optimal strut-and-tie models in reinforced concrete structures and verifying its efficiency to cope with torsional actions. Several issues relating to the use of the topology optimization for strut-and-tie modelling of structural concrete, such as chequerboard patterns, mesh-dependency and multiple load cases, are studied. In the last example, a design procedure for detailing and dimensioning of the strut-and-tie models is given according to the American Concrete Institute (ACI) 318-08 provisions.
NASA Technical Reports Server (NTRS)
Koltai, Kolina Sun; Ho, Nhut; Masequesmay, Gina; Niedober, David; Skoog, Mark; Johnson, Walter; Cacanindin, Artemio
2014-01-01
This paper discusses a case study that examined the influence of cultural, organizational and automation capability upon human trust in, and reliance on, automation. In particular, this paper focuses on the design and application of an extended case study methodology, and on the foundational lessons revealed by it. Experimental test pilots involved in the research and development of the US Air Forces newly developed Automatic Ground Collision Avoidance System served as the context for this examination. An eclectic, multi-pronged approach was designed to conduct this case study, and proved effective in addressing the challenges associated with the cases politically sensitive and military environment. Key results indicate that the system design was in alignment with pilot culture and organizational mission, indicating the potential for appropriate trust development in operational pilots. These include the low-vulnerabilityhigh risk nature of the pilot profession, automation transparency and suspicion, system reputation, and the setup of and communications among organizations involved in the system development.
Reibling, Nadine
2013-09-01
This paper outlines the capabilities of pooled cross-sectional time series methodology for the international comparison of health system performance in population health. It shows how common model specifications can be improved so that they not only better address the specific nature of time series data on population health but are also more closely aligned with our theoretical expectations of the effect of healthcare systems. Three methodological innovations for this field of applied research are discussed: (1) how dynamic models help us understand the timing of effects, (2) how parameter heterogeneity can be used to compare performance across countries, and (3) how multiple imputation can be used to deal with incomplete data. We illustrate these methodological strategies with an analysis of infant mortality rates in 21 OECD countries between 1960 and 2008 using OECD Health Data. Copyright © 2013 The Author. Published by Elsevier Ireland Ltd.. All rights reserved.
MRI-guided robotics at the U of Houston: evolving methodologies for interventions and surgeries.
Tsekos, Nikolaos V
2009-01-01
Currently, we witness the rapid evolution of minimally invasive surgeries (MIS) and image guided interventions (IGI) for offering improved patient management and cost effectiveness. It is well recognized that sustaining and expand this paradigm shift would require new computational methodology that integrates sensing with multimodal imaging, actively controlled robotic manipulators, the patient and the operator. Such approach would include (1) assessing in real-time tissue deformation secondary to the procedure and physiologic motion, (2) monitoring the tool(s) in 3D, and (3) on-the-fly update information about the pathophysiology of the targeted tissue. With those capabilities, real time image guidance may facilitate a paradigm shift and methodological leap from "keyhole" visualization (i.e. endoscopy or laparoscopy) to one that uses a volumetric and informational rich perception of the Area of Operation (AoO). This capability may eventually enable a wider range and level of complexity IGI and MIS.
Satellite vulnerability to space debris - an improved 3D risk assessment methodology
NASA Astrophysics Data System (ADS)
Grassi, Lilith; Tiboldo, Francesca; Destefanis, Roberto; Donath, Thérèse; Winterboer, Arne; Evans, Leanne; Janovsky, Rolf; Kempf, Scott; Rudolph, Martin; Schäfer, Frank; Gelhaus, Johannes
2014-06-01
The work described in the present paper, performed as a part of the P2 project, presents an enhanced method to evaluate satellite vulnerability to micrometeoroids and orbital debris (MMOD), using the ESABASE2/Debris tool (developed under ESA contract). Starting from the estimation of induced failures on spacecraft (S/C) components and from the computation of lethal impacts (with an energy leading to the loss of the satellite), and considering the equipment redundancies and interactions between components, the debris-induced S/C functional impairment is assessed. The developed methodology, illustrated through its application to a case study satellite, includes the capability to estimate the number of failures on internal components, overcoming the limitations of current tools which do not allow propagating the debris cloud inside the S/C. The ballistic limit of internal equipment behind a sandwich panel structure is evaluated through the implementation of the Schäfer Ryan Lambert (SRL) Ballistic Limit Equation (BLE). The analysis conducted on the case study satellite shows the S/C vulnerability index to be in the range of about 4% over the complete mission, with a significant reduction with respect to the results typically obtained with the traditional analysis, which considers as a failure the structural penetration of the satellite structural panels. The methodology has then been applied to select design strategies (additional local shielding, relocation of components) to improve S/C protection with respect to MMOD. The results of the analyses conducted on the improved design show a reduction of the vulnerability index of about 18%.
How to improve healthcare? Identify, nurture and embed individuals and teams with "deep smarts".
Eljiz, Kathy; Greenfield, David; Molineux, John; Sloan, Terry
2018-03-19
Purpose Unlocking and transferring skills and capabilities in individuals to the teams they work within, and across, is the key to positive organisational development and improved patient care. Using the "deep smarts" model, the purpose of this paper is to examine these issues. Design/methodology/approach The "deep smarts" model is described, reviewed and proposed as a way of transferring knowledge and capabilities within healthcare organisations. Findings Effective healthcare delivery is achieved through, and continues to require, integrative care involving numerous, dispersed service providers. In the space of overlapping organisational boundaries, there is a need for "deep smarts" people who act as "boundary spanners". These are critical integrative, networking roles employing clinical, organisational and people skills across multiple settings. Research limitations/implications Studies evaluating the barriers and enablers to the application of the deep smarts model and 13 knowledge development strategies proposed are required. Such future research will empirically and contemporary ground our understanding of organisational development in modern complex healthcare settings. Practical implications An organisation with "deep smarts" people - in managerial, auxiliary and clinical positions - has a greater capacity for integration and achieving improved patient-centred care. Originality/value In total, 13 developmental strategies, to transfer individual capabilities into organisational capability, are proposed. These strategies are applicable to different contexts and challenges faced by individuals and teams in complex healthcare organisations.
NASA Technical Reports Server (NTRS)
Fink, Pamela K.; Palmer, Karol K.
1988-01-01
The development of a probabilistic structural analysis methodology (PSAM) is described. In the near-term, the methodology will be applied to designing critical components of the next generation space shuttle main engine. In the long-term, PSAM will be applied very broadly, providing designers with a new technology for more effective design of structures whose character and performance are significantly affected by random variables. The software under development to implement the ideas developed in PSAM resembles, in many ways, conventional deterministic structural analysis code. However, several additional capabilities regarding the probabilistic analysis makes the input data requirements and the resulting output even more complex. As a result, an intelligent front- and back-end to the code is being developed to assist the design engineer in providing the input data in a correct and appropriate manner. The type of knowledge that this entails is, in general, heuristically-based, allowing the fairly well-understood technology of production rules to apply with little difficulty. However, the PSAM code, called NESSUS, is written in FORTRAN-77 and runs on a DEC VAX. Thus, the associated expert system, called NESSUS/EXPERT, must run on a DEC VAX as well, and integrate effectively and efficiently with the existing FORTRAN code. This paper discusses the process undergone to select a suitable tool, identify an appropriate division between the functions that should be performed in FORTRAN and those that should be performed by production rules, and how integration of the conventional and AI technologies was achieved.
SDDL- SOFTWARE DESIGN AND DOCUMENTATION LANGUAGE
NASA Technical Reports Server (NTRS)
Kleine, H.
1994-01-01
Effective, efficient communication is an essential element of the software development process. The Software Design and Documentation Language (SDDL) provides an effective communication medium to support the design and documentation of complex software applications. SDDL supports communication between all the members of a software design team and provides for the production of informative documentation on the design effort. Even when an entire development task is performed by a single individual, it is important to explicitly express and document communication between the various aspects of the design effort including concept development, program specification, program development, and program maintenance. SDDL ensures that accurate documentation will be available throughout the entire software life cycle. SDDL offers an extremely valuable capability for the design and documentation of complex programming efforts ranging from scientific and engineering applications to data management and business sytems. Throughout the development of a software design, the SDDL generated Software Design Document always represents the definitive word on the current status of the ongoing, dynamic design development process. The document is easily updated and readily accessible in a familiar, informative form to all members of the development team. This makes the Software Design Document an effective instrument for reconciling misunderstandings and disagreements in the development of design specifications, engineering support concepts, and the software design itself. Using the SDDL generated document to analyze the design makes it possible to eliminate many errors that might not be detected until coding and testing is attempted. As a project management aid, the Software Design Document is useful for monitoring progress and for recording task responsibilities. SDDL is a combination of language, processor, and methodology. The SDDL syntax consists of keywords to invoke design structures and a collection of directives which control processor actions. The designer has complete control over the choice of keywords, commanding the capabilities of the processor in a way which is best suited to communicating the intent of the design. The SDDL processor translates the designer's creative thinking into an effective document for communication. The processor performs as many automatic functions as possible, thereby freeing the designer's energy for the creative effort. Document formatting includes graphical highlighting of structure logic, accentuation of structure escapes and module invocations, logic error detection, and special handling of title pages and text segments. The SDDL generated document contains software design summary information including module invocation hierarchy, module cross reference, and cross reference tables of user selected words or phrases appearing in the document. The basic forms of the methodology are module and block structures and the module invocation statement. A design is stated in terms of modules that represent problem abstractions which are complete and independent enough to be treated as separate problem entities. Blocks are lower-level structures used to build the modules. Both kinds of structures may have an initiator part, a terminator part, an escape segment, or a substructure. The SDDL processor is written in PASCAL for batch execution on a DEC VAX series computer under VMS. SDDL was developed in 1981 and last updated in 1984.
Media Literacy, Education & (Civic) Capability: A Transferable Methodology
ERIC Educational Resources Information Center
McDougall, Julian; Berger, Richard; Fraser, Pete; Zezulkova, Marketa
2015-01-01
This article explores the relationship between a formal media educational encounter in the UK and the broad objectives for media and information literacy education circulating in mainland Europe and the US. A pilot study, developed with a special interest group of the United Kingdom Literacy Association, applied a three-part methodology for…
Resisting Coherence: Trans Men's Experiences and the Use of Grounded Theory Methods
ERIC Educational Resources Information Center
Catalano, D. Chase J.
2017-01-01
In this methodological reflective manuscript, I explore my decision to use a grounded theoretical approach to my dissertation study on trans* men in higher education. Specifically, I question whether grounded theory as a methodology is capable of capturing the complexity and capaciousness of trans*-masculine experiences. Through the lenses of…
ERIC Educational Resources Information Center
Jakovljevic, Maria; Ankiewicz, Piet; De swardt, Estelle; Gross, Elna
2004-01-01
Traditional instructional methodology in the Information System Design (ISD) environment lacks explicit strategies for promoting the cognitive skills of prospective system designers. This contributes to the fragmented knowledge and low motivational and creative involvement of learners in system design tasks. In addition, present ISD methodologies,…
Handheld ultrasound array imaging device
NASA Astrophysics Data System (ADS)
Hwang, Juin-Jet; Quistgaard, Jens
1999-06-01
A handheld ultrasound imaging device, one that weighs less than five pounds, has been developed for diagnosing trauma in the combat battlefield as well as a variety of commercial mobile diagnostic applications. This handheld device consists of four component ASICs, each is designed using the state of the art microelectronics technologies. These ASICs are integrated with a convex array transducer to allow high quality imaging of soft tissues and blood flow in real time. The device is designed to be battery driven or ac powered with built-in image storage and cineloop playback capability. Design methodologies of a handheld device are fundamentally different to those of a cart-based system. As system architecture, signal and image processing algorithm as well as image control circuit and software in this device is deigned suitably for large-scale integration, the image performance of this device is designed to be adequate to the intent applications. To elongate the battery life, low power design rules and power management circuits are incorporated in the design of each component ASIC. The performance of the prototype device is currently being evaluated for various applications such as a primary image screening tool, fetal imaging in Obstetrics, foreign object detection and wound assessment for emergency care, etc.
Double patterning from design enablement to verification
NASA Astrophysics Data System (ADS)
Abercrombie, David; Lacour, Pat; El-Sewefy, Omar; Volkov, Alex; Levine, Evgueni; Arb, Kellen; Reid, Chris; Li, Qiao; Ghosh, Pradiptya
2011-11-01
Litho-etch-litho-etch (LELE) is the double patterning (DP) technology of choice for 20 nm contact, via, and lower metal layers. We discuss the unique design and process characteristics of LELE DP, the challenges they present, and various solutions. ∘ We examine DP design methodologies, current DP conflict feedback mechanisms, and how they can help designers identify and resolve conflicts. ∘ In place and route (P&R), the placement engine must now be aware of the assumptions made during IP cell design, and use placement directives provide by the library designer. We examine the new effects DP introduces in detail routing, discuss how multiple choices of LELE and the cut allowances can lead to different solutions, and describe new capabilities required by detail routers and P&R engines. ∘ We discuss why LELE DP cuts and overlaps are critical to optical process correction (OPC), and how a hybrid mechanism of rule and model-based overlap generation can provide a fast and effective solution. ∘ With two litho-etch steps, mask misalignment and image rounding are now verification considerations. We present enhancements to the OPCVerify engine that check for pinching and bridging in the presence of DP overlay errors and acute angles.
Advances in computer-aided well-test interpretation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Horne, R.N.
1994-07-01
Despite the feeling expressed several times over the past 40 years that well-test analysis had reached it peak development, an examination of recent advances shows continuous expansion in capability, with future improvement likely. The expansion in interpretation capability over the past decade arose mainly from the development of computer-aided techniques, which, although introduced 20 years ago, have come into use only recently. The broad application of computer-aided interpretation originated with the improvement of the methodologies and continued with the expansion in computer access and capability that accompanied the explosive development of the microcomputer industry. This paper focuses on the differentmore » pieces of the methodology that combine to constitute a computer-aided interpretation and attempts to compare some of the approaches currently used. Future directions of the approach are also discussed. The separate areas discussed are deconvolution, pressure derivatives, model recognition, nonlinear regression, and confidence intervals.« less
Propulsion integration of hypersonic air-breathing vehicles utilizing a top-down design methodology
NASA Astrophysics Data System (ADS)
Kirkpatrick, Brad Kenneth
In recent years, a focus of aerospace engineering design has been the development of advanced design methodologies and frameworks to account for increasingly complex and integrated vehicles. Techniques such as parametric modeling, global vehicle analyses, and interdisciplinary data sharing have been employed in an attempt to improve the design process. The purpose of this study is to introduce a new approach to integrated vehicle design known as the top-down design methodology. In the top-down design methodology, the main idea is to relate design changes on the vehicle system and sub-system level to a set of over-arching performance and customer requirements. Rather than focusing on the performance of an individual system, the system is analyzed in terms of the net effect it has on the overall vehicle and other vehicle systems. This detailed level of analysis can only be accomplished through the use of high fidelity computational tools such as Computational Fluid Dynamics (CFD) or Finite Element Analysis (FEA). The utility of the top-down design methodology is investigated through its application to the conceptual and preliminary design of a long-range hypersonic air-breathing vehicle for a hypothetical next generation hypersonic vehicle (NHRV) program. System-level design is demonstrated through the development of the nozzle section of the propulsion system. From this demonstration of the methodology, conclusions are made about the benefits, drawbacks, and cost of using the methodology.
Adaptive multiresolution modeling of groundwater flow in heterogeneous porous media
NASA Astrophysics Data System (ADS)
Malenica, Luka; Gotovac, Hrvoje; Srzic, Veljko; Andric, Ivo
2016-04-01
Proposed methodology was originally developed by our scientific team in Split who designed multiresolution approach for analyzing flow and transport processes in highly heterogeneous porous media. The main properties of the adaptive Fup multi-resolution approach are: 1) computational capabilities of Fup basis functions with compact support capable to resolve all spatial and temporal scales, 2) multi-resolution presentation of heterogeneity as well as all other input and output variables, 3) accurate, adaptive and efficient strategy and 4) semi-analytical properties which increase our understanding of usually complex flow and transport processes in porous media. The main computational idea behind this approach is to separately find the minimum number of basis functions and resolution levels necessary to describe each flow and transport variable with the desired accuracy on a particular adaptive grid. Therefore, each variable is separately analyzed, and the adaptive and multi-scale nature of the methodology enables not only computational efficiency and accuracy, but it also describes subsurface processes closely related to their understood physical interpretation. The methodology inherently supports a mesh-free procedure, avoiding the classical numerical integration, and yields continuous velocity and flux fields, which is vitally important for flow and transport simulations. In this paper, we will show recent improvements within the proposed methodology. Since "state of the art" multiresolution approach usually uses method of lines and only spatial adaptive procedure, temporal approximation was rarely considered as a multiscale. Therefore, novel adaptive implicit Fup integration scheme is developed, resolving all time scales within each global time step. It means that algorithm uses smaller time steps only in lines where solution changes are intensive. Application of Fup basis functions enables continuous time approximation, simple interpolation calculations across different temporal lines and local time stepping control. Critical aspect of time integration accuracy is construction of spatial stencil due to accurate calculation of spatial derivatives. Since common approach applied for wavelets and splines uses a finite difference operator, we developed here collocation one including solution values and differential operator. In this way, new improved algorithm is adaptive in space and time enabling accurate solution for groundwater flow problems, especially in highly heterogeneous porous media with large lnK variances and different correlation length scales. In addition, differences between collocation and finite volume approaches are discussed. Finally, results show application of methodology to the groundwater flow problems in highly heterogeneous confined and unconfined aquifers.
NASA Astrophysics Data System (ADS)
Casselman, Steve; Schewel, John
2002-07-01
Success in the marketplace may well depend upon the ability to upgrade and test hardware designs instantly around the world. An upgrade management strategy requires more than just the bitstream file, email or a JTAG cable. A well-managed methodology, capable of transmitting bitstreams directly into targeted FPGAs over the network or internet is an essential element for a successful FPGA based product strategy. Virtual Computer Corporation"s HOTMan, Bitstream Management Environment combines a feature rich cross-platform API with an Object Oriented Bitstream technique for Remote Upgrading of Hardware over the Internet.
FPGA-Based Stochastic Echo State Networks for Time-Series Forecasting.
Alomar, Miquel L; Canals, Vincent; Perez-Mora, Nicolas; Martínez-Moll, Víctor; Rosselló, Josep L
2016-01-01
Hardware implementation of artificial neural networks (ANNs) allows exploiting the inherent parallelism of these systems. Nevertheless, they require a large amount of resources in terms of area and power dissipation. Recently, Reservoir Computing (RC) has arisen as a strategic technique to design recurrent neural networks (RNNs) with simple learning capabilities. In this work, we show a new approach to implement RC systems with digital gates. The proposed method is based on the use of probabilistic computing concepts to reduce the hardware required to implement different arithmetic operations. The result is the development of a highly functional system with low hardware resources. The presented methodology is applied to chaotic time-series forecasting.
Simulation and Analyses of Multi-Body Separation in Launch Vehicle Staging Environment
NASA Technical Reports Server (NTRS)
Pamadi, Bandu N.; Hotchko, Nathaniel J.; Samareh, Jamshid; Covell, Peter F.; Tartabini, Paul V.
2006-01-01
The development of methodologies, techniques, and tools for analysis and simulation of multi-body separation is critically needed for successful design and operation of next generation launch vehicles. As a part of this activity, ConSep simulation tool is being developed. ConSep is a generic MATLAB-based front-and-back-end to the commercially available ADAMS. solver, an industry standard package for solving multi-body dynamic problems. This paper discusses the 3-body separation capability in ConSep and its application to the separation of the Shuttle Solid Rocket Boosters (SRBs) from the External Tank (ET) and the Orbiter. The results are compared with STS-1 flight data.
FPGA-Based Stochastic Echo State Networks for Time-Series Forecasting
Alomar, Miquel L.; Canals, Vincent; Perez-Mora, Nicolas; Martínez-Moll, Víctor; Rosselló, Josep L.
2016-01-01
Hardware implementation of artificial neural networks (ANNs) allows exploiting the inherent parallelism of these systems. Nevertheless, they require a large amount of resources in terms of area and power dissipation. Recently, Reservoir Computing (RC) has arisen as a strategic technique to design recurrent neural networks (RNNs) with simple learning capabilities. In this work, we show a new approach to implement RC systems with digital gates. The proposed method is based on the use of probabilistic computing concepts to reduce the hardware required to implement different arithmetic operations. The result is the development of a highly functional system with low hardware resources. The presented methodology is applied to chaotic time-series forecasting. PMID:26880876
Adly, Amr A.; Abd-El-Hafiz, Salwa K.
2014-01-01
Transformers are regarded as crucial components in power systems. Due to market globalization, power transformer manufacturers are facing an increasingly competitive environment that mandates the adoption of design strategies yielding better performance at lower costs. In this paper, a power transformer design methodology using multi-objective evolutionary optimization is proposed. Using this methodology, which is tailored to be target performance design-oriented, quick rough estimation of transformer design specifics may be inferred. Testing of the suggested approach revealed significant qualitative and quantitative match with measured design and performance values. Details of the proposed methodology as well as sample design results are reported in the paper. PMID:26257939
Adly, Amr A; Abd-El-Hafiz, Salwa K
2015-05-01
Transformers are regarded as crucial components in power systems. Due to market globalization, power transformer manufacturers are facing an increasingly competitive environment that mandates the adoption of design strategies yielding better performance at lower costs. In this paper, a power transformer design methodology using multi-objective evolutionary optimization is proposed. Using this methodology, which is tailored to be target performance design-oriented, quick rough estimation of transformer design specifics may be inferred. Testing of the suggested approach revealed significant qualitative and quantitative match with measured design and performance values. Details of the proposed methodology as well as sample design results are reported in the paper.
Human-Rated Space Vehicle Backup Flight Systems
NASA Technical Reports Server (NTRS)
Davis, Jeffrey A.; Busa, Joseph L.
2004-01-01
Human rated space vehicles have historically employed a Backup Flight System (BFS) for the main purpose of mitigating the loss of the primary avionics control system. Throughout these projects, however, the underlying philosophy and technical implementation vary greatly. This paper attempts to coalesce each of the past space vehicle program's BFS design and implementation methodologies with the accompanying underlining philosophical arguments that drove each program to such decisions. The focus will be aimed at Mercury, Gemini, Apollo, and Space Shuttle However, the ideologies and implementation of several commercial and military aircraft are incorporated as well to complete the full breadth view of BFS development across the varying industries. In particular to the non-space based vehicles is the notion of deciding not to utilize a BFS. A diverse analysis of BFS to primary system benefits in terms of reliability against all aspects of project development are reviewed and traded. The risk of engaging the BFS during critical stages of flight (e.g. ascent and entry), the level of capability of the BFS (subset capability of main system vs. equivalent system), and the notion of dissimilar hardware and software design are all discussed. Finally, considerations for employing a BFS on future human-rated space missions are reviewed in light of modern avionics architectures and mission scenarios implicit in exploration beyond low Earth orbit.
Integrated Simulation Design Challenges to Support TPS Repair Operations
NASA Technical Reports Server (NTRS)
Quiocho, Leslie J.; Crues, Edwin Z.; Huynh, An; Nguyen, Hung T.; MacLean, John
2005-01-01
During the Orbiter Repair Maneuver (ORM) operations planned for Return to Flight (RTF), the Shuttle Remote Manipulator System (SRMS) must grapple the International Space Station (ISS), undock the Orbiter, maneuver it through a long duration trajectory, and orient it to an EVA crewman poised at the end of the Space Station Remote Manipulator System (SSRMS) to facilitate the repair of the Thermal Protection System (TPS). Once repair has been completed and confirmed, then the SRMS proceeds back through the trajectory to dock the Orbiter to the Orbiter Docking System. In order to support analysis of the complex dynamic interactions of the integrated system formed by the Orbiter, ISS, SRMS, and SSRMS during the ORM, simulation tools used for previous 'nominal' mission support required substantial enhancements. These upgrades were necessary to provide analysts with the capabilities needed to study integrated system performance. This paper discusses the simulation design challenges encountered while developing simulation capabilities to mirror the ORM operations. The paper also describes the incremental build approach that was utilized, starting with the subsystem simulation elements and integration into increasing more complex simulations until the resulting ORM worksite dynamics simulation had been assembled. Furthermore, the paper presents an overall integrated simulation V&V methodology based upon a subsystem level testing, integrated comparisons, and phased checkout.
Rinaldi, Fabio; Ellendorff, Tilia Renate; Madan, Sumit; Clematide, Simon; van der Lek, Adrian; Mevissen, Theo; Fluck, Juliane
2016-01-01
Automatic extraction of biological network information is one of the most desired and most complex tasks in biological and medical text mining. Track 4 at BioCreative V attempts to approach this complexity using fragments of large-scale manually curated biological networks, represented in Biological Expression Language (BEL), as training and test data. BEL is an advanced knowledge representation format which has been designed to be both human readable and machine processable. The specific goal of track 4 was to evaluate text mining systems capable of automatically constructing BEL statements from given evidence text, and of retrieving evidence text for given BEL statements. Given the complexity of the task, we designed an evaluation methodology which gives credit to partially correct statements. We identified various levels of information expressed by BEL statements, such as entities, functions, relations, and introduced an evaluation framework which rewards systems capable of delivering useful BEL fragments at each of these levels. The aim of this evaluation method is to help identify the characteristics of the systems which, if combined, would be most useful for achieving the overall goal of automatically constructing causal biological networks from text. © The Author(s) 2016. Published by Oxford University Press.
NASA Astrophysics Data System (ADS)
Furlong, Cosme; Pryputniewicz, Ryszard J.
2002-06-01
Recent technological trends based on miniaturization of mechanical, electro-mechanical, and photonic devices to the microscopic scale, have led to the development of microelectromechanical systems (MEMS). Effective development of MEMS components requires the synergism of advanced design, analysis, and fabrication methodologies, and also of quantitative metrology techniques for characterizing their performance, reliability, and integrity during the electronic packaging cycle. In this paper, we describe opto-electronic techniques for measuring, with sub-micrometer accuracy, shape and changes in states of deformation of MEMS strictures. With the described opto-electronic techniques, it is possible to characterize MEMS components using the display and data modes. In the display mode, interferometric information related to shape and deformation is displayed at video frame rates, providing the capability for adjusting and setting experimental conditions. In the data mode, interferometric information related to shape and deformation is recorded as high-spatial and high-digital resolution images, which are further processed to provide quantitative 3D information. Furthermore, the quantitative 3D data are exported to computer-aided design (CAD) environments and utilized for analysis and optimization of MEMS devices. Capabilities of opto- electronic techniques are illustrated with representative applications demonstrating their applicability to provide indispensable quantitative information for the effective development and optimization of MEMS devices.
Computational Science: A Research Methodology for the 21st Century
NASA Astrophysics Data System (ADS)
Orbach, Raymond L.
2004-03-01
Computational simulation - a means of scientific discovery that employs computer systems to simulate a physical system according to laws derived from theory and experiment - has attained peer status with theory and experiment. Important advances in basic science are accomplished by a new "sociology" for ultrascale scientific computing capability (USSCC), a fusion of sustained advances in scientific models, mathematical algorithms, computer architecture, and scientific software engineering. Expansion of current capabilities by factors of 100 - 1000 open up new vistas for scientific discovery: long term climatic variability and change, macroscopic material design from correlated behavior at the nanoscale, design and optimization of magnetic confinement fusion reactors, strong interactions on a computational lattice through quantum chromodynamics, and stellar explosions and element production. The "virtual prototype," made possible by this expansion, can markedly reduce time-to-market for industrial applications such as jet engines and safer, more fuel efficient cleaner cars. In order to develop USSCC, the National Energy Research Scientific Computing Center (NERSC) announced the competition "Innovative and Novel Computational Impact on Theory and Experiment" (INCITE), with no requirement for current DOE sponsorship. Fifty nine proposals for grand challenge scientific problems were submitted for a small number of awards. The successful grants, and their preliminary progress, will be described.
Three-Dimensional Modeling of Aircraft High-Lift Components with Vehicle Sketch Pad
NASA Technical Reports Server (NTRS)
Olson, Erik D.
2016-01-01
Vehicle Sketch Pad (OpenVSP) is a parametric geometry modeler that has been used extensively for conceptual design studies of aircraft, including studies using higher-order analysis. OpenVSP can model flap and slat surfaces using simple shearing of the airfoil coordinates, which is an appropriate level of complexity for lower-order aerodynamic analysis methods. For three-dimensional analysis, however, there is not a built-in method for defining the high-lift components in OpenVSP in a realistic manner, or for controlling their complex motions in a parametric manner that is intuitive to the designer. This paper seeks instead to utilize OpenVSP's existing capabilities, and establish a set of best practices for modeling high-lift components at a level of complexity suitable for higher-order analysis methods. Techniques are described for modeling the flap and slat components as separate three-dimensional surfaces, and for controlling their motion using simple parameters defined in the local hinge-axis frame of reference. To demonstrate the methodology, an OpenVSP model for the Energy-Efficient Transport (EET) AR12 wind-tunnel model has been created, taking advantage of OpenVSP's Advanced Parameter Linking capability to translate the motions of the high-lift components from the hinge-axis coordinate system to a set of transformations in OpenVSP's frame of reference.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Almansouri, Hani; Foster, Benjamin; Kisner, Roger A
2016-01-01
This paper documents our progress developing an ultrasound phased array system in combination with a model-based iterative reconstruction (MBIR) algorithm to inspect the health of and characterize the composition of the near-wellbore region for geothermal reservoirs. The main goal for this system is to provide a near-wellbore in-situ characterization capability that will significantly improve wellbore integrity evaluation and near well-bore fracture network mapping. A more detailed image of the fracture network near the wellbore in particular will enable the selection of optimal locations for stimulation along the wellbore, provide critical data that can be used to improve stimulation design, andmore » provide a means for measuring evolution of the fracture network to support long term management of reservoir operations. Development of such a measurement capability supports current hydrothermal operations as well as the successful demonstration of Engineered Geothermal Systems (EGS). The paper will include the design of the phased array system, the performance specifications, and characterization methodology. In addition, we will describe the MBIR forward model derived for the phased array system and the propagation of compressional waves through a pseudo-homogenous medium.« less
Multidisciplinary Design Optimization of a Full Vehicle with High Performance Computing
NASA Technical Reports Server (NTRS)
Yang, R. J.; Gu, L.; Tho, C. H.; Sobieszczanski-Sobieski, Jaroslaw
2001-01-01
Multidisciplinary design optimization (MDO) of a full vehicle under the constraints of crashworthiness, NVH (Noise, Vibration and Harshness), durability, and other performance attributes is one of the imperative goals for automotive industry. However, it is often infeasible due to the lack of computational resources, robust simulation capabilities, and efficient optimization methodologies. This paper intends to move closer towards that goal by using parallel computers for the intensive computation and combining different approximations for dissimilar analyses in the MDO process. The MDO process presented in this paper is an extension of the previous work reported by Sobieski et al. In addition to the roof crush, two full vehicle crash modes are added: full frontal impact and 50% frontal offset crash. Instead of using an adaptive polynomial response surface method, this paper employs a DOE/RSM method for exploring the design space and constructing highly nonlinear crash functions. Two NMO strategies are used and results are compared. This paper demonstrates that with high performance computing, a conventionally intractable real world full vehicle multidisciplinary optimization problem considering all performance attributes with large number of design variables become feasible.
Lunar base Controlled Ecological Life Support System (LCELSS): Preliminary conceptual design study
NASA Technical Reports Server (NTRS)
Schwartzkopf, Steven H.
1991-01-01
The objective of this study was to develop a conceptual design for a self-sufficient LCELSS. The mission need is for a CELSS with a capacity to supply the life support needs for a nominal crew of 30, and a capability for accommodating a range of crew sizes from 4 to 100 people. The work performed in this study was nominally divided into two parts. In the first part, relevant literature was assembled and reviewed. This review identified LCELSS performance requirements and the constraints and advantages confronting the design. It also collected information on the environment of the lunar surface and identified candidate technologies for the life support subsystems and the systems with which the LCELSS interfaced. Information on the operation and performance of these technologies was collected, along with concepts of how they might be incorporated into the LCELSS conceptual design. The data collected on these technologies was stored for incorporation into the study database. Also during part one, the study database structure was formulated and implemented, and an overall systems engineering methodology was developed for carrying out the study.
NASA Technical Reports Server (NTRS)
Ngan, Angelen; Biezad, Daniel
1996-01-01
A study has been conducted to develop and to analyze a FORTRAN computer code for performing agility analysis on fighter aircraft configurations. This program is one of the modules of the NASA Ames ACSYNT (AirCraft SYNThesis) design code. The background of the agility research in the aircraft industry and a survey of a few agility metrics are discussed. The methodology, techniques, and models developed for the code are presented. The validity of the existing code was evaluated by comparing with existing flight test data. A FORTRAN program was developed for a specific metric, PM (Pointing Margin), as part of the agility module. Example trade studies using the agility module along with ACSYNT were conducted using a McDonnell Douglas F/A-18 Hornet aircraft model. Tile sensitivity of thrust loading, wing loading, and thrust vectoring on agility criteria were investigated. The module can compare the agility potential between different configurations and has capability to optimize agility performance in the preliminary design process. This research provides a new and useful design tool for analyzing fighter performance during air combat engagements in the preliminary design.