Sample records for structural analysis methodology

  1. Probabilistic structural analysis methods for space propulsion system components

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.

    1986-01-01

    The development of a three-dimensional inelastic analysis methodology for the Space Shuttle main engine (SSME) structural components is described. The methodology is composed of: (1) composite load spectra, (2) probabilistic structural analysis methods, (3) the probabilistic finite element theory, and (4) probabilistic structural analysis. The methodology has led to significant technical progress in several important aspects of probabilistic structural analysis. The program and accomplishments to date are summarized.

  2. Advances in Structural Integrity Analysis Methods for Aging Metallic Airframe Structures with Local Damage

    NASA Technical Reports Server (NTRS)

    Starnes, James H., Jr.; Newman, James C., Jr.; Harris, Charles E.; Piascik, Robert S.; Young, Richard D.; Rose, Cheryl A.

    2003-01-01

    Analysis methodologies for predicting fatigue-crack growth from rivet holes in panels subjected to cyclic loads and for predicting the residual strength of aluminum fuselage structures with cracks and subjected to combined internal pressure and mechanical loads are described. The fatigue-crack growth analysis methodology is based on small-crack theory and a plasticity induced crack-closure model, and the effect of a corrosive environment on crack-growth rate is included. The residual strength analysis methodology is based on the critical crack-tip-opening-angle fracture criterion that characterizes the fracture behavior of a material of interest, and a geometric and material nonlinear finite element shell analysis code that performs the structural analysis of the fuselage structure of interest. The methodologies have been verified experimentally for structures ranging from laboratory coupons to full-scale structural components. Analytical and experimental results based on these methodologies are described and compared for laboratory coupons and flat panels, small-scale pressurized shells, and full-scale curved stiffened panels. The residual strength analysis methodology is sufficiently general to include the effects of multiple-site damage on structural behavior.

  3. Global/local methods research using a common structural analysis framework

    NASA Technical Reports Server (NTRS)

    Knight, Norman F., Jr.; Ransom, Jonathan B.; Griffin, O. H., Jr.; Thompson, Danniella M.

    1991-01-01

    Methodologies for global/local stress analysis are described including both two- and three-dimensional analysis methods. These methods are being developed within a common structural analysis framework. Representative structural analysis problems are presented to demonstrate the global/local methodologies being developed.

  4. Analysis and Design of Fuselage Structures Including Residual Strength Prediction Methodology

    NASA Technical Reports Server (NTRS)

    Knight, Norman F.

    1998-01-01

    The goal of this research project is to develop and assess methodologies for the design and analysis of fuselage structures accounting for residual strength. Two primary objectives are included in this research activity: development of structural analysis methodology for predicting residual strength of fuselage shell-type structures; and the development of accurate, efficient analysis, design and optimization tool for fuselage shell structures. Assessment of these tools for robustness, efficient, and usage in a fuselage shell design environment will be integrated with these two primary research objectives.

  5. Verification of nonlinear dynamic structural test results by combined image processing and acoustic analysis

    NASA Astrophysics Data System (ADS)

    Tene, Yair; Tene, Noam; Tene, G.

    1993-08-01

    An interactive data fusion methodology of video, audio, and nonlinear structural dynamic analysis for potential application in forensic engineering is presented. The methodology was developed and successfully demonstrated in the analysis of heavy transportable bridge collapse during preparation for testing. Multiple bridge elements failures were identified after the collapse, including fracture, cracks and rupture of high performance structural materials. Videotape recording by hand held camcorder was the only source of information about the collapse sequence. The interactive data fusion methodology resulted in extracting relevant information form the videotape and from dynamic nonlinear structural analysis, leading to full account of the sequence of events during the bridge collapse.

  6. Structural Optimization Methodology for Rotating Disks of Aircraft Engines

    NASA Technical Reports Server (NTRS)

    Armand, Sasan C.

    1995-01-01

    In support of the preliminary evaluation of various engine technologies, a methodology has been developed for structurally designing the rotating disks of an aircraft engine. The structural design methodology, along with a previously derived methodology for predicting low-cycle fatigue life, was implemented in a computer program. An interface computer program was also developed that gathers the required data from a flowpath analysis program (WATE) being used at NASA Lewis. The computer program developed for this study requires minimum interaction with the user, thus allowing engineers with varying backgrounds in aeropropulsion to successfully execute it. The stress analysis portion of the methodology and the computer program were verified by employing the finite element analysis method. The 10th- stage, high-pressure-compressor disk of the Energy Efficient Engine Program (E3) engine was used to verify the stress analysis; the differences between the stresses and displacements obtained from the computer program developed for this study and from the finite element analysis were all below 3 percent for the problem solved. The computer program developed for this study was employed to structurally optimize the rotating disks of the E3 high-pressure compressor. The rotating disks designed by the computer program in this study were approximately 26 percent lighter than calculated from the E3 drawings. The methodology is presented herein.

  7. Application of the probabilistic approximate analysis method to a turbopump blade analysis. [for Space Shuttle Main Engine

    NASA Technical Reports Server (NTRS)

    Thacker, B. H.; Mcclung, R. C.; Millwater, H. R.

    1990-01-01

    An eigenvalue analysis of a typical space propulsion system turbopump blade is presented using an approximate probabilistic analysis methodology. The methodology was developed originally to investigate the feasibility of computing probabilistic structural response using closed-form approximate models. This paper extends the methodology to structures for which simple closed-form solutions do not exist. The finite element method will be used for this demonstration, but the concepts apply to any numerical method. The results agree with detailed analysis results and indicate the usefulness of using a probabilistic approximate analysis in determining efficient solution strategies.

  8. Application of numerical methods to heat transfer and thermal stress analysis of aerospace vehicles

    NASA Technical Reports Server (NTRS)

    Wieting, A. R.

    1979-01-01

    The paper describes a thermal-structural design analysis study of a fuel-injection strut for a hydrogen-cooled scramjet engine for a supersonic transport, utilizing finite-element methodology. Applications of finite-element and finite-difference codes to the thermal-structural design-analysis of space transports and structures are discussed. The interaction between the thermal and structural analyses has led to development of finite-element thermal methodology to improve the integration between these two disciplines. The integrated thermal-structural analysis capability developed within the framework of a computer code is outlined.

  9. Developing Army Leaders through Increased Rigor in Professional Military Training and Education

    DTIC Science & Technology

    2017-06-09

    leadership. Research Methodology An applied, exploratory, qualitative research methodology via a structured and focused case study comparison was...research methodology via a structured and focused case study comparison. Finally, it will discuss how the methodology will be conducted to make...development models; it serves as the base data for case study comparison. 48 Research Methodology and Data Analysis A qualitative research

  10. Structural damage continuous monitoring by using a data driven approach based on principal component analysis and cross-correlation analysis

    NASA Astrophysics Data System (ADS)

    Camacho-Navarro, Jhonatan; Ruiz, Magda; Villamizar, Rodolfo; Mujica, Luis; Moreno-Beltrán, Gustavo; Quiroga, Jabid

    2017-05-01

    Continuous monitoring for damage detection in structural assessment comprises implementation of low cost equipment and efficient algorithms. This work describes the stages involved in the design of a methodology with high feasibility to be used in continuous damage assessment. Specifically, an algorithm based on a data-driven approach by using principal component analysis and pre-processing acquired signals by means of cross-correlation functions, is discussed. A carbon steel pipe section and a laboratory tower were used as test structures in order to demonstrate the feasibility of the methodology to detect abrupt changes in the structural response when damages occur. Two types of damage cases are studied: crack and leak for each structure, respectively. Experimental results show that the methodology is promising in the continuous monitoring of real structures.

  11. Probabilistic structural analysis methods of hot engine structures

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.; Hopkins, D. A.

    1989-01-01

    Development of probabilistic structural analysis methods for hot engine structures at Lewis Research Center is presented. Three elements of the research program are: (1) composite load spectra methodology; (2) probabilistic structural analysis methodology; and (3) probabilistic structural analysis application. Recent progress includes: (1) quantification of the effects of uncertainties for several variables on high pressure fuel turbopump (HPFT) turbine blade temperature, pressure, and torque of the space shuttle main engine (SSME); (2) the evaluation of the cumulative distribution function for various structural response variables based on assumed uncertainties in primitive structural variables; and (3) evaluation of the failure probability. Collectively, the results demonstrate that the structural durability of hot engine structural components can be effectively evaluated in a formal probabilistic/reliability framework.

  12. Improving Junior Infantry Officer Leader Development and Performance

    DTIC Science & Technology

    2017-06-09

    researcher used a qualitative literature review and semi-structured interview methodology to analyze Army leadership theories and leader development...researcher used a qualitative literature review and semi-structured interview methodology to analyze Army leadership theories and leader development...CHAPTER 3 RESEARCH METHODOLOGY ..............................................................132 CHAPTER 4 QUALITATIVE ANALYSIS

  13. The methodology of semantic analysis for extracting physical effects

    NASA Astrophysics Data System (ADS)

    Fomenkova, M. A.; Kamaev, V. A.; Korobkin, D. M.; Fomenkov, S. A.

    2017-01-01

    The paper represents new methodology of semantic analysis for physical effects extracting. This methodology is based on the Tuzov ontology that formally describes the Russian language. In this paper, semantic patterns were described to extract structural physical information in the form of physical effects. A new algorithm of text analysis was described.

  14. Global-local methodologies and their application to nonlinear analysis. [for structural postbuckling study

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K.

    1986-01-01

    An assessment is made of the potential of different global-local analysis strategies for predicting the nonlinear and postbuckling responses of structures. Two postbuckling problems of composite panels are used as benchmarks and the application of different global-local methodologies to these benchmarks is outlined. The key elements of each of the global-local strategies are discussed and future research areas needed to realize the full potential of global-local methodologies are identified.

  15. Evaluation of a Progressive Failure Analysis Methodology for Laminated Composite Structures

    NASA Technical Reports Server (NTRS)

    Sleight, David W.; Knight, Norman F., Jr.; Wang, John T.

    1997-01-01

    A progressive failure analysis methodology has been developed for predicting the nonlinear response and failure of laminated composite structures. The progressive failure analysis uses C plate and shell elements based on classical lamination theory to calculate the in-plane stresses. Several failure criteria, including the maximum strain criterion, Hashin's criterion, and Christensen's criterion, are used to predict the failure mechanisms. The progressive failure analysis model is implemented into a general purpose finite element code and can predict the damage and response of laminated composite structures from initial loading to final failure.

  16. Applications of decision analysis and related techniques to industrial engineering problems at KSC

    NASA Technical Reports Server (NTRS)

    Evans, Gerald W.

    1995-01-01

    This report provides: (1) a discussion of the origination of decision analysis problems (well-structured problems) from ill-structured problems; (2) a review of the various methodologies and software packages for decision analysis and related problem areas; (3) a discussion of how the characteristics of a decision analysis problem affect the choice of modeling methodologies, thus providing a guide as to when to choose a particular methodology; and (4) examples of applications of decision analysis to particular problems encountered by the IE Group at KSC. With respect to the specific applications at KSC, particular emphasis is placed on the use of the Demos software package (Lumina Decision Systems, 1993).

  17. Probabilistic sizing of laminates with uncertainties

    NASA Technical Reports Server (NTRS)

    Shah, A. R.; Liaw, D. G.; Chamis, C. C.

    1993-01-01

    A reliability based design methodology for laminate sizing and configuration for a special case of composite structures is described. The methodology combines probabilistic composite mechanics with probabilistic structural analysis. The uncertainties of constituent materials (fiber and matrix) to predict macroscopic behavior are simulated using probabilistic theory. Uncertainties in the degradation of composite material properties are included in this design methodology. A multi-factor interaction equation is used to evaluate load and environment dependent degradation of the composite material properties at the micromechanics level. The methodology is integrated into a computer code IPACS (Integrated Probabilistic Assessment of Composite Structures). Versatility of this design approach is demonstrated by performing a multi-level probabilistic analysis to size the laminates for design structural reliability of random type structures. The results show that laminate configurations can be selected to improve the structural reliability from three failures in 1000, to no failures in one million. Results also show that the laminates with the highest reliability are the least sensitive to the loading conditions.

  18. An Efficient Analysis Methodology for Fluted-Core Composite Structures

    NASA Technical Reports Server (NTRS)

    Oremont, Leonard; Schultz, Marc R.

    2012-01-01

    The primary loading condition in launch-vehicle barrel sections is axial compression, and it is therefore important to understand the compression behavior of any structures, structural concepts, and materials considered in launch-vehicle designs. This understanding will necessarily come from a combination of test and analysis. However, certain potentially beneficial structures and structural concepts do not lend themselves to commonly used simplified analysis methods, and therefore innovative analysis methodologies must be developed if these structures and structural concepts are to be considered. This paper discusses such an analysis technique for the fluted-core sandwich composite structural concept. The presented technique is based on commercially available finite-element codes, and uses shell elements to capture behavior that would normally require solid elements to capture the detailed mechanical response of the structure. The shell thicknesses and offsets using this analysis technique are parameterized, and the parameters are adjusted through a heuristic procedure until this model matches the mechanical behavior of a more detailed shell-and-solid model. Additionally, the detailed shell-and-solid model can be strategically placed in a larger, global shell-only model to capture important local behavior. Comparisons between shell-only models, experiments, and more detailed shell-and-solid models show excellent agreement. The discussed analysis methodology, though only discussed in the context of fluted-core composites, is widely applicable to other concepts.

  19. Designing for fiber composite structural durability in hygrothermomechanical environment

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.

    1985-01-01

    A methodology is described which can be used to design/analyze fiber composite structures subjected to complex hygrothermomechanical environments. This methodology includes composite mechanics and advanced structural analysis methods (finite element). Select examples are described to illustrate the application of the available methodology. The examples include: (1) composite progressive fracture; (2) composite design for high cycle fatigue combined with hot-wet conditions; and (3) general laminate design.

  20. Residual Strength Analysis Methodology: Laboratory Coupons to Structural Components

    NASA Technical Reports Server (NTRS)

    Dawicke, D. S.; Newman, J. C., Jr.; Starnes, J. H., Jr.; Rose, C. A.; Young, R. D.; Seshadri, B. R.

    2000-01-01

    The NASA Aircraft Structural Integrity (NASIP) and Airframe Airworthiness Assurance/Aging Aircraft (AAA/AA) Programs have developed a residual strength prediction methodology for aircraft fuselage structures. This methodology has been experimentally verified for structures ranging from laboratory coupons up to full-scale structural components. The methodology uses the critical crack tip opening angle (CTOA) fracture criterion to characterize the fracture behavior and a material and a geometric nonlinear finite element shell analysis code to perform the structural analyses. The present paper presents the results of a study to evaluate the fracture behavior of 2024-T3 aluminum alloys with thickness of 0.04 inches to 0.09 inches. The critical CTOA and the corresponding plane strain core height necessary to simulate through-the-thickness effects at the crack tip in an otherwise plane stress analysis, were determined from small laboratory specimens. Using these parameters, the CTOA fracture criterion was used to predict the behavior of middle crack tension specimens that were up to 40 inches wide, flat panels with riveted stiffeners and multiple-site damage cracks, 18-inch diameter pressurized cylinders, and full scale curved stiffened panels subjected to internal pressure and mechanical loads.

  1. HPCC Methodologies for Structural Design and Analysis on Parallel and Distributed Computing Platforms

    NASA Technical Reports Server (NTRS)

    Farhat, Charbel

    1998-01-01

    In this grant, we have proposed a three-year research effort focused on developing High Performance Computation and Communication (HPCC) methodologies for structural analysis on parallel processors and clusters of workstations, with emphasis on reducing the structural design cycle time. Besides consolidating and further improving the FETI solver technology to address plate and shell structures, we have proposed to tackle the following design related issues: (a) parallel coupling and assembly of independently designed and analyzed three-dimensional substructures with non-matching interfaces, (b) fast and smart parallel re-analysis of a given structure after it has undergone design modifications, (c) parallel evaluation of sensitivity operators (derivatives) for design optimization, and (d) fast parallel analysis of mildly nonlinear structures. While our proposal was accepted, support was provided only for one year.

  2. Systematic methodological review: developing a framework for a qualitative semi-structured interview guide.

    PubMed

    Kallio, Hanna; Pietilä, Anna-Maija; Johnson, Martin; Kangasniemi, Mari

    2016-12-01

    To produce a framework for the development of a qualitative semi-structured interview guide. Rigorous data collection procedures fundamentally influence the results of studies. The semi-structured interview is a common data collection method, but methodological research on the development of a semi-structured interview guide is sparse. Systematic methodological review. We searched PubMed, CINAHL, Scopus and Web of Science for methodological papers on semi-structured interview guides from October 2004-September 2014. Having examined 2,703 titles and abstracts and 21 full texts, we finally selected 10 papers. We analysed the data using the qualitative content analysis method. Our analysis resulted in new synthesized knowledge on the development of a semi-structured interview guide, including five phases: (1) identifying the prerequisites for using semi-structured interviews; (2) retrieving and using previous knowledge; (3) formulating the preliminary semi-structured interview guide; (4) pilot testing the guide; and (5) presenting the complete semi-structured interview guide. Rigorous development of a qualitative semi-structured interview guide contributes to the objectivity and trustworthiness of studies and makes the results more plausible. Researchers should consider using this five-step process to develop a semi-structured interview guide and justify the decisions made during it. © 2016 John Wiley & Sons Ltd.

  3. An improved approach for flight readiness certification: Methodology for failure risk assessment and application examples. Volume 3: Structure and listing of programs

    NASA Technical Reports Server (NTRS)

    Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.

    1992-01-01

    An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with engineering analysis to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in engineering analyses of failure phenomena, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which engineering analysis models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes. These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. Conventional engineering analysis models currently employed for design of failure prediction are used in this methodology. The PFA methodology is described and examples of its application are presented. Conventional approaches to failure risk evaluation for spaceflight systems are discussed, and the rationale for the approach taken in the PFA methodology is presented. The statistical methods, engineering models, and computer software used in fatigue failure mode applications are thoroughly documented.

  4. Reliability analysis of composite structures

    NASA Technical Reports Server (NTRS)

    Kan, Han-Pin

    1992-01-01

    A probabilistic static stress analysis methodology has been developed to estimate the reliability of a composite structure. Closed form stress analysis methods are the primary analytical tools used in this methodology. These structural mechanics methods are used to identify independent variables whose variations significantly affect the performance of the structure. Once these variables are identified, scatter in their values is evaluated and statistically characterized. The scatter in applied loads and the structural parameters are then fitted to appropriate probabilistic distribution functions. Numerical integration techniques are applied to compute the structural reliability. The predicted reliability accounts for scatter due to variability in material strength, applied load, fabrication and assembly processes. The influence of structural geometry and mode of failure are also considerations in the evaluation. Example problems are given to illustrate various levels of analytical complexity.

  5. Probabilistic analysis of structures involving random stress-strain behavior

    NASA Technical Reports Server (NTRS)

    Millwater, H. R.; Thacker, B. H.; Harren, S. V.

    1991-01-01

    The present methodology for analysis of structures with random stress strain behavior characterizes the uniaxial stress-strain curve in terms of (1) elastic modulus, (2) engineering stress at initial yield, (3) initial plastic-hardening slope, (4) engineering stress at point of ultimate load, and (5) engineering strain at point of ultimate load. The methodology is incorporated into the Numerical Evaluation of Stochastic Structures Under Stress code for probabilistic structural analysis. The illustrative problem of a thick cylinder under internal pressure, where both the internal pressure and the stress-strain curve are random, is addressed by means of the code. The response value is the cumulative distribution function of the equivalent plastic strain at the inner radius.

  6. Progression of 3D Protein Structure and Dynamics Measurements

    NASA Astrophysics Data System (ADS)

    Sato-Tomita, Ayana; Sekiguchi, Hiroshi; Sasaki, Yuji C.

    2018-06-01

    New measurement methodologies have begun to be proposed with the recent progress in the life sciences. Here, we introduce two new methodologies, X-ray fluorescence holography for protein structural analysis and diffracted X-ray tracking (DXT), to observe the dynamic behaviors of individual single molecules.

  7. Finite element analysis of container ship's cargo hold using ANSYS and POSEIDON software

    NASA Astrophysics Data System (ADS)

    Tanny, Tania Tamiz; Akter, Naznin; Amin, Osman Md.

    2017-12-01

    Nowadays ship structural analysis has become an integral part of the preliminary ship design providing further support for the development and detail design of ship structures. Structural analyses of container ship's cargo holds are carried out for the balancing of their safety and capacity, as those ships are exposed to the high risk of structural damage during voyage. Two different design methodologies have been considered for the structural analysis of a container ship's cargo hold. One is rule-based methodology and the other is a more conventional software based analyses. The rule based analysis is done by DNV-GL's software POSEIDON and the conventional package based analysis is done by ANSYS structural module. Both methods have been applied to analyze some of the mechanical properties of the model such as total deformation, stress-strain distribution, Von Mises stress, Fatigue etc., following different design bases and approaches, to indicate some guidance's for further improvements in ship structural design.

  8. Coupled structural/thermal/electromagnetic analysis/tailoring of graded composite structures

    NASA Technical Reports Server (NTRS)

    Mcknight, R. L.; Huang, H.; Hartle, M.

    1992-01-01

    Accomplishments are described for the third years effort of a 5-year program to develop a methodology for coupled structural/thermal/electromagnetic analysis/tailoring of graded composite structures. These accomplishments include: (1) structural analysis capability specialized for graded composite structures including large deformation and deformation position eigenanalysis technologies; (2) a thermal analyzer specialized for graded composite structures; (3) absorption of electromagnetic waves by graded composite structures; and (4) coupled structural thermal/electromagnetic analysis of graded composite structures.

  9. Coupled structural/thermal/electromagnetic analysis/tailoring of graded composite structures

    NASA Technical Reports Server (NTRS)

    Mcknight, R. L.; Chen, P. C.; Dame, L. T.; Huang, H.

    1992-01-01

    Accomplishments are described for the first year effort of a 5-year program to develop a methodology for coupled structural/thermal/electromagnetic analysis/tailoring of graded composite structures. These accomplishments include: (1) the results of the selective literature survey; (2) 8-, 16-, and 20-noded isoparametric plate and shell elements; (3) large deformation structural analysis; (4) eigenanalysis; (5) anisotropic heat transfer analysis; and (6) anisotropic electromagnetic analysis.

  10. Development of Methodologies for the Estimation of Thermal Properties Associated with Aerospace Vehicles

    NASA Technical Reports Server (NTRS)

    Scott, Elaine P.

    1996-01-01

    A thermal stress analysis is an important aspect in the design of aerospace structures and vehicles such as the High Speed Civil Transport (HSCT) at the National Aeronautics and Space Administration Langley Research Center (NASA-LaRC). These structures are complex and are often composed of numerous components fabricated from a variety of different materials. The thermal loads on these structures induce temperature variations within the structure, which in turn result in the development of thermal stresses. Therefore, a thermal stress analysis requires knowledge of the temperature distributions within the structures which consequently necessitates the need for accurate knowledge of the thermal properties, boundary conditions and thermal interface conditions associated with the structural materials. The goal of this proposed multi-year research effort was to develop estimation methodologies for the determination of the thermal properties and interface conditions associated with aerospace vehicles. Specific objectives focused on the development and implementation of optimal experimental design strategies and methodologies for the estimation of thermal properties associated with simple composite and honeycomb structures. The strategy used in this multi-year research effort was to first develop methodologies for relatively simple systems and then systematically modify these methodologies to analyze complex structures. This can be thought of as a building block approach. This strategy was intended to promote maximum usability of the resulting estimation procedure by NASA-LARC researchers through the design of in-house experimentation procedures and through the use of an existing general purpose finite element software.

  11. State of the Art Methodology for the Design and Analysis of Future Large Scale Evaluations: A Selective Examination.

    ERIC Educational Resources Information Center

    Burstein, Leigh

    Two specific methods of analysis in large-scale evaluations are considered: structural equation modeling and selection modeling/analysis of non-equivalent control group designs. Their utility in large-scale educational program evaluation is discussed. The examination of these methodological developments indicates how people (evaluators,…

  12. Social Network Analysis: A New Methodology for Counseling Research.

    ERIC Educational Resources Information Center

    Koehly, Laura M.; Shivy, Victoria A.

    1998-01-01

    Social network analysis (SNA) uses indices of relatedness among individuals to produce representations of social structures and positions inherent in dyads or groups. SNA methods provide quantitative representations of ongoing transactional patterns in a given social environment. Methodological issues, applications and resources are discussed…

  13. Accounting for Uncertainties in Strengths of SiC MEMS Parts

    NASA Technical Reports Server (NTRS)

    Nemeth, Noel; Evans, Laura; Beheim, Glen; Trapp, Mark; Jadaan, Osama; Sharpe, William N., Jr.

    2007-01-01

    A methodology has been devised for accounting for uncertainties in the strengths of silicon carbide structural components of microelectromechanical systems (MEMS). The methodology enables prediction of the probabilistic strengths of complexly shaped MEMS parts using data from tests of simple specimens. This methodology is intended to serve as a part of a rational basis for designing SiC MEMS, supplementing methodologies that have been borrowed from the art of designing macroscopic brittle material structures. The need for this or a similar methodology arises as a consequence of the fundamental nature of MEMS and the brittle silicon-based materials of which they are typically fabricated. When tested to fracture, MEMS and structural components thereof show wide part-to-part scatter in strength. The methodology involves the use of the Ceramics Analysis and Reliability Evaluation of Structures Life (CARES/Life) software in conjunction with the ANSYS Probabilistic Design System (PDS) software to simulate or predict the strength responses of brittle material components while simultaneously accounting for the effects of variability of geometrical features on the strength responses. As such, the methodology involves the use of an extended version of the ANSYS/CARES/PDS software system described in Probabilistic Prediction of Lifetimes of Ceramic Parts (LEW-17682-1/4-1), Software Tech Briefs supplement to NASA Tech Briefs, Vol. 30, No. 9 (September 2006), page 10. The ANSYS PDS software enables the ANSYS finite-element-analysis program to account for uncertainty in the design-and analysis process. The ANSYS PDS software accounts for uncertainty in material properties, dimensions, and loading by assigning probabilistic distributions to user-specified model parameters and performing simulations using various sampling techniques.

  14. Requirements analysis notebook for the flight data systems definition in the Real-Time Systems Engineering Laboratory (RSEL)

    NASA Astrophysics Data System (ADS)

    Wray, Richard B.

    1991-12-01

    A hybrid requirements analysis methodology was developed, based on the practices actually used in developing a Space Generic Open Avionics Architecture. During the development of this avionics architecture, a method of analysis able to effectively define the requirements for this space avionics architecture was developed. In this methodology, external interfaces and relationships are defined, a static analysis resulting in a static avionics model was developed, operating concepts for simulating the requirements were put together, and a dynamic analysis of the execution needs for the dynamic model operation was planned. The systems engineering approach was used to perform a top down modified structured analysis of a generic space avionics system and to convert actual program results into generic requirements. CASE tools were used to model the analyzed system and automatically generate specifications describing the model's requirements. Lessons learned in the use of CASE tools, the architecture, and the design of the Space Generic Avionics model were established, and a methodology notebook was prepared for NASA. The weaknesses of standard real-time methodologies for practicing systems engineering, such as Structured Analysis and Object Oriented Analysis, were identified.

  15. Requirements analysis notebook for the flight data systems definition in the Real-Time Systems Engineering Laboratory (RSEL)

    NASA Technical Reports Server (NTRS)

    Wray, Richard B.

    1991-01-01

    A hybrid requirements analysis methodology was developed, based on the practices actually used in developing a Space Generic Open Avionics Architecture. During the development of this avionics architecture, a method of analysis able to effectively define the requirements for this space avionics architecture was developed. In this methodology, external interfaces and relationships are defined, a static analysis resulting in a static avionics model was developed, operating concepts for simulating the requirements were put together, and a dynamic analysis of the execution needs for the dynamic model operation was planned. The systems engineering approach was used to perform a top down modified structured analysis of a generic space avionics system and to convert actual program results into generic requirements. CASE tools were used to model the analyzed system and automatically generate specifications describing the model's requirements. Lessons learned in the use of CASE tools, the architecture, and the design of the Space Generic Avionics model were established, and a methodology notebook was prepared for NASA. The weaknesses of standard real-time methodologies for practicing systems engineering, such as Structured Analysis and Object Oriented Analysis, were identified.

  16. Global/local methods research using the CSM testbed

    NASA Technical Reports Server (NTRS)

    Knight, Norman F., Jr.; Ransom, Jonathan B.; Griffin, O. Hayden, Jr.; Thompson, Danniella M.

    1990-01-01

    Research activities in global/local stress analysis are described including both two- and three-dimensional analysis methods. These methods are being developed within a common structural analysis framework. Representative structural analysis problems are presented to demonstrate the global/local methodologies being developed.

  17. Analysis and methodology for aeronautical systems technology program planning

    NASA Technical Reports Server (NTRS)

    White, M. J.; Gershkoff, I.; Lamkin, S.

    1983-01-01

    A structured methodology was developed that allows the generation, analysis, and rank-ordering of system concepts by their benefits and costs, indicating the preferred order of implementation. The methodology is supported by a base of data on civil transport aircraft fleet growth projections and data on aircraft performance relating the contribution of each element of the aircraft to overall performance. The performance data are used to assess the benefits of proposed concepts. The methodology includes a computer program for performing the calculations needed to rank-order the concepts and compute their cumulative benefit-to-cost ratio. The use of the methodology and supporting data is illustrated through the analysis of actual system concepts from various sources.

  18. Computational methods for global/local analysis

    NASA Technical Reports Server (NTRS)

    Ransom, Jonathan B.; Mccleary, Susan L.; Aminpour, Mohammad A.; Knight, Norman F., Jr.

    1992-01-01

    Computational methods for global/local analysis of structures which include both uncoupled and coupled methods are described. In addition, global/local analysis methodology for automatic refinement of incompatible global and local finite element models is developed. Representative structural analysis problems are presented to demonstrate the global/local analysis methods.

  19. A quantitative flood risk analysis methodology for urban areas with integration of social research data

    NASA Astrophysics Data System (ADS)

    Escuder-Bueno, I.; Castillo-Rodríguez, J. T.; Zechner, S.; Jöbstl, C.; Perales-Momparler, S.; Petaccia, G.

    2012-09-01

    Risk analysis has become a top priority for authorities and stakeholders in many European countries, with the aim of reducing flooding risk, considering the population's needs and improving risk awareness. Within this context, two methodological pieces have been developed in the period 2009-2011 within the SUFRI project (Sustainable Strategies of Urban Flood Risk Management with non-structural measures to cope with the residual risk, 2nd ERA-Net CRUE Funding Initiative). First, the "SUFRI Methodology for pluvial and river flooding risk assessment in urban areas to inform decision-making" provides a comprehensive and quantitative tool for flood risk analysis. Second, the "Methodology for investigation of risk awareness of the population concerned" presents the basis to estimate current risk from a social perspective and identify tendencies in the way floods are understood by citizens. Outcomes of both methods are integrated in this paper with the aim of informing decision making on non-structural protection measures. The results of two case studies are shown to illustrate practical applications of this developed approach. The main advantage of applying the methodology herein presented consists in providing a quantitative estimation of flooding risk before and after investing in non-structural risk mitigation measures. It can be of great interest for decision makers as it provides rational and solid information.

  20. Measuring Structural Gender Equality in Mexico: A State Level Analysis

    ERIC Educational Resources Information Center

    Frias, Sonia M.

    2008-01-01

    The main goal of this article is to assess the level of gender equality across the 32 Mexican states. After reviewing conceptual and methodological issues related to previous measures of structural inequality I detail the logic and methodology involved in the construction of a composite and multidimensional measure of gender equality, at the…

  1. Probabilistic design of fibre concrete structures

    NASA Astrophysics Data System (ADS)

    Pukl, R.; Novák, D.; Sajdlová, T.; Lehký, D.; Červenka, J.; Červenka, V.

    2017-09-01

    Advanced computer simulation is recently well-established methodology for evaluation of resistance of concrete engineering structures. The nonlinear finite element analysis enables to realistically predict structural damage, peak load, failure, post-peak response, development of cracks in concrete, yielding of reinforcement, concrete crushing or shear failure. The nonlinear material models can cover various types of concrete and reinforced concrete: ordinary concrete, plain or reinforced, without or with prestressing, fibre concrete, (ultra) high performance concrete, lightweight concrete, etc. Advanced material models taking into account fibre concrete properties such as shape of tensile softening branch, high toughness and ductility are described in the paper. Since the variability of the fibre concrete material properties is rather high, the probabilistic analysis seems to be the most appropriate format for structural design and evaluation of structural performance, reliability and safety. The presented combination of the nonlinear analysis with advanced probabilistic methods allows evaluation of structural safety characterized by failure probability or by reliability index respectively. Authors offer a methodology and computer tools for realistic safety assessment of concrete structures; the utilized approach is based on randomization of the nonlinear finite element analysis of the structural model. Uncertainty of the material properties or their randomness obtained from material tests are accounted in the random distribution. Furthermore, degradation of the reinforced concrete materials such as carbonation of concrete, corrosion of reinforcement, etc. can be accounted in order to analyze life-cycle structural performance and to enable prediction of the structural reliability and safety in time development. The results can serve as a rational basis for design of fibre concrete engineering structures based on advanced nonlinear computer analysis. The presented methodology is illustrated on results from two probabilistic studies with different types of concrete structures related to practical applications and made from various materials (with the parameters obtained from real material tests).

  2. Command and Control for Distributed Lethality

    DTIC Science & Technology

    2017-06-01

    based systems engineering (MBSE) approach to C2 within the distributed lethality environment requires development of methodologies to provide...lethality environment requires development of methodologies to provide definition and structure for existing operational concepts while providing...2  D.  SCOPE AND METHODOLOGY ............................................................2  E.  STAKEHOLDER ANALYSIS

  3. Development of Probabilistic Structural Analysis Integrated with Manufacturing Processes

    NASA Technical Reports Server (NTRS)

    Pai, Shantaram S.; Nagpal, Vinod K.

    2007-01-01

    An effort has been initiated to integrate manufacturing process simulations with probabilistic structural analyses in order to capture the important impacts of manufacturing uncertainties on component stress levels and life. Two physics-based manufacturing process models (one for powdered metal forging and the other for annular deformation resistance welding) have been linked to the NESSUS structural analysis code. This paper describes the methodology developed to perform this integration including several examples. Although this effort is still underway, particularly for full integration of a probabilistic analysis, the progress to date has been encouraging and a software interface that implements the methodology has been developed. The purpose of this paper is to report this preliminary development.

  4. Structural features of dilute acid, steam exploded, and alkali pretreated mustard stalk and their impact on enzymatic hydrolysis.

    PubMed

    Kapoor, Manali; Raj, Tirath; Vijayaraj, M; Chopra, Anju; Gupta, Ravi P; Tuli, Deepak K; Kumar, Ravindra

    2015-06-25

    To overcome the recalcitrant nature of biomass several pretreatment methodologies have been explored to make it amenable to enzymatic hydrolysis. These methodologies alter cell wall structure primarily by removing/altering hemicelluloses and lignin. In this work, alkali, dilute acid, steam explosion pretreatment are systematically studied for mustard stalk. To assess the structural variability after pretreatment, chemical analysis, surface area, crystallinity index, accessibility of cellulose, FT-IR and thermal analysis are conducted. Although the extent of enzymatic hydrolysis varies upon the methodologies used, nevertheless, cellulose conversion increases from <10% to 81% after pretreatment. Glucose yield at 2 and 72h are well correlated with surface area and maximum adsorption capacity. However, no such relationship is observed for xylose yield. Mass balance of the process is also studied. Dilute acid pretreatment is the best methodology in terms of maximum sugar yield at lower enzyme loading. Copyright © 2015 Elsevier Ltd. All rights reserved.

  5. Nonlinear Structural Analysis Methodology and Dynamics Scaling of Inflatable Parabolic Reflector Antenna Concepts

    NASA Technical Reports Server (NTRS)

    Sreekantamurthy, Tham; Gaspar, James L.; Mann, Troy; Behun, Vaughn; Pearson, James C., Jr.; Scarborough, Stephen

    2007-01-01

    Ultra-light weight and ultra-thin membrane inflatable antenna concepts are fast evolving to become the state-of-the-art antenna concepts for deep-space applications. NASA Langley Research Center has been involved in the structural dynamics research on antenna structures. One of the goals of the research is to develop structural analysis methodology for prediction of the static and dynamic response characteristics of the inflatable antenna concepts. This research is focused on the computational studies to use nonlinear large deformation finite element analysis to characterize the ultra-thin membrane responses of the antennas. Recently, structural analyses have been performed on a few parabolic reflector antennas of varying size and shape, which are referred in the paper as 0.3 meters subscale, 2 meters half-scale, and 4 meters full-scale antenna. The various aspects studied included nonlinear analysis methodology and solution techniques, ways to speed convergence in iterative methods, the sensitivities of responses with respect to structural loads, such as inflation pressure, gravity, and pretension loads in the ground and in-space conditions, and the ultra-thin membrane wrinkling characteristics. Several such intrinsic aspects studied have provided valuable insight into evaluation of structural characteristics of such antennas. While analyzing these structural characteristics, a quick study was also made to assess the applicability of dynamics scaling of the half-scale antenna. This paper presents the details of the nonlinear structural analysis results, and discusses the insight gained from the studies on the various intrinsic aspects of the analysis methodology. The predicted reflector surface characteristics of the three inflatable ultra-thin membrane parabolic reflector antenna concepts are presented as easily observable displacement fringe patterns with associated maximum values, and normal mode shapes and associated frequencies. Wrinkling patterns are presented to show how surface wrinkle progress with increasing tension loads. Antenna reflector surface accuracies were found to be very much dependent on the type and size of the antenna, the reflector surface curvature, reflector membrane supports in terms of spacing of catenaries, as well as the amount of applied load.

  6. Global-local methodologies and their application to nonlinear analysis

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K.

    1989-01-01

    An assessment is made of the potential of different global-local analysis strategies for predicting the nonlinear and postbuckling responses of structures. Two postbuckling problems of composite panels are used as benchmarks and the application of different global-local methodologies to these benchmarks is outlined. The key elements of each of the global-local strategies are discussed and future research areas needed to realize the full potential of global-local methodologies are identified.

  7. What Does Global Migration Network Say about Recent Changes in the World System Structure?

    ERIC Educational Resources Information Center

    Zinkina, Julia; Korotayev, Andrey

    2014-01-01

    Purpose: The aim of this paper is to investigate whether the structure of the international migration system has remained stable through the recent turbulent changes in the world system. Design/methodology/approach: The methodology draws on the social network analysis framework--but with some noteworthy limitations stipulated by the specifics of…

  8. Multidisciplinary System Reliability Analysis

    NASA Technical Reports Server (NTRS)

    Mahadevan, Sankaran; Han, Song; Chamis, Christos C. (Technical Monitor)

    2001-01-01

    The objective of this study is to develop a new methodology for estimating the reliability of engineering systems that encompass multiple disciplines. The methodology is formulated in the context of the NESSUS probabilistic structural analysis code, developed under the leadership of NASA Glenn Research Center. The NESSUS code has been successfully applied to the reliability estimation of a variety of structural engineering systems. This study examines whether the features of NESSUS could be used to investigate the reliability of systems in other disciplines such as heat transfer, fluid mechanics, electrical circuits etc., without considerable programming effort specific to each discipline. In this study, the mechanical equivalence between system behavior models in different disciplines are investigated to achieve this objective. A new methodology is presented for the analysis of heat transfer, fluid flow, and electrical circuit problems using the structural analysis routines within NESSUS, by utilizing the equivalence between the computational quantities in different disciplines. This technique is integrated with the fast probability integration and system reliability techniques within the NESSUS code, to successfully compute the system reliability of multidisciplinary systems. Traditional as well as progressive failure analysis methods for system reliability estimation are demonstrated, through a numerical example of a heat exchanger system involving failure modes in structural, heat transfer and fluid flow disciplines.

  9. Multi-Disciplinary System Reliability Analysis

    NASA Technical Reports Server (NTRS)

    Mahadevan, Sankaran; Han, Song

    1997-01-01

    The objective of this study is to develop a new methodology for estimating the reliability of engineering systems that encompass multiple disciplines. The methodology is formulated in the context of the NESSUS probabilistic structural analysis code developed under the leadership of NASA Lewis Research Center. The NESSUS code has been successfully applied to the reliability estimation of a variety of structural engineering systems. This study examines whether the features of NESSUS could be used to investigate the reliability of systems in other disciplines such as heat transfer, fluid mechanics, electrical circuits etc., without considerable programming effort specific to each discipline. In this study, the mechanical equivalence between system behavior models in different disciplines are investigated to achieve this objective. A new methodology is presented for the analysis of heat transfer, fluid flow, and electrical circuit problems using the structural analysis routines within NESSUS, by utilizing the equivalence between the computational quantities in different disciplines. This technique is integrated with the fast probability integration and system reliability techniques within the NESSUS code, to successfully compute the system reliability of multi-disciplinary systems. Traditional as well as progressive failure analysis methods for system reliability estimation are demonstrated, through a numerical example of a heat exchanger system involving failure modes in structural, heat transfer and fluid flow disciplines.

  10. An improved approach for flight readiness certification: Probabilistic models for flaw propagation and turbine blade failure. Volume 1: Methodology and applications

    NASA Technical Reports Server (NTRS)

    Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.

    1992-01-01

    An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with analytical modeling of failure phenomena to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in analytical modeling, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which analytical models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes. These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. State-of-the-art analytical models currently employed for designs failure prediction, or performance analysis are used in this methodology. The rationale for the statistical approach taken in the PFA methodology is discussed, the PFA methodology is described, and examples of its application to structural failure modes are presented. The engineering models and computer software used in fatigue crack growth and fatigue crack initiation applications are thoroughly documented.

  11. Participatory Development and Analysis of a Fuzzy Cognitive Map of the Establishment of a Bio-Based Economy in the Humber Region

    PubMed Central

    Penn, Alexandra S.; Knight, Christopher J. K.; Lloyd, David J. B.; Avitabile, Daniele; Kok, Kasper; Schiller, Frank; Woodward, Amy; Druckman, Angela; Basson, Lauren

    2013-01-01

    Fuzzy Cognitive Mapping (FCM) is a widely used participatory modelling methodology in which stakeholders collaboratively develop a ‘cognitive map’ (a weighted, directed graph), representing the perceived causal structure of their system. This can be directly transformed by a workshop facilitator into simple mathematical models to be interrogated by participants by the end of the session. Such simple models provide thinking tools which can be used for discussion and exploration of complex issues, as well as sense checking the implications of suggested causal links. They increase stakeholder motivation and understanding of whole systems approaches, but cannot be separated from an intersubjective participatory context. Standard FCM methodologies make simplifying assumptions, which may strongly influence results, presenting particular challenges and opportunities. We report on a participatory process, involving local companies and organisations, focussing on the development of a bio-based economy in the Humber region. The initial cognitive map generated consisted of factors considered key for the development of the regional bio-based economy and their directional, weighted, causal interconnections. A verification and scenario generation procedure, to check the structure of the map and suggest modifications, was carried out with a second session. Participants agreed on updates to the original map and described two alternate potential causal structures. In a novel analysis all map structures were tested using two standard methodologies usually used independently: linear and sigmoidal FCMs, demonstrating some significantly different results alongside some broad similarities. We suggest a development of FCM methodology involving a sensitivity analysis with different mappings and discuss the use of this technique in the context of our case study. Using the results and analysis of our process, we discuss the limitations and benefits of the FCM methodology in this case and in general. We conclude by proposing an extended FCM methodology, including multiple functional mappings within one participant-constructed graph. PMID:24244303

  12. Pushover Analysis Methodologies: A Tool For Limited Damage Based Design Of Structure For Seismic Vibration

    NASA Astrophysics Data System (ADS)

    Dutta, Sekhar Chandra; Chakroborty, Suvonkar; Raychaudhuri, Anusrita

    Vibration transmitted to the structure during earthquake may vary in magnitude over a wide range. Design methodology should, therefore, enumerates steps so that structures are able to survive in the event of even severe ground motion. However, on account of economic reason, the strengths can be provided to the structures in such a way that the structure remains in elastic range in low to moderate range earthquake and is allowed to undergo inelastic deformation in severe earthquake without collapse. To implement this design philosophy a rigorous nonlinear dynamic analysis is needed to be performed to estimate the inelastic demands. Furthermore, the same is time consuming and requires expertise to judge the results obtained from the same. In this context, the present paper discusses and demonstrates an alternative simple method known as Pushover method, which can be easily used by practicing engineers bypassing intricate nonlinear dynamic analysis and can be thought of as a substitute of the latter. This method is in the process of development and is increasingly becoming popular for its simplicity. The objective of this paper is to emphasize and demonstrate the basic concept, strength and ease of this state of the art methodology for regular use in design offices in performance based seismic design of structures.

  13. Development of a probabilistic analysis methodology for structural reliability estimation

    NASA Technical Reports Server (NTRS)

    Torng, T. Y.; Wu, Y.-T.

    1991-01-01

    The novel probabilistic analysis method for assessment of structural reliability presented, which combines fast-convolution with an efficient structural reliability analysis, can after identifying the most important point of a limit state proceed to establish a quadratic-performance function. It then transforms the quadratic function into a linear one, and applies fast convolution. The method is applicable to problems requiring computer-intensive structural analysis. Five illustrative examples of the method's application are given.

  14. Documentation of indigenous Pacific agroforestry systems: a review of methodologies

    Treesearch

    Bill Raynor

    1993-01-01

    Recent interest in indigenous agroforestry has led to a need for documentation of these systems. However, previous work is very limited, and few methodologies are well-known or widely accepted. This paper outlines various methodologies (including sampling methods, data to be collected, and considerations in analysis) for documenting structure and productivity of...

  15. Development of Probabilistic Life Prediction Methodologies and Testing Strategies for MEMS and CMC's

    NASA Technical Reports Server (NTRS)

    Jadaan, Osama

    2003-01-01

    This effort is to investigate probabilistic life prediction methodologies for ceramic matrix composites and MicroElectroMechanical Systems (MEMS) and to analyze designs that determine stochastic properties of MEMS. For CMC's this includes a brief literature survey regarding lifing methodologies. Also of interest for MEMS is the design of a proper test for the Weibull size effect in thin film (bulge test) specimens. The Weibull size effect is a consequence of a stochastic strength response predicted from the Weibull distribution. Confirming that MEMS strength is controlled by the Weibull distribution will enable the development of a probabilistic design methodology for MEMS - similar to the GRC developed CARES/Life program for bulk ceramics. A main objective of this effort is to further develop and verify the ability of the Ceramics Analysis and Reliability Evaluation of Structures/Life (CARES/Life) code to predict the time-dependent reliability of MEMS structures subjected to multiple transient loads. A second set of objectives is to determine the applicability/suitability of the CARES/Life methodology for CMC analysis, what changes would be needed to the methodology and software, and if feasible, run a demonstration problem. Also important is an evaluation of CARES/Life coupled to the ANSYS Probabilistic Design System (PDS) and the potential of coupling transient reliability analysis to the ANSYS PDS.

  16. Formal Transformations from Graphically-Based Object-Oriented Representations to Theory-Based Specifications

    DTIC Science & Technology

    1996-06-01

    for Software Synthesis." KBSE 󈨡. IEEE, 1993. 51. Kang, Kyo C., et al. Feature-Oriented Domain Analysis ( FODA ) Feasibility Study. Technical Report...and usefulness in domain analysis and modeling. Rumbaugh uses three distinct views to describe a domain: (1) the object model describes structural...Gibbons describe a methodology where Structured Analysis is used to build a hierarchical system structure chart. This structure chart is then translated

  17. A methodology for creating greenways through multidisciplinary sustainable landscape planning.

    PubMed

    Pena, Selma Beatriz; Abreu, Maria Manuela; Teles, Rui; Espírito-Santo, Maria Dalila

    2010-01-01

    This research proposes a methodology for defining greenways via sustainable planning. This approach includes the analysis and discussion of culture and natural processes that occur in the landscape. The proposed methodology is structured in three phases: eco-cultural analysis; synthesis and diagnosis; and proposal. An interdisciplinary approach provides an assessment of the relationships between landscape structure and landscape dynamics, which are essential to any landscape management or land use. The landscape eco-cultural analysis provides a biophysical, dynamic (geomorphologic rate), vegetation (habitats from directive 92/43/EEC) and cultural characterisation. The knowledge obtained by this analysis then supports the definition of priority actions to stabilise the landscape and the management measures for the habitats. After the analysis and diagnosis phases, a proposal for the development of sustainable greenways can be achieved. This methodology was applied to a study area of the Azambuja Municipality in the Lisbon Metropolitan Area (Portugal). The application of the proposed methodology to the study area shows that landscape stability is crucial for greenway users in order to appreciate the landscape and its natural and cultural elements in a sustainable and healthy way, both by cycling or by foot. A balanced landscape will increase the value of greenways and in return, they can develop socio-economic activities with benefits for rural communities. Copyright 2009 Elsevier Ltd. All rights reserved.

  18. Development of a structured approach for decomposition of complex systems on a functional basis

    NASA Astrophysics Data System (ADS)

    Yildirim, Unal; Felician Campean, I.

    2014-07-01

    The purpose of this paper is to present the System State Flow Diagram (SSFD) as a structured and coherent methodology to decompose a complex system on a solution- independent functional basis. The paper starts by reviewing common function modelling frameworks in literature and discusses practical requirements of the SSFD in the context of the current literature and current approaches in industry. The proposed methodology is illustrated through the analysis of a case study: design analysis of a generic Bread Toasting System (BTS).

  19. Systematic analysis of EOS data system for operations

    NASA Technical Reports Server (NTRS)

    Moe, K. L.; Dasgupta, R.

    1985-01-01

    A data management analysis methodology is being proposed. The objective of the methodology is to assist mission managers by identifying a series of ordered activities to be systematically followed in order to arrive at an effective ground system design. Existing system engineering tools and concepts have been assembled into a structured framework to facilitate the work of a mission planner. It is intended that this methodology can be gainfully applied (with probable modifications and/or changes) to the EOS payloads and their associated data systems.

  20. Methodology for object-oriented real-time systems analysis and design: Software engineering

    NASA Technical Reports Server (NTRS)

    Schoeffler, James D.

    1991-01-01

    Successful application of software engineering methodologies requires an integrated analysis and design life-cycle in which the various phases flow smoothly 'seamlessly' from analysis through design to implementation. Furthermore, different analysis methodologies often lead to different structuring of the system so that the transition from analysis to design may be awkward depending on the design methodology to be used. This is especially important when object-oriented programming is to be used for implementation when the original specification and perhaps high-level design is non-object oriented. Two approaches to real-time systems analysis which can lead to an object-oriented design are contrasted: (1) modeling the system using structured analysis with real-time extensions which emphasizes data and control flows followed by the abstraction of objects where the operations or methods of the objects correspond to processes in the data flow diagrams and then design in terms of these objects; and (2) modeling the system from the beginning as a set of naturally occurring concurrent entities (objects) each having its own time-behavior defined by a set of states and state-transition rules and seamlessly transforming the analysis models into high-level design models. A new concept of a 'real-time systems-analysis object' is introduced and becomes the basic building block of a series of seamlessly-connected models which progress from the object-oriented real-time systems analysis and design system analysis logical models through the physical architectural models and the high-level design stages. The methodology is appropriate to the overall specification including hardware and software modules. In software modules, the systems analysis objects are transformed into software objects.

  1. An improved approach for flight readiness certification: Probabilistic models for flaw propagation and turbine blade failure. Volume 2: Software documentation

    NASA Technical Reports Server (NTRS)

    Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.

    1992-01-01

    An improved methodology for quantitatively evaluating failure risk of spaceflights systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with analytical modeling of failure phenomena to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in analytical modeling, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which analytical models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes. These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. State-of-the-art analytical models currently employed for design, failure prediction, or performance analysis are used in this methodology. The rationale for the statistical approach taken in the PFA methodology is discussed, the PFA methodology is described, and examples of its application to structural failure modes are presented. The engineering models and computer software used in fatigue crack growth and fatigue crack initiation applications are thoroughly documented.

  2. Finite element analysis of steady and transiently moving/rolling nonlinear viscoelastic structure. III - Impact/contact simulations

    NASA Technical Reports Server (NTRS)

    Nakajima, Yukio; Padovan, Joe

    1987-01-01

    In a three-part series of papers, a generalized finite element methodology is formulated to handle traveling load problems involving large deformation fields in structure composed of viscoelastic media. The main thrust of this paper is to develop an overall finite element methodology and associated solution algorithms to handle the transient aspects of moving problems involving contact impact type loading fields. Based on the methodology and algorithms formulated, several numerical experiments are considered. These include the rolling/sliding impact of tires with road obstructions.

  3. Polyphony: superposition independent methods for ensemble-based drug discovery.

    PubMed

    Pitt, William R; Montalvão, Rinaldo W; Blundell, Tom L

    2014-09-30

    Structure-based drug design is an iterative process, following cycles of structural biology, computer-aided design, synthetic chemistry and bioassay. In favorable circumstances, this process can lead to the structures of hundreds of protein-ligand crystal structures. In addition, molecular dynamics simulations are increasingly being used to further explore the conformational landscape of these complexes. Currently, methods capable of the analysis of ensembles of crystal structures and MD trajectories are limited and usually rely upon least squares superposition of coordinates. Novel methodologies are described for the analysis of multiple structures of a protein. Statistical approaches that rely upon residue equivalence, but not superposition, are developed. Tasks that can be performed include the identification of hinge regions, allosteric conformational changes and transient binding sites. The approaches are tested on crystal structures of CDK2 and other CMGC protein kinases and a simulation of p38α. Known interaction - conformational change relationships are highlighted but also new ones are revealed. A transient but druggable allosteric pocket in CDK2 is predicted to occur under the CMGC insert. Furthermore, an evolutionarily-conserved conformational link from the location of this pocket, via the αEF-αF loop, to phosphorylation sites on the activation loop is discovered. New methodologies are described and validated for the superimposition independent conformational analysis of large collections of structures or simulation snapshots of the same protein. The methodologies are encoded in a Python package called Polyphony, which is released as open source to accompany this paper [http://wrpitt.bitbucket.org/polyphony/].

  4. Damage detection methodology under variable load conditions based on strain field pattern recognition using FBGs, nonlinear principal component analysis, and clustering techniques

    NASA Astrophysics Data System (ADS)

    Sierra-Pérez, Julián; Torres-Arredondo, M.-A.; Alvarez-Montoya, Joham

    2018-01-01

    Structural health monitoring consists of using sensors integrated within structures together with algorithms to perform load monitoring, damage detection, damage location, damage size and severity, and prognosis. One possibility is to use strain sensors to infer structural integrity by comparing patterns in the strain field between the pristine and damaged conditions. In previous works, the authors have demonstrated that it is possible to detect small defects based on strain field pattern recognition by using robust machine learning techniques. They have focused on methodologies based on principal component analysis (PCA) and on the development of several unfolding and standardization techniques, which allow dealing with multiple load conditions. However, before a real implementation of this approach in engineering structures, changes in the strain field due to conditions different from damage occurrence need to be isolated. Since load conditions may vary in most engineering structures and promote significant changes in the strain field, it is necessary to implement novel techniques for uncoupling such changes from those produced by damage occurrence. A damage detection methodology based on optimal baseline selection (OBS) by means of clustering techniques is presented. The methodology includes the use of hierarchical nonlinear PCA as a nonlinear modeling technique in conjunction with Q and nonlinear-T 2 damage indices. The methodology is experimentally validated using strain measurements obtained by 32 fiber Bragg grating sensors bonded to an aluminum beam under dynamic bending loads and simultaneously submitted to variations in its pitch angle. The results demonstrated the capability of the methodology for clustering data according to 13 different load conditions (pitch angles), performing the OBS and detecting six different damages induced in a cumulative way. The proposed methodology showed a true positive rate of 100% and a false positive rate of 1.28% for a 99% of confidence.

  5. The IDEAL (Integrated Design and Engineering Analysis Languages) modeling methodology: Capabilities and Applications

    NASA Technical Reports Server (NTRS)

    Evers, Ken H.; Bachert, Robert F.

    1987-01-01

    The IDEAL (Integrated Design and Engineering Analysis Languages) modeling methodology has been formulated and applied over a five-year period. It has proven to be a unique, integrated approach utilizing a top-down, structured technique to define and document the system of interest; a knowledge engineering technique to collect and organize system descriptive information; a rapid prototyping technique to perform preliminary system performance analysis; and a sophisticated simulation technique to perform in-depth system performance analysis.

  6. Analytical Methodology for Predicting the Onset of Widespread Fatigue Damage in Fuselage Structure

    NASA Technical Reports Server (NTRS)

    Harris, Charles E.; Newman, James C., Jr.; Piascik, Robert S.; Starnes, James H., Jr.

    1996-01-01

    NASA has developed a comprehensive analytical methodology for predicting the onset of widespread fatigue damage in fuselage structure. The determination of the number of flights and operational hours of aircraft service life that are related to the onset of widespread fatigue damage includes analyses for crack initiation, fatigue crack growth, and residual strength. Therefore, the computational capability required to predict analytically the onset of widespread fatigue damage must be able to represent a wide range of crack sizes from the material (microscale) level to the global structural-scale level. NASA studies indicate that the fatigue crack behavior in aircraft structure can be represented conveniently by the following three analysis scales: small three-dimensional cracks at the microscale level, through-the-thickness two-dimensional cracks at the local structural level, and long cracks at the global structural level. The computational requirements for each of these three analysis scales are described in this paper.

  7. Coupled structural/thermal/electromagnetic analysis/tailoring of graded composite structures

    NASA Technical Reports Server (NTRS)

    Hartle, M. S.; Mcknight, R. L.; Huang, H.; Holt, R.

    1992-01-01

    Described here are the accomplishments of a 5-year program to develop a methodology for coupled structural, thermal, electromagnetic analysis tailoring of graded component structures. The capabilities developed over the course of the program are the analyzer module and the tailoring module for the modeling of graded materials. Highlighted accomplishments for the past year include the addition of a buckling analysis capability, the addition of mode shape slope calculation for flutter analysis, verification of the analysis modules using simulated components, and verification of the tailoring module.

  8. Crossing trend analysis methodology and application for Turkish rainfall records

    NASA Astrophysics Data System (ADS)

    Şen, Zekâi

    2018-01-01

    Trend analyses are the necessary tools for depicting possible general increase or decrease in a given time series. There are many versions of trend identification methodologies such as the Mann-Kendall trend test, Spearman's tau, Sen's slope, regression line, and Şen's innovative trend analysis. The literature has many papers about the use, cons and pros, and comparisons of these methodologies. In this paper, a completely new approach is proposed based on the crossing properties of a time series. It is suggested that the suitable trend from the centroid of the given time series should have the maximum number of crossings (total number of up-crossings or down-crossings). This approach is applicable whether the time series has dependent or independent structure and also without any dependence on the type of the probability distribution function. The validity of this method is presented through extensive Monte Carlo simulation technique and its comparison with other existing trend identification methodologies. The application of the methodology is presented for a set of annual daily extreme rainfall time series from different parts of Turkey and they have physically independent structure.

  9. An improved approach for flight readiness certification: Methodology for failure risk assessment and application examples. Volume 2: Software documentation

    NASA Technical Reports Server (NTRS)

    Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.

    1992-01-01

    An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with engineering analysis to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in engineering analyses of failure phenomena, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which engineering analysis models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes, These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. Conventional engineering analysis models currently employed for design of failure prediction are used in this methodology. The PFA methodology is described and examples of its application are presented. Conventional approaches to failure risk evaluation for spaceflight systems are discussed, and the rationale for the approach taken in the PFA methodology is presented. The statistical methods, engineering models, and computer software used in fatigue failure mode applications are thoroughly documented.

  10. An improved approach for flight readiness certification: Methodology for failure risk assessment and application examples, volume 1

    NASA Technical Reports Server (NTRS)

    Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.

    1992-01-01

    An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with engineering analysis to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in engineering analyses of failure phenomena, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which engineering analysis models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes. These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. Conventional engineering analysis models currently employed for design of failure prediction are used in this methodology. The PFA methodology is described and examples of its application are presented. Conventional approaches to failure risk evaluation for spaceflight systems are discussed, and the rationale for the approach taken in the PFA methodology is presented. The statistical methods, engineering models, and computer software used in fatigue failure mode applications are thoroughly documented.

  11. Review of the probabilistic failure analysis methodology and other probabilistic approaches for application in aerospace structural design

    NASA Technical Reports Server (NTRS)

    Townsend, J.; Meyers, C.; Ortega, R.; Peck, J.; Rheinfurth, M.; Weinstock, B.

    1993-01-01

    Probabilistic structural analyses and design methods are steadily gaining acceptance within the aerospace industry. The safety factor approach to design has long been the industry standard, and it is believed by many to be overly conservative and thus, costly. A probabilistic approach to design may offer substantial cost savings. This report summarizes several probabilistic approaches: the probabilistic failure analysis (PFA) methodology developed by Jet Propulsion Laboratory, fast probability integration (FPI) methods, the NESSUS finite element code, and response surface methods. Example problems are provided to help identify the advantages and disadvantages of each method.

  12. Coupled structural/thermal/electromagnetic analysis/tailoring of graded composite structures

    NASA Technical Reports Server (NTRS)

    Mcknight, R. L.; Huang, H.; Hartle, M.

    1992-01-01

    Accomplishments are described for the fourth years effort of a 5-year program to develop a methodology for coupled structural/thermal/electromagnetic analysis/tailoring of graded component structures. These accomplishments include: (1) demonstration of coupled solution capability; (2) alternate CSTEM electromagnetic technology; (3) CSTEM acoustic capability; (4) CSTEM tailoring; (5) CSTEM composite micromechanics using ICAN; and (6) multiple layer elements in CSTEM.

  13. Methodological support for the further abstraction of and philosophical examination of empirical findings in the context of caring science.

    PubMed

    Lindberg, Elisabeth; Österberg, Sofia A; Hörberg, Ulrica

    2016-01-01

    Phenomena in caring science are often complex and laden with meanings. Empirical research with the aim of capturing lived experiences is one way of revealing the complexity. Sometimes, however, results from empirical research need to be further discussed. One way is to further abstract the result and/or philosophically examine it. This has previously been performed and presented in scientific journals and doctoral theses, contributing to a greater understanding of phenomena in caring science. Although the intentions in many of these publications are laudable, the lack of methodological descriptions as well as a theoretical and systematic foundation can contribute to an ambiguity concerning how the results have emerged during the analysis. The aim of this paper is to describe the methodological support for the further abstraction of and/or philosophical examination of empirical findings. When trying to systematize the support procedures, we have used a reflective lifeworld research (RLR) approach. Based on the assumptions in RLR, this article will present methodological support for a theoretical examination that can include two stages. In the first stage, data from several (two or more) empirical results on an essential level are synthesized into a general structure. Sometimes the analysis ends with the general structure, but sometimes there is a need to proceed further. The second stage can then be a philosophical examination, in which the general structure is discussed in relation to a philosophical text, theory, or concept. It is important that the theories are brought in as the final stage after the completion of the analysis. Core dimensions of the described methodological support are, in accordance with RLR, openness, bridling, and reflection. The methodological support cannot be understood as fixed stages, but rather as a guiding light in the search for further meanings.

  14. Methodological Issues in the Study of Air Force Organizational Structures,

    DTIC Science & Technology

    MOTIVATION, MORALE, PERFORMANCE(HUMAN), LEADERSHIP , SKILLS, MANAGEMENT PLANNING AND CONTROL, MODEL THEORY , SYMPOSIA...RESOURCE MANAGEMENT , *HUMAN RESOURCES, *MANPOWER UTILIZATION, *JOB ANALYSIS, *ORGANIZATIONS, STRUCTURES, PERSONNEL MANAGEMENT , DECISION MAKING

  15. New methodologies for multi-scale time-variant reliability analysis of complex lifeline networks

    NASA Astrophysics Data System (ADS)

    Kurtz, Nolan Scot

    The cost of maintaining existing civil infrastructure is enormous. Since the livelihood of the public depends on such infrastructure, its state must be managed appropriately using quantitative approaches. Practitioners must consider not only which components are most fragile to hazard, e.g. seismicity, storm surge, hurricane winds, etc., but also how they participate on a network level using network analysis. Focusing on particularly damaged components does not necessarily increase network functionality, which is most important to the people that depend on such infrastructure. Several network analyses, e.g. S-RDA, LP-bounds, and crude-MCS, and performance metrics, e.g. disconnection bounds and component importance, are available for such purposes. Since these networks are existing, the time state is also important. If networks are close to chloride sources, deterioration may be a major issue. Information from field inspections may also have large impacts on quantitative models. To address such issues, hazard risk analysis methodologies for deteriorating networks subjected to seismicity, i.e. earthquakes, have been created from analytics. A bridge component model has been constructed for these methodologies. The bridge fragilities, which were constructed from data, required a deeper level of analysis as these were relevant for specific structures. Furthermore, chloride-induced deterioration network effects were investigated. Depending on how mathematical models incorporate new information, many approaches are available, such as Bayesian model updating. To make such procedures more flexible, an adaptive importance sampling scheme was created for structural reliability problems. Additionally, such a method handles many kinds of system and component problems with singular or multiple important regions of the limit state function. These and previously developed analysis methodologies were found to be strongly sensitive to the network size. Special network topologies may be more or less computationally difficult, while the resolution of the network also has large affects. To take advantage of some types of topologies, network hierarchical structures with super-link representation have been used in the literature to increase the computational efficiency by analyzing smaller, densely connected networks; however, such structures were based on user input and subjective at times. To address this, algorithms must be automated and reliable. These hierarchical structures may indicate the structure of the network itself. This risk analysis methodology has been expanded to larger networks using such automated hierarchical structures. Component importance is the most important objective from such network analysis; however, this may only provide the information of which bridges to inspect/repair earliest and little else. High correlations influence such component importance measures in a negative manner. Additionally, a regional approach is not appropriately modelled. To investigate a more regional view, group importance measures based on hierarchical structures have been created. Such structures may also be used to create regional inspection/repair approaches. Using these analytical, quantitative risk approaches, the next generation of decision makers may make both component and regional-based optimal decisions using information from both network function and further effects of infrastructure deterioration.

  16. Optical analysis of thermal induced structural distortions

    NASA Technical Reports Server (NTRS)

    Weinswig, Shepard; Hookman, Robert A.

    1991-01-01

    The techniques used for the analysis of thermally induced structural distortions of optical components such as scanning mirrors and telescope optics are outlined. Particular attention is given to the methodology used in the thermal and structural analysis of the GOES scan mirror, the optical analysis using Zernike coefficients, and the optical system performance evaluation. It is pointed out that the use of Zernike coefficients allows an accurate, effective, and simple linkage between thermal/mechanical effects and the optical design.

  17. Structural health monitoring apparatus and methodology

    NASA Technical Reports Server (NTRS)

    Giurgiutiu, Victor (Inventor); Yu, Lingyu (Inventor); Bottai, Giola Santoni (Inventor)

    2011-01-01

    Disclosed is an apparatus and methodology for structural health monitoring (SHM) in which smart devices interrogate structural components to predict failure, expedite needed repairs, and thus increase the useful life of those components. Piezoelectric wafer active sensors (PWAS) are applied to or integrated with structural components and various data collected there from provide the ability to detect and locate cracking, corrosion, and disbanding through use of pitch-catch, pulse-echo, electro/mechanical impedance, and phased array technology. Stand alone hardware and an associated software program are provided that allow selection of multiple types of SHM investigations as well as multiple types of data analysis to perform a wholesome investigation of a structure.

  18. Discrete crack growth analysis methodology for through cracks in pressurized fuselage structures

    NASA Technical Reports Server (NTRS)

    Potyondy, David O.; Wawrzynek, Paul A.; Ingraffea, Anthony R.

    1994-01-01

    A methodology for simulating the growth of long through cracks in the skin of pressurized aircraft fuselage structures is described. Crack trajectories are allowed to be arbitrary and are computed as part of the simulation. The interaction between the mechanical loads acting on the superstructure and the local structural response near the crack tips is accounted for by employing a hierarchical modeling strategy. The structural response for each cracked configuration is obtained using a geometrically nonlinear shell finite element analysis procedure. Four stress intensity factors, two for membrane behavior and two for bending using Kirchhoff plate theory, are computed using an extension of the modified crack closure integral method. Crack trajectories are determined by applying the maximum tangential stress criterion. Crack growth results in localized mesh deletion, and the deletion regions are remeshed automatically using a newly developed all-quadrilateral meshing algorithm. The effectiveness of the methodology and its applicability to performing practical analyses of realistic structures is demonstrated by simulating curvilinear crack growth in a fuselage panel that is representative of a typical narrow-body aircraft. The predicted crack trajectory and fatigue life compare well with measurements of these same quantities from a full-scale pressurized panel test.

  19. Quantitative Analysis Of Three-dimensional Branching Systems From X-ray Computed Microtomography Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McKinney, Adriana L.; Varga, Tamas

    Branching structures such as lungs, blood vessels and plant roots play a critical role in life. Growth, structure, and function of these branching structures have an immense effect on our lives. Therefore, quantitative size information on such structures in their native environment is invaluable for studying their growth and the effect of the environment on them. X-ray computed tomography (XCT) has been an effective tool for in situ imaging and analysis of branching structures. We developed a costless tool that approximates the surface and volume of branching structures. Our methodology of noninvasive imaging, segmentation and extraction of quantitative information ismore » demonstrated through the analysis of a plant root in its soil medium from 3D tomography data. XCT data collected on a grass specimen was used to visualize its root structure. A suite of open-source software was employed to segment the root from the soil and determine its isosurface, which was used to calculate its volume and surface. This methodology of processing 3D data is applicable to other branching structures even when the structure of interest is of similar x-ray attenuation to its environment and difficulties arise with sample segmentation.« less

  20. Observed Family Interactions among Subtypes of Eating Disorders Using Structural Analysis of Social Behavior.

    ERIC Educational Resources Information Center

    Humphrey, Laura Lynn

    1989-01-01

    Compared observations of family interactions among anorexic, bulimic-anorexic, bulimic, and normal families (N=74 families) consisting of father, mother, and teenage daughter. Benjamin's structural analysis of social behavior methodology differentiated clinical from normal families. Found unique patterns among subtypes of eating disorders which…

  1. Structural mapping from MSS-LANDSAT imagery: A proposed methodology for international geological correlation studies

    NASA Technical Reports Server (NTRS)

    Dejesusparada, N. (Principal Investigator); Crepani, E.; Martini, P. R.

    1980-01-01

    A methodology is proposed for international geological correlation studies based on LANDSAT-MSS imagery, Bullard's model of continental fit and compatible structural trends between Northeast Brazil and the West African counterpart. Six extensive lineaments in the Brazilian study area are mapped and discussed according to their regional behavior and in relation to the adjacent continental margin. Among the first conclusions, correlations were found between the Sobral Pedro II Lineament and the megafaults that surround the West African craton; and the Pernambuco Lineament with the Ngaurandere Linemanet in Cameroon. Ongoing research to complete the methodological stages includes the mapping of the West African structural framework, reconstruction of the pre-drift puzzle, and an analysis of the counterpart correlations.

  2. Mixed time integration methods for transient thermal analysis of structures, appendix 5

    NASA Technical Reports Server (NTRS)

    Liu, W. K.

    1982-01-01

    Mixed time integration methods for transient thermal analysis of structures are studied. An efficient solution procedure for predicting the thermal behavior of aerospace vehicle structures was developed. A 2D finite element computer program incorporating these methodologies is being implemented. The performance of these mixed time finite element algorithms can then be evaluated employing the proposed example problem.

  3. Comparative Bioinformatic Analysis of Active Site Structures in Evolutionarily Remote Homologues of α,β-Hydrolase Superfamily Enzymes.

    PubMed

    Suplatov, D A; Arzhanik, V K; Svedas, V K

    2011-01-01

    Comparative bioinformatic analysis is the cornerstone of the study of enzymes' structure-function relationship. However, numerous enzymes that derive from a common ancestor and have undergone substantial functional alterations during natural selection appear not to have a sequence similarity acceptable for a statistically reliable comparative analysis. At the same time, their active site structures, in general, can be conserved, while other parts may largely differ. Therefore, it sounds both plausible and appealing to implement a comparative analysis of the most functionally important structural elements - the active site structures; that is, the amino acid residues involved in substrate binding and the catalytic mechanism. A computer algorithm has been developed to create a library of enzyme active site structures based on the use of the PDB database, together with programs of structural analysis and identification of functionally important amino acid residues and cavities in the enzyme structure. The proposed methodology has been used to compare some α,β-hydrolase superfamily enzymes. The insight has revealed a high structural similarity of catalytic site areas, including the conservative organization of a catalytic triad and oxyanion hole residues, despite the wide functional diversity among the remote homologues compared. The methodology can be used to compare the structural organization of the catalytic and substrate binding sites of various classes of enzymes, as well as study enzymes' evolution and to create of a databank of enzyme active site structures.

  4. Structural Loads Analysis for Wave Energy Converters

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    van Rij, Jennifer A; Yu, Yi-Hsiang; Guo, Yi

    2017-06-03

    This study explores and verifies the generalized body-modes method for evaluating the structural loads on a wave energy converter (WEC). Historically, WEC design methodologies have focused primarily on accurately evaluating hydrodynamic loads, while methodologies for evaluating structural loads have yet to be fully considered and incorporated into the WEC design process. As wave energy technologies continue to advance, however, it has become increasingly evident that an accurate evaluation of the structural loads will enable an optimized structural design, as well as the potential utilization of composites and flexible materials, and hence reduce WEC costs. Although there are many computational fluidmore » dynamics, structural analyses and fluid-structure-interaction (FSI) codes available, the application of these codes is typically too computationally intensive to be practical in the early stages of the WEC design process. The generalized body-modes method, however, is a reduced order, linearized, frequency-domain FSI approach, performed in conjunction with the linear hydrodynamic analysis, with computation times that could realistically be incorporated into the WEC design process.« less

  5. Development of an Aeroelastic Modeling Capability for Transient Nozzle Side Load Analysis

    NASA Technical Reports Server (NTRS)

    Wang, Ten-See; Zhao, Xiang; Zhang, Sijun; Chen, Yen-Sen

    2013-01-01

    Lateral nozzle forces are known to cause severe structural damage to any new rocket engine in development during test. While three-dimensional, transient, turbulent, chemically reacting computational fluid dynamics methodology has been demonstrated to capture major side load physics with rigid nozzles, hot-fire tests often show nozzle structure deformation during major side load events, leading to structural damages if structural strengthening measures were not taken. The modeling picture is incomplete without the capability to address the two-way responses between the structure and fluid. The objective of this study is to develop a coupled aeroelastic modeling capability by implementing the necessary structural dynamics component into an anchored computational fluid dynamics methodology. The computational fluid dynamics component is based on an unstructured-grid, pressure-based computational fluid dynamics formulation, while the computational structural dynamics component is developed in the framework of modal analysis. Transient aeroelastic nozzle startup analyses of the Block I Space Shuttle Main Engine at sea level were performed. The computed results from the aeroelastic nozzle modeling are presented.

  6. STAGS Developments for Residual Strength Analysis Methods for Metallic Fuselage Structures

    NASA Technical Reports Server (NTRS)

    Young, Richard D.; Rose, Cheryl A.

    2014-01-01

    A summary of advances in the Structural Analysis of General Shells (STAGS) finite element code for the residual strength analysis of metallic fuselage structures, that were realized through collaboration between the structures group at NASA Langley, and Dr. Charles Rankin is presented. The majority of the advancements described were made in the 1990's under the NASA Airframe Structural Integrity Program (NASIP). Example results from studies that were conducted using the STAGS code to develop improved understanding of the nonlinear response of cracked fuselage structures subjected to combined loads are presented. An integrated residual strength analysis methodology for metallic structure that models crack growth to predict the effect of cracks on structural integrity is demonstrated

  7. Three-Dimensional Extension of a Digital Library Service System

    ERIC Educational Resources Information Center

    Xiao, Long

    2010-01-01

    Purpose: The paper aims to provide an overall methodology and case study for the innovation and extension of a digital library, especially the service system. Design/methodology/approach: Based on the three-dimensional structure theory of the information service industry, this paper combines a comprehensive analysis with the practical experiences…

  8. A methodology for producing reliable software, volume 1

    NASA Technical Reports Server (NTRS)

    Stucki, L. G.; Moranda, P. B.; Foshee, G.; Kirchoff, M.; Omre, R.

    1976-01-01

    An investigation into the areas having an impact on producing reliable software including automated verification tools, software modeling, testing techniques, structured programming, and management techniques is presented. This final report contains the results of this investigation, analysis of each technique, and the definition of a methodology for producing reliable software.

  9. Nonlinear Time Domain Seismic Soil-Structure Interaction (SSI) Deep Soil Site Methodology Development

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Spears, Robert Edward; Coleman, Justin Leigh

    Currently the Department of Energy (DOE) and the nuclear industry perform seismic soil-structure interaction (SSI) analysis using equivalent linear numerical analysis tools. For lower levels of ground motion, these tools should produce reasonable in-structure response values for evaluation of existing and new facilities. For larger levels of ground motion these tools likely overestimate the in-structure response (and therefore structural demand) since they do not consider geometric nonlinearities (such as gaping and sliding between the soil and structure) and are limited in the ability to model nonlinear soil behavior. The current equivalent linear SSI (SASSI) analysis approach either joins the soilmore » and structure together in both tension and compression or releases the soil from the structure for both tension and compression. It also makes linear approximations for material nonlinearities and generalizes energy absorption with viscous damping. This produces the potential for inaccurately establishing where the structural concerns exist and/or inaccurately establishing the amplitude of the in-structure responses. Seismic hazard curves at nuclear facilities have continued to increase over the years as more information has been developed on seismic sources (i.e. faults), additional information gathered on seismic events, and additional research performed to determine local site effects. Seismic hazard curves are used to develop design basis earthquakes (DBE) that are used to evaluate nuclear facility response. As the seismic hazard curves increase, the input ground motions (DBE’s) used to numerically evaluation nuclear facility response increase causing larger in-structure response. As ground motions increase so does the importance of including nonlinear effects in numerical SSI models. To include material nonlinearity in the soil and geometric nonlinearity using contact (gaping and sliding) it is necessary to develop a nonlinear time domain methodology. This methodology will be known as, NonLinear Soil-Structure Interaction (NLSSI). In general NLSSI analysis should provide a more accurate representation of the seismic demands on nuclear facilities their systems and components. INL, in collaboration with a Nuclear Power Plant Vender (NPP-V), will develop a generic Nuclear Power Plant (NPP) structural design to be used in development of the methodology and for comparison with SASSI. This generic NPP design has been evaluated for the INL soil site because of the ease of access and quality of the site specific data. It is now being evaluated for a second site at Vogtle which is located approximately 15 miles East-Northeast of Waynesboro, Georgia and adjacent to Savanna River. The Vogtle site consists of many soil layers spanning down to a depth of 1058 feet. The reason that two soil sites are chosen is to demonstrate the methodology across multiple soil sites. The project will drive the models (soil and structure) using successively increasing acceleration time histories with amplitudes. The models will be run in time domain codes such as ABAQUS, LS-DYNA, and/or ESSI and compared with the same models run in SASSI. The project is focused on developing and documenting a method for performing time domain, non-linear seismic soil structure interaction (SSI) analysis. Development of this method will provide the Department of Energy (DOE) and industry with another tool to perform seismic SSI analysis.« less

  10. Symposium Proceedings--Occupational Research and the Navy--Prospectus 1980. Technical Report No. 74-14.

    ERIC Educational Resources Information Center

    Jones, Earl I., Ed.

    This five-section symposium report includes 22 papers assessing the state-of-the-art in occupational research. Section 1, Occupational Analysis, Structure, and Methods, contains four papers that discuss: the Air Force Occupational Research project, methodologies in job analysis, evaluation, structures and requirements, career development,…

  11. Orbiter lessons learned: A guide to future vehicle development

    NASA Technical Reports Server (NTRS)

    Greenberg, Harry Stan

    1993-01-01

    Topics addressed are: (1) wind persistence loads methodology; (2) emphasize supportability in design of reusable vehicles; (3) design for robustness; (4) improved aerodynamic environment prediction methods for complex vehicles; (5) automated integration of aerothermal, manufacturing, and structures analysis; (6) continued electronic documentation of structural design and analysis; and (7) landing gear rollout load simulations.

  12. Epistemological Beliefs across Cultures: Critique and Analysis of Beliefs Structure Studies

    ERIC Educational Resources Information Center

    Chan, Kwok-wai; Elliott, Robert G.

    2004-01-01

    The findings of epistemological beliefs studies in North America, Hong Kong and Taiwan were compared and interpreted in terms of the different cultural contexts and methodologies used in the research studies. Based on cross culture analysis a hypothesis for the structure of epistemological beliefs was proposed. Implications were also drawn for…

  13. Methodologies for launcher-payload coupled dynamic analysis

    NASA Astrophysics Data System (ADS)

    Fransen, S. H. J. A.

    2012-06-01

    An important step in the design and verification process of spacecraft structures is the coupled dynamic analysis with the launch vehicle in the low-frequency domain, also referred to as coupled loads analysis (CLA). The objective of such analyses is the computation of the dynamic environment of the spacecraft (payload) in terms of interface accelerations, interface forces, center of gravity (CoG) accelerations as well as the internal state of stress. In order to perform an efficient, fast and accurate launcher-payload coupled dynamic analysis, various methodologies have been applied and developed. The methods are related to substructuring techniques, data recovery techniques, the effects of prestress and fluids and time integration problems. The aim of this paper was to give an overview of these methodologies and to show why, how and where these techniques can be used in the process of launcher-payload coupled dynamic analysis. In addition, it will be shown how these methodologies fit together in a library of procedures which can be used with the MSC.Nastran™ solution sequences.

  14. Potential of SNP markers for the characterization of Brazilian cassava germplasm.

    PubMed

    de Oliveira, Eder Jorge; Ferreira, Cláudia Fortes; da Silva Santos, Vanderlei; de Jesus, Onildo Nunes; Oliveira, Gilmara Alvarenga Fachardo; da Silva, Maiane Suzarte

    2014-06-01

    High-throughput markers, such as SNPs, along with different methodologies were used to evaluate the applicability of the Bayesian approach and the multivariate analysis in structuring the genetic diversity in cassavas. The objective of the present work was to evaluate the diversity and genetic structure of the largest cassava germplasm bank in Brazil. Complementary methodological approaches such as discriminant analysis of principal components (DAPC), Bayesian analysis and molecular analysis of variance (AMOVA) were used to understand the structure and diversity of 1,280 accessions genotyped using 402 single nucleotide polymorphism markers. The genetic diversity (0.327) and the average observed heterozygosity (0.322) were high considering the bi-allelic markers. In terms of population, the presence of a complex genetic structure was observed indicating the formation of 30 clusters by DAPC and 34 clusters by Bayesian analysis. Both methodologies presented difficulties and controversies in terms of the allocation of some accessions to specific clusters. However, the clusters suggested by the DAPC analysis seemed to be more consistent for presenting higher probability of allocation of the accessions within the clusters. Prior information related to breeding patterns and geographic origins of the accessions were not sufficient for providing clear differentiation between the clusters according to the AMOVA analysis. In contrast, the F ST was maximized when considering the clusters suggested by the Bayesian and DAPC analyses. The high frequency of germplasm exchange between producers and the subsequent alteration of the name of the same material may be one of the causes of the low association between genetic diversity and geographic origin. The results of this study may benefit cassava germplasm conservation programs, and contribute to the maximization of genetic gains in breeding programs.

  15. Algebra for Enterprise Ontology: towards analysis and synthesis of enterprise models

    NASA Astrophysics Data System (ADS)

    Suga, Tetsuya; Iijima, Junichi

    2018-03-01

    Enterprise modeling methodologies have made enterprises more likely to be the object of systems engineering rather than craftsmanship. However, the current state of research in enterprise modeling methodologies lacks investigations of the mathematical background embedded in these methodologies. Abstract algebra, a broad subfield of mathematics, and the study of algebraic structures may provide interesting implications in both theory and practice. Therefore, this research gives an empirical challenge to establish an algebraic structure for one aspect model proposed in Design & Engineering Methodology for Organizations (DEMO), which is a major enterprise modeling methodology in the spotlight as a modeling principle to capture the skeleton of enterprises for developing enterprise information systems. The results show that the aspect model behaves well in the sense of algebraic operations and indeed constructs a Boolean algebra. This article also discusses comparisons with other modeling languages and suggests future work.

  16. From intuition to statistics in building subsurface structural models

    USGS Publications Warehouse

    Brandenburg, J.P.; Alpak, F.O.; Naruk, S.; Solum, J.

    2011-01-01

    Experts associated with the oil and gas exploration industry suggest that combining forward trishear models with stochastic global optimization algorithms allows a quantitative assessment of the uncertainty associated with a given structural model. The methodology is applied to incompletely imaged structures related to deepwater hydrocarbon reservoirs and results are compared to prior manual palinspastic restorations and borehole data. This methodology is also useful for extending structural interpretations into other areas of limited resolution, such as subsalt in addition to extrapolating existing data into seismic data gaps. This technique can be used for rapid reservoir appraisal and potentially have other applications for seismic processing, well planning, and borehole stability analysis.

  17. Predicting the Reliability of Brittle Material Structures Subjected to Transient Proof Test and Service Loading

    NASA Astrophysics Data System (ADS)

    Nemeth, Noel N.; Jadaan, Osama M.; Palfi, Tamas; Baker, Eric H.

    Brittle materials today are being used, or considered, for a wide variety of high tech applications that operate in harsh environments, including static and rotating turbine parts, thermal protection systems, dental prosthetics, fuel cells, oxygen transport membranes, radomes, and MEMS. Designing brittle material components to sustain repeated load without fracturing while using the minimum amount of material requires the use of a probabilistic design methodology. The NASA CARES/Life 1 (Ceramic Analysis and Reliability Evaluation of Structure/Life) code provides a general-purpose analysis tool that predicts the probability of failure of a ceramic component as a function of its time in service. This capability includes predicting the time-dependent failure probability of ceramic components against catastrophic rupture when subjected to transient thermomechanical loads (including cyclic loads). The developed methodology allows for changes in material response that can occur with temperature or time (i.e. changing fatigue and Weibull parameters with temperature or time). For this article an overview of the transient reliability methodology and how this methodology is extended to account for proof testing is described. The CARES/Life code has been modified to have the ability to interface with commercially available finite element analysis (FEA) codes executed for transient load histories. Examples are provided to demonstrate the features of the methodology as implemented in the CARES/Life program.

  18. Modeling and Design Analysis Methodology for Tailoring of Aircraft Structures with Composites

    NASA Technical Reports Server (NTRS)

    Rehfield, Lawrence W.

    2004-01-01

    Composite materials provide design flexibility in that fiber placement and orientation can be specified and a variety of material forms and manufacturing processes are available. It is possible, therefore, to 'tailor' the structure to a high degree in order to meet specific design requirements in an optimum manner. Common industrial practices, however, have limited the choices designers make. One of the reasons for this is that there is a dearth of conceptual/preliminary design analysis tools specifically devoted to identifying structural concepts for composite airframe structures. Large scale finite element simulations are not suitable for such purposes. The present project has been devoted to creating modeling and design analysis methodology for use in the tailoring process of aircraft structures. Emphasis has been given to creating bend-twist elastic coupling in high aspect ratio wings or other lifting surfaces. The direction of our work was in concert with the overall NASA effort Twenty- First Century Aircraft Technology (TCAT). A multi-disciplinary team was assembled by Dr. Damodar Ambur to work on wing technology, which included our project.

  19. Methods for evaluating the predictive accuracy of structural dynamic models

    NASA Technical Reports Server (NTRS)

    Hasselman, Timothy K.; Chrostowski, Jon D.

    1991-01-01

    Modeling uncertainty is defined in terms of the difference between predicted and measured eigenvalues and eigenvectors. Data compiled from 22 sets of analysis/test results was used to create statistical databases for large truss-type space structures and both pretest and posttest models of conventional satellite-type space structures. Modeling uncertainty is propagated through the model to produce intervals of uncertainty on frequency response functions, both amplitude and phase. This methodology was used successfully to evaluate the predictive accuracy of several structures, including the NASA CSI Evolutionary Structure tested at Langley Research Center. Test measurements for this structure were within + one-sigma intervals of predicted accuracy for the most part, demonstrating the validity of the methodology and computer code.

  20. Modal Parameters Evaluation in a Full-Scale Aircraft Demonstrator under Different Environmental Conditions Using HS 3D-DIC.

    PubMed

    Molina-Viedma, Ángel Jesús; López-Alba, Elías; Felipe-Sesé, Luis; Díaz, Francisco A; Rodríguez-Ahlquist, Javier; Iglesias-Vallejo, Manuel

    2018-02-02

    In real aircraft structures the comfort and the occupational performance of crewmembers and passengers are affected by the presence of noise. In this sense, special attention is focused on mechanical and material design for isolation and vibration control. Experimental characterization and, in particular, experimental modal analysis, provides information for adequate cabin noise control. Traditional sensors employed in the aircraft industry for this purpose are invasive and provide a low spatial resolution. This paper presents a methodology for experimental modal characterization of a front fuselage full-scale demonstrator using high-speed 3D digital image correlation, which is non-invasive, ensuring that the structural response is unperturbed by the instrumentation mass. Specifically, full-field measurements on the passenger window area were conducted when the structure was excited using an electrodynamic shaker. The spectral analysis of the measured time-domain displacements made it possible to identify natural frequencies and full-field operational deflection shapes. Changes in the modal parameters due to cabin pressurization and the behavior of different local structural modifications were assessed using this methodology. The proposed full-field methodology allowed the characterization of relevant dynamic response patterns, complementing the capabilities provided by accelerometers.

  1. Modal Parameters Evaluation in a Full-Scale Aircraft Demonstrator under Different Environmental Conditions Using HS 3D-DIC

    PubMed Central

    López-Alba, Elías; Felipe-Sesé, Luis; Díaz, Francisco A.; Rodríguez-Ahlquist, Javier; Iglesias-Vallejo, Manuel

    2018-01-01

    In real aircraft structures the comfort and the occupational performance of crewmembers and passengers are affected by the presence of noise. In this sense, special attention is focused on mechanical and material design for isolation and vibration control. Experimental characterization and, in particular, experimental modal analysis, provides information for adequate cabin noise control. Traditional sensors employed in the aircraft industry for this purpose are invasive and provide a low spatial resolution. This paper presents a methodology for experimental modal characterization of a front fuselage full-scale demonstrator using high-speed 3D digital image correlation, which is non-invasive, ensuring that the structural response is unperturbed by the instrumentation mass. Specifically, full-field measurements on the passenger window area were conducted when the structure was excited using an electrodynamic shaker. The spectral analysis of the measured time-domain displacements made it possible to identify natural frequencies and full-field operational deflection shapes. Changes in the modal parameters due to cabin pressurization and the behavior of different local structural modifications were assessed using this methodology. The proposed full-field methodology allowed the characterization of relevant dynamic response patterns, complementing the capabilities provided by accelerometers. PMID:29393897

  2. Integrating protein structural dynamics and evolutionary analysis with Bio3D.

    PubMed

    Skjærven, Lars; Yao, Xin-Qiu; Scarabelli, Guido; Grant, Barry J

    2014-12-10

    Popular bioinformatics approaches for studying protein functional dynamics include comparisons of crystallographic structures, molecular dynamics simulations and normal mode analysis. However, determining how observed displacements and predicted motions from these traditionally separate analyses relate to each other, as well as to the evolution of sequence, structure and function within large protein families, remains a considerable challenge. This is in part due to the general lack of tools that integrate information of molecular structure, dynamics and evolution. Here, we describe the integration of new methodologies for evolutionary sequence, structure and simulation analysis into the Bio3D package. This major update includes unique high-throughput normal mode analysis for examining and contrasting the dynamics of related proteins with non-identical sequences and structures, as well as new methods for quantifying dynamical couplings and their residue-wise dissection from correlation network analysis. These new methodologies are integrated with major biomolecular databases as well as established methods for evolutionary sequence and comparative structural analysis. New functionality for directly comparing results derived from normal modes, molecular dynamics and principal component analysis of heterogeneous experimental structure distributions is also included. We demonstrate these integrated capabilities with example applications to dihydrofolate reductase and heterotrimeric G-protein families along with a discussion of the mechanistic insight provided in each case. The integration of structural dynamics and evolutionary analysis in Bio3D enables researchers to go beyond a prediction of single protein dynamics to investigate dynamical features across large protein families. The Bio3D package is distributed with full source code and extensive documentation as a platform independent R package under a GPL2 license from http://thegrantlab.org/bio3d/ .

  3. Methodological aspects of fuel performance system analysis at raw hydrocarbon processing plants

    NASA Astrophysics Data System (ADS)

    Kulbjakina, A. V.; Dolotovskij, I. V.

    2018-01-01

    The article discusses the methodological aspects of fuel performance system analysis at raw hydrocarbon (RH) processing plants. Modern RH processing facilities are the major consumers of energy resources (ER) for their own needs. To reduce ER, including fuel consumption, and to develop rational fuel system structure are complex and relevant scientific tasks that can only be done using system analysis and complex system synthesis. In accordance with the principles of system analysis, the hierarchical structure of the fuel system, the block scheme for the synthesis of the most efficient alternative of the fuel system using mathematical models and the set of performance criteria have been developed on the main stages of the study. The results from the introduction of specific engineering solutions to develop their own energy supply sources for RH processing facilities have been provided.

  4. Examination of Short- and Long-Range Atomic Order Nanocrystalline SiC and Diamond by Powder Diffraction Methods

    NASA Technical Reports Server (NTRS)

    Palosz, B.; Grzanka, E.; Stelmakh, S.; Gierlotka, S.; Weber, H.-P.; Proffen, T.; Palosz, W.

    2002-01-01

    The real atomic structure of nanocrystals determines unique, key properties of the materials. Determination of the structure presents a challenge due to inherent limitations of standard powder diffraction techniques when applied to nanocrystals. Alternate methodology of the structural analysis of nanocrystals (several nanometers in size) based on Bragg-like scattering and called the "apparent lattice parameter" (alp) is proposed. Application of the alp methodology to examination of the core-shell model of nanocrystals will be presented. The results of application of the alp method to structural analysis of several nanopowders were complemented by those obtained by determination of the Atomic Pair Distribution Function, PDF. Based on synchrotron and neutron diffraction data measured in a large diffraction vector of up to Q = 25 Angstroms(exp -1), the surface stresses in nanocrystalline diamond and SiC were evaluated.

  5. Efficient Analysis of Complex Structures

    NASA Technical Reports Server (NTRS)

    Kapania, Rakesh K.

    2000-01-01

    Last various accomplishments achieved during this project are : (1) A Survey of Neural Network (NN) applications using MATLAB NN Toolbox on structural engineering especially on equivalent continuum models (Appendix A). (2) Application of NN and GAs to simulate and synthesize substructures: 1-D and 2-D beam problems (Appendix B). (3) Development of an equivalent plate-model analysis method (EPA) for static and vibration analysis of general trapezoidal built-up wing structures composed of skins, spars and ribs. Calculation of all sorts of test cases and comparison with measurements or FEA results. (Appendix C). (4) Basic work on using second order sensitivities on simulating wing modal response, discussion of sensitivity evaluation approaches, and some results (Appendix D). (5) Establishing a general methodology of simulating the modal responses by direct application of NN and by sensitivity techniques, in a design space composed of a number of design points. Comparison is made through examples using these two methods (Appendix E). (6) Establishing a general methodology of efficient analysis of complex wing structures by indirect application of NN: the NN-aided Equivalent Plate Analysis. Training of the Neural Networks for this purpose in several cases of design spaces, which can be applicable for actual design of complex wings (Appendix F).

  6. Fatigue criterion to system design, life and reliability

    NASA Technical Reports Server (NTRS)

    Zaretsky, E. V.

    1985-01-01

    A generalized methodology to structural life prediction, design, and reliability based upon a fatigue criterion is advanced. The life prediction methodology is based in part on work of W. Weibull and G. Lundberg and A. Palmgren. The approach incorporates the computed life of elemental stress volumes of a complex machine element to predict system life. The results of coupon fatigue testing can be incorporated into the analysis allowing for life prediction and component or structural renewal rates with reasonable statistical certainty.

  7. Digital Learning Characteristics and Principles of Information Resources Knowledge Structuring

    ERIC Educational Resources Information Center

    Belichenko, Margarita; Davidovitch, Nitza; Kravchenko, Yuri

    2017-01-01

    Analysis of principles knowledge representation in information systems led to the necessity of improving the structuring knowledge. It is caused by the development of software component and new possibilities of information technologies. The article combines methodological aspects of structuring knowledge and effective usage of information…

  8. Toward a Methodology for Conducting Social Impact Assessments Using Quality of Social Life Indicators.

    ERIC Educational Resources Information Center

    Olsen, Marvin E.; Merwin, Donna J.

    Broadly conceived, social impacts refer to all changes in the structure and functioning of patterned social ordering that occur in conjunction with an environmental, technological, or social innovation or alteration. Departing from the usual cost-benefit analysis approach, a new methodology proposes conducting social impact assessment grounded in…

  9. Predicting bird habitat quality from a geospatial analysis of FIA data

    Treesearch

    John M. Tirpak; D. Todd Jones-Farrand; Frank R., III Thompson; Daniel J. Twedt; Mark D. Nelson; William B., III Uihlein

    2009-01-01

    The ability to assess the influence of site-scale forest structure on avian habitat suitability at an ecoregional scale remains a major methodological constraint to effective biological planning for forest land birds in North America. We evaluated the feasibility of using forest inventory and analysis (FIA) data to define vegetation structure within forest patches,...

  10. EEMD-MUSIC-Based Analysis for Natural Frequencies Identification of Structures Using Artificial and Natural Excitations

    PubMed Central

    Amezquita-Sanchez, Juan P.; Romero-Troncoso, Rene J.; Osornio-Rios, Roque A.; Garcia-Perez, Arturo

    2014-01-01

    This paper presents a new EEMD-MUSIC- (ensemble empirical mode decomposition-multiple signal classification-) based methodology to identify modal frequencies in structures ranging from free and ambient vibration signals produced by artificial and natural excitations and also considering several factors as nonstationary effects, close modal frequencies, and noisy environments, which are common situations where several techniques reported in literature fail. The EEMD and MUSIC methods are used to decompose the vibration signal into a set of IMFs (intrinsic mode functions) and to identify the natural frequencies of a structure, respectively. The effectiveness of the proposed methodology has been validated and tested with synthetic signals and under real operating conditions. The experiments are focused on extracting the natural frequencies of a truss-type scaled structure and of a bridge used for both highway traffic and pedestrians. Results show the proposed methodology as a suitable solution for natural frequencies identification of structures from free and ambient vibration signals. PMID:24683346

  11. EEMD-MUSIC-based analysis for natural frequencies identification of structures using artificial and natural excitations.

    PubMed

    Camarena-Martinez, David; Amezquita-Sanchez, Juan P; Valtierra-Rodriguez, Martin; Romero-Troncoso, Rene J; Osornio-Rios, Roque A; Garcia-Perez, Arturo

    2014-01-01

    This paper presents a new EEMD-MUSIC- (ensemble empirical mode decomposition-multiple signal classification-) based methodology to identify modal frequencies in structures ranging from free and ambient vibration signals produced by artificial and natural excitations and also considering several factors as nonstationary effects, close modal frequencies, and noisy environments, which are common situations where several techniques reported in literature fail. The EEMD and MUSIC methods are used to decompose the vibration signal into a set of IMFs (intrinsic mode functions) and to identify the natural frequencies of a structure, respectively. The effectiveness of the proposed methodology has been validated and tested with synthetic signals and under real operating conditions. The experiments are focused on extracting the natural frequencies of a truss-type scaled structure and of a bridge used for both highway traffic and pedestrians. Results show the proposed methodology as a suitable solution for natural frequencies identification of structures from free and ambient vibration signals.

  12. Cost-benefit analysis of space technology

    NASA Technical Reports Server (NTRS)

    Hein, G. F.; Stevenson, S. M.; Sivo, J. N.

    1976-01-01

    A discussion of the implications and problems associated with the use of cost-benefit techniques is presented. Knowledge of these problems is useful in the structure of a decision making process. A methodology of cost-benefit analysis is presented for the evaluation of space technology. The use of the methodology is demonstrated with an evaluation of ion thrusters for north-south stationkeeping aboard geosynchronous communication satellites. A critique of the concept of consumers surplus for measuring benefits is also presented.

  13. Methodological support for the further abstraction of and philosophical examination of empirical findings in the context of caring science

    PubMed Central

    Lindberg, Elisabeth; Österberg, Sofia A.; Hörberg, Ulrica

    2016-01-01

    Phenomena in caring science are often complex and laden with meanings. Empirical research with the aim of capturing lived experiences is one way of revealing the complexity. Sometimes, however, results from empirical research need to be further discussed. One way is to further abstract the result and/or philosophically examine it. This has previously been performed and presented in scientific journals and doctoral theses, contributing to a greater understanding of phenomena in caring science. Although the intentions in many of these publications are laudable, the lack of methodological descriptions as well as a theoretical and systematic foundation can contribute to an ambiguity concerning how the results have emerged during the analysis. The aim of this paper is to describe the methodological support for the further abstraction of and/or philosophical examination of empirical findings. When trying to systematize the support procedures, we have used a reflective lifeworld research (RLR) approach. Based on the assumptions in RLR, this article will present methodological support for a theoretical examination that can include two stages. In the first stage, data from several (two or more) empirical results on an essential level are synthesized into a general structure. Sometimes the analysis ends with the general structure, but sometimes there is a need to proceed further. The second stage can then be a philosophical examination, in which the general structure is discussed in relation to a philosophical text, theory, or concept. It is important that the theories are brought in as the final stage after the completion of the analysis. Core dimensions of the described methodological support are, in accordance with RLR, openness, bridling, and reflection. The methodological support cannot be understood as fixed stages, but rather as a guiding light in the search for further meanings. PMID:26925926

  14. Analysis of Challenges for Management Education in India Using Total Interpretive Structural Modelling

    ERIC Educational Resources Information Center

    Mahajan, Ritika; Agrawal, Rajat; Sharma, Vinay; Nangia, Vinay

    2016-01-01

    Purpose: The purpose of this paper is to identify challenges for management education in India and explain their nature, significance and interrelations using total interpretive structural modelling (TISM), an innovative version of Warfield's interpretive structural modelling (ISM). Design/methodology/approach: The challenges have been drawn from…

  15. Does Gender-Specific Differential Item Functioning Affect the Structure in Vocational Interest Inventories?

    ERIC Educational Resources Information Center

    Beinicke, Andrea; Pässler, Katja; Hell, Benedikt

    2014-01-01

    The study investigates consequences of eliminating items showing gender-specific differential item functioning (DIF) on the psychometric structure of a standard RIASEC interest inventory. Holland's hexagonal model was tested for structural invariance using a confirmatory methodological approach (confirmatory factor analysis and randomization…

  16. The prospect of modern thermomechanics in structural integrity calculations of large-scale pressure vessels

    NASA Astrophysics Data System (ADS)

    Fekete, Tamás

    2018-05-01

    Structural integrity calculations play a crucial role in designing large-scale pressure vessels. Used in the electric power generation industry, these kinds of vessels undergo extensive safety analyses and certification procedures before deemed feasible for future long-term operation. The calculations are nowadays directed and supported by international standards and guides based on state-of-the-art results of applied research and technical development. However, their ability to predict a vessel's behavior under accidental circumstances after long-term operation is largely limited by the strong dependence of the analysis methodology on empirical models that are correlated to the behavior of structural materials and their changes during material aging. Recently a new scientific engineering paradigm, structural integrity has been developing that is essentially a synergistic collaboration between a number of scientific and engineering disciplines, modeling, experiments and numerics. Although the application of the structural integrity paradigm highly contributed to improving the accuracy of safety evaluations of large-scale pressure vessels, the predictive power of the analysis methodology has not yet improved significantly. This is due to the fact that already existing structural integrity calculation methodologies are based on the widespread and commonly accepted 'traditional' engineering thermal stress approach, which is essentially based on the weakly coupled model of thermomechanics and fracture mechanics. Recently, a research has been initiated in MTA EK with the aim to review and evaluate current methodologies and models applied in structural integrity calculations, including their scope of validity. The research intends to come to a better understanding of the physical problems that are inherently present in the pool of structural integrity problems of reactor pressure vessels, and to ultimately find a theoretical framework that could serve as a well-grounded theoretical foundation for a new modeling framework of structural integrity. This paper presents the first findings of the research project.

  17. The decade 1989-1998 in Spanish psychology: an analysis of research in statistics, methodology, and psychometric theory.

    PubMed

    García-Pérez, M A

    2001-11-01

    This paper presents an analysis of research published in the decade 1989-1998 by Spanish faculty members in the areas of statistical methods, research methodology, and psychometric theory. Database search and direct correspondence with faculty members in Departments of Methodology across Spain rendered a list of 193 papers published in these broad areas by 82 faculty members. These and other faculty members had actually published 931 papers over the decade of analysis, but 738 of them addressed topics not appropriate for description in this report. Classification and analysis of these 193 papers revealed topics that have attracted the most interest (psychophysics, item response theory, analysis of variance, sequential analysis, and meta-analysis) as well as other topics that have received less attention (scaling, factor analysis, time series, and structural models). A significant number of papers also dealt with various methodological issues (software, algorithms, instrumentation, and techniques). A substantial part of this report is devoted to describing the issues addressed across these 193 papers--most of which are written in the Spanish language and published in Spanish journals--and some representative references are given.

  18. Residual Strength Characterization of Unitized Structures Fabricated Using Different Manufacturing Technologies

    NASA Technical Reports Server (NTRS)

    Seshadri, B. R.; Smith, S. W.; Johnston, W. M.

    2008-01-01

    This viewgraph presentation describes residual strength analysis of integral structures fabricated using different manufacturing procedures. The topics include: 1) Built-up and Integral Structures; 2) Development of Prediction Methodology for Integral Structures Fabricated using different Manufacturing Procedures; 3) Testing Facility; 4) Fracture Parameters Definition; 5) Crack Branching in Integral Structures; 6) Results and Discussion; and 7) Concluding Remarks.

  19. Application of enhanced modern structured analysis techniques to Space Station Freedom electric power system requirements

    NASA Technical Reports Server (NTRS)

    Biernacki, John; Juhasz, John; Sadler, Gerald

    1991-01-01

    A team of Space Station Freedom (SSF) system engineers are in the process of extensive analysis of the SSF requirements, particularly those pertaining to the electrical power system (EPS). The objective of this analysis is the development of a comprehensive, computer-based requirements model, using an enhanced modern structured analysis methodology (EMSA). Such a model provides a detailed and consistent representation of the system's requirements. The process outlined in the EMSA methodology is unique in that it allows the graphical modeling of real-time system state transitions, as well as functional requirements and data relationships, to be implemented using modern computer-based tools. These tools permit flexible updating and continuous maintenance of the models. Initial findings resulting from the application of EMSA to the EPS have benefited the space station program by linking requirements to design, providing traceability of requirements, identifying discrepancies, and fostering an understanding of the EPS.

  20. Methods for heat transfer and temperature field analysis of the insulated diesel phase 2 progress report

    NASA Technical Reports Server (NTRS)

    Morel, T.; Kerlbar, R.; Fort, E. F.; Blumberg, P. N.

    1985-01-01

    This report describes work done during Phase 2 of a 3 year program aimed at developing a comprehensive heat transfer and thermal analysis methodology for design analysis of insulated diesel engines. The overall program addresses all the key heat transfer issues: (1) spatially and time-resolved convective and radiative in-cylinder heat transfer, (2) steady-state conduction in the overall structure, and (3) cyclical and load/speed temperature transients in the engine structure. During Phase 2, radiation heat transfer model was developed, which accounts for soot formation and burn up. A methodology was developed for carrying out the multi-dimensional finite-element heat conduction calculations within the framework of thermodynamic cycle codes. Studies were carried out using the integrated methodology to address key issues in low heat rejection engines. A wide ranging design analysis matrix was covered, including a variety of insulation strategies, recovery devices and base engine configurations. A single cylinder Cummins engine was installed at Purdue University, and it was brought to a full operational status. The development of instrumentation was continued, concentrating on radiation heat flux detector, total heat flux probe, and accurate pressure-crank angle data acquisition.

  1. Using Social Networking to Understand Social Networks: Analysis of a Mobile Phone Closed User Group Used by a Ghanaian Health Team

    PubMed Central

    Akosah, Eric; Ohemeng-Dapaah, Seth; Sakyi Baah, Joseph; Kanter, Andrew S

    2013-01-01

    Background The network structure of an organization influences how well or poorly an organization communicates and manages its resources. In the Millennium Villages Project site in Bonsaaso, Ghana, a mobile phone closed user group has been introduced for use by the Bonsaaso Millennium Villages Project Health Team and other key individuals. No assessment on the benefits or barriers of the use of the closed user group had been carried out. Objective The purpose of this research was to make the case for the use of social network analysis methods to be applied in health systems research—specifically related to mobile health. Methods This study used mobile phone voice records of, conducted interviews with, and reviewed call journals kept by a mobile phone closed user group consisting of the Bonsaaso Millennium Villages Project Health Team. Social network analysis methodology complemented by a qualitative component was used. Monthly voice data of the closed user group from Airtel Bharti Ghana were analyzed using UCINET and visual depictions of the network were created using NetDraw. Interviews and call journals kept by informants were analyzed using NVivo. Results The methodology was successful in helping identify effective organizational structure. Members of the Health Management Team were the more central players in the network, rather than the Community Health Nurses (who might have been expected to be central). Conclusions Social network analysis methodology can be used to determine the most productive structure for an organization or team, identify gaps in communication, identify key actors with greatest influence, and more. In conclusion, this methodology can be a useful analytical tool, especially in the context of mobile health, health services, and operational and managerial research. PMID:23552721

  2. Using social networking to understand social networks: analysis of a mobile phone closed user group used by a Ghanaian health team.

    PubMed

    Kaonga, Nadi Nina; Labrique, Alain; Mechael, Patricia; Akosah, Eric; Ohemeng-Dapaah, Seth; Sakyi Baah, Joseph; Kodie, Richmond; Kanter, Andrew S; Levine, Orin

    2013-04-03

    The network structure of an organization influences how well or poorly an organization communicates and manages its resources. In the Millennium Villages Project site in Bonsaaso, Ghana, a mobile phone closed user group has been introduced for use by the Bonsaaso Millennium Villages Project Health Team and other key individuals. No assessment on the benefits or barriers of the use of the closed user group had been carried out. The purpose of this research was to make the case for the use of social network analysis methods to be applied in health systems research--specifically related to mobile health. This study used mobile phone voice records of, conducted interviews with, and reviewed call journals kept by a mobile phone closed user group consisting of the Bonsaaso Millennium Villages Project Health Team. Social network analysis methodology complemented by a qualitative component was used. Monthly voice data of the closed user group from Airtel Bharti Ghana were analyzed using UCINET and visual depictions of the network were created using NetDraw. Interviews and call journals kept by informants were analyzed using NVivo. The methodology was successful in helping identify effective organizational structure. Members of the Health Management Team were the more central players in the network, rather than the Community Health Nurses (who might have been expected to be central). Social network analysis methodology can be used to determine the most productive structure for an organization or team, identify gaps in communication, identify key actors with greatest influence, and more. In conclusion, this methodology can be a useful analytical tool, especially in the context of mobile health, health services, and operational and managerial research.

  3. Strategies of TV News Dramatization: An Attempt of Discourse Analysis.

    ERIC Educational Resources Information Center

    Mancini, Paolo

    This paper defines indicators related to the dramatization of television and formulates a methodology for analyzing the discourse of the television news based on empirical studies. This methodology is used to isolate some indicators of dramatization as it relates to the structure and form of the message. The changes that have affected the text of…

  4. Advances and trends in computational structural mechanics

    NASA Technical Reports Server (NTRS)

    Noor, A. K.

    1986-01-01

    Recent developments in computational structural mechanics are reviewed with reference to computational needs for future structures technology, advances in computational models for material behavior, discrete element technology, assessment and control of numerical simulations of structural response, hybrid analysis, and techniques for large-scale optimization. Research areas in computational structural mechanics which have high potential for meeting future technological needs are identified. These include prediction and analysis of the failure of structural components made of new materials, development of computational strategies and solution methodologies for large-scale structural calculations, and assessment of reliability and adaptive improvement of response predictions.

  5. Multi-resolution analysis for region of interest extraction in thermographic nondestructive evaluation

    NASA Astrophysics Data System (ADS)

    Ortiz-Jaramillo, B.; Fandiño Toro, H. A.; Benitez-Restrepo, H. D.; Orjuela-Vargas, S. A.; Castellanos-Domínguez, G.; Philips, W.

    2012-03-01

    Infrared Non-Destructive Testing (INDT) is known as an effective and rapid method for nondestructive inspection. It can detect a broad range of near-surface structuring flaws in metallic and composite components. Those flaws are modeled as a smooth contour centered at peaks of stored thermal energy, termed Regions of Interest (ROI). Dedicated methodologies must detect the presence of those ROIs. In this paper, we present a methodology for ROI extraction in INDT tasks. The methodology deals with the difficulties due to the non-uniform heating. The non-uniform heating affects low spatial/frequencies and hinders the detection of relevant points in the image. In this paper, a methodology for ROI extraction in INDT using multi-resolution analysis is proposed, which is robust to ROI low contrast and non-uniform heating. The former methodology includes local correlation, Gaussian scale analysis and local edge detection. In this methodology local correlation between image and Gaussian window provides interest points related to ROIs. We use a Gaussian window because thermal behavior is well modeled by Gaussian smooth contours. Also, the Gaussian scale is used to analyze details in the image using multi-resolution analysis avoiding low contrast, non-uniform heating and selection of the Gaussian window size. Finally, local edge detection is used to provide a good estimation of the boundaries in the ROI. Thus, we provide a methodology for ROI extraction based on multi-resolution analysis that is better or equal compared with the other dedicate algorithms proposed in the state of art.

  6. Formulation of the nonlinear analysis of shell-like structures, subjected to time-dependent mechanical and thermal loading

    NASA Technical Reports Server (NTRS)

    Simitses, George J.; Carlson, Robert L.; Riff, Richard

    1991-01-01

    The object of the research reported herein was to develop a general mathematical model and solution methodologies for analyzing the structural response of thin, metallic shell structures under large transient, cyclic, or static thermomechanical loads. Among the system responses associated with these loads and conditions are thermal buckling, creep buckling, and ratcheting. Thus geometric and material nonlinearities (of high order) can be anticipated and must be considered in developing the mathematical model. The methodology is demonstrated through different problems of extension, shear, and of planar curved beams. Moreover, importance of the inclusion of large strain is clearly demonstrated, through the chosen applications.

  7. Validating a Finite Element Model of a Structure Subjected to Mine Blast with Experimental Modal Analysis

    DTIC Science & Technology

    2017-11-01

    The Under-body Blast Methodology (UBM) for the Test and Evaluation (T&E) program was established to provide a capability for the US Army Test and... Evaluation Command to assess the vulnerability of vehicles to under-body blast. Finite element (FE) models are part of the current UBM for T&E methodology...Methodology (UBM) for the Test and Evaluation (T&E) program was established to provide a capability for the US Army Test and Evaluation Command

  8. ’Coxiella Burnetii’ Vaccine Development: Lipopolysaccharide Structural Analysis

    DTIC Science & Technology

    1991-02-20

    Analytical instrumentation and methodology is presented for the determination of endotoxin -related structures at much improved sensitivity and... ENDOTOXIN CHARACTERIZATION BY SFC .......................... 10 III. COXIELLA BURNETII LPS CHARACTERIZATION A. EXPERIMENTAL...period for the determination of endotoxin -related structures at much improved sensitivity and specificity. Reports, and their applications, are listed in

  9. The Influence of Accreditation on the Sustainability of Organizations with the Brazilian Accreditation Methodology

    PubMed Central

    de Paiva, Anderson Paulo

    2018-01-01

    This research evaluates the influence of the Brazilian accreditation methodology on the sustainability of the organizations. Critical factors for implementing accreditation were also examined, including measuring the relationships established between these factors in the organization sustainability. The present study was developed based on the survey methodology applied in the organizations accredited by ONA (National Accreditation Organization); 288 responses were received from the top level managers. The analysis of quantitative data of the measurement models was made with factorial analysis from principal components. The final model was evaluated from the confirmatory factorial analysis and structural equation modeling techniques. The results from the research are vital for the definition of factors that interfere in the accreditation processes, providing a better understanding for accredited organizations and for Brazilian accreditation. PMID:29599939

  10. Predicting the Reliability of Ceramics Under Transient Loads and Temperatures With CARES/Life

    NASA Technical Reports Server (NTRS)

    Nemeth, Noel N.; Jadaan, Osama M.; Palfi, Tamas; Baker, Eric H.

    2003-01-01

    A methodology is shown for predicting the time-dependent reliability of ceramic components against catastrophic rupture when subjected to transient thermomechanical loads (including cyclic loads). The methodology takes into account the changes in material response that can occur with temperature or time (i.e., changing fatigue and Weibull parameters with temperature or time). This capability has been added to the NASA CARES/Life (Ceramic Analysis and Reliability Evaluation of Structures/Life) code. The code has been modified to have the ability to interface with commercially available finite element analysis (FEA) codes executed for transient load histories. Examples are provided to demonstrate the features of the methodology as implemented in the CARES/Life program.

  11. Nonlinear damage detection in composite structures using bispectral analysis

    NASA Astrophysics Data System (ADS)

    Ciampa, Francesco; Pickering, Simon; Scarselli, Gennaro; Meo, Michele

    2014-03-01

    Literature offers a quantitative number of diagnostic methods that can continuously provide detailed information of the material defects and damages in aerospace and civil engineering applications. Indeed, low velocity impact damages can considerably degrade the integrity of structural components and, if not detected, they can result in catastrophic failure conditions. This paper presents a nonlinear Structural Health Monitoring (SHM) method, based on ultrasonic guided waves (GW), for the detection of the nonlinear signature in a damaged composite structure. The proposed technique, based on a bispectral analysis of ultrasonic input waveforms, allows for the evaluation of the nonlinear response due to the presence of cracks and delaminations. Indeed, such a methodology was used to characterize the nonlinear behaviour of the structure, by exploiting the frequency mixing of the original waveform acquired from a sparse array of sensors. The robustness of bispectral analysis was experimentally demonstrated on a damaged carbon fibre reinforce plastic (CFRP) composite panel, and the nonlinear source was retrieved with a high level of accuracy. Unlike other linear and nonlinear ultrasonic methods for damage detection, this methodology does not require any baseline with the undamaged structure for the evaluation of the nonlinear source, nor a priori knowledge of the mechanical properties of the specimen. Moreover, bispectral analysis can be considered as a nonlinear elastic wave spectroscopy (NEWS) technique for materials showing either classical or non-classical nonlinear behaviour.

  12. Sociocultural Meanings of Nanotechnology: Research Methodologies

    NASA Astrophysics Data System (ADS)

    Bainbridge, William Sims

    2004-06-01

    This article identifies six social-science research methodologies that will be useful for charting the sociocultural meaning of nanotechnology: web-based questionnaires, vignette experiments, analysis of web linkages, recommender systems, quantitative content analysis, and qualitative textual analysis. Data from a range of sources are used to illustrate how the methods can delineate the intellectual content and institutional structure of the emerging nanotechnology culture. Such methods will make it possible in future to test hypotheses such as that there are two competing definitions of nanotechnology - the technical-scientific and the science-fiction - that are influencing public perceptions by different routes and in different directions.

  13. Non-isothermal elastoviscoplastic analysis of planar curved beams

    NASA Technical Reports Server (NTRS)

    Simitses, G. J.; Carlson, R. L.; Riff, R.

    1988-01-01

    The development of a general mathematical model and solution methodologies, to examine the behavior of thin structural elements such as beams, rings, and arches, subjected to large nonisothermal elastoviscoplastic deformations is presented. Thus, geometric as well as material type nonlinearities of higher order are present in the analysis. For this purpose a complete true abinito rate theory of kinematics and kinetics for thin bodies, without any restriction on the magnitude of the transformation is presented. A previously formulated elasto-thermo-viscoplastic material constitutive law is employed in the analysis. The methodology is demonstrated through three different straight and curved beams problems.

  14. FAA/NASA International Symposium on Advanced Structural Integrity Methods for Airframe Durability and Damage Tolerance

    NASA Technical Reports Server (NTRS)

    Harris, Charles E. (Editor)

    1994-01-01

    International technical experts in durability and damage tolerance of metallic airframe structures were assembled to present and discuss recent research findings and the development of advanced design and analysis methods, structural concepts, and advanced materials. The symposium focused on the dissemination of new knowledge and the peer-review of progress on the development of advanced methodologies. Papers were presented on: structural concepts for enhanced durability, damage tolerance, and maintainability; new metallic alloys and processing technology; fatigue crack initiation and small crack effects; fatigue crack growth models; fracture mechanics failure, criteria for ductile materials; structural mechanics methodology for residual strength and life prediction; development of flight load spectra for design and testing; and advanced approaches to resist corrosion and environmentally assisted fatigue.

  15. FAA/NASA International Symposium on Advanced Structural Integrity Methods for Airframe Durability and Damage Tolerance, part 2

    NASA Technical Reports Server (NTRS)

    Harris, Charles E. (Editor)

    1994-01-01

    The international technical experts in the areas of durability and damage tolerance of metallic airframe structures were assembled to present and discuss recent research findings and the development of advanced design and analysis methods, structural concepts, and advanced materials. The principal focus of the symposium was on the dissemination of new knowledge and the peer-review of progress on the development of advanced methodologies. Papers were presented on the following topics: structural concepts for enhanced durability, damage tolerance, and maintainability; new metallic alloys and processing technology; fatigue crack initiation and small crack effects; fatigue crack growth models; fracture mechanics failure criteria for ductile materials; structural mechanics methodology for residual strength and life prediction; development of flight load spectra for design and testing; and corrosion resistance.

  16. An Innovative Structural Mode Selection Methodology: Application for the X-33 Launch Vehicle Finite Element Model

    NASA Technical Reports Server (NTRS)

    Hidalgo, Homero, Jr.

    2000-01-01

    An innovative methodology for determining structural target mode selection and mode selection based on a specific criterion is presented. An effective approach to single out modes which interact with specific locations on a structure has been developed for the X-33 Launch Vehicle Finite Element Model (FEM). We presented Root-Sum-Square (RSS) displacement method computes resultant modal displacement for each mode at selected degrees of freedom (DOF) and sorts to locate modes with highest values. This method was used to determine modes, which most influenced specific locations/points on the X-33 flight vehicle such as avionics control components, aero-surface control actuators, propellant valve and engine points for use in flight control stability analysis and for flight POGO stability analysis. Additionally, the modal RSS method allows for primary or global target vehicle modes to also be identified in an accurate and efficient manner.

  17. FAME, a microprocessor based front-end analysis and modeling environment

    NASA Technical Reports Server (NTRS)

    Rosenbaum, J. D.; Kutin, E. B.

    1980-01-01

    Higher order software (HOS) is a methodology for the specification and verification of large scale, complex, real time systems. The HOS methodology was implemented as FAME (front end analysis and modeling environment), a microprocessor based system for interactively developing, analyzing, and displaying system models in a low cost user-friendly environment. The nature of the model is such that when completed it can be the basis for projection to a variety of forms such as structured design diagrams, Petri-nets, data flow diagrams, and PSL/PSA source code. The user's interface with the analyzer is easily recognized by any current user of a structured modeling approach; therefore extensive training is unnecessary. Furthermore, when all the system capabilities are used one can check on proper usage of data types, functions, and control structures thereby adding a new dimension to the design process that will lead to better and more easily verified software designs.

  18. Progress in multirate digital control system design

    NASA Technical Reports Server (NTRS)

    Berg, Martin C.; Mason, Gregory S.

    1991-01-01

    A new methodology for multirate sampled-data control design based on a new generalized control law structure, two new parameter-optimization-based control law synthesis methods, and a new singular-value-based robustness analysis method are described. The control law structure can represent multirate sampled-data control laws of arbitrary structure and dynamic order, with arbitrarily prescribed sampling rates for all sensors and update rates for all processor states and actuators. The two control law synthesis methods employ numerical optimization to determine values for the control law parameters. The robustness analysis method is based on the multivariable Nyquist criterion applied to the loop transfer function for the sampling period equal to the period of repetition of the system's complete sampling/update schedule. The complete methodology is demonstrated by application to the design of a combination yaw damper and modal suppression system for a commercial aircraft.

  19. Comparison of Damage Path Predictions for Composite Laminates by Explicit and Standard Finite Element Analysis Tools

    NASA Technical Reports Server (NTRS)

    Bogert, Philip B.; Satyanarayana, Arunkumar; Chunchu, Prasad B.

    2006-01-01

    Splitting, ultimate failure load and the damage path in center notched composite specimens subjected to in-plane tension loading are predicted using progressive failure analysis methodology. A 2-D Hashin-Rotem failure criterion is used in determining intra-laminar fiber and matrix failures. This progressive failure methodology has been implemented in the Abaqus/Explicit and Abaqus/Standard finite element codes through user written subroutines "VUMAT" and "USDFLD" respectively. A 2-D finite element model is used for predicting the intra-laminar damages. Analysis results obtained from the Abaqus/Explicit and Abaqus/Standard code show good agreement with experimental results. The importance of modeling delamination in progressive failure analysis methodology is recognized for future studies. The use of an explicit integration dynamics code for simple specimen geometry and static loading establishes a foundation for future analyses where complex loading and nonlinear dynamic interactions of damage and structure will necessitate it.

  20. Social capital calculations in economic systems: Experimental study

    NASA Astrophysics Data System (ADS)

    Chepurov, E. G.; Berg, D. B.; Zvereva, O. M.; Nazarova, Yu. Yu.; Chekmarev, I. V.

    2017-11-01

    The paper describes the social capital study for a system where actors are engaged in an economic activity. The focus is on the analysis of communications structural parameters (transactions) between the actors. Comparison between transaction network graph structure and the structure of a random Bernoulli graph of the same dimension and density allows revealing specific structural features of the economic system under study. Structural analysis is based on SNA-methodology (SNA - Social Network Analysis). It is shown that structural parameter values of the graph formed by agent relationship links may well characterize different aspects of the social capital structure. The research advocates that it is useful to distinguish the difference between each agent social capital and the whole system social capital.

  1. Software Methodology Catalog. Second Edition. Revision

    DTIC Science & Technology

    1989-03-01

    structured design involve characterization of the data flow through graphical representation, identification of the various transform elements, assembling...and graphical diagrams to facilitate communication within the team. The diagrams are consistent with the design language and can be automatically...organization, box structure graphics provide a visual means of client communication. These box structures are used during analysis and design to review

  2. A Sizing Methodology for the Conceptual Design of Blended-Wing-Body Transports. Degree awarded by George Washington Univ.

    NASA Technical Reports Server (NTRS)

    Kimmel, William M. (Technical Monitor); Bradley, Kevin R.

    2004-01-01

    This paper describes the development of a methodology for sizing Blended-Wing-Body (BWB) transports and how the capabilities of the Flight Optimization System (FLOPS) have been expanded using that methodology. In this approach, BWB transports are sized based on the number of passengers in each class that must fit inside the centerbody or pressurized vessel. Weight estimation equations for this centerbody structure were developed using Finite Element Analysis (FEA). This paper shows how the sizing methodology has been incorporated into FLOPS to enable the design and analysis of BWB transports. Previous versions of FLOPS did not have the ability to accurately represent or analyze BWB configurations in any reliable, logical way. The expanded capabilities allow the design and analysis of a 200 to 450-passenger BWB transport or the analysis of a BWB transport for which the geometry is already known. The modifications to FLOPS resulted in differences of less than 4 percent for the ramp weight of a BWB transport in this range when compared to previous studies performed by NASA and Boeing.

  3. Application of atomic force microscopy as a nanotechnology tool in food science.

    PubMed

    Yang, Hongshun; Wang, Yifen; Lai, Shaojuan; An, Hongjie; Li, Yunfei; Chen, Fusheng

    2007-05-01

    Atomic force microscopy (AFM) provides a method for detecting nanoscale structural information. First, this review explains the fundamentals of AFM, including principle, manipulation, and analysis. Applications of AFM are then reported in food science and technology research, including qualitative macromolecule and polymer imaging, complicated or quantitative structure analysis, molecular interaction, molecular manipulation, surface topography, and nanofood characterization. The results suggested that AFM could bring insightful knowledge on food properties, and the AFM analysis could be used to illustrate some mechanisms of property changes during processing and storage. However, the current difficulty in applying AFM to food research is lacking appropriate methodology for different food systems. Better understanding of AFM technology and developing corresponding methodology for complicated food systems would lead to a more in-depth understanding of food properties at macromolecular levels and enlarge their applications. The AFM results could greatly improve the food processing and storage technologies.

  4. Development of advanced structural analysis methodologies for predicting widespread fatigue damage in aircraft structures

    NASA Technical Reports Server (NTRS)

    Harris, Charles E.; Starnes, James H., Jr.; Newman, James C., Jr.

    1995-01-01

    NASA is developing a 'tool box' that includes a number of advanced structural analysis computer codes which, taken together, represent the comprehensive fracture mechanics capability required to predict the onset of widespread fatigue damage. These structural analysis tools have complementary and specialized capabilities ranging from a finite-element-based stress-analysis code for two- and three-dimensional built-up structures with cracks to a fatigue and fracture analysis code that uses stress-intensity factors and material-property data found in 'look-up' tables or from equations. NASA is conducting critical experiments necessary to verify the predictive capabilities of the codes, and these tests represent a first step in the technology-validation and industry-acceptance processes. NASA has established cooperative programs with aircraft manufacturers to facilitate the comprehensive transfer of this technology by making these advanced structural analysis codes available to industry.

  5. Contemporary Impact Analysis Methodology for Planetary Sample Return Missions

    NASA Technical Reports Server (NTRS)

    Perino, Scott V.; Bayandor, Javid; Samareh, Jamshid A.; Armand, Sasan C.

    2015-01-01

    Development of an Earth entry vehicle and the methodology created to evaluate the vehicle's impact landing response when returning to Earth is reported. NASA's future Mars Sample Return Mission requires a robust vehicle to return Martian samples back to Earth for analysis. The Earth entry vehicle is a proposed solution to this Mars mission requirement. During Earth reentry, the vehicle slows within the atmosphere and then impacts the ground at its terminal velocity. To protect the Martian samples, a spherical energy absorber called an impact sphere is under development. The impact sphere is composed of hybrid composite and crushable foam elements that endure large plastic deformations during impact and cause a highly nonlinear vehicle response. The developed analysis methodology captures a range of complex structural interactions and much of the failure physics that occurs during impact. Numerical models were created and benchmarked against experimental tests conducted at NASA Langley Research Center. The postimpact structural damage assessment showed close correlation between simulation predictions and experimental results. Acceleration, velocity, displacement, damage modes, and failure mechanisms were all effectively captured. These investigations demonstrate that the Earth entry vehicle has great potential in facilitating future sample return missions.

  6. Automatic Inference of Cryptographic Key Length Based on Analysis of Proof Tightness

    DTIC Science & Technology

    2016-06-01

    within an attack tree structure, then expand attack tree methodology to include cryptographic reductions. We then provide the algorithms for...maintaining and automatically reasoning about these expanded attack trees . We provide a software tool that utilizes machine-readable proof and attack metadata...and the attack tree methodology to provide rapid and precise answers regarding security parameters and effective security. This eliminates the need

  7. The Use of Culture in Operational Planning

    DTIC Science & Technology

    2005-06-17

    comparativism , relativism, functionalism, and structuralism. It will conclude by describing the methodology that will be used for this paper...combination of both approaches. The first two methodologies out of the four discussed in this paper are relativism and comparativism . These theories look...the framework for analysis, there will be evidence of relativism and comparativism . The study will be from the etic viewpoint. The only means to get an

  8. Biophysical analysis of bacterial and viral systems. A shock tube study of bio-aerosols and a correlated AFM/nanosims investigation of vaccinia virus

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gates, Sean Damien

    2013-05-01

    The work presented herein is concerned with the development of biophysical methodology designed to address pertinent questions regarding the behavior and structure of select pathogenic agents. Two distinct studies are documented: a shock tube analysis of endospore-laden bio-aerosols and a correlated AFM/NanoSIMS study of the structure of vaccinia virus.

  9. Methodology for CFD Design Analysis of National Launch System Nozzle Manifold

    NASA Technical Reports Server (NTRS)

    Haire, Scot L.

    1993-01-01

    The current design environment dictates that high technology CFD (Computational Fluid Dynamics) analysis produce quality results in a timely manner if it is to be integrated into the design process. The design methodology outlined describes the CFD analysis of an NLS (National Launch System) nozzle film cooling manifold. The objective of the analysis was to obtain a qualitative estimate for the flow distribution within the manifold. A complex, 3D, multiple zone, structured grid was generated from a 3D CAD file of the geometry. A Euler solution was computed with a fully implicit compressible flow solver. Post processing consisted of full 3D color graphics and mass averaged performance. The result was a qualitative CFD solution that provided the design team with relevant information concerning the flow distribution in and performance characteristics of the film cooling manifold within an effective time frame. Also, this design methodology was the foundation for a quick turnaround CFD analysis of the next iteration in the manifold design.

  10. Sixteenth NASTRAN (R) Users' Colloquium

    NASA Technical Reports Server (NTRS)

    1988-01-01

    These are the proceedings of the Sixteenth NASTRAN Users' Colloquium held in Arlington, Virginia from 25 to 29 April, 1988. Technical papers contributed by participants review general application of finite element methodology and the specific application of the NASA Structural Analysis System (NASTRAN) to a variety of static and dynamic structural problems.

  11. An approximate methods approach to probabilistic structural analysis

    NASA Technical Reports Server (NTRS)

    Mcclung, R. C.; Millwater, H. R.; Wu, Y.-T.; Thacker, B. H.; Burnside, O. H.

    1989-01-01

    A probabilistic structural analysis method (PSAM) is described which makes an approximate calculation of the structural response of a system, including the associated probabilistic distributions, with minimal computation time and cost, based on a simplified representation of the geometry, loads, and material. The method employs the fast probability integration (FPI) algorithm of Wu and Wirsching. Typical solution strategies are illustrated by formulations for a representative critical component chosen from the Space Shuttle Main Engine (SSME) as part of a major NASA-sponsored program on PSAM. Typical results are presented to demonstrate the role of the methodology in engineering design and analysis.

  12. Incorporating Probability Models of Complex Test Structures to Perform Technology Independent FPGA Single Event Upset Analysis

    NASA Technical Reports Server (NTRS)

    Berg, M. D.; Kim, H. S.; Friendlich, M. A.; Perez, C. E.; Seidlick, C. M.; LaBel, K. A.

    2011-01-01

    We present SEU test and analysis of the Microsemi ProASIC3 FPGA. SEU Probability models are incorporated for device evaluation. Included is a comparison to the RTAXS FPGA illustrating the effectiveness of the overall testing methodology.

  13. Improved FTA methodology and application to subsea pipeline reliability design.

    PubMed

    Lin, Jing; Yuan, Yongbo; Zhang, Mingyuan

    2014-01-01

    An innovative logic tree, Failure Expansion Tree (FET), is proposed in this paper, which improves on traditional Fault Tree Analysis (FTA). It describes a different thinking approach for risk factor identification and reliability risk assessment. By providing a more comprehensive and objective methodology, the rather subjective nature of FTA node discovery is significantly reduced and the resulting mathematical calculations for quantitative analysis are greatly simplified. Applied to the Useful Life phase of a subsea pipeline engineering project, the approach provides a more structured analysis by constructing a tree following the laws of physics and geometry. Resulting improvements are summarized in comparison table form.

  14. Improved FTA Methodology and Application to Subsea Pipeline Reliability Design

    PubMed Central

    Lin, Jing; Yuan, Yongbo; Zhang, Mingyuan

    2014-01-01

    An innovative logic tree, Failure Expansion Tree (FET), is proposed in this paper, which improves on traditional Fault Tree Analysis (FTA). It describes a different thinking approach for risk factor identification and reliability risk assessment. By providing a more comprehensive and objective methodology, the rather subjective nature of FTA node discovery is significantly reduced and the resulting mathematical calculations for quantitative analysis are greatly simplified. Applied to the Useful Life phase of a subsea pipeline engineering project, the approach provides a more structured analysis by constructing a tree following the laws of physics and geometry. Resulting improvements are summarized in comparison table form. PMID:24667681

  15. Integration of topological modification within the modeling of multi-physics systems: Application to a Pogo-stick

    NASA Astrophysics Data System (ADS)

    Abdeljabbar Kharrat, Nourhene; Plateaux, Régis; Miladi Chaabane, Mariem; Choley, Jean-Yves; Karra, Chafik; Haddar, Mohamed

    2018-05-01

    The present work tackles the modeling of multi-physics systems applying a topological approach while proceeding with a new methodology using a topological modification to the structure of systems. Then the comparison with the Magos' methodology is made. Their common ground is the use of connectivity within systems. The comparison and analysis of the different types of modeling show the importance of the topological methodology through the integration of the topological modification to the topological structure of a multi-physics system. In order to validate this methodology, the case of Pogo-stick is studied. The first step consists in generating a topological graph of the system. Then the connectivity step takes into account the contact with the ground. During the last step of this research; the MGS language (Modeling of General System) is used to model the system through equations. Finally, the results are compared to those obtained by MODELICA. Therefore, this proposed methodology may be generalized to model multi-physics systems that can be considered as a set of local elements.

  16. Infrared spectroscopy as a tool to characterise starch ordered structure--a joint FTIR-ATR, NMR, XRD and DSC study.

    PubMed

    Warren, Frederick J; Gidley, Michael J; Flanagan, Bernadine M

    2016-03-30

    Starch has a heterogeneous, semi-crystalline granular structure and the degree of ordered structure can affect its behaviour in foods and bioplastics. A range of methodologies are employed to study starch structure; differential scanning calorimetry, (13)C nuclear magnetic resonance, X-ray diffraction and Fourier transform infrared spectroscopy (FTIR). Despite the appeal of FTIR as a rapid, non-destructive methodology, there is currently no systematically defined quantitative relationship between FTIR spectral features and other starch structural measures. Here, we subject 61 starch samples to structural analysis, and systematically correlate FTIR spectra with other measures of starch structure. A hydration dependent peak position shift in the FTIR spectra of starch is observed, resulting from increased molecular order, but with complex, non-linear behaviour. We demonstrate that FTIR is a tool that can quantitatively probe short range interactions in starch structure. However, the assumptions of linear relationships between starch ordered structure and peak ratios are overly simplistic. Copyright © 2015 Elsevier Ltd. All rights reserved.

  17. Design considerations for fiber composite structures

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.

    1972-01-01

    An overview of the design methodology for designing structural components from fiber composites is presented. In particular, the need for new conceptual structural designs for the future is discussed and the evolution of conceptual design is illustrated. Sources of design data, analysis and design procedures, and the basic components of structural fiber composites are cited and described. Examples of tradeoff studies and optimum designs are discussed and a simple structure is described in some detail.

  18. Structural design of composite rotor blades with consideration of manufacturability, durability, and manufacturing uncertainties

    NASA Astrophysics Data System (ADS)

    Li, Leihong

    A modular structural design methodology for composite blades is developed. This design method can be used to design composite rotor blades with sophisticate geometric cross-sections. This design method hierarchically decomposed the highly-coupled interdisciplinary rotor analysis into global and local levels. In the global level, aeroelastic response analysis and rotor trim are conduced based on multi-body dynamic models. In the local level, variational asymptotic beam sectional analysis methods are used for the equivalent one-dimensional beam properties. Compared with traditional design methodology, the proposed method is more efficient and accurate. Then, the proposed method is used to study three different design problems that have not been investigated before. The first is to add manufacturing constraints into design optimization. The introduction of manufacturing constraints complicates the optimization process. However, the design with manufacturing constraints benefits the manufacturing process and reduces the risk of violating major performance constraints. Next, a new design procedure for structural design against fatigue failure is proposed. This procedure combines the fatigue analysis with the optimization process. The durability or fatigue analysis employs a strength-based model. The design is subject to stiffness, frequency, and durability constraints. Finally, the manufacturing uncertainty impacts on rotor blade aeroelastic behavior are investigated, and a probabilistic design method is proposed to control the impacts of uncertainty on blade structural performance. The uncertainty factors include dimensions, shapes, material properties, and service loads.

  19. An optimal baseline selection methodology for data-driven damage detection and temperature compensation in acousto-ultrasonics

    NASA Astrophysics Data System (ADS)

    Torres-Arredondo, M.-A.; Sierra-Pérez, Julián; Cabanes, Guénaël

    2016-05-01

    The process of measuring and analysing the data from a distributed sensor network all over a structural system in order to quantify its condition is known as structural health monitoring (SHM). For the design of a trustworthy health monitoring system, a vast amount of information regarding the inherent physical characteristics of the sources and their propagation and interaction across the structure is crucial. Moreover, any SHM system which is expected to transition to field operation must take into account the influence of environmental and operational changes which cause modifications in the stiffness and damping of the structure and consequently modify its dynamic behaviour. On that account, special attention is paid in this paper to the development of an efficient SHM methodology where robust signal processing and pattern recognition techniques are integrated for the correct interpretation of complex ultrasonic waves within the context of damage detection and identification. The methodology is based on an acousto-ultrasonics technique where the discrete wavelet transform is evaluated for feature extraction and selection, linear principal component analysis for data-driven modelling and self-organising maps for a two-level clustering under the principle of local density. At the end, the methodology is experimentally demonstrated and results show that all the damages were detectable and identifiable.

  20. Developing a Data Set and Processing Methodology for Fluid/Structure Interaction Code Validation

    DTIC Science & Technology

    2007-06-01

    50 29. 9-Probe Wake Survey Rake Configurations...structural stability and fatigue in test article components and, in general, in facility support structures and rotating machinery blading . Both T&E... blade analysis and simulations. To ensure the accuracy of the U of CO technology, validation using flight-test data and test data from a wind tunnel

  1. Analysis of high speed flow, thermal and structural interactions

    NASA Technical Reports Server (NTRS)

    Thornton, Earl A.

    1994-01-01

    Research for this grant focused on the following tasks: (1) the prediction of severe, localized aerodynamic heating for complex, high speed flows; (2) finite element adaptive refinement methodology for multi-disciplinary analyses; (3) the prediction of thermoviscoplastic structural response with rate-dependent effects and large deformations; (4) thermoviscoplastic constitutive models for metals; and (5) coolant flow/structural heat transfer analyses.

  2. Applications of a damage tolerance analysis methodology in aircraft design and production

    NASA Technical Reports Server (NTRS)

    Woodward, M. R.; Owens, S. D.; Law, G. E.; Mignery, L. A.

    1992-01-01

    Objectives of customer mandated aircraft structural integrity initiatives in design are to guide material selection, to incorporate fracture resistant concepts in the design, to utilize damage tolerance based allowables and planned inspection procedures necessary to enhance the safety and reliability of manned flight vehicles. However, validated fracture analysis tools for composite structures are needed to accomplish these objectives in a timely and economical manner. This paper briefly describes the development, validation, and application of a damage tolerance methodology for composite airframe structures. A closed-form analysis code, entitled SUBLAM was developed to predict the critical biaxial strain state necessary to cause sublaminate buckling-induced delamination extension in an impact damaged composite laminate. An embedded elliptical delamination separating a thin sublaminate from a thick parent laminate is modelled. Predicted failure strains were correlated against a variety of experimental data that included results from compression after impact coupon and element tests. An integrated analysis package was developed to predict damage tolerance based margin-of-safety (MS) using NASTRAN generated loads and element information. Damage tolerance aspects of new concepts are quickly and cost-effectively determined without the need for excessive testing.

  3. Chemoinformatic Analysis of Combinatorial Libraries, Drugs, Natural Products and Molecular Libraries Small Molecule Repository

    PubMed Central

    Singh, Narender; Guha, Rajarshi; Giulianotti, Marc; Pinilla, Clemencia; Houghten, Richard; Medina-Franco, Jose L.

    2009-01-01

    A multiple criteria approach is presented, that is used to perform a comparative analysis of four recently developed combinatorial libraries to drugs, Molecular Libraries Small Molecule Repository (MLSMR) and natural products. The compound databases were assessed in terms of physicochemical properties, scaffolds and fingerprints. The approach enables the analysis of property space coverage, degree of overlap between collections, scaffold and structural diversity and overall structural novelty. The degree of overlap between combinatorial libraries and drugs was assessed using the R-NN curve methodology, which measures the density of chemical space around a query molecule embedded in the chemical space of a target collection. The combinatorial libraries studied in this work exhibit scaffolds that were not observed in the drug, MLSMR and natural products collections. The fingerprint-based comparisons indicate that these combinatorial libraries are structurally different to current drugs. The R-NN curve methodology revealed that a proportion of molecules in the combinatorial libraries are located within the property space of the drugs. However, the R-NN analysis also showed that there are a significant number of molecules in several combinatorial libraries that are located in sparse regions of the drug space. PMID:19301827

  4. Aeroelastic Modeling of a Nozzle Startup Transient

    NASA Technical Reports Server (NTRS)

    Wang, Ten-See; Zhao, Xiang; Zhang, Sijun; Chen, Yen-Sen

    2014-01-01

    Lateral nozzle forces are known to cause severe structural damage to any new rocket engine in development during test. While three-dimensional, transient, turbulent, chemically reacting computational fluid dynamics methodology has been demonstrated to capture major side load physics with rigid nozzles, hot-fire tests often show nozzle structure deformation during major side load events, leading to structural damages if structural strengthening measures were not taken. The modeling picture is incomplete without the capability to address the two-way responses between the structure and fluid. The objective of this study is to develop a tightly coupled aeroelastic modeling algorithm by implementing the necessary structural dynamics component into an anchored computational fluid dynamics methodology. The computational fluid dynamics component is based on an unstructured-grid, pressure-based computational fluid dynamics formulation, while the computational structural dynamics component is developed under the framework of modal analysis. Transient aeroelastic nozzle startup analyses at sea level were performed, and the computed transient nozzle fluid-structure interaction physics presented,

  5. User Evaluation of the NASA Technical Report Server Recommendation Service

    NASA Technical Reports Server (NTRS)

    Nelson, Michael L.; Bollen, Johan; Calhoun, JoAnne R.; Mackey, Calvin E.

    2004-01-01

    We present the user evaluation of two recommendation server methodologies implemented for the NASA Technical Report Server (NTRS). One methodology for generating recommendations uses log analysis to identify co-retrieval events on full-text documents. For comparison, we used the Vector Space Model (VSM) as the second methodology. We calculated cosine similarities and used the top 10 most similar documents (based on metadata) as recommendations . We then ran an experiment with NASA Langley Research Center (LaRC) staff members to gather their feedback on which method produced the most quality recommendations. We found that in most cases VSM outperformed log analysis of co-retrievals. However, analyzing the data revealed the evaluations may have been structurally biased in favor of the VSM generated recommendations. We explore some possible methods for combining log analysis and VSM generated recommendations and suggest areas of future work.

  6. User Evaluation of the NASA Technical Report Server Recommendation Service

    NASA Technical Reports Server (NTRS)

    Nelson, Michael L.; Bollen, Johan; Calhoun, JoAnne R.; Mackey, Calvin E.

    2004-01-01

    We present the user evaluation of two recommendation server methodologies implemented for the NASA Technical Report Server (NTRS). One methodology for generating recommendations uses log analysis to identify co-retrieval events on full-text documents. For comparison, we used the Vector Space Model (VSM) as the second methodology. We calculated cosine similarities and used the top 10 most similar documents (based on metadata) as 'recommendations'. We then ran an experiment with NASA Langley Research Center (LaRC) staff members to gather their feedback on which method produced the most 'quality' recommendations. We found that in most cases VSM outperformed log analysis of co-retrievals. However, analyzing the data revealed the evaluations may have been structurally biased in favor of the VSM generated recommendations. We explore some possible methods for combining log analysis and VSM generated recommendations and suggest areas of future work.

  7. Optimizing Force Deployment and Force Structure for the Rapid Deployment Force

    DTIC Science & Technology

    1984-03-01

    Analysis . . . . .. .. ... ... 97 Experimental Design . . . . . .. .. .. ... 99 IX. Use of a Flexible Response Surface ........ 10.2 Selection of a...setS . ere designe . arun, programming methodology , where the require: s.stem re..r is input and the model optimizes the num=er. :::pe, cargo. an...to obtain new computer outputs" (Ref 38:23). The methodology can be used with any decision model, linear or nonlinear. Experimental Desion Since the

  8. A Progressive Damage Methodology for Residual Strength Predictions of Notched Composite Panels

    NASA Technical Reports Server (NTRS)

    Coats, Timothy W.; Harris, Charles E.

    1998-01-01

    The translaminate fracture behavior of carbon/epoxy structural laminates with through-penetration notches was investigated to develop a residual strength prediction methodology for composite structures. An experimental characterization of several composite materials systems revealed a fracture resistance behavior that was very similar to the R-curve behavior exhibited by ductile metals. Fractographic examinations led to the postulate that the damage growth resistance was primarily due to fractured fibers in the principal load-carrying plies being bridged by intact fibers of the adjacent plies. The load transfer associated with this bridging mechanism suggests that a progressive damage analysis methodology will be appropriate for predicting the residual strength of laminates with through-penetration notches. A progressive damage methodology developed by the authors was used to predict the initiation and growth of matrix cracks and fiber fracture. Most of the residual strength predictions for different panel widths, notch lengths, and material systems were within about 10% of the experimental failure loads.

  9. Guaranteeing robustness of structural condition monitoring to environmental variability

    NASA Astrophysics Data System (ADS)

    Van Buren, Kendra; Reilly, Jack; Neal, Kyle; Edwards, Harry; Hemez, François

    2017-01-01

    Advances in sensor deployment and computational modeling have allowed significant strides to be recently made in the field of Structural Health Monitoring (SHM). One widely used SHM strategy is to perform a vibration analysis where a model of the structure's pristine (undamaged) condition is compared with vibration response data collected from the physical structure. Discrepancies between model predictions and monitoring data can be interpreted as structural damage. Unfortunately, multiple sources of uncertainty must also be considered in the analysis, including environmental variability, unknown model functional forms, and unknown values of model parameters. Not accounting for these sources of uncertainty can lead to false-positives or false-negatives in the structural condition assessment. To manage the uncertainty, we propose a robust SHM methodology that combines three technologies. A time series algorithm is trained using "baseline" data to predict the vibration response, compare predictions to actual measurements collected on a potentially damaged structure, and calculate a user-defined damage indicator. The second technology handles the uncertainty present in the problem. An analysis of robustness is performed to propagate this uncertainty through the time series algorithm and obtain the corresponding bounds of variation of the damage indicator. The uncertainty description and robustness analysis are both inspired by the theory of info-gap decision-making. Lastly, an appropriate "size" of the uncertainty space is determined through physical experiments performed in laboratory conditions. Our hypothesis is that examining how the uncertainty space changes throughout time might lead to superior diagnostics of structural damage as compared to only monitoring the damage indicator. This methodology is applied to a portal frame structure to assess if the strategy holds promise for robust SHM. (Publication approved for unlimited, public release on October-28-2015, LA-UR-15-28442, unclassified.)

  10. Development of Testing Methodologies for the Mechanical Properties of MEMS

    NASA Technical Reports Server (NTRS)

    Ekwaro-Osire, Stephen

    2003-01-01

    This effort is to investigate and design testing strategies to determine the mechanical properties of MicroElectroMechanical Systems (MEMS) as well as investigate the development of a MEMS Probabilistic Design Methodology (PDM). One item of potential interest is the design of a test for the Weibull size effect in pressure membranes. The Weibull size effect is a consequence of a stochastic strength response predicted from the Weibull distribution. Confirming that MEMS strength is controlled by the Weibull distribution will enable the development of a probabilistic design methodology for MEMS - similar to the GRC developed CARES/Life program for bulk ceramics. However, the primary area of investigation will most likely be analysis and modeling of material interfaces for strength as well as developing a strategy to handle stress singularities at sharp corners, filets, and material interfaces. This will be a continuation of the previous years work. The ultimate objective of this effort is to further develop and verify the ability of the Ceramics Analysis and Reliability Evaluation of Structures Life (CARES/Life) code to predict the time-dependent reliability of MEMS structures subjected to multiple transient loads.

  11. Theoretical Analysis of Photoelectron Spectra of Pure and Mixed Metal Clusters: Disentangling Size, Structure, and Composition Effects

    DOE PAGES

    Acioli, Paulo H.; Jellinek, Julius

    2017-07-14

    A theoretical/computational description and analysis of the spectra of electron binding energies of Al 12 -, Al 13 - and Al 12Ni- clusters, which differ in size and/or composition by a single atom yet possess strikingly different measured photoelectron spectra, is presented. It is shown that the measured spectra can not only be reproduced computationally with quantitative fidelity – this is achieved through a combination of state-of-the-art density functional theory with a highly accurate scheme for conversion of the Kohn-Sham eigenenergies into electron binding energies – but also explained in terms of the effects of size, structure/symmetry and composition. Furthermore,more » a new methodology is developed and applied that provides for disentanglement and differential assignment of the separate roles played by size, structure/symmetry and composition in defining the observed differences in the measured spectra. The methodology is general and applicable to any finite system, homogeneous or heterogeneous. Finally, we project that in combination with advances in synthesis techniques this methodology will become an indispensable computation-based aid in the design of controlled synthesis protocols for manufacture of nanosystems and nanodevices with precisely desired electronic and other characteristics.« less

  12. Chemometric strategy for modeling metabolic biological space along the gastrointestinal tract and assessing microbial influences.

    PubMed

    Martin, François-Pierre J; Montoliu, Ivan; Kochhar, Sunil; Rezzi, Serge

    2010-12-01

    Over the past decade, the analysis of metabolic data with advanced chemometric techniques has offered the potential to explore functional relationships among biological compartments in relation to the structure and function of the intestine. However, the employed methodologies, generally based on regression modeling techniques, have given emphasis to region-specific metabolic patterns, while providing only limited insights into the spatiotemporal metabolic features of the complex gastrointestinal system. Hence, novel approaches are needed to analyze metabolic data to reconstruct the metabolic biological space associated with the evolving structures and functions of an organ such as the gastrointestinal tract. Here, we report the application of multivariate curve resolution (MCR) methodology to model metabolic relationships along the gastrointestinal compartments in relation to its structure and function using data from our previous metabonomic analysis. The method simultaneously summarizes metabolite occurrence and contribution to continuous metabolic signatures of the different biological compartments of the gut tract. This methodology sheds new light onto the complex web of metabolic interactions with gut symbionts that modulate host cell metabolism in surrounding gut tissues. In the future, such an approach will be key to provide new insights into the dynamic onset of metabolic deregulations involved in region-specific gastrointestinal disorders, such as Crohn's disease or ulcerative colitis.

  13. Theoretical Analysis of Photoelectron Spectra of Pure and Mixed Metal Clusters: Disentangling Size, Structure, and Composition Effects

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Acioli, Paulo H.; Jellinek, Julius

    A theoretical/computational description and analysis of the spectra of electron binding energies of Al 12 -, Al 13 - and Al 12Ni- clusters, which differ in size and/or composition by a single atom yet possess strikingly different measured photoelectron spectra, is presented. It is shown that the measured spectra can not only be reproduced computationally with quantitative fidelity – this is achieved through a combination of state-of-the-art density functional theory with a highly accurate scheme for conversion of the Kohn-Sham eigenenergies into electron binding energies – but also explained in terms of the effects of size, structure/symmetry and composition. Furthermore,more » a new methodology is developed and applied that provides for disentanglement and differential assignment of the separate roles played by size, structure/symmetry and composition in defining the observed differences in the measured spectra. The methodology is general and applicable to any finite system, homogeneous or heterogeneous. Finally, we project that in combination with advances in synthesis techniques this methodology will become an indispensable computation-based aid in the design of controlled synthesis protocols for manufacture of nanosystems and nanodevices with precisely desired electronic and other characteristics.« less

  14. Analysis of atomic force microscopy data for surface characterization using fuzzy logic

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Al-Mousa, Amjed, E-mail: aalmousa@vt.edu; Niemann, Darrell L.; Niemann, Devin J.

    2011-07-15

    In this paper we present a methodology to characterize surface nanostructures of thin films. The methodology identifies and isolates nanostructures using Atomic Force Microscopy (AFM) data and extracts quantitative information, such as their size and shape. The fuzzy logic based methodology relies on a Fuzzy Inference Engine (FIE) to classify the data points as being top, bottom, uphill, or downhill. The resulting data sets are then further processed to extract quantitative information about the nanostructures. In the present work we introduce a mechanism which can consistently distinguish crowded surfaces from those with sparsely distributed structures and present an omni-directional searchmore » technique to improve the structural recognition accuracy. In order to demonstrate the effectiveness of our approach we present a case study which uses our approach to quantitatively identify particle sizes of two specimens each with a unique gold nanoparticle size distribution. - Research Highlights: {yields} A Fuzzy logic analysis technique capable of characterizing AFM images of thin films. {yields} The technique is applicable to different surfaces regardless of their densities. {yields} Fuzzy logic technique does not require manual adjustment of the algorithm parameters. {yields} The technique can quantitatively capture differences between surfaces. {yields} This technique yields more realistic structure boundaries compared to other methods.« less

  15. Evaluation of grid generation technologies from an applied perspective

    NASA Technical Reports Server (NTRS)

    Hufford, Gary S.; Harrand, Vincent J.; Patel, Bhavin C.; Mitchell, Curtis R.

    1995-01-01

    An analysis of the grid generation process from the point of view of an applied CFD engineer is given. Issues addressed include geometric modeling, structured grid generation, unstructured grid generation, hybrid grid generation and use of virtual parts libraries in large parametric analysis projects. The analysis is geared towards comparing the effective turn around time for specific grid generation and CFD projects. The conclusion was made that a single grid generation methodology is not universally suited for all CFD applications due to both limitations in grid generation and flow solver technology. A new geometric modeling and grid generation tool, CFD-GEOM, is introduced to effectively integrate the geometric modeling process to the various grid generation methodologies including structured, unstructured, and hybrid procedures. The full integration of the geometric modeling and grid generation allows implementation of extremely efficient updating procedures, a necessary requirement for large parametric analysis projects. The concept of using virtual parts libraries in conjunction with hybrid grids for large parametric analysis projects is also introduced to improve the efficiency of the applied CFD engineer.

  16. A Study in Difference: Structures and Cultures in Registered Training Organisations. Support Document 3

    ERIC Educational Resources Information Center

    Clayton, Berwyn; Fisher, Thea; Harris, Roger; Bateman, Andrea; Brown, Mike

    2008-01-01

    This document supports the report "A Study in Difference: Structures and Cultures in Registered Training Organisations." The first section outlines the methodology used to undertake the research and covers the design of the research, sample details, the data collection process and the strategy for data analysis and reporting. The…

  17. Methodological Status and Trends in Expository Text Structure Instruction Efficacy Research

    ERIC Educational Resources Information Center

    Bohaty, Janet J.; Hebert, Michael A.; Nelson, J. Ron; Brown, Jessica A.

    2015-01-01

    This systematic descriptive historical review was conducted to examine the status and trends in expository text structure instruction efficacy research for first through twelfth grade students. The analysis included sixty studies, which spanned the years 1978 to 2014. Descriptive dimensions of the research included study type, research design,…

  18. A Hierarchical Bayesian Multidimensional Scaling Methodology for Accommodating Both Structural and Preference Heterogeneity

    ERIC Educational Resources Information Center

    Park, Joonwook; Desarbo, Wayne S.; Liechty, John

    2008-01-01

    Multidimensional scaling (MDS) models for the analysis of dominance data have been developed in the psychometric and classification literature to simultaneously capture subjects' "preference heterogeneity" and the underlying dimentional structure for a set of designated stimuli in a parsimonious manner. There are two major types of latent utility…

  19. Structure Determination of Natural Products by Mass Spectrometry.

    PubMed

    Biemann, Klaus

    2015-01-01

    I review laboratory research on the development of mass spectrometric methodology for the determination of the structure of natural products of biological and medical interest, which I conducted from 1958 to the end of the twentieth century. The methodology was developed by converting small peptides to their corresponding polyamino alcohols to make them amenable to mass spectrometry, thereby making it applicable to whole proteins. The structures of alkaloids were determined by analyzing the fragmentation of a known alkaloid and then using the results to deduce the structures of related compounds. Heparin-like structures were investigated by determining their molecular weights from the mass of protonated molecular ions of complexes with highly basic, synthetic peptides. Mass spectrometry was also employed in the analysis of lunar material returned by the Apollo missions. A miniaturized gas chromatograph mass spectrometer was sent to Mars on board of the two Viking 1976 spacecrafts.

  20. A spectral approach for the quantitative description of cardiac collagen network from nonlinear optical imaging.

    PubMed

    Masè, Michela; Cristoforetti, Alessandro; Avogaro, Laura; Tessarolo, Francesco; Piccoli, Federico; Caola, Iole; Pederzolli, Carlo; Graffigna, Angelo; Ravelli, Flavia

    2015-01-01

    The assessment of collagen structure in cardiac pathology, such as atrial fibrillation (AF), is essential for a complete understanding of the disease. This paper introduces a novel methodology for the quantitative description of collagen network properties, based on the combination of nonlinear optical microscopy with a spectral approach of image processing and analysis. Second-harmonic generation (SHG) microscopy was applied to atrial tissue samples from cardiac surgery patients, providing label-free, selective visualization of the collagen structure. The spectral analysis framework, based on 2D-FFT, was applied to the SHG images, yielding a multiparametric description of collagen fiber orientation (angle and anisotropy indexes) and texture scale (dominant wavelength and peak dispersion indexes). The proof-of-concept application of the methodology showed the capability of our approach to detect and quantify differences in the structural properties of the collagen network in AF versus sinus rhythm patients. These results suggest the potential of our approach in the assessment of collagen properties in cardiac pathologies related to a fibrotic structural component.

  1. NEEDED RESEARCH ON DIFFUSION WITHIN EDUCATIONAL ORGANIZATIONS.

    ERIC Educational Resources Information Center

    JAIN, NEMI C.; ROGERS, EVERETT M.

    IN SPITE OF THE VOLUME OF RESEARCH ATTENTION DEVOTED TO THE DIFFUSION OF INNOVATIONS, RELATIVELY LITTLE EMPHASIS HAS BEEN PLACED UPON DIFFUSION WITHIN ORGANIZATIONAL STRUCTURES. METHODOLOGICALLY, RELATIONAL ANALYSIS IN WHICH THE UNIT OF ANALYSIS IS A TWO-PERSON INTERACTING PAIR, A MULTIPLE PERSON COMMUNICATION CHAIN, OR CLIQUES OR SUBSYSTEMS IS…

  2. Driven and No Regrets: A Qualitative Analysis of Students Earning Baccalaureate Degrees in Three Years

    ERIC Educational Resources Information Center

    Firmin, Michael W.; Gilson, Krista Merrick

    2007-01-01

    Using rigorous qualitative research methodology, twenty-four college students receiving their undergraduate degrees in three years were interviewed. Following analysis of the semi-structured interview transcripts and coding, themes emerged, indicating that these students possessed self-discipline, self-motivation, and drive. Overall, the results…

  3. Issues in Longitudinal Research on Motivation

    ERIC Educational Resources Information Center

    Stoel, Reinoud D.; Roeleveld, Jaap; Peetsma, Thea; van den Wittenboer, Godfried; Hox, Joop

    2006-01-01

    This paper discusses two methodological issues regarding the analysis of longitudinal data using structural equation modeling that emerged during the reconsideration of the analysis of a recent study on the relationship between academic motivation and language achievement in elementary education [Stoel R.D., Peetsma, T.T.D. and Roeleveld, J.…

  4. Three-dimensional analysis of anisotropic spatially reinforced structures

    NASA Technical Reports Server (NTRS)

    Bogdanovich, Alexander E.

    1993-01-01

    The material-adaptive three-dimensional analysis of inhomogeneous structures based on the meso-volume concept and application of deficient spline functions for displacement approximations is proposed. The general methodology is demonstrated on the example of a brick-type mosaic parallelepiped arbitrarily composed of anisotropic meso-volumes. A partition of each meso-volume into sub-elements, application of deficient spline functions for a local approximation of displacements and, finally, the use of the variational principle allows one to obtain displacements, strains, and stresses at anypoint within the structural part. All of the necessary external and internal boundary conditions (including the conditions of continuity of transverse stresses at interfaces between adjacent meso-volumes) can be satisfied with requisite accuracy by increasing the density of the sub-element mesh. The application of the methodology to textile composite materials is described. Several numerical examples for woven and braided rectangular composite plates and stiffened panels under transverse bending are considered. Some typical effects of stress concentrations due to the material inhomogeneities are demonstrated.

  5. The methodology of multi-viewpoint clustering analysis

    NASA Technical Reports Server (NTRS)

    Mehrotra, Mala; Wild, Chris

    1993-01-01

    One of the greatest challenges facing the software engineering community is the ability to produce large and complex computer systems, such as ground support systems for unmanned scientific missions, that are reliable and cost effective. In order to build and maintain these systems, it is important that the knowledge in the system be suitably abstracted, structured, and otherwise clustered in a manner which facilitates its understanding, manipulation, testing, and utilization. Development of complex mission-critical systems will require the ability to abstract overall concepts in the system at various levels of detail and to consider the system from different points of view. Multi-ViewPoint - Clustering Analysis MVP-CA methodology has been developed to provide multiple views of large, complicated systems. MVP-CA provides an ability to discover significant structures by providing an automated mechanism to structure both hierarchically (from detail to abstract) and orthogonally (from different perspectives). We propose to integrate MVP/CA into an overall software engineering life cycle to support the development and evolution of complex mission critical systems.

  6. Equivalent Viscous Damping Methodologies Applied on VEGA Launch Vehicle Numerical Model

    NASA Astrophysics Data System (ADS)

    Bartoccini, D.; Di Trapani, C.; Fransen, S.

    2014-06-01

    Part of the mission analysis of a spacecraft is the so- called launcher-satellite coupled loads analysis which aims at computing the dynamic environment of the satellite and of the launch vehicle for the most severe load cases in flight. Evidently the damping of the coupled system shall be defined with care as to not overestimate or underestimate the loads derived for the spacecraft. In this paper the application of several EqVD (Equivalent Viscous Damping) for Craig an Bampton (CB)-systems are investigated. Based on the structural damping defined for the various materials in the parent FE-models of the CB-components, EqVD matrices can be computed according to different methodologies. The effect of these methodologies on the numerical reconstruction of the VEGA launch vehicle dynamic environment will be presented.

  7. A Practical Engineering Approach to Predicting Fatigue Crack Growth in Riveted Lap Joints

    NASA Technical Reports Server (NTRS)

    Harris, Charles E.; Piascik, Robert S.; Newman, James C., Jr.

    1999-01-01

    An extensive experimental database has been assembled from very detailed teardown examinations of fatigue cracks found in rivet holes of fuselage structural components. Based on this experimental database, a comprehensive analysis methodology was developed to predict the onset of widespread fatigue damage in lap joints of fuselage structure. Several computer codes were developed with specialized capabilities to conduct the various analyses that make up the comprehensive methodology. Over the past several years, the authors have interrogated various aspects of the analysis methods to determine the degree of computational rigor required to produce numerical predictions with acceptable engineering accuracy. This study led to the formulation of a practical engineering approach to predicting fatigue crack growth in riveted lap joints. This paper describes the practical engineering approach and compares predictions with the results from several experimental studies.

  8. A Practical Engineering Approach to Predicting Fatigue Crack Growth in Riveted Lap Joints

    NASA Technical Reports Server (NTRS)

    Harris, C. E.; Piascik, R. S.; Newman, J. C., Jr.

    2000-01-01

    An extensive experimental database has been assembled from very detailed teardown examinations of fatigue cracks found in rivet holes of fuselage structural components. Based on this experimental database, a comprehensive analysis methodology was developed to predict the onset of widespread fatigue damage in lap joints of fuselage structure. Several computer codes were developed with specialized capabilities to conduct the various analyses that make up the comprehensive methodology. Over the past several years, the authors have interrogated various aspects of the analysis methods to determine the degree of computational rigor required to produce numerical predictions with acceptable engineering accuracy. This study led to the formulation of a practical engineering approach to predicting fatigue crack growth in riveted lap joints. This paper describes the practical engineering approach and compares predictions with the results from several experimental studies.

  9. GoPros™ as an underwater photogrammetry tool for citizen science

    PubMed Central

    David, Peter A.; Dupont, Sally F.; Mathewson, Ciaran P.; O’Neill, Samuel J.; Powell, Nicholas N.; Williamson, Jane E.

    2016-01-01

    Citizen science can increase the scope of research in the marine environment; however, it suffers from necessitating specialized training and simplified methodologies that reduce research output. This paper presents a simplified, novel survey methodology for citizen scientists, which combines GoPro imagery and structure from motion to construct an ortho-corrected 3D model of habitats for analysis. Results using a coral reef habitat were compared to surveys conducted with traditional snorkelling methods for benthic cover, holothurian counts, and coral health. Results were comparable between the two methods, and structure from motion allows the results to be analysed off-site for any chosen visual analysis. The GoPro method outlined in this study is thus an effective tool for citizen science in the marine environment, especially for comparing changes in coral cover or volume over time. PMID:27168973

  10. GoPros™ as an underwater photogrammetry tool for citizen science.

    PubMed

    Raoult, Vincent; David, Peter A; Dupont, Sally F; Mathewson, Ciaran P; O'Neill, Samuel J; Powell, Nicholas N; Williamson, Jane E

    2016-01-01

    Citizen science can increase the scope of research in the marine environment; however, it suffers from necessitating specialized training and simplified methodologies that reduce research output. This paper presents a simplified, novel survey methodology for citizen scientists, which combines GoPro imagery and structure from motion to construct an ortho-corrected 3D model of habitats for analysis. Results using a coral reef habitat were compared to surveys conducted with traditional snorkelling methods for benthic cover, holothurian counts, and coral health. Results were comparable between the two methods, and structure from motion allows the results to be analysed off-site for any chosen visual analysis. The GoPro method outlined in this study is thus an effective tool for citizen science in the marine environment, especially for comparing changes in coral cover or volume over time.

  11. Probabilistic analysis of the torsional effects on the tall building resistance due to earthquake even

    NASA Astrophysics Data System (ADS)

    Králik, Juraj; Králik, Juraj

    2017-07-01

    The paper presents the results from the deterministic and probabilistic analysis of the accidental torsional effect of reinforced concrete tall buildings due to earthquake even. The core-column structural system was considered with various configurations in plane. The methodology of the seismic analysis of the building structures in Eurocode 8 and JCSS 2000 is discussed. The possibilities of the utilization the LHS method to analyze the extensive and robust tasks in FEM is presented. The influence of the various input parameters (material, geometry, soil, masses and others) is considered. The deterministic and probability analysis of the seismic resistance of the structure was calculated in the ANSYS program.

  12. Lifetime Reliability Prediction of Ceramic Structures Under Transient Thermomechanical Loads

    NASA Technical Reports Server (NTRS)

    Nemeth, Noel N.; Jadaan, Osama J.; Gyekenyesi, John P.

    2005-01-01

    An analytical methodology is developed to predict the probability of survival (reliability) of ceramic components subjected to harsh thermomechanical loads that can vary with time (transient reliability analysis). This capability enables more accurate prediction of ceramic component integrity against fracture in situations such as turbine startup and shutdown, operational vibrations, atmospheric reentry, or other rapid heating or cooling situations (thermal shock). The transient reliability analysis methodology developed herein incorporates the following features: fast-fracture transient analysis (reliability analysis without slow crack growth, SCG); transient analysis with SCG (reliability analysis with time-dependent damage due to SCG); a computationally efficient algorithm to compute the reliability for components subjected to repeated transient loading (block loading); cyclic fatigue modeling using a combined SCG and Walker fatigue law; proof testing for transient loads; and Weibull and fatigue parameters that are allowed to vary with temperature or time. Component-to-component variation in strength (stochastic strength response) is accounted for with the Weibull distribution, and either the principle of independent action or the Batdorf theory is used to predict the effect of multiaxial stresses on reliability. The reliability analysis can be performed either as a function of the component surface (for surface-distributed flaws) or component volume (for volume-distributed flaws). The transient reliability analysis capability has been added to the NASA CARES/ Life (Ceramic Analysis and Reliability Evaluation of Structures/Life) code. CARES/Life was also updated to interface with commercially available finite element analysis software, such as ANSYS, when used to model the effects of transient load histories. Examples are provided to demonstrate the features of the methodology as implemented in the CARES/Life program.

  13. Progressive Failure Analysis Methodology for Laminated Composite Structures

    NASA Technical Reports Server (NTRS)

    Sleight, David W.

    1999-01-01

    A progressive failure analysis method has been developed for predicting the failure of laminated composite structures under geometrically nonlinear deformations. The progressive failure analysis uses C(exp 1) shell elements based on classical lamination theory to calculate the in-plane stresses. Several failure criteria, including the maximum strain criterion, Hashin's criterion, and Christensen's criterion, are used to predict the failure mechanisms and several options are available to degrade the material properties after failures. The progressive failure analysis method is implemented in the COMET finite element analysis code and can predict the damage and response of laminated composite structures from initial loading to final failure. The different failure criteria and material degradation methods are compared and assessed by performing analyses of several laminated composite structures. Results from the progressive failure method indicate good correlation with the existing test data except in structural applications where interlaminar stresses are important which may cause failure mechanisms such as debonding or delaminations.

  14. Testing the limits of sensitivity in a solid-state structural investigation by combined X-ray powder diffraction, solid-state NMR, and molecular modelling.

    PubMed

    Filip, Xenia; Borodi, Gheorghe; Filip, Claudiu

    2011-10-28

    A solid state structural investigation of ethoxzolamide is performed on microcrystalline powder by using a multi-technique approach that combines X-ray powder diffraction (XRPD) data analysis based on direct space methods with information from (13)C((15)N) solid-state Nuclear Magnetic Resonance (SS-NMR) and molecular modeling. Quantum chemical computations of the crystal were employed for geometry optimization and chemical shift calculations based on the Gauge Including Projector Augmented-Wave (GIPAW) method, whereas a systematic search in the conformational space was performed on the isolated molecule using a molecular mechanics (MM) approach. The applied methodology proved useful for: (i) removing ambiguities in the XRPD crystal structure determination process and further refining the derived structure solutions, and (ii) getting important insights into the relationship between the complex network of non-covalent interactions and the induced supra-molecular architectures/crystal packing patterns. It was found that ethoxzolamide provides an ideal case study for testing the accuracy with which this methodology allows to distinguish between various structural features emerging from the analysis of the powder diffraction data. This journal is © the Owner Societies 2011

  15. Deterministic and Probabilistic Creep and Creep Rupture Enhancement to CARES/Creep: Multiaxial Creep Life Prediction of Ceramic Structures Using Continuum Damage Mechanics and the Finite Element Method

    NASA Technical Reports Server (NTRS)

    Jadaan, Osama M.; Powers, Lynn M.; Gyekenyesi, John P.

    1998-01-01

    High temperature and long duration applications of monolithic ceramics can place their failure mode in the creep rupture regime. A previous model advanced by the authors described a methodology by which the creep rupture life of a loaded component can be predicted. That model was based on the life fraction damage accumulation rule in association with the modified Monkman-Grant creep ripture criterion However, that model did not take into account the deteriorating state of the material due to creep damage (e.g., cavitation) as time elapsed. In addition, the material creep parameters used in that life prediction methodology, were based on uniaxial creep curves displaying primary and secondary creep behavior, with no tertiary regime. The objective of this paper is to present a creep life prediction methodology based on a modified form of the Kachanov-Rabotnov continuum damage mechanics (CDM) theory. In this theory, the uniaxial creep rate is described in terms of stress, temperature, time, and the current state of material damage. This scalar damage state parameter is basically an abstract measure of the current state of material damage due to creep deformation. The damage rate is assumed to vary with stress, temperature, time, and the current state of damage itself. Multiaxial creep and creep rupture formulations of the CDM approach are presented in this paper. Parameter estimation methodologies based on nonlinear regression analysis are also described for both, isothermal constant stress states and anisothermal variable stress conditions This creep life prediction methodology was preliminarily added to the integrated design code CARES/Creep (Ceramics Analysis and Reliability Evaluation of Structures/Creep), which is a postprocessor program to commercially available finite element analysis (FEA) packages. Two examples, showing comparisons between experimental and predicted creep lives of ceramic specimens, are used to demonstrate the viability of this methodology and the CARES/Creep program.

  16. State Analysis Database Tool

    NASA Technical Reports Server (NTRS)

    Rasmussen, Robert; Bennett, Matthew

    2006-01-01

    The State Analysis Database Tool software establishes a productive environment for collaboration among software and system engineers engaged in the development of complex interacting systems. The tool embodies State Analysis, a model-based system engineering methodology founded on a state-based control architecture (see figure). A state represents a momentary condition of an evolving system, and a model may describe how a state evolves and is affected by other states. The State Analysis methodology is a process for capturing system and software requirements in the form of explicit models and states, and defining goal-based operational plans consistent with the models. Requirements, models, and operational concerns have traditionally been documented in a variety of system engineering artifacts that address different aspects of a mission s lifecycle. In State Analysis, requirements, models, and operations information are State Analysis artifacts that are consistent and stored in a State Analysis Database. The tool includes a back-end database, a multi-platform front-end client, and Web-based administrative functions. The tool is structured to prompt an engineer to follow the State Analysis methodology, to encourage state discovery and model description, and to make software requirements and operations plans consistent with model descriptions.

  17. CACDA Jiffy War Game Technical Manual. Part 1: Methodology

    DTIC Science & Technology

    1977-03-01

    Systems Analysis Office (Mr Tyburski) Fort Monmout’.h, NJ 07703 Commander 1* USAISD ATTN: ATISE-TD-TS-CD (LT Boyer) Fort Deven , MASS 01433 Commander 2...Developments Activity Fort Leavenworth, Kansas 66027 CACDA JIFFY WAR GAME TECHNICAL MANUAL Part 1: Methodology by Timothy J. Bailey and Gerald A. Martin ACN...ComrbatDevelopments Activity (CACDA), Fort Leavenworth,i-Xsas," for scenario devel- opment and force structure evaluation. The Jiffy Game computer

  18. Analysis of Structures with Rotating, Flexible Substructures Applied to Rotorcraft Aeroelasticity in GRASP (General Rotorcraft Aeromechanical Stability Program),

    DTIC Science & Technology

    1987-01-01

    two nodes behave identically. In GRASP, these constraints are entirely invisible from the user’s point of view. GRASP (Recall that the Levi - Civita ...virtual rotation GRASP is the first program implementing a new methodWl( = Levi -Ciudta symbol op for dynamic analysis of structures, parts of which may...natural coordinatization of sis for this methodology, which incorporates body flexibility components. with the large discrete motions previously

  19. NASA LeRC/Akron University Graduate Cooperative Fellowship Program and Graduate Student Researchers Program

    NASA Technical Reports Server (NTRS)

    Fertis, D. G.; Simon, A. L.

    1981-01-01

    The requisite methodology to solve linear and nonlinear problems associated with the static and dynamic analysis of rotating machinery, their static and dynamic behavior, and the interaction between the rotating and nonrotating parts of an engine is developed. Linear and nonlinear structural engine problems are investigated by developing solution strategies and interactive computational methods whereby the man and computer can communicate directly in making analysis decisions. Representative examples include modifying structural models, changing material, parameters, selecting analysis options and coupling with interactive graphical display for pre- and postprocessing capability.

  20. Application of Steinberg vibration fatigue model for structural verification of space instruments

    NASA Astrophysics Data System (ADS)

    García, Andrés; Sorribes-Palmer, Félix; Alonso, Gustavo

    2018-01-01

    Electronic components in spaceships are subjected to vibration loads during the ascent phase of the launcher. It is important to verify by tests and analysis that all parts can survive in the most severe load cases. The purpose of this paper is to present the methodology and results of the application of the Steinberg's fatigue model to estimate the life of electronic components of the EPT-HET instrument for the Solar Orbiter space mission. A Nastran finite element model (FEM) of the EPT-HET instrument was created and used for the structural analysis. The methodology is based on the use of the FEM of the entire instrument to calculate the relative displacement RDSD and RMS values of the PCBs from random vibration analysis. These values are used to estimate the fatigue life of the most susceptible electronic components with the Steinberg's fatigue damage equation and the Miner's cumulative fatigue index. The estimations are calculated for two different configurations of the instrument and three different inputs in order to support the redesign process. Finally, these analytical results are contrasted with the inspections and the functional tests made after the vibration tests, concluding that this methodology can adequately predict the fatigue damage or survival of the electronic components.

  1. Methods for heat transfer and temperature field analysis of the insulated diesel, phase 3

    NASA Technical Reports Server (NTRS)

    Morel, Thomas; Wahiduzzaman, Syed; Fort, Edward F.; Keribar, Rifat; Blumberg, Paul N.

    1988-01-01

    Work during Phase 3 of a program aimed at developing a comprehensive heat transfer and thermal analysis methodology for design analysis of insulated diesel engines is described. The overall program addresses all the key heat transfer issues: (1) spatially and time-resolved convective and radiative in-cylinder heat transfer, (2) steady-state conduction in the overall structure, and (3) cyclical and load/speed temperature transients in the engine structure. These are all accounted for in a coupled way together with cycle thermodynamics. This methodology was developed during Phases 1 and 2. During Phase 3, an experimental program was carried out to obtain data on heat transfer under cooled and insulated engine conditions and also to generate a database to validate the developed methodology. A single cylinder Cummins diesel engine was instrumented for instantaneous total heat flux and heat radiation measurements. Data were acquired over a wide range of operating conditions in two engine configurations. One was a cooled baseline. The other included ceramic coated components (0.050 inches plasma sprayed zirconia)-piston, head and valves. The experiments showed that the insulated engine has a smaller heat flux than the cooled one. The model predictions were found to be in very good agreement with the data.

  2. Recent Advances in the Analysis of Spiral Bevel Gears

    NASA Technical Reports Server (NTRS)

    Handschuh, Robert F.

    1997-01-01

    A review of recent progress for the analysis of spiral bevel gears will be described. The foundation of this work relies on the description of the gear geometry of face-milled spiral bevel gears via the approach developed by Litvin. This methodology was extended by combining the basic gear design data with the manufactured surfaces using a differential geometry approach, and provides the data necessary for assembling three-dimensional finite element models. The finite element models have been utilized to conduct thermal and structural analysis of the gear system. Examples of the methods developed for thermal and structural/contact analysis are presented.

  3. Analysis of the time structure of synchronization in multidimensional chaotic systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Makarenko, A. V., E-mail: avm.science@mail.ru

    2015-05-15

    A new approach is proposed to the integrated analysis of the time structure of synchronization of multidimensional chaotic systems. The method allows one to diagnose and quantitatively evaluate the intermittency characteristics during synchronization of chaotic oscillations in the T-synchronization mode. A system of two identical logistic mappings with unidirectional coupling that operate in the developed chaos regime is analyzed. It is shown that the widely used approach, in which only synchronization patterns are subjected to analysis while desynchronization areas are considered as a background signal and removed from analysis, should be regarded as methodologically incomplete.

  4. Ninth NASTRAN (R) Users' Colloquium

    NASA Technical Reports Server (NTRS)

    1980-01-01

    The general application of finite element methodology and the specific application of NASTRAN to a wide variety of static and dynamic structural problems is addressed. Comparison with other approaches and new methods of analysis with nastran are included.

  5. Risk analysis within environmental impact assessment of proposed construction activity

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zeleňáková, Martina; Zvijáková, Lenka

    Environmental impact assessment is an important process, prior to approval of the investment plan, providing a detailed examination of the likely and foreseeable impacts of proposed construction activity on the environment. The objective of this paper is to develop a specific methodology for the analysis and evaluation of environmental impacts of selected constructions – flood protection structures using risk analysis methods. The application of methodology designed for the process of environmental impact assessment will develop assumptions for further improvements or more effective implementation and performance of this process. The main objective of the paper is to improve the implementation ofmore » the environmental impact assessment process. Through the use of risk analysis methods in environmental impact assessment process, the set objective has been achieved. - Highlights: This paper is informed by an effort to develop research with the aim of: • Improving existing qualitative and quantitative methods for assessing the impacts • A better understanding of relations between probabilities and consequences • Methodology for the EIA of flood protection constructions based on risk analysis • Creative approaches in the search for environmentally friendly proposed activities.« less

  6. A case study using the PrOACT-URL and BRAT frameworks for structured benefit risk assessment.

    PubMed

    Nixon, Richard; Dierig, Christoph; Mt-Isa, Shahrul; Stöckert, Isabelle; Tong, Thaison; Kuhls, Silvia; Hodgson, Gemma; Pears, John; Waddingham, Ed; Hockley, Kimberley; Thomson, Andrew

    2016-01-01

    While benefit-risk assessment is a key component of the drug development and maintenance process, it is often described in a narrative. In contrast, structured benefit-risk assessment builds on established ideas from decision analysis and comprises a qualitative framework and quantitative methodology. We compare two such frameworks, applying multi-criteria decision-analysis (MCDA) within the PrOACT-URL framework and weighted net clinical benefit (wNCB), within the BRAT framework. These are applied to a case study of natalizumab for the treatment of relapsing remitting multiple sclerosis. We focus on the practical considerations of applying these methods and give recommendations for visual presentation of results. In the case study, we found structured benefit-risk analysis to be a useful tool for structuring, quantifying, and communicating the relative benefit and safety profiles of drugs in a transparent, rational and consistent way. The two frameworks were similar. MCDA is a generic and flexible methodology that can be used to perform a structured benefit-risk in any common context. wNCB is a special case of MCDA and is shown to be equivalent to an extension of the number needed to treat (NNT) principle. It is simpler to apply and understand than MCDA and can be applied when all outcomes are measured on a binary scale. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  7. A network-base analysis of CMIP5 "historical" experiments

    NASA Astrophysics Data System (ADS)

    Bracco, A.; Foudalis, I.; Dovrolis, C.

    2012-12-01

    In computer science, "complex network analysis" refers to a set of metrics, modeling tools and algorithms commonly used in the study of complex nonlinear dynamical systems. Its main premise is that the underlying topology or network structure of a system has a strong impact on its dynamics and evolution. By allowing to investigate local and non-local statistical interaction, network analysis provides a powerful, but only marginally explored, framework to validate climate models and investigate teleconnections, assessing their strength, range, and impacts on the climate system. In this work we propose a new, fast, robust and scalable methodology to examine, quantify, and visualize climate sensitivity, while constraining general circulation models (GCMs) outputs with observations. The goal of our novel approach is to uncover relations in the climate system that are not (or not fully) captured by more traditional methodologies used in climate science and often adopted from nonlinear dynamical systems analysis, and to explain known climate phenomena in terms of the network structure or its metrics. Our methodology is based on a solid theoretical framework and employs mathematical and statistical tools, exploited only tentatively in climate research so far. Suitably adapted to the climate problem, these tools can assist in visualizing the trade-offs in representing global links and teleconnections among different data sets. Here we present the methodology, and compare network properties for different reanalysis data sets and a suite of CMIP5 coupled GCM outputs. With an extensive model intercomparison in terms of the climate network that each model leads to, we quantify how each model reproduces major teleconnections, rank model performances, and identify common or specific errors in comparing model outputs and observations.

  8. Evaluation of RCAS Inflow Models for Wind Turbine Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tangler, J.; Bir, G.

    The finite element structural modeling in the Rotorcraft Comprehensive Analysis System (RCAS) provides a state-of-the-art approach to aeroelastic analysis. This, coupled with its ability to model all turbine components, results in a methodology that can simulate complex system interactions characteristic of large wind. In addition, RCAS is uniquely capable of modeling advanced control algorithms and the resulting dynamic responses.

  9. Lyapunov exponents, covariant vectors and shadowing sensitivity analysis of 3D wakes: from laminar to chaotic regimes

    NASA Astrophysics Data System (ADS)

    Wang, Qiqi; Rigas, Georgios; Esclapez, Lucas; Magri, Luca; Blonigan, Patrick

    2016-11-01

    Bluff body flows are of fundamental importance to many engineering applications involving massive flow separation and in particular the transport industry. Coherent flow structures emanating in the wake of three-dimensional bluff bodies, such as cars, trucks and lorries, are directly linked to increased aerodynamic drag, noise and structural fatigue. For low Reynolds laminar and transitional regimes, hydrodynamic stability theory has aided the understanding and prediction of the unstable dynamics. In the same framework, sensitivity analysis provides the means for efficient and optimal control, provided the unstable modes can be accurately predicted. However, these methodologies are limited to laminar regimes where only a few unstable modes manifest. Here we extend the stability analysis to low-dimensional chaotic regimes by computing the Lyapunov covariant vectors and their associated Lyapunov exponents. We compare them to eigenvectors and eigenvalues computed in traditional hydrodynamic stability analysis. Computing Lyapunov covariant vectors and Lyapunov exponents also enables the extension of sensitivity analysis to chaotic flows via the shadowing method. We compare the computed shadowing sensitivities to traditional sensitivity analysis. These Lyapunov based methodologies do not rely on mean flow assumptions, and are mathematically rigorous for calculating sensitivities of fully unsteady flow simulations.

  10. The Effects of Cognitive Style on Edmodo Users' Behaviour: A Structural Equation Modeling-Based Multi-Group Analysis

    ERIC Educational Resources Information Center

    Ursavas, Omer Faruk; Reisoglu, Ilknur

    2017-01-01

    Purpose: The purpose of this paper is to explore the validity of extended technology acceptance model (TAM) in explaining pre-service teachers' Edmodo acceptance and the variation of variables related to TAM among pre-service teachers having different cognitive styles. Design/methodology/approach: Structural equation modeling approach was used to…

  11. Fundamental Studies of Strength Physics--Methodology of Longevity Prediction of Materials under Arbitrary Thermally and Forced Effects

    ERIC Educational Resources Information Center

    Petrov, Mark G.

    2016-01-01

    Thermally activated analysis of experimental data allows considering about the structure features of each material. By modelling the structural heterogeneity of materials by means of rheological models, general and local plastic flows in metals and alloys can be described over. Based on physical fundamentals of failure and deformation of materials…

  12. How to structure and prioritize information needs in support of monitoring design for Integrated Coastal Management

    NASA Astrophysics Data System (ADS)

    Vugteveen, Pim; van Katwijk, Marieke M.; Rouwette, Etiënne; Hanssen, Lucien

    2014-02-01

    Integrated Coastal Management cannot operate effectively without reliable information and knowledge on changes in the environment and on the causes of those changes. Monitoring is essential to provide data needed for a real understanding of socio-economic and ecological functioning in multi-user nature areas. We present a web-based and comprehensive assessment methodology to articulate, structure and prioritize information needs and ensuing monitoring needs. We applied this methodology in the Dutch Wadden Sea Region, which includes a designated UNESCO World Heritage nature reserve. The methodology consists of the following steps: i) exploring social-ecological issues of concern and defining the monitoring scope; ii) articulating information needs expressed as tractable questions; iii) elaborating monitoring needs; iv) grounding in scientific models and current monitoring; v) synthesizing assessment findings into target entities, i.e. analysis variables for monitoring. In this paper we focus on the first three steps. As part of our methodology we performed two online surveys amongst a broad range of stakeholders and amongst monitoring professionals. In the case of the Dutch Wadden Sea Region, main monitoring questions were related to biodiversity and food web relations; effects of fisheries and its pressures on the ecosystem; channel and port dredging; spatial planning and multifunctional use; sustainable energy production; and effects of changing storm regimes due to climate change. Subsequently we elaborated these general issues into analysis variables within five themes. The presented methodology enables large scale and unbiased involvement of stakeholders in articulating information needs in a multi-user nature reserve like the Wadden Sea. In addition the methodology facilitates the input and feedback of monitoring professionals by providing a detailed elaboration of monitoring needs.

  13. Data development technical support document for the aircraft crash risk analysis methodology (ACRAM) standard

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kimura, C.Y.; Glaser, R.E.; Mensing, R.W.

    1996-08-01

    The Aircraft Crash Risk Analysis Methodology (ACRAM) Panel has been formed by the US Department of Energy Office of Defense Programs (DOE/DP) for the purpose of developing a standard methodology for determining the risk from aircraft crashes onto DOE ground facilities. In order to accomplish this goal, the ACRAM panel has been divided into four teams, the data development team, the model evaluation team, the structural analysis team, and the consequence team. Each team, consisting of at least one member of the ACRAM plus additional DOE and DOE contractor personnel, specializes in the development of the methodology assigned to thatmore » team. This report documents the work performed by the data development team and provides the technical basis for the data used by the ACRAM Standard for determining the aircraft crash frequency. This report should be used to provide the generic data needed to calculate the aircraft crash frequency into the facility under consideration as part of the process for determining the aircraft crash risk to ground facilities as given by the DOE Standard Aircraft Crash Risk Assessment Methodology (ACRAM). Some broad guidance is presented on how to obtain the needed site-specific and facility specific data but this data is not provided by this document.« less

  14. Three-dimensional structural analysis of eukaryotic flagella/cilia by electron cryo-tomography

    PubMed Central

    Bui, Khanh Huy; Pigino, Gaia; Ishikawa, Takashi

    2011-01-01

    Electron cryo-tomography is a potential approach to analyzing the three-dimensional conformation of frozen hydrated biological macromolecules using electron microscopy. Since projections of each individual object illuminated from different orientations are merged, electron tomography is capable of structural analysis of such heterogeneous environments as in vivo or with polymorphism, although radiation damage and the missing wedge are severe problems. Here, recent results on the structure of eukaryotic flagella, which is an ATP-driven bending organelle, from green algae Chlamydomonas are presented. Tomographic analysis reveals asymmetric molecular arrangements, especially that of the dynein motor proteins, in flagella, giving insight into the mechanism of planar asymmetric bending motion. Methodological challenges to obtaining higher-resolution structures from this technique are also discussed. PMID:21169680

  15. Improved finite element methodology for integrated thermal structural analysis

    NASA Technical Reports Server (NTRS)

    Dechaumphai, P.; Thornton, E. A.

    1982-01-01

    An integrated thermal-structural finite element approach for efficient coupling of thermal and structural analysis is presented. New thermal finite elements which yield exact nodal and element temperatures for one dimensional linear steady state heat transfer problems are developed. A nodeless variable formulation is used to establish improved thermal finite elements for one dimensional nonlinear transient and two dimensional linear transient heat transfer problems. The thermal finite elements provide detailed temperature distributions without using additional element nodes and permit a common discretization with lower order congruent structural finite elements. The accuracy of the integrated approach is evaluated by comparisons with analytical solutions and conventional finite element thermal structural analyses for a number of academic and more realistic problems. Results indicate that the approach provides a significant improvement in the accuracy and efficiency of thermal stress analysis for structures with complex temperature distributions.

  16. Social Cognitive Predictors of College Students' Academic Performance and Persistence: A Meta-Analytic Path Analysis

    ERIC Educational Resources Information Center

    Brown, Steven D.; Tramayne, Selena; Hoxha, Denada; Telander, Kyle; Fan, Xiaoyan; Lent, Robert W.

    2008-01-01

    This study tested Social Cognitive Career Theory's (SCCT) academic performance model using a two-stage approach that combined meta-analytic and structural equation modeling methodologies. Unbiased correlations obtained from a previously published meta-analysis [Robbins, S. B., Lauver, K., Le, H., Davis, D., & Langley, R. (2004). Do psychosocial…

  17. Rapid forest change in the interior west presents analysis opportunities and challenges

    Treesearch

    John D. Shaw

    2007-01-01

    A recent drought has caused compositional and structural changes in Interior West forests. Recent periodic and annual inventory data provide an opportunity to analyze forest changes on a grand scale. This "natural experiment" also provides opportunities to test the effectiveness of Forest Inventory and Analysis (FIA) methodologies. It also presents some...

  18. A Risk Analysis Methodology to Address Human and Organizational Factors in Offshore Drilling Safety: With an Emphasis on Negative Pressure Test

    NASA Astrophysics Data System (ADS)

    Tabibzadeh, Maryam

    According to the final Presidential National Commission report on the BP Deepwater Horizon (DWH) blowout, there is need to "integrate more sophisticated risk assessment and risk management practices" in the oil industry. Reviewing the literature of the offshore drilling industry indicates that most of the developed risk analysis methodologies do not fully and more importantly, systematically address the contribution of Human and Organizational Factors (HOFs) in accident causation. This is while results of a comprehensive study, from 1988 to 2005, of more than 600 well-documented major failures in offshore structures show that approximately 80% of those failures were due to HOFs. In addition, lack of safety culture, as an issue related to HOFs, have been identified as a common contributing cause of many accidents in this industry. This dissertation introduces an integrated risk analysis methodology to systematically assess the critical role of human and organizational factors in offshore drilling safety. The proposed methodology in this research focuses on a specific procedure called Negative Pressure Test (NPT), as the primary method to ascertain well integrity during offshore drilling, and analyzes the contributing causes of misinterpreting such a critical test. In addition, the case study of the BP Deepwater Horizon accident and their conducted NPT is discussed. The risk analysis methodology in this dissertation consists of three different approaches and their integration constitutes the big picture of my whole methodology. The first approach is the comparative analysis of a "standard" NPT, which is proposed by the author, with the test conducted by the DWH crew. This analysis contributes to identifying the involved discrepancies between the two test procedures. The second approach is a conceptual risk assessment framework to analyze the causal factors of the identified mismatches in the previous step, as the main contributors of negative pressure test misinterpretation. Finally, a rational decision making model is introduced to quantify a section of the developed conceptual framework in the previous step and analyze the impact of different decision making biases on negative pressure test results. Along with the corroborating findings of previous studies, the analysis of the developed conceptual framework in this paper indicates that organizational factors are root causes of accumulated errors and questionable decisions made by personnel or management. Further analysis of this framework identifies procedural issues, economic pressure, and personnel management issues as the organizational factors with the highest influence on misinterpreting a negative pressure test. It is noteworthy that the captured organizational factors in the introduced conceptual framework are not only specific to the scope of the NPT. Most of these organizational factors have been identified as not only the common contributing causes of other offshore drilling accidents but also accidents in other oil and gas related operations as well as high-risk operations in other industries. In addition, the proposed rational decision making model in this research introduces a quantitative structure for analysis of the results of a conducted NPT. This model provides a structure and some parametric derived formulas to determine a cut-off point value, which assists personnel in accepting or rejecting an implemented negative pressure test. Moreover, it enables analysts to assess different decision making biases involved in the process of interpreting a conducted negative pressure test as well as the root organizational factors of those biases. In general, although the proposed integrated research methodology in this dissertation is developed for the risk assessment of human and organizational factors contributions in negative pressure test misinterpretation, it can be generalized and be potentially useful for other well control situations, both offshore and onshore; e.g. fracking. In addition, this methodology can be applied for the analysis of any high-risk operations, in not only the oil and gas industry but also in other industries such as nuclear power plants, aviation industry, and transportation sector.

  19. Application of a substructuring technique to the problem of crack extension and closure

    NASA Technical Reports Server (NTRS)

    Armen, H., Jr.

    1974-01-01

    A substructuring technique, originally developed for the efficient reanalysis of structures, is incorporated into the methodology associated with the plastic analysis of structures. An existing finite-element computer program that accounts for elastic-plastic material behavior under cyclic loading was modified to account for changing kinematic constraint conditions - crack growth and intermittent contact of crack surfaces in two dimensional regions. Application of the analysis is presented for a problem of a centercrack panel to demonstrate the efficiency and accuracy of the technique.

  20. Evaluation of Structural Robustness against Column Loss: Methodology and Application to RC Frame Buildings.

    PubMed

    Bao, Yihai; Main, Joseph A; Noh, Sam-Young

    2017-08-01

    A computational methodology is presented for evaluating structural robustness against column loss. The methodology is illustrated through application to reinforced concrete (RC) frame buildings, using a reduced-order modeling approach for three-dimensional RC framing systems that includes the floor slabs. Comparisons with high-fidelity finite-element model results are presented to verify the approach. Pushdown analyses of prototype buildings under column loss scenarios are performed using the reduced-order modeling approach, and an energy-based procedure is employed to account for the dynamic effects associated with sudden column loss. Results obtained using the energy-based approach are found to be in good agreement with results from direct dynamic analysis of sudden column loss. A metric for structural robustness is proposed, calculated by normalizing the ultimate capacities of the structural system under sudden column loss by the applicable service-level gravity loading and by evaluating the minimum value of this normalized ultimate capacity over all column removal scenarios. The procedure is applied to two prototype 10-story RC buildings, one employing intermediate moment frames (IMFs) and the other employing special moment frames (SMFs). The SMF building, with its more stringent seismic design and detailing, is found to have greater robustness.

  1. Commercialization of NESSUS: Status

    NASA Technical Reports Server (NTRS)

    Thacker, Ben H.; Millwater, Harry R.

    1991-01-01

    A plan was initiated in 1988 to commercialize the Numerical Evaluation of Stochastic Structures Under Stress (NESSUS) probabilistic structural analysis software. The goal of the on-going commercialization effort is to begin the transfer of Probabilistic Structural Analysis Method (PSAM) developed technology into industry and to develop additional funding resources in the general area of structural reliability. The commercialization effort is summarized. The SwRI NESSUS Software System is a general purpose probabilistic finite element computer program using state of the art methods for predicting stochastic structural response due to random loads, material properties, part geometry, and boundary conditions. NESSUS can be used to assess structural reliability, to compute probability of failure, to rank the input random variables by importance, and to provide a more cost effective design than traditional methods. The goal is to develop a general probabilistic structural analysis methodology to assist in the certification of critical components in the next generation Space Shuttle Main Engine.

  2. Application of hybrid methodology to rotors in steady and maneuvering flight

    NASA Astrophysics Data System (ADS)

    Rajmohan, Nischint

    Helicopters are versatile flying machines that have capabilities that are unparalleled by fixed wing aircraft, such as operating in hover, performing vertical takeoff and landing on unprepared sites. This makes their use especially desirable in military and search-and-rescue operations. However, modern helicopters still suffer from high levels of noise and vibration caused by the physical phenomena occurring in the vicinity of the rotor blades. Therefore, improvement in rotorcraft design to reduce the noise and vibration levels requires understanding of the underlying physical phenomena, and accurate prediction capabilities of the resulting rotorcraft aeromechanics. The goal of this research is to study the aeromechanics of rotors in steady and maneuvering flight using hybrid Computational Fluid Dynamics (CFD) methodology. The hybrid CFD methodology uses the Navier-Stokes equations to solve the flow near the blade surface but the effect of the far wake is computed through the wake model. The hybrid CFD methodology is computationally efficient and its wake modeling approach is nondissipative making it an attractive tool to study rotorcraft aeromechanics. Several enhancements were made to the CFD methodology and it was coupled to a Computational Structural Dynamics (CSD) methodology to perform a trimmed aeroelastic analysis of a rotor in forward flight. The coupling analyses, both loose and tight were used to identify the key physical phenomena that affect rotors in different steady flight regimes. The modeling enhancements improved the airloads predictions for a variety of flight conditions. It was found that the tightly coupled method did not impact the loads significantly for steady flight conditions compared to the loosely coupled method. The coupling methodology was extended to maneuvering flight analysis by enhancing the computational and structural models to handle non-periodic flight conditions and vehicle motions in time accurate mode. The flight test control angles were employed to enable the maneuvering flight analysis. The fully coupled model provided the presence of three dynamic stall cycles on the rotor in maneuver. It is important to mention that analysis of maneuvering flight requires knowledge of the pilot input control pitch settings, and the vehicle states. As the result, these computational tools cannot be used for analysis of loads in a maneuver that has not been duplicated in a real flight. This is a significant limitation if these tools are to be selected during the design phase of a helicopter where its handling qualities are evaluated in different trajectories. Therefore, a methodology was developed to couple the CFD/CSD simulation with an inverse flight mechanics simulation to perform the maneuver analysis without using the flight test control input. The methodology showed reasonable convergence in steady flight regime and control angles predictions compared fairly well with test data. In the maneuvering flight regions, the convergence was slower due to relaxation techniques used for the numerical stability. The subsequent computed control angles for the maneuvering flight regions compared well with test data. Further, the enhancement of the rotor inflow computations in the inverse simulation through implementation of a Lagrangian wake model improved the convergence of the coupling methodology.

  3. High-Dimensional Sparse Factor Modeling: Applications in Gene Expression Genomics

    PubMed Central

    Carvalho, Carlos M.; Chang, Jeffrey; Lucas, Joseph E.; Nevins, Joseph R.; Wang, Quanli; West, Mike

    2010-01-01

    We describe studies in molecular profiling and biological pathway analysis that use sparse latent factor and regression models for microarray gene expression data. We discuss breast cancer applications and key aspects of the modeling and computational methodology. Our case studies aim to investigate and characterize heterogeneity of structure related to specific oncogenic pathways, as well as links between aggregate patterns in gene expression profiles and clinical biomarkers. Based on the metaphor of statistically derived “factors” as representing biological “subpathway” structure, we explore the decomposition of fitted sparse factor models into pathway subcomponents and investigate how these components overlay multiple aspects of known biological activity. Our methodology is based on sparsity modeling of multivariate regression, ANOVA, and latent factor models, as well as a class of models that combines all components. Hierarchical sparsity priors address questions of dimension reduction and multiple comparisons, as well as scalability of the methodology. The models include practically relevant non-Gaussian/nonparametric components for latent structure, underlying often quite complex non-Gaussianity in multivariate expression patterns. Model search and fitting are addressed through stochastic simulation and evolutionary stochastic search methods that are exemplified in the oncogenic pathway studies. Supplementary supporting material provides more details of the applications, as well as examples of the use of freely available software tools for implementing the methodology. PMID:21218139

  4. System architectures for telerobotic research

    NASA Technical Reports Server (NTRS)

    Harrison, F. Wallace

    1989-01-01

    Several activities are performed related to the definition and creation of telerobotic systems. The effort and investment required to create architectures for these complex systems can be enormous; however, the magnitude of process can be reduced if structured design techniques are applied. A number of informal methodologies supporting certain aspects of the design process are available. More recently, prototypes of integrated tools supporting all phases of system design from requirements analysis to code generation and hardware layout have begun to appear. Activities related to system architecture of telerobots are described, including current activities which are designed to provide a methodology for the comparison and quantitative analysis of alternative system architectures.

  5. Damage detection methodology on beam-like structures based on combined modal Wavelet Transform strategy

    NASA Astrophysics Data System (ADS)

    Serra, Roger; Lopez, Lautaro

    2018-05-01

    Different approaches on the detection of damages based on dynamic measurement of structures have appeared in the last decades. They were based, amongst others, on changes in natural frequencies, modal curvatures, strain energy or flexibility. Wavelet analysis has also been used to detect the abnormalities on modal shapes induced by damages. However the majority of previous work was made with non-corrupted by noise signals. Moreover, the damage influence for each mode shape was studied separately. This paper proposes a new methodology based on combined modal wavelet transform strategy to cope with noisy signals, while at the same time, able to extract the relevant information from each mode shape. The proposed methodology will be then compared with the most frequently used and wide-studied methods from the bibliography. To evaluate the performance of each method, their capacity to detect and localize damage will be analyzed in different cases. The comparison will be done by simulating the oscillations of a cantilever steel beam with and without defect as a numerical case. The proposed methodology proved to outperform classical methods in terms of noisy signals.

  6. Exploratory High-Fidelity Aerostructural Optimization Using an Efficient Monolithic Solution Method

    NASA Astrophysics Data System (ADS)

    Zhang, Jenmy Zimi

    This thesis is motivated by the desire to discover fuel efficient aircraft concepts through exploratory design. An optimization methodology based on tightly integrated high-fidelity aerostructural analysis is proposed, which has the flexibility, robustness, and efficiency to contribute to this goal. The present aerostructural optimization methodology uses an integrated geometry parameterization and mesh movement strategy, which was initially proposed for aerodynamic shape optimization. This integrated approach provides the optimizer with a large amount of geometric freedom for conducting exploratory design, while allowing for efficient and robust mesh movement in the presence of substantial shape changes. In extending this approach to aerostructural optimization, this thesis has addressed a number of important challenges. A structural mesh deformation strategy has been introduced to translate consistently the shape changes described by the geometry parameterization to the structural model. A three-field formulation of the discrete steady aerostructural residual couples the mesh movement equations with the three-dimensional Euler equations and a linear structural analysis. Gradients needed for optimization are computed with a three-field coupled adjoint approach. A number of investigations have been conducted to demonstrate the suitability and accuracy of the present methodology for use in aerostructural optimization involving substantial shape changes. Robustness and efficiency in the coupled solution algorithms is crucial to the success of an exploratory optimization. This thesis therefore also focuses on the design of an effective monolithic solution algorithm for the proposed methodology. This involves using a Newton-Krylov method for the aerostructural analysis and a preconditioned Krylov subspace method for the coupled adjoint solution. Several aspects of the monolithic solution method have been investigated. These include appropriate strategies for scaling and matrix-vector product evaluation, as well as block preconditioning techniques that preserve the modularity between subproblems. The monolithic solution method is applied to problems with varying degrees of fluid-structural coupling, as well as a wing span optimization study. The monolithic solution algorithm typically requires 20%-70% less computing time than its partitioned counterpart. This advantage increases with increasing wing flexibility. The performance of the monolithic solution method is also much less sensitive to the choice of the solution parameter.

  7. Application of damage tolerance methodology in certification of the Piaggio P-180 Avanti

    NASA Technical Reports Server (NTRS)

    Johnson, Jerry

    1992-01-01

    The Piaggio P-180 Avanti, a twin pusher-prop engine nine-passenger business aircraft was certified in 1990, to the requirements of FAR Part 23 and Associated Special Conditions for Composite Structure. Certification included the application of a damage tolerant methodology to the design of the composite forward wing and empennage (vertical fin, horizontal stabilizer, tailcone, and rudder) structure. This methodology included an extensive analytical evaluation coupled with sub-component and full-scale testing of the structure. The work from the Damage Tolerance Analysis Assessment was incorporated into the full-scale testing. Damage representing hazards such as dropped tools, ground equipment, handling, and runway debris, was applied to the test articles. Additional substantiation included allowing manufacturing discrepancies to exist unrepaired on the full-scale articles and simulated bondline failures in critical elements. The importance of full-scale testing in the critical environmental conditions and the application of critical damage are addressed. The implication of damage tolerance on static and fatigue testing is discussed. Good correlation between finite element solutions and experimental test data was observed.

  8. Identifying Items to Assess Methodological Quality in Physical Therapy Trials: A Factor Analysis

    PubMed Central

    Cummings, Greta G.; Fuentes, Jorge; Saltaji, Humam; Ha, Christine; Chisholm, Annabritt; Pasichnyk, Dion; Rogers, Todd

    2014-01-01

    Background Numerous tools and individual items have been proposed to assess the methodological quality of randomized controlled trials (RCTs). The frequency of use of these items varies according to health area, which suggests a lack of agreement regarding their relevance to trial quality or risk of bias. Objective The objectives of this study were: (1) to identify the underlying component structure of items and (2) to determine relevant items to evaluate the quality and risk of bias of trials in physical therapy by using an exploratory factor analysis (EFA). Design A methodological research design was used, and an EFA was performed. Methods Randomized controlled trials used for this study were randomly selected from searches of the Cochrane Database of Systematic Reviews. Two reviewers used 45 items gathered from 7 different quality tools to assess the methodological quality of the RCTs. An exploratory factor analysis was conducted using the principal axis factoring (PAF) method followed by varimax rotation. Results Principal axis factoring identified 34 items loaded on 9 common factors: (1) selection bias; (2) performance and detection bias; (3) eligibility, intervention details, and description of outcome measures; (4) psychometric properties of the main outcome; (5) contamination and adherence to treatment; (6) attrition bias; (7) data analysis; (8) sample size; and (9) control and placebo adequacy. Limitation Because of the exploratory nature of the results, a confirmatory factor analysis is needed to validate this model. Conclusions To the authors' knowledge, this is the first factor analysis to explore the underlying component items used to evaluate the methodological quality or risk of bias of RCTs in physical therapy. The items and factors represent a starting point for evaluating the methodological quality and risk of bias in physical therapy trials. Empirical evidence of the association among these items with treatment effects and a confirmatory factor analysis of these results are needed to validate these items. PMID:24786942

  9. Identifying items to assess methodological quality in physical therapy trials: a factor analysis.

    PubMed

    Armijo-Olivo, Susan; Cummings, Greta G; Fuentes, Jorge; Saltaji, Humam; Ha, Christine; Chisholm, Annabritt; Pasichnyk, Dion; Rogers, Todd

    2014-09-01

    Numerous tools and individual items have been proposed to assess the methodological quality of randomized controlled trials (RCTs). The frequency of use of these items varies according to health area, which suggests a lack of agreement regarding their relevance to trial quality or risk of bias. The objectives of this study were: (1) to identify the underlying component structure of items and (2) to determine relevant items to evaluate the quality and risk of bias of trials in physical therapy by using an exploratory factor analysis (EFA). A methodological research design was used, and an EFA was performed. Randomized controlled trials used for this study were randomly selected from searches of the Cochrane Database of Systematic Reviews. Two reviewers used 45 items gathered from 7 different quality tools to assess the methodological quality of the RCTs. An exploratory factor analysis was conducted using the principal axis factoring (PAF) method followed by varimax rotation. Principal axis factoring identified 34 items loaded on 9 common factors: (1) selection bias; (2) performance and detection bias; (3) eligibility, intervention details, and description of outcome measures; (4) psychometric properties of the main outcome; (5) contamination and adherence to treatment; (6) attrition bias; (7) data analysis; (8) sample size; and (9) control and placebo adequacy. Because of the exploratory nature of the results, a confirmatory factor analysis is needed to validate this model. To the authors' knowledge, this is the first factor analysis to explore the underlying component items used to evaluate the methodological quality or risk of bias of RCTs in physical therapy. The items and factors represent a starting point for evaluating the methodological quality and risk of bias in physical therapy trials. Empirical evidence of the association among these items with treatment effects and a confirmatory factor analysis of these results are needed to validate these items. © 2014 American Physical Therapy Association.

  10. Analysis of shell type structures subjected to time dependent mechanical and thermal loading

    NASA Technical Reports Server (NTRS)

    Simitses, G. J.; Carlson, R. L.; Riff, R.

    1985-01-01

    A general mathematical model and solution methodologies for analyzing structural response of thin, metallic shell-type structures under large transient, cyclic or static thermomechanical loads is considered. Among the system responses, which are associated with these load conditions, are thermal buckling, creep buckling and ratchetting. Thus, geometric as well as material-type nonlinearities (of high order) can be anticipated and must be considered in the development of the mathematical model.

  11. Next Steps in Bayesian Structural Equation Models: Comments on, Variations of, and Extensions to Muthen and Asparouhov (2012)

    ERIC Educational Resources Information Center

    Rindskopf, David

    2012-01-01

    Muthen and Asparouhov (2012) made a strong case for the advantages of Bayesian methodology in factor analysis and structural equation models. I show additional extensions and adaptations of their methods and show how non-Bayesians can take advantage of many (though not all) of these advantages by using interval restrictions on parameters. By…

  12. Information Technology Governance, Funding and Structure: A Case Analysis of a Public University in Malaysia

    ERIC Educational Resources Information Center

    Ismail, Noor Azizi

    2008-01-01

    Purpose: The paper's purpose is to investigate the issues of IT governance, funding and structure of a public university in Malaysia. Design/methodology/approach: The study uses a case study approach, i.e. a series of interviews with users and information services provider of campus information system. Findings: The university lacks a common…

  13. Principal Cluster Axes: A Projection Pursuit Index for the Preservation of Cluster Structures in the Presence of Data Reduction

    ERIC Educational Resources Information Center

    Steinley, Douglas; Brusco, Michael J.; Henson, Robert

    2012-01-01

    A measure of "clusterability" serves as the basis of a new methodology designed to preserve cluster structure in a reduced dimensional space. Similar to principal component analysis, which finds the direction of maximal variance in multivariate space, principal cluster axes find the direction of maximum clusterability in multivariate space.…

  14. Hashin Failure Theory Based Damage Assessment Methodology of Composite Tidal Turbine Blades and Implications for the Blade Design

    NASA Astrophysics Data System (ADS)

    Yu, Guo-qing; Ren, Yi-ru; Zhang, Tian-tian; Xiao, Wan-shen; Jiang, Hong-yong

    2018-04-01

    A damage assessment methodology based on the Hashin failure theory for glass fiber reinforced polymer (GFRP) composite blade is proposed. The typical failure mechanisms including the fiber tension/compression and matrix tension/compression are considered to describe the damage behaviors. To give the flapwise and edgewise loading along the blade span, the Blade Element Momentum Theory (BEMT) is adopted. In conjunction with the hydrodynamic analysis, the structural analysis of the composite blade is cooperatively performed with the Hashin damage model. The damage characteristics of the composite blade, under normal and extreme operational conditions, are comparatively analyzed. Numerical results demonstrate that the matrix tension damage is the most significant failure mode which occurs in the mid-span of the blade. The blade internal configurations including the box-beam, Ibeam, left-C beam and right-C beam are compared and analyzed. The GFRP and carbon fiber reinforced polymer (CFRP) are considered and combined. Numerical results show that the I-beam is the best structural type. The structural performance of composite tidal turbine blades could be improved by combining the GFRP and CFRP structure considering the damage and cost-effectiveness synthetically.

  15. A new approach to flood vulnerability assessment for historic buildings in England

    NASA Astrophysics Data System (ADS)

    Stephenson, V.; D'Ayala, D.

    2014-05-01

    The recent increase in frequency and severity of flooding in the UK has led to a shift in the perception of risk associated with flood hazards. This has extended to the conservation community, and the risks posed to historic structures that suffer from flooding are particularly concerning for those charged with preserving and maintaining such buildings. In order to fully appraise the risks in a manner appropriate to the complex issue of preservation, a new methodology is presented here that studies the nature of the vulnerability of such structures, and places it in the context of risk assessment, accounting for the vulnerable object and the subsequent exposure of that object to flood hazards. The testing of the methodology is carried out using three urban case studies and the results of the survey analysis provide guidance on the development of fragility curves for historic structures exposed to flooding. This occurs through appraisal of vulnerability indicators related to building form, structural and fabric integrity, and preservation of architectural and archaeological values. Key findings of the work include determining the applicability of these indicators to fragility analysis, and the determination of the relative vulnerability of the three case study sites.

  16. Parametric representation of weld fillets using shell finite elements—a proposal based on minimum stiffness and inertia errors

    NASA Astrophysics Data System (ADS)

    Echer, L.; Marczak, R. J.

    2018-02-01

    The objective of the present work is to introduce a methodology capable of modelling welded components for structural stress analysis. The modelling technique was based on the recommendations of the International Institute of Welding; however, some geometrical features of the weld fillet were used as design parameters in an optimization problem. Namely, the weld leg length and thickness of the shell elements representing the weld fillet were optimized in such a way that the first natural frequencies were not changed significantly when compared to a reference result. Sequential linear programming was performed for T-joint structures corresponding to two different structural details: with and without full penetration weld fillets. Both structural details were tested in scenarios of various plate thicknesses and depths. Once the optimal parameters were found, a modelling procedure was proposed for T-shaped components. Furthermore, the proposed modelling technique was extended for overlapped welded joints. The results obtained were compared to well-established methodologies presented in standards and in the literature. The comparisons included results for natural frequencies, total mass and structural stress. By these comparisons, it was observed that some established practices produce significant errors in the overall stiffness and inertia. The methodology proposed herein does not share this issue and can be easily extended to other types of structure.

  17. Multidisciplinary analysis and design of printed wiring boards

    NASA Astrophysics Data System (ADS)

    Fulton, Robert E.; Hughes, Joseph L.; Scott, Waymond R., Jr.; Umeagukwu, Charles; Yeh, Chao-Pin

    1991-04-01

    Modern printed wiring board design depends on electronic prototyping using computer-based simulation and design tools. Existing electrical computer-aided design (ECAD) tools emphasize circuit connectivity with only rudimentary analysis capabilities. This paper describes a prototype integrated PWB design environment denoted Thermal Structural Electromagnetic Testability (TSET) being developed at Georgia Tech in collaboration with companies in the electronics industry. TSET provides design guidance based on enhanced electrical and mechanical CAD capabilities including electromagnetic modeling testability analysis thermal management and solid mechanics analysis. TSET development is based on a strong analytical and theoretical science base and incorporates an integrated information framework and a common database design based on a systematic structured methodology.

  18. Aeroelastic optimization methodology for viscous and turbulent flows

    NASA Astrophysics Data System (ADS)

    Barcelos Junior, Manuel Nascimento Dias

    2007-12-01

    In recent years, the development of faster computers and parallel processing allowed the application of high-fidelity analysis methods to the aeroelastic design of aircraft. However, these methods are restricted to the final design verification, mainly due to the computational cost involved in iterative design processes. Therefore, this work is concerned with the creation of a robust and efficient aeroelastic optimization methodology for inviscid, viscous and turbulent flows by using high-fidelity analysis and sensitivity analysis techniques. Most of the research in aeroelastic optimization, for practical reasons, treat the aeroelastic system as a quasi-static inviscid problem. In this work, as a first step toward the creation of a more complete aeroelastic optimization methodology for realistic problems, an analytical sensitivity computation technique was developed and tested for quasi-static aeroelastic viscous and turbulent flow configurations. Viscous and turbulent effects are included by using an averaged discretization of the Navier-Stokes equations, coupled with an eddy viscosity turbulence model. For quasi-static aeroelastic problems, the traditional staggered solution strategy has unsatisfactory performance when applied to cases where there is a strong fluid-structure coupling. Consequently, this work also proposes a solution methodology for aeroelastic and sensitivity analyses of quasi-static problems, which is based on the fixed point of an iterative nonlinear block Gauss-Seidel scheme. The methodology can also be interpreted as the solution of the Schur complement of the aeroelastic and sensitivity analyses linearized systems of equations. The methodologies developed in this work are tested and verified by using realistic aeroelastic systems.

  19. A methodology for eliciting, representing, and analysing stakeholder knowledge for decision making on complex socio-ecological systems: from cognitive maps to agent-based models.

    PubMed

    Elsawah, Sondoss; Guillaume, Joseph H A; Filatova, Tatiana; Rook, Josefine; Jakeman, Anthony J

    2015-03-15

    This paper aims to contribute to developing better ways for incorporating essential human elements in decision making processes for modelling of complex socio-ecological systems. It presents a step-wise methodology for integrating perceptions of stakeholders (qualitative) into formal simulation models (quantitative) with the ultimate goal of improving understanding and communication about decision making in complex socio-ecological systems. The methodology integrates cognitive mapping and agent based modelling. It cascades through a sequence of qualitative/soft and numerical methods comprising: (1) Interviews to elicit mental models; (2) Cognitive maps to represent and analyse individual and group mental models; (3) Time-sequence diagrams to chronologically structure the decision making process; (4) All-encompassing conceptual model of decision making, and (5) computational (in this case agent-based) Model. We apply the proposed methodology (labelled ICTAM) in a case study of viticulture irrigation in South Australia. Finally, we use strengths-weakness-opportunities-threats (SWOT) analysis to reflect on the methodology. Results show that the methodology leverages the use of cognitive mapping to capture the richness of decision making and mental models, and provides a combination of divergent and convergent analysis methods leading to the construction of an Agent Based Model. Copyright © 2014 Elsevier Ltd. All rights reserved.

  20. The factor structure of the Values in Action Inventory of Strengths (VIA-IS): An item-level exploratory structural equation modeling (ESEM) bifactor analysis.

    PubMed

    Ng, Vincent; Cao, Mengyang; Marsh, Herbert W; Tay, Louis; Seligman, Martin E P

    2017-08-01

    The factor structure of the Values in Action Inventory of Strengths (VIA-IS; Peterson & Seligman, 2004) has not been well established as a result of methodological challenges primarily attributable to a global positivity factor, item cross-loading across character strengths, and questions concerning the unidimensionality of the scales assessing character strengths. We sought to overcome these methodological challenges by applying exploratory structural equation modeling (ESEM) at the item level using a bifactor analytic approach to a large sample of 447,573 participants who completed the VIA-IS with all 240 character strengths items and a reduced set of 107 unidimensional character strength items. It was found that a 6-factor bifactor structure generally held for the reduced set of unidimensional character strength items; these dimensions were justice, temperance, courage, wisdom, transcendence, humanity, and an overarching general factor that is best described as dispositional positivity. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  1. The Tacitness of Tacitus. A Methodological Approach to European Thought. No. 46.

    ERIC Educational Resources Information Center

    Bierschenk, Bernhard

    This study measured the analysis of verbal flows by means of volume-elasticity measures and the analysis of information flow structures and their representations in the form of a metaphysical cube. A special purpose system of computer programs (PERTEX) was used to establish the language space in which the textual flow patterns occurred containing…

  2. qHNMR Analysis of Purity of Common Organic Solvents--An Undergraduate Quantitative Analysis Laboratory Experiment

    ERIC Educational Resources Information Center

    Bell, Peter T.; Whaley, W. Lance; Tochterman, Alyssa D.; Mueller, Karl S.; Schultz, Linda D.

    2017-01-01

    NMR spectroscopy is currently a premier technique for structural elucidation of organic molecules. Quantitative NMR (qNMR) methodology has developed more slowly but is now widely accepted, especially in the areas of natural product and medicinal chemistry. However, many undergraduate students are not routinely exposed to this important concept.…

  3. Exploring the Micro-Social Geography of Children's Interactions in Preschool: A Long-Term Observational Study and Analysis Using Geographic Information Technologies

    ERIC Educational Resources Information Center

    Torrens, Paul M.; Griffin, William A.

    2013-01-01

    The authors describe an observational and analytic methodology for recording and interpreting dynamic microprocesses that occur during social interaction, making use of space--time data collection techniques, spatial-statistical analysis, and visualization. The scheme has three investigative foci: Structure, Activity Composition, and Clustering.…

  4. A novel hybrid joining methodology for composite to steel joints

    NASA Astrophysics Data System (ADS)

    Sarh, Bastian

    This research has established a novel approach for designing, analyzing, and fabricating load bearing structural connections between resin infused composite materials and components made of steel or other metals or alloys. A design philosophy is proposed wherein overlapping joint sections comprised of fiber reinforced plastics (FRP's) and steel members are connected via a combination of adhesive bonding and integrally placed composite pins. A film adhesive is utilized, placed into the dry stack prior to resin infusion and is cured after infusion through either local heat elements or by placing the structure into an oven. The novel manner in which the composite pins are introduced consists of perforating the steel member with holes and placing pre-formed composite pins through them, also prior to resin infusion of the composite section. In this manner joints are co-molded structures such that secondary processing is eliminated. It is shown that such joints blend the structural benefits of adhesive and mechanically connected joints, and that the fabrication process is feasible for low-cost, large-scale production as applicable to the shipbuilding industry. Analysis procedures used for designing such joints are presented consisting of an adhesive joint design theory and a pin placement theory. These analysis tools are used in the design of specimens, specific designs are fabricated, and these evaluated through structural tests. Structural tests include quasi-static loading and low cycle fatigue evaluation. This research has thereby invented a novel philosophy on joints, created the manufacturing technique for fabricating such joints, established simple to apply analysis procedures used in the design of such joints (consisting of both an adhesive and a pin placement analysis), and has validated the methodology through specimen fabrication and testing.

  5. Methodological issues in volumetric magnetic resonance imaging of the brain in the Edinburgh High Risk Project.

    PubMed

    Whalley, H C; Kestelman, J N; Rimmington, J E; Kelso, A; Abukmeil, S S; Best, J J; Johnstone, E C; Lawrie, S M

    1999-07-30

    The Edinburgh High Risk Project is a longitudinal study of brain structure (and function) in subjects at high risk of developing schizophrenia in the next 5-10 years for genetic reasons. In this article we describe the methods of volumetric analysis of structural magnetic resonance images used in the study. We also consider potential sources of error in these methods: the validity of our image analysis techniques; inter- and intra-rater reliability; possible positional variation; and thresholding criteria used in separating brain from cerebro-spinal fluid (CSF). Investigation with a phantom test object (of similar imaging characteristics to the brain) provided evidence for the validity of our image acquisition and analysis techniques. Both inter- and intra-rater reliability were found to be good in whole brain measures but less so for smaller regions. There were no statistically significant differences in positioning across the three study groups (patients with schizophrenia, high risk subjects and normal volunteers). A new technique for thresholding MRI scans longitudinally is described (the 'rescale' method) and compared with our established method (thresholding by eye). Few differences between the two techniques were seen at 3- and 6-month follow-up. These findings demonstrate the validity and reliability of the structural MRI analysis techniques used in the Edinburgh High Risk Project, and highlight methodological issues of general concern in cross-sectional and longitudinal studies of brain structure in healthy control subjects and neuropsychiatric populations.

  6. A methodology for formulating a minimal uncertainty model for robust control system design and analysis

    NASA Technical Reports Server (NTRS)

    Belcastro, Christine M.; Chang, B.-C.; Fischl, Robert

    1989-01-01

    In the design and analysis of robust control systems for uncertain plants, the technique of formulating what is termed an M-delta model has become widely accepted and applied in the robust control literature. The M represents the transfer function matrix M(s) of the nominal system, and delta represents an uncertainty matrix acting on M(s). The uncertainty can arise from various sources, such as structured uncertainty from parameter variations or multiple unstructured uncertainties from unmodeled dynamics and other neglected phenomena. In general, delta is a block diagonal matrix, and for real parameter variations the diagonal elements are real. As stated in the literature, this structure can always be formed for any linear interconnection of inputs, outputs, transfer functions, parameter variations, and perturbations. However, very little of the literature addresses methods for obtaining this structure, and none of this literature addresses a general methodology for obtaining a minimal M-delta model for a wide class of uncertainty. Since have a delta matrix of minimum order would improve the efficiency of structured singular value (or multivariable stability margin) computations, a method of obtaining a minimal M-delta model would be useful. A generalized method of obtaining a minimal M-delta structure for systems with real parameter variations is given.

  7. Development of hazard-compatible building fragility and vulnerability models

    USGS Publications Warehouse

    Karaca, E.; Luco, N.

    2008-01-01

    We present a methodology for transforming the structural and non-structural fragility functions in HAZUS into a format that is compatible with conventional seismic hazard analysis information. The methodology makes use of the building capacity (or pushover) curves and related building parameters provided in HAZUS. Instead of the capacity spectrum method applied in HAZUS, building response is estimated by inelastic response history analysis of corresponding single-degree-of-freedom systems under a large number of earthquake records. Statistics of the building response are used with the damage state definitions from HAZUS to derive fragility models conditioned on spectral acceleration values. Using the developed fragility models for structural and nonstructural building components, with corresponding damage state loss ratios from HAZUS, we also derive building vulnerability models relating spectral acceleration to repair costs. Whereas in HAZUS the structural and nonstructural damage states are treated as if they are independent, our vulnerability models are derived assuming "complete" nonstructural damage whenever the structural damage state is complete. We show the effects of considering this dependence on the final vulnerability models. The use of spectral acceleration (at selected vibration periods) as the ground motion intensity parameter, coupled with the careful treatment of uncertainty, makes the new fragility and vulnerability models compatible with conventional seismic hazard curves and hence useful for extensions to probabilistic damage and loss assessment.

  8. Explosion/Blast Dynamics for Constellation Launch Vehicles Assessment

    NASA Technical Reports Server (NTRS)

    Baer, Mel; Crawford, Dave; Hickox, Charles; Kipp, Marlin; Hertel, Gene; Morgan, Hal; Ratzel, Arthur; Cragg, Clinton H.

    2009-01-01

    An assessment methodology is developed to guide quantitative predictions of adverse physical environments and the subsequent effects on the Ares-1 crew launch vehicle associated with the loss of containment of cryogenic liquid propellants from the upper stage during ascent. Development of the methodology is led by a team at Sandia National Laboratories (SNL) with guidance and support from a number of National Aeronautics and Space Administration (NASA) personnel. The methodology is based on the current Ares-1 design and feasible accident scenarios. These scenarios address containment failure from debris impact or structural response to pressure or blast loading from an external source. Once containment is breached, the envisioned assessment methodology includes predictions for the sequence of physical processes stemming from cryogenic tank failure. The investigative techniques, analysis paths, and numerical simulations that comprise the proposed methodology are summarized and appropriate simulation software is identified in this report.

  9. A Damage-Dependent Finite Element Analysis for Fiber-Reinforced Composite Laminates

    NASA Technical Reports Server (NTRS)

    Coats, Timothy W.; Harris, Charles E.

    1998-01-01

    A progressive damage methodology has been developed to predict damage growth and residual strength of fiber-reinforced composite structure with through penetrations such as a slit. The methodology consists of a damage-dependent constitutive relationship based on continuum damage mechanics. Damage is modeled using volume averaged strain-like quantities known as internal state variables and is represented in the equilibrium equations as damage induced force vectors instead of the usual degradation and modification of the global stiffness matrix.

  10. Multidisciplinary Concurrent Design Optimization via the Internet

    NASA Technical Reports Server (NTRS)

    Woodard, Stanley E.; Kelkar, Atul G.; Koganti, Gopichand

    2001-01-01

    A methodology is presented which uses commercial design and analysis software and the Internet to perform concurrent multidisciplinary optimization. The methodology provides a means to develop multidisciplinary designs without requiring that all software be accessible from the same local network. The procedures are amenable to design and development teams whose members, expertise and respective software are not geographically located together. This methodology facilitates multidisciplinary teams working concurrently on a design problem of common interest. Partition of design software to different machines allows each constituent software to be used on the machine that provides the most economy and efficiency. The methodology is demonstrated on the concurrent design of a spacecraft structure and attitude control system. Results are compared to those derived from performing the design with an autonomous FORTRAN program.

  11. Working with Data: Discovering Knowledge through Mining and Analysis; Systematic Knowledge Management and Knowledge Discovery; Text Mining; Methodological Approach in Discovering User Search Patterns through Web Log Analysis; Knowledge Discovery in Databases Using Formal Concept Analysis; Knowledge Discovery with a Little Perspective.

    ERIC Educational Resources Information Center

    Qin, Jian; Jurisica, Igor; Liddy, Elizabeth D.; Jansen, Bernard J; Spink, Amanda; Priss, Uta; Norton, Melanie J.

    2000-01-01

    These six articles discuss knowledge discovery in databases (KDD). Topics include data mining; knowledge management systems; applications of knowledge discovery; text and Web mining; text mining and information retrieval; user search patterns through Web log analysis; concept analysis; data collection; and data structure inconsistency. (LRW)

  12. Comparison of Requirements for Composite Structures for Aircraft and Space Applications

    NASA Technical Reports Server (NTRS)

    Raju, Ivatury S.; Elliott, Kenny B.; Hampton, Roy W.; Knight, Norman F., Jr.; Aggarwal, Pravin; Engelstad, Stephen P.; Chang, James B.

    2010-01-01

    In this paper, the aircraft and space vehicle requirements for composite structures are compared. It is a valuable exercise to study composite structural design approaches used in the airframe industry, and to adopt methodology that is applicable for space vehicles. The missions, environments, analysis methods, analysis validation approaches, testing programs, build quantities, inspection, and maintenance procedures used by the airframe industry, in general, are not transferable to spaceflight hardware. Therefore, while the application of composite design approaches from other industries is appealing, many aspects cannot be directly utilized. Nevertheless, experiences and research for composite aircraft structures may be of use in unexpected arenas as space exploration technology develops, and so continued technology exchanges are encouraged.

  13. Using data from monitoring combined sewer overflows to assess, improve, and maintain combined sewer systems.

    PubMed

    Montserrat, A; Bosch, Ll; Kiser, M A; Poch, M; Corominas, Ll

    2015-02-01

    Using low-cost sensors, data can be collected on the occurrence and duration of overflows in each combined sewer overflow (CSO) structure in a combined sewer system (CSS). The collection and analysis of real data can be used to assess, improve, and maintain CSSs in order to reduce the number and impact of overflows. The objective of this study was to develop a methodology to evaluate the performance of CSSs using low-cost monitoring. This methodology includes (1) assessing the capacity of a CSS using overflow duration and rain volume data, (2) characterizing the performance of CSO structures with statistics, (3) evaluating the compliance of a CSS with government guidelines, and (4) generating decision tree models to provide support to managers for making decisions about system maintenance. The methodology is demonstrated with a case study of a CSS in La Garriga, Spain. The rain volume breaking point from which CSO structures started to overflow ranged from 0.6 mm to 2.8 mm. The structures with the best and worst performance in terms of overflow (overflow probability, order, duration and CSO ranking) were characterized. Most of the obtained decision trees to predict overflows from rain data had accuracies ranging from 70% to 83%. The results obtained from the proposed methodology can greatly support managers and engineers dealing with real-world problems, improvements, and maintenance of CSSs. Copyright © 2014 Elsevier B.V. All rights reserved.

  14. Fourteenth NASTRAN (R) Users' Colloquium

    NASA Technical Reports Server (NTRS)

    1986-01-01

    The proceedings of a colloquium are presented along with technical papers contributed during the conference. Reviewed are general applications of finite element methodology and the specific application of the NASA Structural Analysis System, NASTRAN, to a variety of static and dynamic sturctural problems.

  15. Power, Revisited

    ERIC Educational Resources Information Center

    Roscigno, Vincent J.

    2011-01-01

    Power is a core theoretical construct in the field with amazing utility across substantive areas, levels of analysis and methodologies. Yet, its use along with associated assumptions--assumptions surrounding constraint vs. action and specifically organizational structure and rationality--remain problematic. In this article, and following an…

  16. A Hierarchical Clustering Methodology for the Estimation of Toxicity

    EPA Science Inventory

    A Quantitative Structure Activity Relationship (QSAR) methodology based on hierarchical clustering was developed to predict toxicological endpoints. This methodology utilizes Ward's method to divide a training set into a series of structurally similar clusters. The structural sim...

  17. Design and analysis of composite structures with stress concentrations

    NASA Technical Reports Server (NTRS)

    Garbo, S. P.

    1983-01-01

    An overview of an analytic procedure which can be used to provide comprehensive stress and strength analysis of composite structures with stress concentrations is given. The methodology provides designer/analysts with a user-oriented procedure which, within acceptable engineering accuracy, accounts for the effects of a wide range of application design variables. The procedure permits the strength of arbitrary laminate constructions under general bearing/bypass load conditions to be predicted with only unnotched unidirectional strength and stiffness input data required. Included is a brief discussion of the relevancy of this analysis to the design of primary aircraft structure; an overview of the analytic procedure with theory/test correlations; and an example of the use and interaction of this strength analysis relative to the design of high-load transfer bolted composite joints.

  18. Improving stability and strength characteristics of framed structures with nonlinear behavior

    NASA Technical Reports Server (NTRS)

    Pezeshk, Shahram

    1990-01-01

    In this paper an optimal design procedure is introduced to improve the overall performance of nonlinear framed structures. The design methodology presented here is a multiple-objective optimization procedure whose objective functions involve the buckling eigenvalues and eigenvectors of the structure. A constant volume with bounds on the design variables is used in conjunction with an optimality criterion approach. The method provides a general tool for solving complex design problems and generally leads to structures with better limit strength and stability. Many algorithms have been developed to improve the limit strength of structures. In most applications geometrically linear analysis is employed with the consequence that overall strength of the design is overestimated. Directly optimizing the limit load of the structure would require a full nonlinear analysis at each iteration which would be prohibitively expensive. The objective of this paper is to develop an algorithm that can improve the limit-load of geometrically nonlinear framed structures while avoiding the nonlinear analysis. One of the novelties of the new design methodology is its ability to efficiently model and design structures under multiple loading conditions. These loading conditions can be different factored loads or any kind of loads that can be applied to the structure simultaneously or independently. Attention is focused on optimal design of space framed structures. Three-dimensional design problems are more complicated to carry out, but they yield insight into real behavior of the structure and can help avoiding some of the problems that might appear in planar design procedure such as the need for out-of-plane buckling constraint. Although researchers in the field of structural engineering generally agree that optimum design of three-dimension building frames especially in the seismic regions would be beneficial, methods have been slow to emerge. Most of the research in this area has dealt with the optimization of truss and plane frame structures.

  19. Ab initio modeling of complex amorphous transition-metal-based ceramics.

    PubMed

    Houska, J; Kos, S

    2011-01-19

    Binary and ternary amorphous transition metal (TM) nitrides and oxides are of great interest because of their suitability for diverse applications ranging from high-temperature machining to the production of optical filters or electrochromic devices. However, understanding of bonding in, and electronic structure of, these materials represents a challenge mainly due to the d electrons in their valence band. In the present work, we report ab initio calculations of the structure and electronic structure of ZrSiN materials. We focus on the methodology needed for the interpretation and automatic analysis of the bonding structure, on the effect of the length of the calculation on the convergence of individual quantities of interest and on the electronic structure of materials. We show that the traditional form of the Wannier function center-based algorithm fails due to the presence of d electrons in the valence band. We propose a modified algorithm, which allows one to analyze bonding structure in TM-based systems. We observe an appearance of valence p states of TM atoms in the electronic spectra of such systems (not only ZrSiN but also NbO(x) and WAuO), and examine the importance of the p states for the character of the bonding as well as for facilitating the bonding analysis. The results show both the physical phenomena and the computational methodology valid for a wide range of TM-based ceramics.

  20. Final Report for Dynamic Models for Causal Analysis of Panel Data. Methodological Overview. Part II, Chapter 1.

    ERIC Educational Resources Information Center

    Hannan, Michael T.

    This technical document, part of a series of chapters described in SO 011 759, describes a basic model of panel analysis used in a study of the causes of institutional and structural change in nations. Panel analysis is defined as a record of state occupancy of a sample of units at two or more points in time; for example, voters disclose voting…

  1. Ozone data and mission sampling analysis

    NASA Technical Reports Server (NTRS)

    Robbins, J. L.

    1980-01-01

    A methodology was developed to analyze discrete data obtained from the global distribution of ozone. Statistical analysis techniques were applied to describe the distribution of data variance in terms of empirical orthogonal functions and components of spherical harmonic models. The effects of uneven data distribution and missing data were considered. Data fill based on the autocorrelation structure of the data is described. Computer coding of the analysis techniques is included.

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Faulds, James

    We conducted a comprehensive analysis of the structural controls of geothermal systems within the Great Basin and adjacent regions. Our main objectives were to: 1) Produce a catalogue of favorable structural environments and models for geothermal systems. 2) Improve site-specific targeting of geothermal resources through detailed studies of representative sites, which included innovative techniques of slip tendency analysis of faults and 3D modeling. 3) Compare and contrast the structural controls and models in different tectonic settings. 4) Synthesize data and develop methodologies for enhancement of exploration strategies for conventional and EGS systems, reduction in the risk of drilling non-productive wells,more » and selecting the best EGS sites.« less

  3. Network representation of protein interactions: Theory of graph description and analysis.

    PubMed

    Kurzbach, Dennis

    2016-09-01

    A methodological framework is presented for the graph theoretical interpretation of NMR data of protein interactions. The proposed analysis generalizes the idea of network representations of protein structures by expanding it to protein interactions. This approach is based on regularization of residue-resolved NMR relaxation times and chemical shift data and subsequent construction of an adjacency matrix that represents the underlying protein interaction as a graph or network. The network nodes represent protein residues. Two nodes are connected if two residues are functionally correlated during the protein interaction event. The analysis of the resulting network enables the quantification of the importance of each amino acid of a protein for its interactions. Furthermore, the determination of the pattern of correlations between residues yields insights into the functional architecture of an interaction. This is of special interest for intrinsically disordered proteins, since the structural (three-dimensional) architecture of these proteins and their complexes is difficult to determine. The power of the proposed methodology is demonstrated at the example of the interaction between the intrinsically disordered protein osteopontin and its natural ligand heparin. © 2016 The Protein Society.

  4. Features of Cross-Correlation Analysis in a Data-Driven Approach for Structural Damage Assessment

    PubMed Central

    Camacho Navarro, Jhonatan; Ruiz, Magda; Villamizar, Rodolfo; Mujica, Luis

    2018-01-01

    This work discusses the advantage of using cross-correlation analysis in a data-driven approach based on principal component analysis (PCA) and piezodiagnostics to obtain successful diagnosis of events in structural health monitoring (SHM). In this sense, the identification of noisy data and outliers, as well as the management of data cleansing stages can be facilitated through the implementation of a preprocessing stage based on cross-correlation functions. Additionally, this work evidences an improvement in damage detection when the cross-correlation is included as part of the whole damage assessment approach. The proposed methodology is validated by processing data measurements from piezoelectric devices (PZT), which are used in a piezodiagnostics approach based on PCA and baseline modeling. Thus, the influence of cross-correlation analysis used in the preprocessing stage is evaluated for damage detection by means of statistical plots and self-organizing maps. Three laboratory specimens were used as test structures in order to demonstrate the validity of the methodology: (i) a carbon steel pipe section with leak and mass damage types, (ii) an aircraft wing specimen, and (iii) a blade of a commercial aircraft turbine, where damages are specified as mass-added. As the main concluding remark, the suitability of cross-correlation features combined with a PCA-based piezodiagnostic approach in order to achieve a more robust damage assessment algorithm is verified for SHM tasks. PMID:29762505

  5. Collision-induced dissociative chemical cross-linking reagents and methodology: Applications to protein structural characterization using tandem mass spectrometry analysis.

    PubMed

    Soderblom, Erik J; Goshe, Michael B

    2006-12-01

    Chemical cross-linking combined with mass spectrometry is a viable approach to study the low-resolution structure of protein and protein complexes. However, unambiguous identification of the residues involved in a cross-link remains analytically challenging. To enable a more effective analysis across various MS platforms, we have developed a novel set of collision-induced dissociative cross-linking reagents and methodology for chemical cross-linking experiments using tandem mass spectrometry (CID-CXL-MS/MS). These reagents incorporate a single gas-phase cleavable bond within their linker region that can be selectively fragmented within the in-source region of the mass spectrometer, enabling independent MS/MS analysis for each peptide. Initial design concepts were characterized using a synthesized cross-linked peptide complex. Following verification and subsequent optimization of cross-linked peptide complex dissociation, our reagents were applied to homodimeric glutathione S-transferase and monomeric bovine serum albumin. Cross-linked residues identified by our CID-CXL-MS/MS method were in agreement with published crystal structures and previous cross-linking studies using conventional approaches. Common LC/MS/MS acquisition approaches such as data-dependent acquisition experiments using ion trap mass spectrometers and product ion spectral analysis using SEQUEST were shown to be compatible with our CID-CXL-MS/MS reagents, obviating the requirement for high resolution and high mass accuracy measurements to identify both intra- and interpeptide cross-links.

  6. Applications of artificial neural network in AIDS research and therapy.

    PubMed

    Sardari, S; Sardari, D

    2002-01-01

    In recent years considerable effort has been devoted to applying pattern recognition techniques to the complex task of data analysis in drug research. Artificial neural networks (ANN) methodology is a modeling method with great ability to adapt to a new situation, or control an unknown system, using data acquired in previous experiments. In this paper, a brief history of ANN and the basic concepts behind the computing, the mathematical and algorithmic formulation of each of the techniques, and their developmental background is presented. Based on the abilities of ANNs in pattern recognition and estimation of system outputs from the known inputs, the neural network can be considered as a tool for molecular data analysis and interpretation. Analysis by neural networks improves the classification accuracy, data quantification and reduces the number of analogues necessary for correct classification of biologically active compounds. Conformational analysis and quantifying the components in mixtures using NMR spectra, aqueous solubility prediction and structure-activity correlation are among the reported applications of ANN as a new modeling method. Ranging from drug design and discovery to structure and dosage form design, the potential pharmaceutical applications of the ANN methodology are significant. In the areas of clinical monitoring, utilization of molecular simulation and design of bioactive structures, ANN would make the study of the status of the health and disease possible and brings their predicted chemotherapeutic response closer to reality.

  7. Features of Cross-Correlation Analysis in a Data-Driven Approach for Structural Damage Assessment.

    PubMed

    Camacho Navarro, Jhonatan; Ruiz, Magda; Villamizar, Rodolfo; Mujica, Luis; Quiroga, Jabid

    2018-05-15

    This work discusses the advantage of using cross-correlation analysis in a data-driven approach based on principal component analysis (PCA) and piezodiagnostics to obtain successful diagnosis of events in structural health monitoring (SHM). In this sense, the identification of noisy data and outliers, as well as the management of data cleansing stages can be facilitated through the implementation of a preprocessing stage based on cross-correlation functions. Additionally, this work evidences an improvement in damage detection when the cross-correlation is included as part of the whole damage assessment approach. The proposed methodology is validated by processing data measurements from piezoelectric devices (PZT), which are used in a piezodiagnostics approach based on PCA and baseline modeling. Thus, the influence of cross-correlation analysis used in the preprocessing stage is evaluated for damage detection by means of statistical plots and self-organizing maps. Three laboratory specimens were used as test structures in order to demonstrate the validity of the methodology: (i) a carbon steel pipe section with leak and mass damage types, (ii) an aircraft wing specimen, and (iii) a blade of a commercial aircraft turbine, where damages are specified as mass-added. As the main concluding remark, the suitability of cross-correlation features combined with a PCA-based piezodiagnostic approach in order to achieve a more robust damage assessment algorithm is verified for SHM tasks.

  8. A mechanics framework for a progressive failure methodology for laminated composites

    NASA Technical Reports Server (NTRS)

    Harris, Charles E.; Allen, David H.; Lo, David C.

    1989-01-01

    A laminate strength and life prediction methodology has been postulated for laminated composites which accounts for the progressive development of microstructural damage to structural failure. A damage dependent constitutive model predicts the stress redistribution in an average sense that accompanies damage development in laminates. Each mode of microstructural damage is represented by a second-order tensor valued internal state variable which is a strain like quantity. The mechanics framework together with the global-local strategy for predicting laminate strength and life is presented in the paper. The kinematic effects of damage are represented by effective engineering moduli in the global analysis and the results of the global analysis provide the boundary conditions for the local ply level stress analysis. Damage evolution laws are based on experimental results.

  9. Demonstration of a Safety Analysis on a Complex System

    NASA Technical Reports Server (NTRS)

    Leveson, Nancy; Alfaro, Liliana; Alvarado, Christine; Brown, Molly; Hunt, Earl B.; Jaffe, Matt; Joslyn, Susan; Pinnell, Denise; Reese, Jon; Samarziya, Jeffrey; hide

    1997-01-01

    For the past 17 years, Professor Leveson and her graduate students have been developing a theoretical foundation for safety in complex systems and building a methodology upon that foundation. The methodology includes special management structures and procedures, system hazard analyses, software hazard analysis, requirements modeling and analysis for completeness and safety, special software design techniques including the design of human-machine interaction, verification, operational feedback, and change analysis. The Safeware methodology is based on system safety techniques that are extended to deal with software and human error. Automation is used to enhance our ability to cope with complex systems. Identification, classification, and evaluation of hazards is done using modeling and analysis. To be effective, the models and analysis tools must consider the hardware, software, and human components in these systems. They also need to include a variety of analysis techniques and orthogonal approaches: There exists no single safety analysis or evaluation technique that can handle all aspects of complex systems. Applying only one or two may make us feel satisfied, but will produce limited results. We report here on a demonstration, performed as part of a contract with NASA Langley Research Center, of the Safeware methodology on the Center-TRACON Automation System (CTAS) portion of the air traffic control (ATC) system and procedures currently employed at the Dallas/Fort Worth (DFW) TRACON (Terminal Radar Approach CONtrol). CTAS is an automated system to assist controllers in handling arrival traffic in the DFW area. Safety is a system property, not a component property, so our safety analysis considers the entire system and not simply the automated components. Because safety analysis of a complex system is an interdisciplinary effort, our team included system engineers, software engineers, human factors experts, and cognitive psychologists.

  10. Systems identification technology development for large space systems

    NASA Technical Reports Server (NTRS)

    Armstrong, E. S.

    1982-01-01

    A methodology for synthesizinng systems identification, both parameter and state, estimation and related control schemes for flexible aerospace structures is developed with emphasis on the Maypole hoop column antenna as a real world application. Modeling studies of the Maypole cable hoop membrane type antenna are conducted using a transfer matrix numerical analysis approach. This methodology was chosen as particularly well suited for handling a large number of antenna configurations of a generic type. A dedicated transfer matrix analysis, both by virtue of its specialization and the inherently easy compartmentalization of the formulation and numerical procedures, is significantly more efficient not only in computer time required but, more importantly, in the time needed to review and interpret the results.

  11. Assessing directed evolution methods for the generation of biosynthetic enzymes with potential in drug biosynthesis

    PubMed Central

    Nannemann, David P; Birmingham, William R; Scism, Robert A; Bachmann, Brian O

    2011-01-01

    To address the synthesis of increasingly structurally diverse small-molecule drugs, methods for the generation of efficient and selective biological catalysts are becoming increasingly important. ‘Directed evolution’ is an umbrella term referring to a variety of methods for improving or altering the function of enzymes using a nature-inspired twofold strategy of mutagenesis followed by selection. This article provides an objective assessment of the effectiveness of directed evolution campaigns in generating enzymes with improved catalytic parameters for new substrates from the last decade, excluding studies that aimed to select for only improved physical properties and those that lack kinetic characterization. An analysis of the trends of methodologies and their success rates from 81 qualifying examples in the literature reveals the average fold improvement for kcat (or Vmax), Km and kcat/Km to be 366-, 12- and 2548-fold, respectively, whereas the median fold improvements are 5.4, 3 and 15.6. Further analysis by enzyme class, library-generation methodology and screening methodology explores relationships between successful campaigns and the methodologies employed. PMID:21644826

  12. NEW SAMPLING THEORY FOR MEASURING ECOSYSTEM STRUCTURE

    EPA Science Inventory

    This research considered the application of systems analysis to the study of laboratory ecosystems. The work concerned the development of a methodology which was shown to be useful in the design of laboratory experiments, the processing and interpretation of the results of these ...

  13. Development of design and analysis methodology for composite bolted joints

    NASA Astrophysics Data System (ADS)

    Grant, Peter; Sawicki, Adam

    1991-05-01

    This paper summarizes work performed to develop composite joint design methodology for use on rotorcraft primary structure, determine joint characteristics which affect joint bearing and bypass strength, and develop analytical methods for predicting the effects of such characteristics in structural joints. Experimental results have shown that bearing-bypass interaction allowables cannot be defined using a single continuous function due to variance of failure modes for different bearing-bypass ratios. Hole wear effects can be significant at moderate stress levels and should be considered in the development of bearing allowables. A computer program has been developed and has successfully predicted bearing-bypass interaction effects for the (0/+/-45/90) family of laminates using filled hole and unnotched test data.

  14. Morphological Variation in the Adult Hard Palate and Posterior Pharyngeal Wall

    PubMed Central

    Lammert, Adam; Proctor, Michael; Narayanan, Shrikanth

    2013-01-01

    Purpose Adult human vocal tracts display considerable morphological variation across individuals, but the nature and extent of this variation has not been extensively studied for many vocal tract structures. There exists a need to analyze morphological variation and, even more basically, to develop a methodology for morphological analysis of the vocal tract. Such analysis will facilitate fundamental characterization of the speech production system, with broad implications from modeling to explaining inter-speaker variability. Method A data-driven methodology to automatically analyze the extent and variety of morphological variation is proposed and applied to a diverse subject pool of 36 adults. Analysis is focused on two key aspects of vocal tract structure: the midsagittal shape of the hard palate and the posterior pharyngeal wall. Result Palatal morphology varies widely in its degree of concavity, but also in anteriority and sharpness. Pharyngeal wall morphology, by contrast, varies mostly in terms of concavity alone. The distribution of morphological characteristics is complex, and analysis suggests that certain variations may be categorical in nature. Conclusion Major modes of morphological variation are identified, including their relative magnitude, distribution and categorical nature. Implications of these findings for speech articulation strategies and speech acoustics are discussed. PMID:23690566

  15. Supercritical fluid extraction and ultra performance liquid chromatography of respiratory quinones for microbial community analysis in environmental and biological samples.

    PubMed

    Hanif, Muhammad; Atsuta, Yoichi; Fujie, Koichi; Daimon, Hiroyuki

    2012-03-05

    Microbial community structure plays a significant role in environmental assessment and animal health management. The development of a superior analytical strategy for the characterization of microbial community structure is an ongoing challenge. In this study, we developed an effective supercritical fluid extraction (SFE) and ultra performance liquid chromatography (UPLC) method for the analysis of bacterial respiratory quinones (RQ) in environmental and biological samples. RQ profile analysis is one of the most widely used culture-independent tools for characterizing microbial community structure. A UPLC equipped with a photo diode array (PDA) detector was successfully applied to the simultaneous determination of ubiquinones (UQ) and menaquinones (MK) without tedious pretreatment. Supercritical carbon dioxide (scCO(2)) extraction with the solid-phase cartridge trap proved to be a more effective and rapid method for extracting respiratory quinones, compared to a conventional organic solvent extraction method. This methodology leads to a successful analytical procedure that involves a significant reduction in the complexity and sample preparation time. Application of the optimized methodology to characterize microbial communities based on the RQ profile was demonstrated for a variety of environmental samples (activated sludge, digested sludge, and compost) and biological samples (swine and Japanese quail feces).

  16. Super-Resolution Imaging Strategies for Cell Biologists Using a Spinning Disk Microscope

    PubMed Central

    Hosny, Neveen A.; Song, Mingying; Connelly, John T.; Ameer-Beg, Simon; Knight, Martin M.; Wheeler, Ann P.

    2013-01-01

    In this study we use a spinning disk confocal microscope (SD) to generate super-resolution images of multiple cellular features from any plane in the cell. We obtain super-resolution images by using stochastic intensity fluctuations of biological probes, combining Photoactivation Light-Microscopy (PALM)/Stochastic Optical Reconstruction Microscopy (STORM) methodologies. We compared different image analysis algorithms for processing super-resolution data to identify the most suitable for analysis of particular cell structures. SOFI was chosen for X and Y and was able to achieve a resolution of ca. 80 nm; however higher resolution was possible >30 nm, dependant on the super-resolution image analysis algorithm used. Our method uses low laser power and fluorescent probes which are available either commercially or through the scientific community, and therefore it is gentle enough for biological imaging. Through comparative studies with structured illumination microscopy (SIM) and widefield epifluorescence imaging we identified that our methodology was advantageous for imaging cellular structures which are not immediately at the cell-substrate interface, which include the nuclear architecture and mitochondria. We have shown that it was possible to obtain two coloured images, which highlights the potential this technique has for high-content screening, imaging of multiple epitopes and live cell imaging. PMID:24130668

  17. Integrated dynamic analysis simulation of space stations with controllable solar array

    NASA Technical Reports Server (NTRS)

    Heinrichs, J. A.; Fee, J. J.

    1972-01-01

    A methodology is formulated and presented for the integrated structural dynamic analysis of space stations with controllable solar arrays and non-controllable appendages. The structural system flexibility characteristics are considered in the dynamic analysis by a synthesis technique whereby free-free space station modal coordinates and cantilever appendage coordinates are inertially coupled. A digital simulation of this analysis method is described and verified by comparison of interaction load solutions with other methods of solution. Motion equations are simulated for both the zero gravity and artificial gravity (spinning) orbital conditions. Closed loop controlling dynamics for both orientation control of the arrays and attitude control of the space station are provided in the simulation by various generic types of controlling systems. The capability of the simulation as a design tool is demonstrated by utilizing typical space station and solar array structural representations and a specific structural perturbing force. Response and interaction load solutions are presented for this structural configuration and indicate the importance of using an integrated type analysis for the predictions of structural interactions.

  18. Spatial genetic analyses reveal cryptic population structure and migration patterns in a continuously harvested grey wolf (Canis lupus) population in north-eastern Europe.

    PubMed

    Hindrikson, Maris; Remm, Jaanus; Männil, Peep; Ozolins, Janis; Tammeleht, Egle; Saarma, Urmas

    2013-01-01

    Spatial genetics is a relatively new field in wildlife and conservation biology that is becoming an essential tool for unravelling the complexities of animal population processes, and for designing effective strategies for conservation and management. Conceptual and methodological developments in this field are therefore critical. Here we present two novel methodological approaches that further the analytical possibilities of STRUCTURE and DResD. Using these approaches we analyse structure and migrations in a grey wolf (Canislupus) population in north-eastern Europe. We genotyped 16 microsatellite loci in 166 individuals sampled from the wolf population in Estonia and Latvia that has been under strong and continuous hunting pressure for decades. Our analysis demonstrated that this relatively small wolf population is represented by four genetic groups. We also used a novel methodological approach that uses linear interpolation to statistically test the spatial separation of genetic groups. The new method, which is capable of using program STRUCTURE output, can be applied widely in population genetics to reveal both core areas and areas of low significance for genetic groups. We also used a recently developed spatially explicit individual-based method DResD, and applied it for the first time to microsatellite data, revealing a migration corridor and barriers, and several contact zones.

  19. Evaluation of Structural Robustness against Column Loss: Methodology and Application to RC Frame Buildings

    PubMed Central

    Bao, Yihai; Main, Joseph A.; Noh, Sam-Young

    2017-01-01

    A computational methodology is presented for evaluating structural robustness against column loss. The methodology is illustrated through application to reinforced concrete (RC) frame buildings, using a reduced-order modeling approach for three-dimensional RC framing systems that includes the floor slabs. Comparisons with high-fidelity finite-element model results are presented to verify the approach. Pushdown analyses of prototype buildings under column loss scenarios are performed using the reduced-order modeling approach, and an energy-based procedure is employed to account for the dynamic effects associated with sudden column loss. Results obtained using the energy-based approach are found to be in good agreement with results from direct dynamic analysis of sudden column loss. A metric for structural robustness is proposed, calculated by normalizing the ultimate capacities of the structural system under sudden column loss by the applicable service-level gravity loading and by evaluating the minimum value of this normalized ultimate capacity over all column removal scenarios. The procedure is applied to two prototype 10-story RC buildings, one employing intermediate moment frames (IMFs) and the other employing special moment frames (SMFs). The SMF building, with its more stringent seismic design and detailing, is found to have greater robustness. PMID:28890599

  20. Probability-based methodology for buckling investigation of sandwich composite shells with and without cut-outs

    NASA Astrophysics Data System (ADS)

    Alfano, M.; Bisagni, C.

    2017-01-01

    The objective of the running EU project DESICOS (New Robust DESign Guideline for Imperfection Sensitive COmposite Launcher Structures) is to formulate an improved shell design methodology in order to meet the demand of aerospace industry for lighter structures. Within the project, this article discusses the development of a probability-based methodology developed at Politecnico di Milano. It is based on the combination of the Stress-Strength Interference Method and the Latin Hypercube Method with the aim to predict the bucking response of three sandwich composite cylindrical shells, assuming a loading condition of pure compression. The three shells are made of the same material, but have different stacking sequence and geometric dimensions. One of them presents three circular cut-outs. Different types of input imperfections, treated as random variables, are taken into account independently and in combination: variability in longitudinal Young's modulus, ply misalignment, geometric imperfections, and boundary imperfections. The methodology enables a first assessment of the structural reliability of the shells through the calculation of a probabilistic buckling factor for a specified level of probability. The factor depends highly on the reliability level, on the number of adopted samples, and on the assumptions made in modeling the input imperfections. The main advantage of the developed procedure is the versatility, as it can be applied to the buckling analysis of laminated composite shells and sandwich composite shells including different types of imperfections.

  1. Sustainability of transport structures - some aspects of the nonlinear reliability assessment

    NASA Astrophysics Data System (ADS)

    Pukl, Radomír; Sajdlová, Tereza; Strauss, Alfred; Lehký, David; Novák, Drahomír

    2017-09-01

    Efficient techniques for both nonlinear numerical analysis of concrete structures and advanced stochastic simulation methods have been combined in order to offer an advanced tool for assessment of realistic behaviour, failure and safety assessment of transport structures. The utilized approach is based on randomization of the non-linear finite element analysis of the structural models. Degradation aspects such as carbonation of concrete can be accounted in order predict durability of the investigated structure and its sustainability. Results can serve as a rational basis for the performance and sustainability assessment based on advanced nonlinear computer analysis of the structures of transport infrastructure such as bridges or tunnels. In the stochastic simulation the input material parameters obtained from material tests including their randomness and uncertainty are represented as random variables or fields. Appropriate identification of material parameters is crucial for the virtual failure modelling of structures and structural elements. Inverse analysis using artificial neural networks and virtual stochastic simulations approach is applied to determine the fracture mechanical parameters of the structural material and its numerical model. Structural response, reliability and sustainability have been investigated on different types of transport structures made from various materials using the above mentioned methodology and tools.

  2. A new methodology based on sensitivity analysis to simplify the recalibration of functional-structural plant models in new conditions.

    PubMed

    Mathieu, Amélie; Vidal, Tiphaine; Jullien, Alexandra; Wu, QiongLi; Chambon, Camille; Bayol, Benoit; Cournède, Paul-Henry

    2018-06-19

    Functional-structural plant models (FSPMs) describe explicitly the interactions between plants and their environment at organ to plant scale. However, the high level of description of the structure or model mechanisms makes this type of model very complex and hard to calibrate. A two-step methodology to facilitate the calibration process is proposed here. First, a global sensitivity analysis method was applied to the calibration loss function. It provided first-order and total-order sensitivity indexes that allow parameters to be ranked by importance in order to select the most influential ones. Second, the Akaike information criterion (AIC) was used to quantify the model's quality of fit after calibration with different combinations of selected parameters. The model with the lowest AIC gives the best combination of parameters to select. This methodology was validated by calibrating the model on an independent data set (same cultivar, another year) with the parameters selected in the second step. All the parameters were set to their nominal value; only the most influential ones were re-estimated. Sensitivity analysis applied to the calibration loss function is a relevant method to underline the most significant parameters in the estimation process. For the studied winter oilseed rape model, 11 out of 26 estimated parameters were selected. Then, the model could be recalibrated for a different data set by re-estimating only three parameters selected with the model selection method. Fitting only a small number of parameters dramatically increases the efficiency of recalibration, increases the robustness of the model and helps identify the principal sources of variation in varying environmental conditions. This innovative method still needs to be more widely validated but already gives interesting avenues to improve the calibration of FSPMs.

  3. Towards the unification of inference structures in medical diagnostic tasks.

    PubMed

    Mira, J; Rives, J; Delgado, A E; Martínez, R

    1998-01-01

    The central purpose of artificial intelligence applied to medicine is to develop models for diagnosis and therapy planning at the knowledge level, in the Newell sense, and software environments to facilitate the reduction of these models to the symbol level. The usual methodology (KADS, Common-KADS, GAMES, HELIOS, Protégé, etc) has been to develop libraries of generic tasks and reusable problem-solving methods with explicit ontologies. The principal problem which clinicians have with these methodological developments concerns the diversity and complexity of new terms whose meaning is not sufficiently clear, precise, unambiguous and consensual for them to be accessible in the daily clinical environment. As a contribution to the solution of this problem, we develop in this article the conjecture that one inference structure is enough to describe the set of analysis tasks associated with medical diagnoses. To this end, we first propose a modification of the systematic diagnostic inference scheme to obtain an analysis generic task and then compare it with the monitoring and the heuristic classification task inference schemes using as comparison criteria the compatibility of domain roles (data structures), the similarity in the inferences, and the commonality in the set of assumptions which underlie the functionally equivalent models. The equivalences proposed are illustrated with several examples. Note that though our ongoing work aims to simplify the methodology and to increase the precision of the terms used, the proposal presented here should be viewed more in the nature of a conjecture.

  4. The temporal structure of pollution levels in developed cities.

    PubMed

    Barrigón Morillas, Juan Miguel; Ortiz-Caraballo, Carmen; Prieto Gajardo, Carlos

    2015-06-01

    Currently, the need for mobility can cause significant pollution levels in cities, with important effects on health and quality of life. Any approach to the study of urban pollution and its effects requires an analysis of spatial distribution and temporal variability. It is a crucial dilemma to obtain proven methodologies that allow an increase in the quality of the prediction and the saving of resources in the spatial and temporal sampling. This work proposes a new analytical methodology in the study of temporal structure. As a result, a model for estimating annual levels of urban traffic noise was proposed. The average errors are less than one decibel in all acoustics indicators. A new working methodology of urban noise has begun. Additionally, a general application can be found for the study of the impacts of pollution associated with traffic, with implications for urban design and possibly in economic and sociological aspects. Copyright © 2015 Elsevier B.V. All rights reserved.

  5. Life-cycle cost as basis to optimize waste collection in space and time: A methodology for obtaining a detailed cost breakdown structure.

    PubMed

    Sousa, Vitor; Dias-Ferreira, Celia; Vaz, João M; Meireles, Inês

    2018-05-01

    Extensive research has been carried out on waste collection costs mainly to differentiate costs of distinct waste streams and spatial optimization of waste collection services (e.g. routes, number, and location of waste facilities). However, waste collection managers also face the challenge of optimizing assets in time, for instance deciding when to replace and how to maintain, or which technological solution to adopt. These issues require a more detailed knowledge about the waste collection services' cost breakdown structure. The present research adjusts the methodology for buildings' life-cycle cost (LCC) analysis, detailed in the ISO 15686-5:2008, to the waste collection assets. The proposed methodology is then applied to the waste collection assets owned and operated by a real municipality in Portugal (Cascais Ambiente - EMAC). The goal is to highlight the potential of the LCC tool in providing a baseline for time optimization of the waste collection service and assets, namely assisting on decisions regarding equipment operation and replacement.

  6. Introduction to a special issue on concept mapping.

    PubMed

    Trochim, William M; McLinden, Daniel

    2017-02-01

    Concept mapping was developed in the 1980s as a unique integration of qualitative (group process, brainstorming, unstructured sorting, interpretation) and quantitative (multidimensional scaling, hierarchical cluster analysis) methods designed to enable a group of people to articulate and depict graphically a coherent conceptual framework or model of any topic or issue of interest. This introduction provides the basic definition and description of the methodology for the newcomer and describes the steps typically followed in its most standard canonical form (preparation, generation, structuring, representation, interpretation and utilization). It also introduces this special issue which reviews the history of the methodology, describes its use in a variety of contexts, shows the latest ways it can be integrated with other methodologies, considers methodological advances and developments, and sketches a vision of the future of the method's evolution. Copyright © 2016 Elsevier Ltd. All rights reserved.

  7. An example of requirements for Advanced Subsonic Civil Transport (ASCT) flight control system using structured techniques

    NASA Technical Reports Server (NTRS)

    Mclees, Robert E.; Cohen, Gerald C.

    1991-01-01

    The requirements are presented for an Advanced Subsonic Civil Transport (ASCT) flight control system generated using structured techniques. The requirements definition starts from initially performing a mission analysis to identify the high level control system requirements and functions necessary to satisfy the mission flight. The result of the study is an example set of control system requirements partially represented using a derivative of Yourdon's structured techniques. Also provided is a research focus for studying structured design methodologies and in particular design-for-validation philosophies.

  8. Topology synthesis and size optimization of morphing wing structures

    NASA Astrophysics Data System (ADS)

    Inoyama, Daisaku

    This research demonstrates a novel topology and size optimization methodology for synthesis of distributed actuation systems with specific applications to morphing air vehicle structures. The main emphasis is placed on the topology and size optimization problem formulations and the development of computational modeling concepts. The analysis model is developed to meet several important criteria: It must allow a rigid-body displacement, as well as a variation in planform area, with minimum strain on structural members while retaining acceptable numerical stability for finite element analysis. Topology optimization is performed on a semi-ground structure with design variables that control the system configuration. In effect, the optimization process assigns morphing members as "soft" elements, non-morphing load-bearing members as "stiff' elements, and non-existent members as "voids." The optimization process also determines the optimum actuator placement, where each actuator is represented computationally by equal and opposite nodal forces with soft axial stiffness. In addition, the configuration of attachments that connect the morphing structure to a non-morphing structure is determined simultaneously. Several different optimization problem formulations are investigated to understand their potential benefits in solution quality, as well as meaningfulness of the formulations. Extensions and enhancements to the initial concept and problem formulations are made to accommodate multiple-configuration definitions. In addition, the principal issues on the external-load dependency and the reversibility of a design, as well as the appropriate selection of a reference configuration, are addressed in the research. The methodology to control actuator distributions and concentrations is also discussed. Finally, the strategy to transfer the topology solution to the sizing optimization is developed and cross-sectional areas of existent structural members are optimized under applied aerodynamic loads. That is, the optimization process is implemented in sequential order: The actuation system layout is first determined through multi-disciplinary topology optimization process, and then the thickness or cross-sectional area of each existent member is optimized under given constraints and boundary conditions. Sample problems are solved to demonstrate the potential capabilities of the presented methodology. The research demonstrates an innovative structural design procedure from a computational perspective and opens new insights into the potential design requirements and characteristics of morphing structures.

  9. Assessment of Material Solutions of Multi-level Garage Structure Within Integrated Life Cycle Design Process

    NASA Astrophysics Data System (ADS)

    Wałach, Daniel; Sagan, Joanna; Gicala, Magdalena

    2017-10-01

    The paper presents an environmental and economic analysis of the material solutions of multi-level garage. The construction project approach considered reinforced concrete structure under conditions of use of ordinary concrete and high-performance concrete (HPC). Using of HPC allowed to significant reduction of reinforcement steel, mainly in compression elements (columns) in the construction of the object. The analysis includes elements of the methodology of integrated lice cycle design (ILCD). By making multi-criteria analysis based on established weight of the economic and environmental parameters, three solutions have been evaluated and compared within phase of material production (information modules A1-A3).

  10. Geostatistics for spatial genetic structures: study of wild populations of perennial ryegrass.

    PubMed

    Monestiez, P; Goulard, M; Charmet, G

    1994-04-01

    Methods based on geostatistics were applied to quantitative traits of agricultural interest measured on a collection of 547 wild populations of perennial ryegrass in France. The mathematical background of these methods, which resembles spatial autocorrelation analysis, is briefly described. When a single variable is studied, the spatial structure analysis is similar to spatial autocorrelation analysis, and a spatial prediction method, called "kriging", gives a filtered map of the spatial pattern over all the sampled area. When complex interactions of agronomic traits with different evaluation sites define a multivariate structure for the spatial analysis, geostatistical methods allow the spatial variations to be broken down into two main spatial structures with ranges of 120 km and 300 km, respectively. The predicted maps that corresponded to each range were interpreted as a result of the isolation-by-distance model and as a consequence of selection by environmental factors. Practical collecting methodology for breeders may be derived from such spatial structures.

  11. Analysis of shell-type structures subjected to time-dependent mechanical and thermal loading

    NASA Technical Reports Server (NTRS)

    Simitses, G. J.; Riff, R.

    1987-01-01

    A general mathematical model and solution methodologies for analyzing structural response of thin, metallic shell-type structures under large transient, cyclic, or static thermomechanical loads are developed. Among the system responses, which are associated with these load conditions, are thermal buckling, creep buckling and ratcheting. Thus, geometric as well as material type nonlinearities (of high order) can be anticipated and must be considered in the development of the mathematical model. Furthermore, this must also be accommodated in the solution procedures.

  12. Analysis of shell-type structures subjected to time-dependent mechanical and thermal loading

    NASA Technical Reports Server (NTRS)

    Simitses, G. J.; Carlson, R. L.; Riff, R.

    1987-01-01

    A general mathematical model and solution methodologies are being developed for analyzing structural response of thin, metallic shell-type structures under large transient, cyclic, or static thermomechanical loads. Among the system responses, which were associated with these load conditions, were thermal buckling, creep buckling, and ratcheting. Thus, geometric as well as material-type nonlinearities (of high order) can be anticipated and must be considered in the development of the mathematical model. Furthermore, this must also be accommodated in the solution process.

  13. Functional approximation using artificial neural networks in structural mechanics

    NASA Technical Reports Server (NTRS)

    Alam, Javed; Berke, Laszlo

    1993-01-01

    The artificial neural networks (ANN) methodology is an outgrowth of research in artificial intelligence. In this study, the feed-forward network model that was proposed by Rumelhart, Hinton, and Williams was applied to the mapping of functions that are encountered in structural mechanics problems. Several different network configurations were chosen to train the available data for problems in materials characterization and structural analysis of plates and shells. By using the recall process, the accuracy of these trained networks was assessed.

  14. Improving image segmentation performance and quantitative analysis via a computer-aided grading methodology for optical coherence tomography retinal image analysis.

    PubMed

    Debuc, Delia Cabrera; Salinas, Harry M; Ranganathan, Sudarshan; Tátrai, Erika; Gao, Wei; Shen, Meixiao; Wang, Jianhua; Somfai, Gábor M; Puliafito, Carmen A

    2010-01-01

    We demonstrate quantitative analysis and error correction of optical coherence tomography (OCT) retinal images by using a custom-built, computer-aided grading methodology. A total of 60 Stratus OCT (Carl Zeiss Meditec, Dublin, California) B-scans collected from ten normal healthy eyes are analyzed by two independent graders. The average retinal thickness per macular region is compared with the automated Stratus OCT results. Intergrader and intragrader reproducibility is calculated by Bland-Altman plots of the mean difference between both gradings and by Pearson correlation coefficients. In addition, the correlation between Stratus OCT and our methodology-derived thickness is also presented. The mean thickness difference between Stratus OCT and our methodology is 6.53 microm and 26.71 microm when using the inner segment/outer segment (IS/OS) junction and outer segment/retinal pigment epithelium (OS/RPE) junction as the outer retinal border, respectively. Overall, the median of the thickness differences as a percentage of the mean thickness is less than 1% and 2% for the intragrader and intergrader reproducibility test, respectively. The measurement accuracy range of the OCT retinal image analysis (OCTRIMA) algorithm is between 0.27 and 1.47 microm and 0.6 and 1.76 microm for the intragrader and intergrader reproducibility tests, respectively. Pearson correlation coefficients demonstrate R(2)>0.98 for all Early Treatment Diabetic Retinopathy Study (ETDRS) regions. Our methodology facilitates a more robust and localized quantification of the retinal structure in normal healthy controls and patients with clinically significant intraretinal features.

  15. Improving image segmentation performance and quantitative analysis via a computer-aided grading methodology for optical coherence tomography retinal image analysis

    NASA Astrophysics Data System (ADS)

    Cabrera Debuc, Delia; Salinas, Harry M.; Ranganathan, Sudarshan; Tátrai, Erika; Gao, Wei; Shen, Meixiao; Wang, Jianhua; Somfai, Gábor M.; Puliafito, Carmen A.

    2010-07-01

    We demonstrate quantitative analysis and error correction of optical coherence tomography (OCT) retinal images by using a custom-built, computer-aided grading methodology. A total of 60 Stratus OCT (Carl Zeiss Meditec, Dublin, California) B-scans collected from ten normal healthy eyes are analyzed by two independent graders. The average retinal thickness per macular region is compared with the automated Stratus OCT results. Intergrader and intragrader reproducibility is calculated by Bland-Altman plots of the mean difference between both gradings and by Pearson correlation coefficients. In addition, the correlation between Stratus OCT and our methodology-derived thickness is also presented. The mean thickness difference between Stratus OCT and our methodology is 6.53 μm and 26.71 μm when using the inner segment/outer segment (IS/OS) junction and outer segment/retinal pigment epithelium (OS/RPE) junction as the outer retinal border, respectively. Overall, the median of the thickness differences as a percentage of the mean thickness is less than 1% and 2% for the intragrader and intergrader reproducibility test, respectively. The measurement accuracy range of the OCT retinal image analysis (OCTRIMA) algorithm is between 0.27 and 1.47 μm and 0.6 and 1.76 μm for the intragrader and intergrader reproducibility tests, respectively. Pearson correlation coefficients demonstrate R2>0.98 for all Early Treatment Diabetic Retinopathy Study (ETDRS) regions. Our methodology facilitates a more robust and localized quantification of the retinal structure in normal healthy controls and patients with clinically significant intraretinal features.

  16. The Development of Sport Expertise: Mapping the Tactical Domain.

    ERIC Educational Resources Information Center

    McPherson, Sue L.

    1994-01-01

    Explores issues and research relevant to sport tactical knowledge development and expertise. The paper discusses controversies concerning methodological tools, possible levels of analysis in sport research, sport tactical knowledge and expertise, a protocol structure model for sport, and expert-novice sport research. (SM)

  17. Teatime Threats. Choking Incidents at the Evening Meal

    ERIC Educational Resources Information Center

    Guthrie, Susan; Stansfield, Jois

    2017-01-01

    Purpose: To explore caregiver perceptions of the socio-environmental issues around evening meal ("teatime") which influence choking. Methodology: A qualitative study of caregivers witnessing a choking incident was undertaken. Semi-structured interviews explored perceptions of the causes. Data were analysed using thematic analysis.…

  18. Understanding Design Tradeoffs for Health Technologies: A Mixed-Methods Approach

    PubMed Central

    O’Leary, Katie; Eschler, Jordan; Kendall, Logan; Vizer, Lisa M.; Ralston, James D.; Pratt, Wanda

    2017-01-01

    We introduce a mixed-methods approach for determining how people weigh tradeoffs in values related to health and technologies for health self-management. Our approach combines interviews with Q-methodology, a method from psychology uniquely suited to quantifying opinions. We derive the framework for structured data collection and analysis for the Q-methodology from theories of self-management of chronic illness and technology adoption. To illustrate the power of this new approach, we used it in a field study of nine older adults with type 2 diabetes, and nine mothers of children with asthma. Our mixed-methods approach provides three key advantages for health design science in HCI: (1) it provides a structured health sciences theoretical framework to guide data collection and analysis; (2) it enhances the coding of unstructured data with statistical patterns of polarizing and consensus views; and (3) it empowers participants to actively weigh competing values that are most personally significant to them. PMID:28804794

  19. Optimization of extraction process by response surface methodology and preliminary structural analysis of polysaccharides from defatted peanut (Arachis hypogaea) cakes.

    PubMed

    Song, Yi; Du, Bingjian; Zhou, Ting; Han, Bing; Yu, Fei; Yang, Rui; Hu, Xiaosong; Ni, Yuanying; Li, Quanhong

    2011-02-01

    In this work, response surface methodology was used to determine optimum conditions for extraction of polysaccharides from defatted peanut cake. A central composite design including independent variables, such as extraction temperature (x(1)), extraction time (x(2)), and ethanol concentration (x(3)) was used. Selected response which evaluates the extraction process was polysaccharide yield, and the second-order model obtained for polysaccharide yield revealed coefficient of determination of 97.81%. The independent variable with the largest effect on response was ethanol concentration (x(3)). The optimum extraction conditions were found to be extraction temperature 48.7°C, extraction time 1.52 h, and ethanol concentration of 61.9% (v/v), respectively. Under these conditions, the extraction efficiency of polysaccharide can increase to 25.89%. The results of structural analysis showed that the main composition of defatted peanut cake polysaccharide was α-galactose. 2010 Elsevier Ltd. All rights reserved.

  20. Critical Protection Item classification for a waste processing facility at Savannah River Site

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ades, M.J.; Garrett, R.J.

    1993-10-01

    This paper describes the methodology for Critical Protection Item (CPI) classification and its application to the Structures, Systems and Components (SSC) of a waste processing facility at the Savannah River Site (SRS). The WSRC methodology for CPI classification includes the evaluation of the radiological and non-radiological consequences resulting from postulated accidents at the waste processing facility and comparison of these consequences with allowable limits. The types of accidents considered include explosions and fire in the facility and postulated accidents due to natural phenomena, including earthquakes, tornadoes, and high velocity straight winds. The radiological analysis results indicate that CPIs are notmore » required at the waste processing facility to mitigate the consequences of radiological release. The non-radiological analysis, however, shows that the Waste Storage Tank (WST) and the dike spill containment structures around the formic acid tanks in the cold chemical feed area and waste treatment area of the facility should be identified as CPIs. Accident mitigation options are provided and discussed.« less

  1. [Problem-based learning, a strategy to employ it].

    PubMed

    Guillamet Lloveras, Ana; Celma Vicente, Matilde; González Carrión, Pilar; Cano-Caballero Gálvez, Ma Dolores; Pérez Ramírez, Francisca

    2009-02-01

    The Virgen de las Nieves University School of Nursing has adopted the methodology of Problem-Based Learning (ABP in Spanish acronym) as a supplementary method to gain specific transversal competencies. In so doing, all basic required/obligatory subjects necessary for a degree have been partially affected. With the objective of identifying and administering all the structural and cultural barriers which could impede the success or effectiveness of its adoption, a strategic analysis at the School was carried out. This technique was based on a) knowing the strong and weak points the School has for adopting the Problem-Based Learning methodology; b) describing the structural problems and necessities to carry out this teaching innovation; c) to discover the needs professors have regarding knowledge and skills related to Problem-Based Learning; d) to prepare students by informing them about the characteristics of Problem-Based Learning; e) to evaluate the results obtained by means of professor and student opinions, f) to adopt the improvements identified. The stages followed were: strategic analysis, preparation, pilot program, adoption and evaluation.

  2. Comparison of Requirements for Composite Structures for Aircraft and Space Applications

    NASA Technical Reports Server (NTRS)

    Raju, Ivatury S.; Elliot, Kenny B.; Hampton, Roy W.; Knight, Norman F., Jr.; Aggarwal, Pravin; Engelstad, Stephen P.; Chang, James B.

    2010-01-01

    In this report, the aircraft and space vehicle requirements for composite structures are compared. It is a valuable exercise to study composite structural design approaches used in the airframe industry and to adopt methodology that is applicable for space vehicles. The missions, environments, analysis methods, analysis validation approaches, testing programs, build quantities, inspection, and maintenance procedures used by the airframe industry, in general, are not transferable to spaceflight hardware. Therefore, while the application of composite design approaches from aircraft and other industries is appealing, many aspects cannot be directly utilized. Nevertheless, experiences and research for composite aircraft structures may be of use in unexpected arenas as space exploration technology develops, and so continued technology exchanges are encouraged.

  3. A novel integrated framework and improved methodology of computer-aided drug design.

    PubMed

    Chen, Calvin Yu-Chian

    2013-01-01

    Computer-aided drug design (CADD) is a critical initiating step of drug development, but a single model capable of covering all designing aspects remains to be elucidated. Hence, we developed a drug design modeling framework that integrates multiple approaches, including machine learning based quantitative structure-activity relationship (QSAR) analysis, 3D-QSAR, Bayesian network, pharmacophore modeling, and structure-based docking algorithm. Restrictions for each model were defined for improved individual and overall accuracy. An integration method was applied to join the results from each model to minimize bias and errors. In addition, the integrated model adopts both static and dynamic analysis to validate the intermolecular stabilities of the receptor-ligand conformation. The proposed protocol was applied to identifying HER2 inhibitors from traditional Chinese medicine (TCM) as an example for validating our new protocol. Eight potent leads were identified from six TCM sources. A joint validation system comprised of comparative molecular field analysis, comparative molecular similarity indices analysis, and molecular dynamics simulation further characterized the candidates into three potential binding conformations and validated the binding stability of each protein-ligand complex. The ligand pathway was also performed to predict the ligand "in" and "exit" from the binding site. In summary, we propose a novel systematic CADD methodology for the identification, analysis, and characterization of drug-like candidates.

  4. Proceedings of the Workshop on Identification and Control of Flexible Space Structures, Volume 2

    NASA Technical Reports Server (NTRS)

    Rodriguez, G. (Editor)

    1985-01-01

    The results of a workshop on identification and control of flexible space structures held in San Diego, CA, July 4 to 6, 1984 are discussed. The main objectives of the workshop were to provide a forum to exchange ideas in exploring the most advanced modeling, estimation, identification and control methodologies to flexible space structures. The workshop responded to the rapidly growing interest within NASA in large space systems (space station, platforms, antennas, flight experiments) currently under design. Dynamic structural analysis, control theory, structural vibration and stability, and distributed parameter systems are discussed.

  5. Exploring the Factor Structure of Neurocognitive Measures in Older Individuals

    PubMed Central

    Santos, Nadine Correia; Costa, Patrício Soares; Amorim, Liliana; Moreira, Pedro Silva; Cunha, Pedro; Cotter, Jorge; Sousa, Nuno

    2015-01-01

    Here we focus on factor analysis from a best practices point of view, by investigating the factor structure of neuropsychological tests and using the results obtained to illustrate on choosing a reasonable solution. The sample (n=1051 individuals) was randomly divided into two groups: one for exploratory factor analysis (EFA) and principal component analysis (PCA), to investigate the number of factors underlying the neurocognitive variables; the second to test the “best fit” model via confirmatory factor analysis (CFA). For the exploratory step, three extraction (maximum likelihood, principal axis factoring and principal components) and two rotation (orthogonal and oblique) methods were used. The analysis methodology allowed exploring how different cognitive/psychological tests correlated/discriminated between dimensions, indicating that to capture latent structures in similar sample sizes and measures, with approximately normal data distribution, reflective models with oblimin rotation might prove the most adequate. PMID:25880732

  6. Numerical simulation of actuation behavior of active fiber composites in helicopter rotor blade application

    NASA Astrophysics Data System (ADS)

    Paik, Seung Hoon; Kim, Ji Yeon; Shin, Sang Joon; Kim, Seung Jo

    2004-07-01

    Smart structures incorporating active materials have been designed and analyzed to improve aerospace vehicle performance and its vibration/noise characteristics. Helicopter integral blade actuation is one example of those efforts using embedded anisotropic piezoelectric actuators. To design and analyze such integrally-actuated blades, beam approach based on homogenization methodology has been traditionally used. Using this approach, the global behavior of the structures is predicted in an averaged sense. However, this approach has intrinsic limitations in describing the local behaviors in the level of the constituents. For example, the failure analysis of the individual active fibers requires the knowledge of the local behaviors. Microscopic approach for the analysis of integrally-actuated structures is established in this paper. Piezoelectric fibers and matrices are modeled individually and finite element method using three-dimensional solid elements is adopted. Due to huge size of the resulting finite element meshes, high performance computing technology is required in its solution process. The present methodology is quoted as Direct Numerical Simulation (DNS) of the smart structure. As an initial validation effort, present analytical results are correlated with the experiments from a small-scaled integrally-actuated blade, Active Twist Rotor (ATR). Through DNS, local stress distribution around the interface of fiber and matrix can be analyzed.

  7. A New Finite-Time Observer for Nonlinear Systems: Applications to Synchronization of Lorenz-Like Systems.

    PubMed

    Aguilar-López, Ricardo; Mata-Machuca, Juan L

    2016-01-01

    This paper proposes a synchronization methodology of two chaotic oscillators under the framework of identical synchronization and master-slave configuration. The proposed methodology is based on state observer design under the frame of control theory; the observer structure provides finite-time synchronization convergence by cancelling the upper bounds of the main nonlinearities of the chaotic oscillator. The above is showed via an analysis of the dynamic of the so called synchronization error. Numerical experiments corroborate the satisfactory results of the proposed scheme.

  8. A New Finite-Time Observer for Nonlinear Systems: Applications to Synchronization of Lorenz-Like Systems

    PubMed Central

    Aguilar-López, Ricardo

    2016-01-01

    This paper proposes a synchronization methodology of two chaotic oscillators under the framework of identical synchronization and master-slave configuration. The proposed methodology is based on state observer design under the frame of control theory; the observer structure provides finite-time synchronization convergence by cancelling the upper bounds of the main nonlinearities of the chaotic oscillator. The above is showed via an analysis of the dynamic of the so called synchronization error. Numerical experiments corroborate the satisfactory results of the proposed scheme. PMID:27738651

  9. Dynamic identification of axial force and boundary restraints in tie rods and cables with uncertainty quantification using Set Inversion Via Interval Analysis

    NASA Astrophysics Data System (ADS)

    Kernicky, Timothy; Whelan, Matthew; Al-Shaer, Ehab

    2018-06-01

    A methodology is developed for the estimation of internal axial force and boundary restraints within in-service, prismatic axial force members of structural systems using interval arithmetic and contractor programming. The determination of the internal axial force and end restraints in tie rods and cables using vibration-based methods has been a long standing problem in the area of structural health monitoring and performance assessment. However, for structural members with low slenderness where the dynamics are significantly affected by the boundary conditions, few existing approaches allow for simultaneous identification of internal axial force and end restraints and none permit for quantifying the uncertainties in the parameter estimates due to measurement uncertainties. This paper proposes a new technique for approaching this challenging inverse problem that leverages the Set Inversion Via Interval Analysis algorithm to solve for the unknown axial forces and end restraints using natural frequency measurements. The framework developed offers the ability to completely enclose the feasible solutions to the parameter identification problem, given specified measurement uncertainties for the natural frequencies. This ability to propagate measurement uncertainty into the parameter space is critical towards quantifying the confidence in the individual parameter estimates to inform decision-making within structural health diagnosis and prognostication applications. The methodology is first verified with simulated data for a case with unknown rotational end restraints and then extended to a case with unknown translational and rotational end restraints. A laboratory experiment is then presented to demonstrate the application of the methodology to an axially loaded rod with progressively increased end restraint at one end.

  10. Validations of Coupled CSD/CFD and Particle Vortex Transport Method for Rotorcraft Applications: Hover, Transition, and High Speed Flights

    NASA Technical Reports Server (NTRS)

    Anusonti-Inthra, Phuriwat

    2010-01-01

    This paper presents validations of a novel rotorcraft analysis that coupled Computational Fluid Dynamics (CFD), Computational Structural Dynamics (CSD), and Particle Vortex Transport Method (PVTM) methodologies. The CSD with associated vehicle trim analysis is used to calculate blade deformations and trim parameters. The near body CFD analysis is employed to provide detailed near body flow field information which is used to obtain high-fidelity blade aerodynamic loadings. The far field wake dominated region is simulated using the PVTM analysis which provides accurate prediction of the evolution of the rotor wake released from the near body CFD domains. A loose coupling methodology between the CSD and CFD/PVTM modules are used with appropriate information exchange amongst the CSD/CFD/PVTM modules. The coupled CSD/CFD/PVTM methodology is used to simulate various rotorcraft flight conditions (i.e. hover, transition, and high speed flights), and the results are compared with several sets of experimental data. For the hover condition, the results are compared with hover data for the HART II rotor tested at DLR Institute of Flight Systems, Germany. For the forward flight conditions, the results are validated with the UH-60A flight test data.

  11. Structure, composition and thermal state of the crust in Brazil. [geomagnetic survey

    NASA Technical Reports Server (NTRS)

    Pacca, I. I. G. (Principal Investigator); Shukowsky, W.

    1981-01-01

    Efforts in support of a geomagnetic survey of the Brazilian area are described. Software to convert MAGSAT data tapes to the Burroughs/B-6700 binary format was developed and tested. A preliminary analysis of the first total intensity anomaly map was performed and methodologies for more intensive analysis were defined. The sources for correlative geological, aeromagnetic, and gravimetric data are described.

  12. Protein secondary structure and stability determined by combining exoproteolysis and matrix-assisted laser desorption/ionization time-of-flight mass spectrometry.

    PubMed

    Villanueva, Josep; Villegas, Virtudes; Querol, Enrique; Avilés, Francesc X; Serrano, Luis

    2002-09-01

    In the post-genomic era, several projects focused on the massive experimental resolution of the three-dimensional structures of all the proteins of different organisms have been initiated. Simultaneously, significant progress has been made in the ab initio prediction of protein three-dimensional structure. One of the keys to the success of such a prediction is the use of local information (i.e. secondary structure). Here we describe a new limited proteolysis methodology, based on the use of unspecific exoproteases coupled with matrix-assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI-TOF MS), to map quickly secondary structure elements of a protein from both ends, the N- and C-termini. We show that the proteolytic patterns (mass spectra series) obtained can be interpreted in the light of the conformation and local stability of the analyzed proteins, a direct correlation being observed between the predicted and the experimentally derived protein secondary structure. Further, this methodology can be easily applied to check rapidly the folding state of a protein and characterize mutational effects on protein conformation and stability. Moreover, given global stability information, this methodology allows one to locate the protein regions of increased or decreased conformational stability. All of this can be done with a small fraction of the amount of protein required by most of the other methods for conformational analysis. Thus limited exoproteolysis, together with MALDI-TOF MS, can be a useful tool to achieve quickly the elucidation of protein structure and stability. Copyright 2002 John Wiley & Sons, Ltd.

  13. Experimental Validation of an Integrated Controls-Structures Design Methodology

    NASA Technical Reports Server (NTRS)

    Maghami, Peiman G.; Gupta, Sandeep; Elliot, Kenny B.; Walz, Joseph E.

    1996-01-01

    The first experimental validation of an integrated controls-structures design methodology for a class of large order, flexible space structures is described. Integrated redesign of the controls-structures-interaction evolutionary model, a laboratory testbed at NASA Langley, was described earlier. The redesigned structure was fabricated, assembled in the laboratory, and experimentally tested against the original structure. Experimental results indicate that the structure redesigned using the integrated design methodology requires significantly less average control power than the nominal structure with control-optimized designs, while maintaining the required line-of-sight pointing performance. Thus, the superiority of the integrated design methodology over the conventional design approach is experimentally demonstrated. Furthermore, amenability of the integrated design structure to other control strategies is evaluated, both analytically and experimentally. Using Linear-Quadratic-Guassian optimal dissipative controllers, it is observed that the redesigned structure leads to significantly improved performance with alternate controllers as well.

  14. 77 FR 68773 - FIFRA Scientific Advisory Panel; Notice of Public Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-11-16

    ... for physical chemical properties that cannot be easily tested in in vitro systems or stable enough for.... Quantitative structural-activity relationship (QSAR) models and estrogen receptor (ER) expert systems development. High-throughput data generation and analysis (expertise focused on how this methodology can be...

  15. AAC Best Practice Using Automated Language Activity Monitoring.

    ERIC Educational Resources Information Center

    Hill, Katya; Romich, Barry

    This brief paper describes automated language activity monitoring (LAM), an augmentative and alternative communication (AAC) methodology for the collection, editing, and analysis of language data in structured or natural situations with people who have severe communication disorders. The LAM function records each language event (letters, words,…

  16. Proceedings of the Workshop on Transportation/Urban Form Interactions held at Cambridge, MA. on August 14-15, 1978

    DOT National Transportation Integrated Search

    1979-06-01

    Contents: A form of utility function for the UMOT model; An analysis of transportation/land use interactions; Toward a methodology to shape urban structure; Approaches for improving urban travel forecasts; Quasi-dynamic urban location models with end...

  17. Literary Aesthetics in the Narration of Dagara Folktales

    ERIC Educational Resources Information Center

    Kyiileyang, Martin

    2016-01-01

    Dagara folktales, like other African folktales, are embedded with various literary aesthetic features related to structure, language and performance. This paper examines major literary aesthetics found in Dagara folktales. The methodology used is based on the collection, analysis and interpretation of selected Dagara folktales gathered through…

  18. Acculturative Stress and Adjustment Experiences of Greek International Students

    ERIC Educational Resources Information Center

    Poulakis, Mixalis; Dike, Craig A.; Massa, Amber C.

    2017-01-01

    This study investigated eight Greek international college students' experiences of acculturation and acculturative stress at a mid-western university in the United States. Semi-structured interviews were conducted with participants and Consensual Qualitative Research methodology was utilized for data analysis to identify contextual themes and…

  19. Promoting energy efficiency through improved electricity pricing: A mid-project report

    NASA Astrophysics Data System (ADS)

    Action, J. P.; Kohler, D. F.; Mitchell, B. M.; Park, R. E.

    1982-03-01

    Five related areas of electricity demand analysis under alternative rate forms were studied. Adjustments by large commercial and industrial customers are examined. Residential demand under time of day (TOD) pricing is examined. A methodology for evaluating alternative rate structures is developed and applied.

  20. Ethical Guidelines for Structural Interventions to Small-Scale Historic Stone Masonry Buildings.

    PubMed

    Hurol, Yonca; Yüceer, Hülya; Başarır, Hacer

    2015-12-01

    Structural interventions to historic stone masonry buildings require that both structural and heritage values be considered simultaneously. The absence of one of these value systems in implementation can be regarded as an unethical professional action. The research objective of this article is to prepare a guideline for ensuring ethical structural interventions to small-scale stone historic masonry buildings in the conservation areas of Northern Cyprus. The methodology covers an analysis of internationally accepted conservation documents and national laws related to the conservation of historic buildings, an analysis of building codes, especially Turkish building codes, which have been used in Northern Cyprus, and an analysis of the structural interventions introduced to a significant historic building in a semi-intact state in the walled city of Famagusta. This guideline covers issues related to whether buildings are intact or ruined, the presence of earthquake risk, the types of structural decisions in an architectural conservation project, and the values to consider during the decision making phase.

  1. Conservative Allowables Determined by a Tsai-Hill Equivalent Criterion for Design of Satellite Composite Parts

    NASA Astrophysics Data System (ADS)

    Pommatau, Gilles

    2014-06-01

    The present paper deals with the industrial application, via a software developed by Thales Alenia Space, of a new failure criterion named "Tsai-Hill equivalent criterion" for composite structural parts of satellites. The first part of the paper briefly describes the main hypothesis and the possibilities in terms of failure analysis of the software. The second parts reminds the quadratic and conservative nature of the new failure criterion, already presented in ESA conference in a previous paper. The third part presents the statistical calculation possibilities of the software, and the associated sensitivity analysis, via results obtained on different composites. Then a methodology, proposed to customers and agencies, is presented with its limitations and advantages. It is then conclude that this methodology is an efficient industrial way to perform mechanical analysis on quasi-isotropic composite parts.

  2. Lithography hotspot discovery at 70nm DRAM 300mm fab: process window qualification using design base binning

    NASA Astrophysics Data System (ADS)

    Chen, Daniel; Chen, Damian; Yen, Ray; Cheng, Mingjen; Lan, Andy; Ghaskadvi, Rajesh

    2008-11-01

    Identifying hotspots--structures that limit the lithography process window--become increasingly important as the industry relies heavily on RET to print sub-wavelength designs. KLA-Tencor's patented Process Window Qualification (PWQ) methodology has been used for this purpose in various fabs. PWQ methodology has three key advantages (a) PWQ Layout--to obtain the best sensitivity (b) Design Based Binning--for pattern repeater analysis (c) Intelligent sampling--for the best DOI sampling rate. This paper evaluates two different analysis strategies for SEM review sampling successfully deployed at Inotera Memories, Inc. We propose a new approach combining the location repeater and pattern repeaters. Based on a recent case study the new sampling flow reduces the data analysis and sampling time from 6 hours to 1.5 hour maintaining maximum DOI sample rate.

  3. Collagen morphology and texture analysis: from statistics to classification

    PubMed Central

    Mostaço-Guidolin, Leila B.; Ko, Alex C.-T.; Wang, Fei; Xiang, Bo; Hewko, Mark; Tian, Ganghong; Major, Arkady; Shiomi, Masashi; Sowa, Michael G.

    2013-01-01

    In this study we present an image analysis methodology capable of quantifying morphological changes in tissue collagen fibril organization caused by pathological conditions. Texture analysis based on first-order statistics (FOS) and second-order statistics such as gray level co-occurrence matrix (GLCM) was explored to extract second-harmonic generation (SHG) image features that are associated with the structural and biochemical changes of tissue collagen networks. Based on these extracted quantitative parameters, multi-group classification of SHG images was performed. With combined FOS and GLCM texture values, we achieved reliable classification of SHG collagen images acquired from atherosclerosis arteries with >90% accuracy, sensitivity and specificity. The proposed methodology can be applied to a wide range of conditions involving collagen re-modeling, such as in skin disorders, different types of fibrosis and muscular-skeletal diseases affecting ligaments and cartilage. PMID:23846580

  4. Post-Buckling and Ultimate Strength Analysis of Stiffened Composite Panel Base on Progressive Damage

    NASA Astrophysics Data System (ADS)

    Zhang, Guofan; Sun, Xiasheng; Sun, Zhonglei

    Stiffened composite panel is the typical thin wall structure applied in aerospace industry, and its main failure mode is buckling subjected to compressive loading. In this paper, the development of an analysis approach using Finite Element Method on post-buckling behavior of stiffened composite structures under compression was presented. Then, the numerical results of stiffened panel are obtained by FE simulations. A thorough comparison were accomplished by comparing the load carrying capacity and key position strains of the specimen with test. The comparison indicates that the FEM results which adopted developed methodology could meet the demand of engineering application in predicting the post-buckling behavior of intact stiffened structures in aircraft design stage.

  5. Novel Composites for Wing and Fuselage Applications

    NASA Technical Reports Server (NTRS)

    Suarez, J. A.; Buttitta, C.

    1996-01-01

    Design development was successfully completed for textile preforms with continuous cross-stiffened epoxy panels with cut-outs. The preforms developed included 3-D angle interlock weaving of graphite structural fibers impregnated by resin film infiltration (RFI) and shown to be structurally suitable under conditions requiring minimum acquisition costs. Design guidelines/analysis methodology for such textile structures are given. The development was expanded to a fuselage side-panel component of a subsonic commercial airframe and found to be readily scalable. The successfully manufactured panel was delivered to NASA Langley for biaxial testing. This report covers the work performed under Task 3 -- Cross-Stiffened Subcomponent; Task 4 -- Design Guidelines/Analysis of Textile-Reinforced Composites; and Task 5 -- Integrally Woven Fuselage Panel.

  6. Dynamic loading and stress life analysis of permanent space station modules

    NASA Astrophysics Data System (ADS)

    Anisimov, A. V.; Krokhin, I. A.; Likhoded, A. I.; Malinin, A. A.; Panichkin, N. G.; Sidorov, V. V.; Titov, V. A.

    2016-11-01

    Some methodological approaches to solving several key problems of dynamic loading and structural strength analysis of Permanent Space Station (PSS)modules developed on the basis of the working experience of Soviet and Russian PSS and the International Space station (ISS) are presented. The solutions of the direct and semi-inverse problems of PSS structure dynamics are mathematically stated. Special attention is paid to the use of the results of ground structural strength tests of space station modules and the data on the actual flight actions on the station and its dynamic responses in the orbital operation regime. The procedure of determining the dynamics and operation life parameters of elements of the PSS modules is described.

  7. An Evaluation of Research Ethics in Undergraduate Health Science Research Methodology Programs at a South African University.

    PubMed

    Coetzee, Tanya; Hoffmann, Willem A; de Roubaix, Malcolm

    2015-10-01

    The amended research ethics policy at a South African University required the ethics review of undergraduate research projects, prompting the need to explore the content and teaching approach of research ethics education in health science undergraduate programs. Two qualitative data collection strategies were used: document analysis (syllabi and study guides) and semi-structured interviews with research methodology coordinators. Five main themes emerged: (a) timing of research ethics courses, (b) research ethics course content, (c) sub-optimal use of creative classroom activities to facilitate research ethics lectures, (d) understanding the need for undergraduate project research ethics review, and (e) research ethics capacity training for research methodology lecturers and undergraduate project supervisors. © The Author(s) 2015.

  8. Virtual-pulse time integral methodology: A new explicit approach for computational dynamics - Theoretical developments for general nonlinear structural dynamics

    NASA Technical Reports Server (NTRS)

    Chen, Xiaoqin; Tamma, Kumar K.; Sha, Desong

    1993-01-01

    The present paper describes a new explicit virtual-pulse time integral methodology for nonlinear structural dynamics problems. The purpose of the paper is to provide the theoretical basis of the methodology and to demonstrate applicability of the proposed formulations to nonlinear dynamic structures. Different from the existing numerical methods such as direct time integrations or mode superposition techniques, the proposed methodology offers new perspectives and methodology of development, and possesses several unique and attractive computational characteristics. The methodology is tested and compared with the implicit Newmark method (trapezoidal rule) through a nonlinear softening and hardening spring dynamic models. The numerical results indicate that the proposed explicit virtual-pulse time integral methodology is an excellent alternative for solving general nonlinear dynamic problems.

  9. Multidimensional Approach for Tsunami Vulnerability Assessment: Framing the Territorial Impacts in Two Municipalities in Portugal.

    PubMed

    Tavares, Alexandre Oliveira; Barros, José Leandro; Santos, Angela

    2017-04-01

    This study presents a new multidimensional methodology for tsunami vulnerability assessment that combines the morphological, structural, social, and tax component of vulnerability. This new approach can be distinguished from previous methodologies that focused primarily on the evaluation of potentially affected buildings and did not use tsunami numerical modeling. The methodology was applied to the Figueira da Foz and Vila do Bispo municipalities in Portugal. For each area, the potential tsunami-inundated areas were calculated considering the 1755 Lisbon tsunami, which is the greatest disaster caused by natural hazards that ever occurred in Portugal. Furthermore, the four components of the vulnerability were calculated to obtain a composite vulnerability index. This methodology enables us to differentiate the two areas in their vulnerability, highlighting the characteristics of the territory components. This methodology can be a starting point for the creation of a local assessment framework at the municipal scale related to tsunami risk. In addition, the methodology is an important support for the different local stakeholders. © 2016 Society for Risk Analysis.

  10. Fracture Mechanics for Composites: State of the Art and Challenges

    NASA Technical Reports Server (NTRS)

    Krueger, Ronald; Krueger, Ronald

    2006-01-01

    Interlaminar fracture mechanics has proven useful for characterizing the onset of delaminations in composites and has been used with limited success primarily to investigate onset in fracture toughness specimens and laboratory size coupon type specimens. Future acceptance of the methodology by industry and certification authorities however, requires the successful demonstration of the methodology on the structural level. In this paper, the state-of-the-art in fracture toughness characterization, and interlaminar fracture mechanics analysis tools are described. To demonstrate the application on the structural level, a panel was selected which is reinforced with stringers. Full implementation of interlaminar fracture mechanics in design however remains a challenge and requires a continuing development effort of codes to calculate energy release rates and advancements in delamination onset and growth criteria under mixed mode conditions.

  11. Optimization of an Advanced Hybrid Wing Body Concept Using HCDstruct Version 1.2

    NASA Technical Reports Server (NTRS)

    Quinlan, Jesse R.; Gern, Frank H.

    2016-01-01

    Hybrid Wing Body (HWB) aircraft concepts continue to be promising candidates for achieving the simultaneous fuel consumption and noise reduction goals set forth by NASA's Environmentally Responsible Aviation (ERA) project. In order to evaluate the projected benefits, improvements in structural analysis at the conceptual design level were necessary; thus, NASA researchers developed the Hybrid wing body Conceptual Design and structural optimization (HCDstruct) tool to perform aeroservoelastic structural optimizations of advanced HWB concepts. In this paper, the authors present substantial updates to the HCDstruct tool and related analysis, including: the addition of four inboard and eight outboard control surfaces and two all-movable tail/rudder assemblies, providing a full aeroservoelastic analysis capability; the implementation of asymmetric load cases for structural sizing applications; and a methodology for minimizing control surface actuation power using NASTRAN SOL 200 and HCDstruct's aeroservoelastic finite-element model (FEM).

  12. Methods for simulation-based analysis of fluid-structure interaction.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barone, Matthew Franklin; Payne, Jeffrey L.

    2005-10-01

    Methods for analysis of fluid-structure interaction using high fidelity simulations are critically reviewed. First, a literature review of modern numerical techniques for simulation of aeroelastic phenomena is presented. The review focuses on methods contained within the arbitrary Lagrangian-Eulerian (ALE) framework for coupling computational fluid dynamics codes to computational structural mechanics codes. The review treats mesh movement algorithms, the role of the geometric conservation law, time advancement schemes, wetted surface interface strategies, and some representative applications. The complexity and computational expense of coupled Navier-Stokes/structural dynamics simulations points to the need for reduced order modeling to facilitate parametric analysis. The proper orthogonalmore » decomposition (POD)/Galerkin projection approach for building a reduced order model (ROM) is presented, along with ideas for extension of the methodology to allow construction of ROMs based on data generated from ALE simulations.« less

  13. Development of a rotorcraft. Propulsion dynamics interface analysis, volume 2

    NASA Technical Reports Server (NTRS)

    Hull, R.

    1982-01-01

    A study was conducted to establish a coupled rotor/propulsion analysis that would be applicable to a wide range of rotorcraft systems. The effort included the following tasks: (1) development of a model structure suitable for simulating a wide range of rotorcraft configurations; (2) defined a methodology for parameterizing the model structure to represent a particular rotorcraft; (3) constructing a nonlinear coupled rotor/propulsion model as a test case to use in analyzing coupled system dynamics; and (4) an attempt to develop a mostly linear coupled model derived from the complete nonlinear simulations. Documentation of the computer models developed is presented.

  14. Programming Probabilistic Structural Analysis for Parallel Processing Computer

    NASA Technical Reports Server (NTRS)

    Sues, Robert H.; Chen, Heh-Chyun; Twisdale, Lawrence A.; Chamis, Christos C.; Murthy, Pappu L. N.

    1991-01-01

    The ultimate goal of this research program is to make Probabilistic Structural Analysis (PSA) computationally efficient and hence practical for the design environment by achieving large scale parallelism. The paper identifies the multiple levels of parallelism in PSA, identifies methodologies for exploiting this parallelism, describes the development of a parallel stochastic finite element code, and presents results of two example applications. It is demonstrated that speeds within five percent of those theoretically possible can be achieved. A special-purpose numerical technique, the stochastic preconditioned conjugate gradient method, is also presented and demonstrated to be extremely efficient for certain classes of PSA problems.

  15. Aeroelastic analysis for propellers - mathematical formulations and program user's manual

    NASA Technical Reports Server (NTRS)

    Bielawa, R. L.; Johnson, S. A.; Chi, R. M.; Gangwani, S. T.

    1983-01-01

    Mathematical development is presented for a specialized propeller dedicated version of the G400 rotor aeroelastic analysis. The G400PROP analysis simulates aeroelastic characteristics particular to propellers such as structural sweep, aerodynamic sweep and high subsonic unsteady airloads (both stalled and unstalled). Formulations are presented for these expanded propeller related methodologies. Results of limited application of the analysis to realistic blade configurations and operating conditions which include stable and unstable stall flutter test conditions are given. Sections included for enhanced program user efficiency and expanded utilization include descriptions of: (1) the structuring of the G400PROP FORTRAN coding; (2) the required input data; and (3) the output results. General information to facilitate operation and improve efficiency is also provided.

  16. Structural analysis of low-speed composite propfan blades for the LRCSW wind tunnel model

    NASA Technical Reports Server (NTRS)

    Ernst, Michael A.

    1992-01-01

    The Naval Weapons Center at China Lake, CA, is currently in the process of evaluating propulsion systems for the Long Range Conventional Standoff Weapons (LRCSW). At present, the Advanced Counter-Rotating Propfan system is being considered. The methodologies are documented which were used to structurally analyze the 0.55 scale CM1 composite propfan blades for the LRCSW with COBSTRAN and MSC/NASTRAN. Significant results are also reported.

  17. Comparison of composite rotor blade models: A coupled-beam analysis and an MSC/NASTRAN finite-element model

    NASA Technical Reports Server (NTRS)

    Hodges, Robert V.; Nixon, Mark W.; Rehfield, Lawrence W.

    1987-01-01

    A methodology was developed for the structural analysis of composite rotor blades. This coupled-beam analysis is relatively simple to use compared with alternative analysis techniques. The beam analysis was developed for thin-wall single-cell rotor structures and includes the effects of elastic coupling. This paper demonstrates the effectiveness of the new composite-beam analysis method through comparison of its results with those of an established baseline analysis technique. The baseline analysis is an MSC/NASTRAN finite-element model built up from anisotropic shell elements. Deformations are compared for three linear static load cases of centrifugal force at design rotor speed, applied torque, and lift for an ideal rotor in hover. A D-spar designed to twist under axial loading is the subject of the analysis. Results indicate the coupled-beam analysis is well within engineering accuracy.

  18. Methods of Forming Professional Competence of Students as Future Teachers

    ERIC Educational Resources Information Center

    Omarov, Yessen B.; Toktarbayev, Darkhan Gabdyl-Samatovich; Rybin, Igor Vyacheslavovich; Saliyevaa, Aigul Zhanayevna; Zhumabekova, Fatima Niyazbekovna; Hamzina, Sholpan; Baitlessova, Nursulu; Sakenov, Janat

    2016-01-01

    The article presents an analysis of the problem of professional competence; a methodological basis of forming professional competence of college students as future teachers is established. The essence of professional competence is defined. The structure has been experimentally proved and developed; the contents, criteria and levels of professional…

  19. A Composite Model for Employees' Performance Appraisal and Improvement

    ERIC Educational Resources Information Center

    Manoharan, T. R.; Muralidharan, C.; Deshmukh, S. G.

    2012-01-01

    Purpose: The purpose of this paper is to develop an innovative method of performance appraisal that will be useful for designing a structured training programme. Design/methodology/approach: Employees' performance appraisals are conducted using new approaches, namely data envelopment analysis and an integrated fuzzy model. Interpretive structural…

  20. Implementing Service Excellence in Higher Education

    ERIC Educational Resources Information Center

    Khan, Hina; Matlay, Harry

    2009-01-01

    Purpose: The purpose of this paper is to provide a critical analysis of the importance of service excellence in higher education. Design/methodology/approach: The research upon which this paper is based employed a phenomenological approach. This method was selected for its focus on respondent perceptions and experiences. Both structured and…

  1. The Development of Entrepreneurship at School: The Spanish Experience

    ERIC Educational Resources Information Center

    Barba-Sánchez, Virginia; Atienza-Sahuquillo, Carlos

    2016-01-01

    Purpose: The purpose of this paper is to encourage entrepreneurship and creativity among primary school pupils than they acquire entrepreneurial skills through running a business. Design/methodology/approach: A pilot experience has been structured into three large phases: analysis of the starting situation; production of the materials and their…

  2. Studies for development of novel quinazolinones: New biomarker for EGFR

    NASA Astrophysics Data System (ADS)

    Aggarwal, Swati; Sinha, Deepa; Tiwari, Anjani Kumar; Pooja, Pooja; Kaul, Ankur; Singh, Gurmeet; Mishra, Anil Kumar

    2015-05-01

    The binding capabilities of a series of novel quinazolinone molecules were established and stated in a comprehensive computational methodology as well as by in vitro analysis. The main focus of this work was to achieve more insight of the interactions with crystal structure of PDB ID:

  3. Notes on a Political Theory of Educational Organizations.

    ERIC Educational Resources Information Center

    Bacharach, Samuel B.

    This essay reviews major trends in methodological and theoretical approaches to the study of organizations since the mid-sixties and espouses the political analysis of organizations, a position representing a middle ground between comparative structuralism and the loosely coupled systems approach. This position emphasizes micropolitics as well as…

  4. Affective Responses of Students Who Witness Classroom Cheating

    ERIC Educational Resources Information Center

    Firmin, Michael W.; Burger, Amanda; Blosser, Matthew

    2009-01-01

    For this study, 82 general psychology students (51 females, 31 males) witnessed a peer cheating while completing a test. Following the incident, we tape recorded semi-structured interviews with each student who saw the cheating event for later analysis. Using qualitative coding and methodology, themes emerged regarding students' emotional…

  5. Internal genetic structure and outcrossing rate in a natural population of Araucaria angustifolia (Bert.) O. Kuntze.

    PubMed

    Mantovani, Adelar; Morellato, L Patrícia C; Dos Reis, Maurício S

    2006-01-01

    The internal genetic structure and outcrossing rate of a population of Araucaria angustifolia (Bert.) O. Kuntze were investigated using 16 allozyme loci. Estimates of the mean number of alleles per loci (1.6), percentage of polymorphic loci (43.8%), and expected genetic diversity (0.170) were similar to those obtained for other gymnosperms. The analysis of spatial autocorrelation demonstrated the presence of internal structure in the first distance classes (up to 70 m), suggesting the presence of family structure. The outcrossing rate was high (0.956), as expected for a dioecious species. However, it was different from unity, indicating outcrossings between related individuals and corroborating the presence of internal genetic structure. The results of this study have implications for the methodologies used in conservation collections and for the use or analysis of this forest species.

  6. Finite Element Simulation of a Space Shuttle Solid Rocket Booster Aft Skirt Splashdown Using an Arbitrary Lagrangian-Eulerian Approach

    NASA Astrophysics Data System (ADS)

    Melis, Matthew E.

    2003-01-01

    Explicit finite element techniques employing an Arbitrary Lagrangian-Eulerian (ALE) methodology, within the transient dynamic code LS-DYNA, are used to predict splashdown loads on a proposed replacement/upgrade of the hydrazine tanks on the thrust vector control system housed within the aft skirt of a Space Shuttle Solid Rocket Booster. Two preliminary studies are performed prior to the full aft skirt analysis: An analysis of the proposed tank impacting water without supporting aft skirt structure, and an analysis of space capsule water drop tests conducted at NASA's Langley Research Center. Results from the preliminary studies provide confidence that useful predictions can be made by applying the ALE methodology to a detailed analysis of a 26-degree section of the skirt with proposed tank attached. Results for all three studies are presented and compared to limited experimental data. The challenges of using the LS-DYNA ALE capability for this type of analysis are discussed.

  7. Finite Element Simulation of a Space Shuttle Solid Rocket Booster Aft Skirt Splashdown Using an Arbitrary Lagrangian-eulerian Approach

    NASA Technical Reports Server (NTRS)

    Melis, Matthew E.

    2003-01-01

    Explicit finite element techniques employing an Arbitrary Lagrangian-Eulerian (ALE) methodology, within the transient dynamic code LS-DYNA, are used to predict splashdown loads on a proposed replacement/upgrade of the hydrazine tanks on the thrust vector control system housed within the aft skirt of a Space Shuttle Solid Rocket Booster. Two preliminary studies are performed prior to the full aft skirt analysis: An analysis of the proposed tank impacting water without supporting aft skirt structure, and an analysis of space capsule water drop tests conducted at NASA's Langley Research Center. Results from the preliminary studies provide confidence that useful predictions can be made by applying the ALE methodology to a detailed analysis of a 26-degree section of the skirt with proposed tank attached. Results for all three studies are presented and compared to limited experimental data. The challenges of using the LS-DYNA ALE capability for this type of analysis are discussed.

  8. TensorCalculator: exploring the evolution of mechanical stress in the CCMV capsid

    NASA Astrophysics Data System (ADS)

    Kononova, Olga; Maksudov, Farkhad; Marx, Kenneth A.; Barsegov, Valeri

    2018-01-01

    A new computational methodology for the accurate numerical calculation of the Cauchy stress tensor, stress invariants, principal stress components, von Mises and Tresca tensors is developed. The methodology is based on the atomic stress approach which permits the calculation of stress tensors, widely used in continuum mechanics modeling of materials properties, using the output from the MD simulations of discrete atomic and C_α -based coarse-grained structural models of biological particles. The methodology mapped into the software package TensorCalculator was successfully applied to the empty cowpea chlorotic mottle virus (CCMV) shell to explore the evolution of mechanical stress in this mechanically-tested specific example of a soft virus capsid. We found an inhomogeneous stress distribution in various portions of the CCMV structure and stress transfer from one portion of the virus structure to another, which also points to the importance of entropic effects, often ignored in finite element analysis and elastic network modeling. We formulate a criterion for elastic deformation using the first principal stress components. Furthermore, we show that von Mises and Tresca stress tensors can be used to predict the onset of a viral capsid’s mechanical failure, which leads to total structural collapse. TensorCalculator can be used to study stress evolution and dynamics of defects in viral capsids and other large-size protein assemblies.

  9. Impact of influent data frequency and model structure on the quality of WWTP model calibration and uncertainty.

    PubMed

    Cierkens, Katrijn; Plano, Salvatore; Benedetti, Lorenzo; Weijers, Stefan; de Jonge, Jarno; Nopens, Ingmar

    2012-01-01

    Application of activated sludge models (ASMs) to full-scale wastewater treatment plants (WWTPs) is still hampered by the problem of model calibration of these over-parameterised models. This either requires expert knowledge or global methods that explore a large parameter space. However, a better balance in structure between the submodels (ASM, hydraulic, aeration, etc.) and improved quality of influent data result in much smaller calibration efforts. In this contribution, a methodology is proposed that links data frequency and model structure to calibration quality and output uncertainty. It is composed of defining the model structure, the input data, an automated calibration, confidence interval computation and uncertainty propagation to the model output. Apart from the last step, the methodology is applied to an existing WWTP using three models differing only in the aeration submodel. A sensitivity analysis was performed on all models, allowing the ranking of the most important parameters to select in the subsequent calibration step. The aeration submodel proved very important to get good NH(4) predictions. Finally, the impact of data frequency was explored. Lowering the frequency resulted in larger deviations of parameter estimates from their default values and larger confidence intervals. Autocorrelation due to high frequency calibration data has an opposite effect on the confidence intervals. The proposed methodology opens doors to facilitate and improve calibration efforts and to design measurement campaigns.

  10. A stochastic multiple imputation algorithm for missing covariate data in tree-structured survival analysis.

    PubMed

    Wallace, Meredith L; Anderson, Stewart J; Mazumdar, Sati

    2010-12-20

    Missing covariate data present a challenge to tree-structured methodology due to the fact that a single tree model, as opposed to an estimated parameter value, may be desired for use in a clinical setting. To address this problem, we suggest a multiple imputation algorithm that adds draws of stochastic error to a tree-based single imputation method presented by Conversano and Siciliano (Technical Report, University of Naples, 2003). Unlike previously proposed techniques for accommodating missing covariate data in tree-structured analyses, our methodology allows the modeling of complex and nonlinear covariate structures while still resulting in a single tree model. We perform a simulation study to evaluate our stochastic multiple imputation algorithm when covariate data are missing at random and compare it to other currently used methods. Our algorithm is advantageous for identifying the true underlying covariate structure when complex data and larger percentages of missing covariate observations are present. It is competitive with other current methods with respect to prediction accuracy. To illustrate our algorithm, we create a tree-structured survival model for predicting time to treatment response in older, depressed adults. Copyright © 2010 John Wiley & Sons, Ltd.

  11. A system methodology for optimization design of the structural crashworthiness of a vehicle subjected to a high-speed frontal crash

    NASA Astrophysics Data System (ADS)

    Xia, Liang; Liu, Weiguo; Lv, Xiaojiang; Gu, Xianguang

    2018-04-01

    The structural crashworthiness design of vehicles has become an important research direction to ensure the safety of the occupants. To effectively improve the structural safety of a vehicle in a frontal crash, a system methodology is presented in this study. The surrogate model of Online support vector regression (Online-SVR) is adopted to approximate crashworthiness criteria and different kernel functions are selected to enhance the accuracy of the model. The Online-SVR model is demonstrated to have the advantages of solving highly nonlinear problems and saving training costs, and can effectively be applied for vehicle structural crashworthiness design. By combining the non-dominated sorting genetic algorithm II and Monte Carlo simulation, both deterministic optimization and reliability-based design optimization (RBDO) are conducted. The optimization solutions are further validated by finite element analysis, which shows the effectiveness of the RBDO solution in the structural crashworthiness design process. The results demonstrate the advantages of using RBDO, resulting in not only increased energy absorption and decreased structural weight from a baseline design, but also a significant improvement in the reliability of the design.

  12. Satellite vulnerability to space debris - an improved 3D risk assessment methodology

    NASA Astrophysics Data System (ADS)

    Grassi, Lilith; Tiboldo, Francesca; Destefanis, Roberto; Donath, Thérèse; Winterboer, Arne; Evans, Leanne; Janovsky, Rolf; Kempf, Scott; Rudolph, Martin; Schäfer, Frank; Gelhaus, Johannes

    2014-06-01

    The work described in the present paper, performed as a part of the P2 project, presents an enhanced method to evaluate satellite vulnerability to micrometeoroids and orbital debris (MMOD), using the ESABASE2/Debris tool (developed under ESA contract). Starting from the estimation of induced failures on spacecraft (S/C) components and from the computation of lethal impacts (with an energy leading to the loss of the satellite), and considering the equipment redundancies and interactions between components, the debris-induced S/C functional impairment is assessed. The developed methodology, illustrated through its application to a case study satellite, includes the capability to estimate the number of failures on internal components, overcoming the limitations of current tools which do not allow propagating the debris cloud inside the S/C. The ballistic limit of internal equipment behind a sandwich panel structure is evaluated through the implementation of the Schäfer Ryan Lambert (SRL) Ballistic Limit Equation (BLE). The analysis conducted on the case study satellite shows the S/C vulnerability index to be in the range of about 4% over the complete mission, with a significant reduction with respect to the results typically obtained with the traditional analysis, which considers as a failure the structural penetration of the satellite structural panels. The methodology has then been applied to select design strategies (additional local shielding, relocation of components) to improve S/C protection with respect to MMOD. The results of the analyses conducted on the improved design show a reduction of the vulnerability index of about 18%.

  13. Probabilistic Structural Analysis Methods (PSAM) for select space propulsion system components, part 2

    NASA Technical Reports Server (NTRS)

    1991-01-01

    The technical effort and computer code enhancements performed during the sixth year of the Probabilistic Structural Analysis Methods program are summarized. Various capabilities are described to probabilistically combine structural response and structural resistance to compute component reliability. A library of structural resistance models is implemented in the Numerical Evaluations of Stochastic Structures Under Stress (NESSUS) code that included fatigue, fracture, creep, multi-factor interaction, and other important effects. In addition, a user interface was developed for user-defined resistance models. An accurate and efficient reliability method was developed and was successfully implemented in the NESSUS code to compute component reliability based on user-selected response and resistance models. A risk module was developed to compute component risk with respect to cost, performance, or user-defined criteria. The new component risk assessment capabilities were validated and demonstrated using several examples. Various supporting methodologies were also developed in support of component risk assessment.

  14. European Healthy Cities evaluation: conceptual framework and methodology.

    PubMed

    de Leeuw, Evelyne; Green, Geoff; Dyakova, Mariana; Spanswick, Lucy; Palmer, Nicola

    2015-06-01

    This paper presents the methodology, programme logic and conceptual framework that drove the evaluation of the Fifth Phase of the WHO European Healthy Cities Network. Towards the end of the phase, 99 cities were designated progressively through the life of the phase (2009-14). The paper establishes the values, systems and aspirations that these cities sign up for, as foundations for the selection of methodology. We assert that a realist synthesis methodology, driven by a wide range of qualitative and quantitative methods, is the most appropriate perspective to address the wide geopolitical, demographic, population and health diversities of these cities. The paper outlines the rationale for a structured multiple case study approach, the deployment of a comprehensive questionnaire, data mining through existing databases including Eurostat and analysis of management information generation tools used throughout the period. Response rates were considered extremely high for this type of research. Non-response analyses are described, which show that data are representative for cities across the spectrum of diversity. This paper provides a foundation for further analysis on specific areas of interest presented in this supplement. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  15. Decision support for redesigning wastewater treatment technologies.

    PubMed

    McConville, Jennifer R; Künzle, Rahel; Messmer, Ulrike; Udert, Kai M; Larsen, Tove A

    2014-10-21

    This paper offers a methodology for structuring the design space for innovative process engineering technology development. The methodology is exemplified in the evaluation of a wide variety of treatment technologies for source-separated domestic wastewater within the scope of the Reinvent the Toilet Challenge. It offers a methodology for narrowing down the decision-making field based on a strict interpretation of treatment objectives for undiluted urine and dry feces and macroenvironmental factors (STEEPLED analysis) which influence decision criteria. Such an evaluation identifies promising paths for technology development such as focusing on space-saving processes or the need for more innovation in low-cost, energy-efficient urine treatment methods. Critical macroenvironmental factors, such as housing density, transportation infrastructure, and climate conditions were found to affect technology decisions regarding reactor volume, weight of outputs, energy consumption, atmospheric emissions, investment cost, and net revenue. The analysis also identified a number of qualitative factors that should be carefully weighed when pursuing technology development; such as availability of O&M resources, health and safety goals, and other ethical issues. Use of this methodology allows for coevolution of innovative technology within context constraints; however, for full-scale technology choices in the field, only very mature technologies can be evaluated.

  16. Numerical Estimation of Sound Transmission Loss in Launch Vehicle Payload Fairing

    NASA Astrophysics Data System (ADS)

    Chandana, Pawan Kumar; Tiwari, Shashi Bhushan; Vukkadala, Kishore Nath

    2017-08-01

    Coupled acoustic-structural analysis of a typical launch vehicle composite payload faring is carried out, and results are validated with experimental data. Depending on the frequency range of interest, prediction of vibro-acoustic behavior of a structure is usually done using the finite element method, boundary element method or through statistical energy analysis. The present study focuses on low frequency dynamic behavior of a composite payload fairing structure using both coupled and uncoupled vibro-acoustic finite element models up to 710 Hz. A vibro-acoustic model, characterizing the interaction between the fairing structure, air cavity, and satellite, is developed. The external sound pressure levels specified for the payload fairing's acoustic test are considered as external loads for the analysis. Analysis methodology is validated by comparing the interior noise levels with those obtained from full scale Acoustic tests conducted in a reverberation chamber. The present approach has application in the design and optimization of acoustic control mechanisms at lower frequencies.

  17. Identification and Modelling of the In-Plane Reinforcement Orientation Variations in a CFRP Laminate Produced by Manual Lay-Up

    NASA Astrophysics Data System (ADS)

    Davila, Yves; Crouzeix, Laurent; Douchin, Bernard; Collombet, Francis; Grunevald, Yves-Henri

    2017-08-01

    Reinforcement angle orientation has a significant effect on the mechanical properties of composite materials. This work presents a methodology to introduce variable reinforcement angles into finite element (FE) models of composite structures. The study of reinforcement orientation variations uses meta-models to identify and control a continuous variation across the composite ply. First, the reinforcement angle is measured through image analysis techniques of the composite plies during the lay-up phase. Image analysis results show that variations in the mean ply orientations are between -0.5 and 0.5° with standard deviations ranging between 0.34 and 0.41°. An automatic post-treatment of the images determines the global and local angle variations yielding good agreements visually and numerically between the analysed images and the identified parameters. A composite plate analysed at the end of the cooling phase is presented as a case of study. Here, the variation in residual strains induced by the variability in the reinforcement orientation are up to 28% of the strain field of the homogeneous FE model. The proposed methodology has shown its capabilities to introduce material and geometrical variability into FE analysis of layered composite structures.

  18. Identification and Modelling of the In-Plane Reinforcement Orientation Variations in a CFRP Laminate Produced by Manual Lay-Up

    NASA Astrophysics Data System (ADS)

    Davila, Yves; Crouzeix, Laurent; Douchin, Bernard; Collombet, Francis; Grunevald, Yves-Henri

    2018-06-01

    Reinforcement angle orientation has a significant effect on the mechanical properties of composite materials. This work presents a methodology to introduce variable reinforcement angles into finite element (FE) models of composite structures. The study of reinforcement orientation variations uses meta-models to identify and control a continuous variation across the composite ply. First, the reinforcement angle is measured through image analysis techniques of the composite plies during the lay-up phase. Image analysis results show that variations in the mean ply orientations are between -0.5 and 0.5° with standard deviations ranging between 0.34 and 0.41°. An automatic post-treatment of the images determines the global and local angle variations yielding good agreements visually and numerically between the analysed images and the identified parameters. A composite plate analysed at the end of the cooling phase is presented as a case of study. Here, the variation in residual strains induced by the variability in the reinforcement orientation are up to 28% of the strain field of the homogeneous FE model. The proposed methodology has shown its capabilities to introduce material and geometrical variability into FE analysis of layered composite structures.

  19. Damage tolerance modeling and validation of a wireless sensory composite panel for a structural health monitoring system

    NASA Astrophysics Data System (ADS)

    Talagani, Mohamad R.; Abdi, Frank; Saravanos, Dimitris; Chrysohoidis, Nikos; Nikbin, Kamran; Ragalini, Rose; Rodov, Irena

    2013-05-01

    The paper proposes the diagnostic and prognostic modeling and test validation of a Wireless Integrated Strain Monitoring and Simulation System (WISMOS). The effort verifies a hardware and web based software tool that is able to evaluate and optimize sensorized aerospace composite structures for the purpose of Structural Health Monitoring (SHM). The tool is an extension of an existing suite of an SHM system, based on a diagnostic-prognostic system (DPS) methodology. The goal of the extended SHM-DPS is to apply multi-scale nonlinear physics-based Progressive Failure analyses to the "as-is" structural configuration to determine residual strength, remaining service life, and future inspection intervals and maintenance procedures. The DPS solution meets the JTI Green Regional Aircraft (GRA) goals towards low weight, durable and reliable commercial aircraft. It will take advantage of the currently developed methodologies within the European Clean sky JTI project WISMOS, with the capability to transmit, store and process strain data from a network of wireless sensors (e.g. strain gages, FBGA) and utilize a DPS-based methodology, based on multi scale progressive failure analysis (MS-PFA), to determine structural health and to advice with respect to condition based inspection and maintenance. As part of the validation of the Diagnostic and prognostic system, Carbon/Epoxy ASTM coupons were fabricated and tested to extract the mechanical properties. Subsequently two composite stiffened panels were manufactured, instrumented and tested under compressive loading: 1) an undamaged stiffened buckling panel; and 2) a damaged stiffened buckling panel including an initial diamond cut. Next numerical Finite element models of the two panels were developed and analyzed under test conditions using Multi-Scale Progressive Failure Analysis (an extension of FEM) to evaluate the damage/fracture evolution process, as well as the identification of contributing failure modes. The comparisons between predictions and test results were within 10% accuracy.

  20. Indirect calculation of monoclonal antibodies in nanoparticles using the radiolabeling process with technetium 99 metastable as primary factor: Alternative methodology for the entrapment efficiency.

    PubMed

    Helal-Neto, Edward; Cabezas, Santiago Sánchez; Sancenón, Félix; Martínez-Máñez, Ramón; Santos-Oliveira, Ralph

    2018-05-10

    The use of monoclonal antibodies (Mab) in the current medicine is increasing. Antibody-drug conjugates (ADCs) represents an increasingly and important modality for treating several types of cancer. In this area, the use of Mab associated with nanoparticles is a valuable strategy. However, the methodology used to calculate the Mab entrapment, efficiency and content is extremely expensive. In this study we developed and tested a novel very simple one-step methodology to calculate monoclonal antibody entrapment in mesoporous silica (with magnetic core) nanoparticles using the radiolabeling process as primary methodology. The magnetic core mesoporous silica were successfully developed and characterised. The PXRD analysis at high angles confirmed the presence of magnetic cores in the structures and transmission electron microscopy allowed to determine structures size (58.9 ± 8.1 nm). From the isotherm curve, a specific surface area of 872 m 2 /g was estimated along with a pore volume of 0.85 cm 3 /g and an average pore diameter of 3.15 nm. The radiolabeling process to proceed the indirect determination were well-done. Trastuzumab were successfully labeled (>97%) with Tc-99m generating a clear suspension. Besides, almost all the Tc-99m used (labeling the trastuzumab) remained trapped in the surface of the mesoporous silica for a period as long as 8 h. The indirect methodology demonstrated a high entrapment in magnetic core mesoporous silica surface of Tc-99m-traztuzumab. The results confirmed the potential use from the indirect entrapment efficiency methodology using the radiolabeling process, as a one-step, easy and cheap methodology. Copyright © 2018 Elsevier B.V. All rights reserved.

  1. Network Data: Statistical Theory and New Models

    DTIC Science & Technology

    2016-02-17

    SECURITY CLASSIFICATION OF: During this period of review, Bin Yu worked on many thrusts of high-dimensional statistical theory and methodologies. Her...research covered a wide range of topics in statistics including analysis and methods for spectral clustering for sparse and structured networks...2,7,8,21], sparse modeling (e.g. Lasso) [4,10,11,17,18,19], statistical guarantees for the EM algorithm [3], statistical analysis of algorithm leveraging

  2. CORSS: Cylinder Optimization of Rings, Skin, and Stringers

    NASA Technical Reports Server (NTRS)

    Finckenor, J.; Rogers, P.; Otte, N.

    1994-01-01

    Launch vehicle designs typically make extensive use of cylindrical skin stringer construction. Structural analysis methods are well developed for preliminary design of this type of construction. This report describes an automated, iterative method to obtain a minimum weight preliminary design. Structural optimization has been researched extensively, and various programs have been written for this purpose. Their complexity and ease of use depends on their generality, the failure modes considered, the methodology used, and the rigor of the analysis performed. This computer program employs closed-form solutions from a variety of well-known structural analysis references and joins them with a commercially available numerical optimizer called the 'Design Optimization Tool' (DOT). Any ring and stringer stiffened shell structure of isotropic materials that has beam type loading can be analyzed. Plasticity effects are not included. It performs a more limited analysis than programs such as PANDA, but it provides an easy and useful preliminary design tool for a large class of structures. This report briefly describes the optimization theory, outlines the development and use of the program, and describes the analysis techniques that are used. Examples of program input and output, as well as the listing of the analysis routines, are included.

  3. Computational simulation of composite structures with and without damage. M.S. Thesis Final Report

    NASA Technical Reports Server (NTRS)

    Wilt, Thomas F.

    1994-01-01

    A methodology is described which uses finite element analysis of various laminates to computationally simulate the effects of delamination damage initiation and growth on the structural behavior of laminated composite structures. The delamination area is expanded according to a set pattern. As the delamination area increases, how the structural response of the laminate changes with respect to buckling and strain energy release rate are investigated. Rules are presented for laminates of different configurations, materials and thickness. These results demonstrate that computational simulation methods can provide alternate methods to investigate the complex delamination damage mechanisms found in composite structures.

  4. Analysis of shell-type structures subjected to time-dependent mechanical and thermal loading

    NASA Technical Reports Server (NTRS)

    Simitses, G. J.; Riff, R.

    1988-01-01

    This research is performed to develop a general mathematical model and solution methodologies for analyzing structural response of thin, metallic shell-type structures under large transient, cyclic or static thermomechanical loads. Among the system responses, which are associated with these load conditions, are thermal buckling, creep buckling, and ratcheting. Thus, geometric as well as material-type nonlinearities (of high order) can be anticipated and must be considered in the development of the mathematical model. Furthermore, this must also be accommodated in the solution procedures.

  5. Analysis of shell-type structures subjected to time-dependent mechanical and thermal loading

    NASA Technical Reports Server (NTRS)

    Simitses, George J.

    1990-01-01

    The development of a general mathematical model and solution methodologies for analyzing structural response of thin, metallic shell-like structures under dynamic and/or static thermomechanical loads is examined. In the mathematical model, geometric as well as material-type of nonlinearities are considered. Traditional as well as novel approaches are reported and detailed applications are presented in the appendices. The emphasis for the mathematical model, the related solution schemes, and the applications, is on thermal viscoelastic and viscoplastic phenomena, which can predict creep and ratchetting.

  6. Analysis of shell-type structures subjected to time-dependent mechanical and thermal loading

    NASA Technical Reports Server (NTRS)

    Simitses, G. J.

    1991-01-01

    This report deals with the development of a general mathematical model and solution methodology for analyzing the structural response of thin, metallic shell-like structures under dynamic and/or static thermomechanical loads. In the mathematical model, geometric as well as the material-type of nonlinearities are considered. Traditional as well as novel approaches are reported and detailed applications are presented in the appendices. The emphasis for the mathematical model, the related solution schemes, and the applications, is on thermal viscoelastic and viscoplastic phenomena, which can predict creep and ratchetting.

  7. Structural design methodologies for ceramic-based material systems

    NASA Technical Reports Server (NTRS)

    Duffy, Stephen F.; Chulya, Abhisak; Gyekenyesi, John P.

    1991-01-01

    One of the primary pacing items for realizing the full potential of ceramic-based structural components is the development of new design methods and protocols. The focus here is on low temperature, fast-fracture analysis of monolithic, whisker-toughened, laminated, and woven ceramic composites. A number of design models and criteria are highlighted. Public domain computer algorithms, which aid engineers in predicting the fast-fracture reliability of structural components, are mentioned. Emphasis is not placed on evaluating the models, but instead is focused on the issues relevant to the current state of the art.

  8. An AI-based approach to structural damage identification by modal analysis

    NASA Technical Reports Server (NTRS)

    Glass, B. J.; Hanagud, S.

    1990-01-01

    Flexible-structure damage is presently addressed by a combined model- and parameter-identification approach which employs the AI methodologies of classification, heuristic search, and object-oriented model knowledge representation. The conditions for model-space search convergence to the best model are discussed in terms of search-tree organization and initial model parameter error. In the illustrative example of a truss structure presented, the use of both model and parameter identification is shown to lead to smaller parameter corrections than would be required by parameter identification alone.

  9. Analysis of shell-type structures subjected to time-dependent mechanical and thermal loading

    NASA Technical Reports Server (NTRS)

    Simitses, G. J.

    1989-01-01

    The objective is to develop a general mathematical model and solution methodologies for analyzing structural response of thin, metallic shell-type structures under large transient, cyclic, or static thermomechanical loads. Among the system responses, which are associated with these load conditions, are thermal buckling, creep buckling, and racheting. Thus, geometric as well as material-type nonlinearities (of high order) can be anticipated and must be considered in the development of the mathematical model. Furthermore, this must also be accommodated in the solution procedures.

  10. Analysis of shell-type structures subjected to time-dependent mechanical and thermal loading

    NASA Technical Reports Server (NTRS)

    Simitses, G. J.; Riff, R.

    1988-01-01

    The objective of this research is to develop a general mathematical model and solution methodologies for analyzing structural response of thin, metallic shell-type structures under large transient, cyclic or static thermomechanical loads. Among the system responses, which are associated with these load conditions, are thermal buckling, creep buckling and racheting. Thus, geometric as well as material-type nonlinearities (of high order) can be anticipated and must be considered in the development of the mathematical model. Furthermore, this must also be accommodated in the solution procedures.

  11. Recent Advances in Characterization of Lignin Polymer by Solution-State Nuclear Magnetic Resonance (NMR) Methodology

    PubMed Central

    Wen, Jia-Long; Sun, Shao-Long; Xue, Bai-Liang; Sun, Run-Cang

    2013-01-01

    The demand for efficient utilization of biomass induces a detailed analysis of the fundamental chemical structures of biomass, especially the complex structures of lignin polymers, which have long been recognized for their negative impact on biorefinery. Traditionally, it has been attempted to reveal the complicated and heterogeneous structure of lignin by a series of chemical analyses, such as thioacidolysis (TA), nitrobenzene oxidation (NBO), and derivatization followed by reductive cleavage (DFRC). Recent advances in nuclear magnetic resonance (NMR) technology undoubtedly have made solution-state NMR become the most widely used technique in structural characterization of lignin due to its versatility in illustrating structural features and structural transformations of lignin polymers. As one of the most promising diagnostic tools, NMR provides unambiguous evidence for specific structures as well as quantitative structural information. The recent advances in two-dimensional solution-state NMR techniques for structural analysis of lignin in isolated and whole cell wall states (in situ), as well as their applications are reviewed. PMID:28809313

  12. Micromechanics Fatigue Damage Analysis Modeling for Fabric Reinforced Ceramic Matrix Composites

    NASA Technical Reports Server (NTRS)

    Min, J. B.; Xue, D.; Shi, Y.

    2013-01-01

    A micromechanics analysis modeling method was developed to analyze the damage progression and fatigue failure of fabric reinforced composite structures, especially for the brittle ceramic matrix material composites. A repeating unit cell concept of fabric reinforced composites was used to represent the global composite structure. The thermal and mechanical properties of the repeating unit cell were considered as the same as those of the global composite structure. The three-phase micromechanics, the shear-lag, and the continuum fracture mechanics models were integrated with a statistical model in the repeating unit cell to predict the progressive damages and fatigue life of the composite structures. The global structure failure was defined as the loss of loading capability of the repeating unit cell, which depends on the stiffness reduction due to material slice failures and nonlinear material properties in the repeating unit cell. The present methodology is demonstrated with the analysis results evaluated through the experimental test performed with carbon fiber reinforced silicon carbide matrix plain weave composite specimens.

  13. Recommendations for benefit-risk assessment methodologies and visual representations.

    PubMed

    Hughes, Diana; Waddingham, Ed; Mt-Isa, Shahrul; Goginsky, Alesia; Chan, Edmond; Downey, Gerald F; Hallgreen, Christine E; Hockley, Kimberley S; Juhaeri, Juhaeri; Lieftucht, Alfons; Metcalf, Marilyn A; Noel, Rebecca A; Phillips, Lawrence D; Ashby, Deborah; Micaleff, Alain

    2016-03-01

    The purpose of this study is to draw on the practical experience from the PROTECT BR case studies and make recommendations regarding the application of a number of methodologies and visual representations for benefit-risk assessment. Eight case studies based on the benefit-risk balance of real medicines were used to test various methodologies that had been identified from the literature as having potential applications in benefit-risk assessment. Recommendations were drawn up based on the results of the case studies. A general pathway through the case studies was evident, with various classes of methodologies having roles to play at different stages. Descriptive and quantitative frameworks were widely used throughout to structure problems, with other methods such as metrics, estimation techniques and elicitation techniques providing ways to incorporate technical or numerical data from various sources. Similarly, tree diagrams and effects tables were universally adopted, with other visualisations available to suit specific methodologies or tasks as required. Every assessment was found to follow five broad stages: (i) Planning, (ii) Evidence gathering and data preparation, (iii) Analysis, (iv) Exploration and (v) Conclusion and dissemination. Adopting formal, structured approaches to benefit-risk assessment was feasible in real-world problems and facilitated clear, transparent decision-making. Prior to this work, no extensive practical application and appraisal of methodologies had been conducted using real-world case examples, leaving users with limited knowledge of their usefulness in the real world. The practical guidance provided here takes us one step closer to a harmonised approach to benefit-risk assessment from multiple perspectives. Copyright © 2016 John Wiley & Sons, Ltd.

  14. Towards adaptive and integrated management paradigms to meet the challenges of water governance.

    PubMed

    Halbe, J; Pahl-Wostl, C; Sendzimir, J; Adamowski, J

    2013-01-01

    Integrated Water Resource Management (IWRM) aims at finding practical and sustainable solutions to water resource issues. Research and practice have shown that innovative methods and tools are not sufficient to implement IWRM - the concept needs to also be integrated in prevailing management paradigms and institutions. Water governance science addresses this human dimension by focusing on the analysis of regulatory processes that influence the behavior of actors in water management systems. This paper proposes a new methodology for the integrated analysis of water resources management and governance systems in order to elicit and analyze case-specific management paradigms. It builds on the Management and Transition Framework (MTF) that allows for the examination of structures and processes underlying water management and governance. The new methodology presented in this paper combines participatory modeling and analysis of the governance system by using the MTF to investigate case-specific management paradigms. The linking of participatory modeling and research on complex management and governance systems allows for the transfer of knowledge between scientific, policy, engineering and local communities. In this way, the proposed methodology facilitates assessment and implementation of transformation processes towards IWRM that require also the adoption of adaptive management principles. A case study on flood management in the Tisza River Basin in Hungary is provided to illustrate the application of the proposed methodology.

  15. Conflicts in developing countries: a case study from Rio de Janeiro

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bredariol, Celso Simoes; Magrini, Alessandra

    In developing countries, environmental conflicts are resolved mainly in the political arena. In the developed nations, approaches favoring structured negotiation support techniques are more common, with methodologies and studies designed especially for this purpose, deriving from Group Communications and Decision Theory. This paper analyzes an environmental dispute in the City of Rio de Janeiro, applying conflict analysis methods and simulating its settlement. It concludes that the use of these methodologies in the developing countries may be undertaken with adaptations, designed to train community groups in negotiating while fostering the democratization of the settlement of these disputes.

  16. Predictive Inference Using Latent Variables with Covariates*

    PubMed Central

    Schofield, Lynne Steuerle; Junker, Brian; Taylor, Lowell J.; Black, Dan A.

    2014-01-01

    Plausible Values (PVs) are a standard multiple imputation tool for analysis of large education survey data that measures latent proficiency variables. When latent proficiency is the dependent variable, we reconsider the standard institutionally-generated PV methodology and find it applies with greater generality than shown previously. When latent proficiency is an independent variable, we show that the standard institutional PV methodology produces biased inference because the institutional conditioning model places restrictions on the form of the secondary analysts’ model. We offer an alternative approach that avoids these biases based on the mixed effects structural equations (MESE) model of Schofield (2008). PMID:25231627

  17. Recent activities within the Aeroservoelasticity Branch at the NASA Langley Research Center

    NASA Technical Reports Server (NTRS)

    Noll, Thomas E.; Perry, Boyd, III; Gilbert, Michael G.

    1989-01-01

    The objective of research in aeroservoelasticity at the NASA Langley Research Center is to enhance the modeling, analysis, and multidisciplinary design methodologies for obtaining multifunction digital control systems for application to flexible flight vehicles. Recent accomplishments are discussed, and a status report on current activities within the Aeroservoelasticity Branch is presented. In the area of modeling, improvements to the Minimum-State Method of approximating unsteady aerodynamics are shown to provide precise, low-order aeroservoelastic models for design and simulation activities. Analytical methods based on Matched Filter Theory and Random Process Theory to provide efficient and direct predictions of the critical gust profile and the time-correlated gust loads for linear structural design considerations are also discussed. Two research projects leading towards improved design methodology are summarized. The first program is developing an integrated structure/control design capability based on hierarchical problem decomposition, multilevel optimization and analytical sensitivities. The second program provides procedures for obtaining low-order, robust digital control laws for aeroelastic applications. In terms of methodology validation and application the current activities associated with the Active Flexible Wing project are reviewed.

  18. Tsunami and shelf resonance on the northern Chile coast

    NASA Astrophysics Data System (ADS)

    Cortés, Pablo; Catalán, Patricio A.; Aránguiz, Rafael; Bellotti, Giorgio

    2017-09-01

    This work presents the analysis of long waves resonance in two of the main cities along the northern coast of Chile, Arica, and Iquique, where a large tsunamigenic potential remains despite recent earthquakes. By combining a modal analysis solving the equation of free surface oscillations, with the analysis of background spectra derived from in situ measurements, the spatial and temporal structures of the modes are recovered. Comparison with spectra from three tsunamis of different characteristics shows that the modes found have been excited by past events. Moreover, the two locations show different response patterns. Arica is more sensitive to the characteristics of the tsunami source, whereas Iquique shows a smaller dependency and similar response for different tsunami events. Results are further compared with other methodologies with good agreement. These findings are relevant in characterizing the tsunami hazard in the area, and the methodology can be further extended to other regions along the Chilean coast.

  19. Structure and dynamics of European sports science textual contents: Analysis of ECSS abstracts (1996-2014).

    PubMed

    Hristovski, Robert; Aceski, Aleksandar; Balague, Natalia; Seifert, Ludovic; Tufekcievski, Aleksandar; Cecilia, Aguirre

    2017-02-01

    The article discusses general structure and dynamics of the sports science research content as obtained from the analysis of 21998 European College of Sport Science abstracts belonging to 12 science topics. The structural analysis showed intertwined multidisciplinary and unifying tendencies structured along horizontal (scope) and vertical (level) axes. Methodological (instrumental and mode of inquiry) integrative tendencies are dominant. Theoretical integrative tendencies are much less detectable along both horizontal and vertical axes. The dynamic analysis of written abstracts text content over the 19 years reveals the contextualizing and guiding role of thematic skeletons of each sports science topic in forming more detailed contingent research ideas and the role of the latter in stabilizing and procreating the former. This circular causality between both hierarchical levels and functioning on separate characteristic time scales is crucial for understanding how stable research traditions self-maintain and self-procreate through innovative contingencies. The structure of sports science continuously rebuilds itself through use and re-use of contingent research ideas. The thematic skeleton ensures its identity and the contingent conceptual sets its flexibility and adaptability to different research or applicative problems.

  20. Multidisciplinary design and optimization (MDO) methodology for the aircraft conceptual design

    NASA Astrophysics Data System (ADS)

    Iqbal, Liaquat Ullah

    An integrated design and optimization methodology has been developed for the conceptual design of an aircraft. The methodology brings higher fidelity Computer Aided Design, Engineering and Manufacturing (CAD, CAE and CAM) Tools such as CATIA, FLUENT, ANSYS and SURFCAM into the conceptual design by utilizing Excel as the integrator and controller. The approach is demonstrated to integrate with many of the existing low to medium fidelity codes such as the aerodynamic panel code called CMARC and sizing and constraint analysis codes, thus providing the multi-fidelity capabilities to the aircraft designer. The higher fidelity design information from the CAD and CAE tools for the geometry, aerodynamics, structural and environmental performance is provided for the application of the structured design methods such as the Quality Function Deployment (QFD) and the Pugh's Method. The higher fidelity tools bring the quantitative aspects of a design such as precise measurements of weight, volume, surface areas, center of gravity (CG) location, lift over drag ratio, and structural weight, as well as the qualitative aspects such as external geometry definition, internal layout, and coloring scheme early in the design process. The performance and safety risks involved with the new technologies can be reduced by modeling and assessing their impact more accurately on the performance of the aircraft. The methodology also enables the design and evaluation of the novel concepts such as the blended (BWB) and the hybrid wing body (HWB) concepts. Higher fidelity computational fluid dynamics (CFD) and finite element analysis (FEA) allow verification of the claims for the performance gains in aerodynamics and ascertain risks of structural failure due to different pressure distribution in the fuselage as compared with the tube and wing design. The higher fidelity aerodynamics and structural models can lead to better cost estimates that help reduce the financial risks as well. This helps in achieving better designs with reduced risk in lesser time and cost. The approach is shown to eliminate the traditional boundary between the conceptual and the preliminary design stages, combining the two into one consolidated preliminary design phase. Several examples for the validation and utilization of the Multidisciplinary Design and Optimization (MDO) Tool are presented using missions for the Medium and High Altitude Long Range/Endurance Unmanned Aerial Vehicles (UAVs).

  1. Detailed methodology for high resolution scanning electron microscopy (SEM) of murine malaria parasitized-erythrocytes.

    PubMed

    Hayakawa, Eri H; Matsuoka, Hiroyuki

    2016-10-01

    Scanning electron microscopy (SEM) is a powerful tool used to investigate object surfaces and has been widely applied in both material science and biology. With respect to the study of malaria, SEM revealed that erythrocytes infected with Plasmodium falciparum, a human parasite, display 'knob-like' structures on their surface comprising parasitized proteins. However, detailed methodology for SEM studies of malaria parasites is lacking in the literature making such studies challenging. Here, we provide a step-by-step guide to preparing Plasmodium-infected erythrocytes from two mouse strains for SEM analysis with minimal structural deterioration. We tested three species of murine malaria parasites, P. berghei, P. yoelii, and P. chabaudi, as well as non-parasitized human erythrocytes and P. falciparum-infected erythrocytes for comparisons. Our data demonstrated that the surface structures of parasitized erythrocytes between the three species of murine parasites in the two different strains of mice were indistinguishable and no surface alterations were observed in P. falciparum-erythrocytes. Our SEM observations contribute towards an understanding of the molecular mechanisms of parasite maturation in the erythrocyte cytoplasm and, along with future studies using our detailed methodology, may help to gain insight into the clinical phenomena of human malaria. Copyright © 2016 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.

  2. Development of methodology for horizontal axis wind turbine dynamic analysis

    NASA Technical Reports Server (NTRS)

    Dugundji, J.

    1982-01-01

    Horizontal axis wind turbine dynamics were studied. The following findings are summarized: (1) review of the MOSTAS computer programs for dynamic analysis of horizontal axis wind turbines; (2) review of various analysis methods for rotating systems with periodic coefficients; (3) review of structural dynamics analysis tools for large wind turbine; (4) experiments for yaw characteristics of a rotating rotor; (5) development of a finite element model for rotors; (6) development of simple models for aeroelastics; and (7) development of simple models for stability and response of wind turbines on flexible towers.

  3. A methodology for automated CPA extraction using liver biopsy image analysis and machine learning techniques.

    PubMed

    Tsipouras, Markos G; Giannakeas, Nikolaos; Tzallas, Alexandros T; Tsianou, Zoe E; Manousou, Pinelopi; Hall, Andrew; Tsoulos, Ioannis; Tsianos, Epameinondas

    2017-03-01

    Collagen proportional area (CPA) extraction in liver biopsy images provides the degree of fibrosis expansion in liver tissue, which is the most characteristic histological alteration in hepatitis C virus (HCV). Assessment of the fibrotic tissue is currently based on semiquantitative staging scores such as Ishak and Metavir. Since its introduction as a fibrotic tissue assessment technique, CPA calculation based on image analysis techniques has proven to be more accurate than semiquantitative scores. However, CPA has yet to reach everyday clinical practice, since the lack of standardized and robust methods for computerized image analysis for CPA assessment have proven to be a major limitation. The current work introduces a three-stage fully automated methodology for CPA extraction based on machine learning techniques. Specifically, clustering algorithms have been employed for background-tissue separation, as well as for fibrosis detection in liver tissue regions, in the first and the third stage of the methodology, respectively. Due to the existence of several types of tissue regions in the image (such as blood clots, muscle tissue, structural collagen, etc.), classification algorithms have been employed to identify liver tissue regions and exclude all other non-liver tissue regions from CPA computation. For the evaluation of the methodology, 79 liver biopsy images have been employed, obtaining 1.31% mean absolute CPA error, with 0.923 concordance correlation coefficient. The proposed methodology is designed to (i) avoid manual threshold-based and region selection processes, widely used in similar approaches presented in the literature, and (ii) minimize CPA calculation time. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  4. Social interaction in management group meetings: a case study of Finnish hospital.

    PubMed

    Laapotti, Tomi; Mikkola, Leena

    2016-06-20

    Purpose - The purpose of this paper is to understand the role of management group meetings (MGMs) in hospital organization by examining the social interaction in these meetings. Design/methodology/approach - This case study approaches social interaction from a structuration point of view. Social network analysis and qualitative content analysis are applied. Findings - The findings show that MGMs are mainly forums for information sharing. Meetings are not held for problem solving or decision making, and operational coordinating is limited. Meeting interaction is very much focused on the chair, and most of the discussion takes place between the chair and one other member, not between members. The organizational structures are maintained and reproduced in the meeting interaction, and they appear to limit discussion. Meetings appear to fulfil their goals as a part of the organization's information structure and to some extent as an instrument for management. The significance of the relational side of MGMs was recognized. Research limitations/implications - The results of this study provide a basis for future research on hospital MGMs with wider datasets and other methodologies. Especially the relational role of MGMs needs more attention. Practical implications - The goals of MGMs should be reviewed and MG members should be made aware of meeting interaction structures. Originality/value - The paper provides new knowledge about interaction networks in hospital MGMs, and describes the complexity of the importance of MGMs for hospitals.

  5. Methodologies for Verification and Validation of Space Launch System (SLS) Structural Dynamic Models

    NASA Technical Reports Server (NTRS)

    Coppolino, Robert N.

    2018-01-01

    Responses to challenges associated with verification and validation (V&V) of Space Launch System (SLS) structural dynamics models are presented in this paper. Four methodologies addressing specific requirements for V&V are discussed. (1) Residual Mode Augmentation (RMA), which has gained acceptance by various principals in the NASA community, defines efficient and accurate FEM modal sensitivity models that are useful in test-analysis correlation and reconciliation and parametric uncertainty studies. (2) Modified Guyan Reduction (MGR) and Harmonic Reduction (HR, introduced in 1976), developed to remedy difficulties encountered with the widely used Classical Guyan Reduction (CGR) method, are presented. MGR and HR are particularly relevant for estimation of "body dominant" target modes of shell-type SLS assemblies that have numerous "body", "breathing" and local component constituents. Realities associated with configuration features and "imperfections" cause "body" and "breathing" mode characteristics to mix resulting in a lack of clarity in the understanding and correlation of FEM- and test-derived modal data. (3) Mode Consolidation (MC) is a newly introduced procedure designed to effectively "de-feature" FEM and experimental modes of detailed structural shell assemblies for unambiguous estimation of "body" dominant target modes. Finally, (4) Experimental Mode Verification (EMV) is a procedure that addresses ambiguities associated with experimental modal analysis of complex structural systems. Specifically, EMV directly separates well-defined modal data from spurious and poorly excited modal data employing newly introduced graphical and coherence metrics.

  6. Testing and injury potential analysis of rollovers with narrow object impacts.

    PubMed

    Meyer, Steven E; Forrest, Stephen; Herbst, Brian; Hayden, Joshua; Orton, Tia; Sances, Anthony; Kumaresan, Srirangam

    2004-01-01

    Recent statistics highlight the significant risk of serious and fatal injuries to occupants involved in rollover collisions due to excessive roof crush. The government has reported that in 2002. Sports Utility Vehicle rollover related fatalities increased by 14% to more than 2400 annually. 61% of all SUV fatalities included rollovers [1]. Rollover crashes rely primarily upon the roof structures to maintain occupant survival space. Frequently these crashes occur off the travel lanes of the roadway and, therefore, can include impacts with various types of narrow objects such as light poles, utility poles and/or trees. A test device and methodology is presented which facilitates dynamic, repeatable rollover impact evaluation of complete vehicle roof structures with such narrow objects. These tests allow for the incorporation of Anthropomorphic Test Dummies (ATDs) which can be instrumented to measure accelerations, forces and moments to evaluate injury potential. High-speed video permits for detailed analysis of occupant kinematics and evaluation of injury causation. Criteria such as restraint performance, injury potential, survival space and the effect of roof crush associated with various types of design alternatives, countermeasures and impact circumstances can also be evaluated. In addition to presentation of the methodology, two representative vehicle crash tests are also reported. Results indicated that the reinforced roof structure significantly reduced the roof deformation compared to the production roof structure.

  7. Hidden Markov model analysis of force/torque information in telemanipulation

    NASA Technical Reports Server (NTRS)

    Hannaford, Blake; Lee, Paul

    1991-01-01

    A model for the prediction and analysis of sensor information recorded during robotic performance of telemanipulation tasks is presented. The model uses the hidden Markov model to describe the task structure, the operator's or intelligent controller's goal structure, and the sensor signals. A methodology for constructing the model parameters based on engineering knowledge of the task is described. It is concluded that the model and its optimal state estimation algorithm, the Viterbi algorithm, are very succesful at the task of segmenting the data record into phases corresponding to subgoals of the task. The model provides a rich modeling structure within a statistical framework, which enables it to represent complex systems and be robust to real-world sensory signals.

  8. Lattice Truss Structural Response Using Energy Methods

    NASA Technical Reports Server (NTRS)

    Kenner, Winfred Scottson

    1996-01-01

    A deterministic methodology is presented for developing closed-form deflection equations for two-dimensional and three-dimensional lattice structures. Four types of lattice structures are studied: beams, plates, shells and soft lattices. Castigliano's second theorem, which entails the total strain energy of a structure, is utilized to generate highly accurate results. Derived deflection equations provide new insight into the bending and shear behavior of the four types of lattices, in contrast to classic solutions of similar structures. Lattice derivations utilizing kinetic energy are also presented, and used to examine the free vibration response of simple lattice structures. Derivations utilizing finite element theory for unique lattice behavior are also presented and validated using the finite element analysis code EAL.

  9. Structural zeros in high-dimensional data with applications to microbiome studies.

    PubMed

    Kaul, Abhishek; Davidov, Ori; Peddada, Shyamal D

    2017-07-01

    This paper is motivated by the recent interest in the analysis of high-dimensional microbiome data. A key feature of these data is the presence of "structural zeros" which are microbes missing from an observation vector due to an underlying biological process and not due to error in measurement. Typical notions of missingness are unable to model these structural zeros. We define a general framework which allows for structural zeros in the model and propose methods of estimating sparse high-dimensional covariance and precision matrices under this setup. We establish error bounds in the spectral and Frobenius norms for the proposed estimators and empirically verify them with a simulation study. The proposed methodology is illustrated by applying it to the global gut microbiome data of Yatsunenko and others (2012. Human gut microbiome viewed across age and geography. Nature 486, 222-227). Using our methodology we classify subjects according to the geographical location on the basis of their gut microbiome. © The Author 2017. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  10. The leverage of demographic dynamics on carbon dioxide emissions: does age structure matter?

    PubMed

    Zagheni, Emilio

    2011-02-01

    This article provides a methodological contribution to the study of the effect of changes in population age structure on carbon dioxide (CO(2)) emissions. First, I propose a generalization of the IPAT equation to a multisector economy with an age-structured population and discuss the insights that can be obtained in the context of stable population theory. Second, I suggest a statistical model of household consumption as a function of household size and age structure to quantitatively evaluate the extent of economies of scale in consumption of energy-intensive goods, and to estimate age-specific profiles of consumption of energy-intensive goods and of CO(2) emissions. Third, I offer an illustration of the methodologies using data for the United States. The analysis shows that per-capita CO(2) emissions increase with age until the individual is in his or her 60s, and then emissions tend to decrease. Holding everything else constant, the expected change in U.S. population age distribution during the next four decades is likely to have a small, but noticeable, positive impact on CO(2) emissions.

  11. Analysis of the tenderisation of jumbo squid (Dosidicus gigas) meat by ultrasonic treatment using response surface methodology.

    PubMed

    Hu, Yaqin; Yu, Hiaxia; Dong, Kaicheng; Yang, Shuibing; Ye, Xingqian; Chen, Shiguo

    2014-10-01

    Due to its unique structure, jumbo squid (Dosidicus gigas) meat is sensitive to heat treatment, which makes the traditional squid products taste tough and hard. This study aimed to tenderise jumbo squid meat through ultrasonic treatment. Response surface methodology (RSM) was used to predict the tenderising effect of various treatment conditions. According to the results of RSM, the optimal conditions appeared to be a power of 186.9 W, a frequency of 25.6 kHz, and a time of 30.8 min, and the predicted values of flexibility and firmness under these optimal conditions were 2.40 mm and 435.1 g, respectively. Protein degradation and a broken muscle fibre structure were observed through histological assay and SDS-PAGE, which suggests a satisfactory tenderisation effect. Copyright © 2014. Published by Elsevier Ltd.

  12. Rocketdyne PSAM: In-house enhancement/application

    NASA Technical Reports Server (NTRS)

    Newell, J. F.; Rajagopal, K. R.; Ohara, K.

    1991-01-01

    The development was initiated of the Probabilistic Design Analysis (PDA) Process for rocket engines. This will enable engineers a quantitative assessment of calculated reliability during the design process. The PDA will help choose better designs, make them more robust, and help decide on critical tests to help demonstrate key reliability issues to aid in improving the confidence of the engine capabilities. Rockedyne's involvement with the Composite Loads Spectra (CLS) and Probabilistic Structural Analysis Methodology (PSAM) contracts started this effort and are key elements in the on-going developments. Internal development efforts and hardware applications complement and extend the CLS and PSAM efforts. The completion of the CLS option work and the follow-on PSAM developments will also be integral parts of this methodology. A brief summary of these efforts is presented.

  13. Time-Varying, Multi-Scale Adaptive System Reliability Analysis of Lifeline Infrastructure Networks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gearhart, Jared Lee; Kurtz, Nolan Scot

    2014-09-01

    The majority of current societal and economic needs world-wide are met by the existing networked, civil infrastructure. Because the cost of managing such infrastructure is high and increases with time, risk-informed decision making is essential for those with management responsibilities for these systems. To address such concerns, a methodology that accounts for new information, deterioration, component models, component importance, group importance, network reliability, hierarchical structure organization, and efficiency concerns has been developed. This methodology analyzes the use of new information through the lens of adaptive Importance Sampling for structural reliability problems. Deterioration, multi-scale bridge models, and time-variant component importance aremore » investigated for a specific network. Furthermore, both bridge and pipeline networks are studied for group and component importance, as well as for hierarchical structures in the context of specific networks. Efficiency is the primary driver throughout this study. With this risk-informed approach, those responsible for management can address deteriorating infrastructure networks in an organized manner.« less

  14. Evaluating multiple determinants of the structure of plant-animal mutualistic networks.

    PubMed

    Vázquez, Diego P; Chacoff, Natacha P; Cagnolo, Luciano

    2009-08-01

    The structure of mutualistic networks is likely to result from the simultaneous influence of neutrality and the constraints imposed by complementarity in species phenotypes, phenologies, spatial distributions, phylogenetic relationships, and sampling artifacts. We develop a conceptual and methodological framework to evaluate the relative contributions of these potential determinants. Applying this approach to the analysis of a plant-pollinator network, we show that information on relative abundance and phenology suffices to predict several aggregate network properties (connectance, nestedness, interaction evenness, and interaction asymmetry). However, such information falls short of predicting the detailed network structure (the frequency of pairwise interactions), leaving a large amount of variation unexplained. Taken together, our results suggest that both relative species abundance and complementarity in spatiotemporal distribution contribute substantially to generate observed network patters, but that this information is by no means sufficient to predict the occurrence and frequency of pairwise interactions. Future studies could use our methodological framework to evaluate the generality of our findings in a representative sample of study systems with contrasting ecological conditions.

  15. The case for multimodal analysis of atypical interaction: questions, answers and gaze in play involving a child with autism.

    PubMed

    Muskett, Tom; Body, Richard

    2013-01-01

    Conversation analysis (CA) continues to accrue interest within clinical linguistics as a methodology that can enable elucidation of structural and sequential orderliness in interactions involving participants who produce ostensibly disordered communication behaviours. However, it can be challenging to apply CA to re-examine clinical phenomena that have initially been defined in terms of linguistics, as a logical starting point for analysis may be to focus primarily on the organisation of language ("talk") in such interactions. In this article, we argue that CA's methodological power can only be fully exploited in this research context when a multimodal analytic orientation is adopted, where due consideration is given to participants' co-ordinated use of multiple semiotic resources including, but not limited to, talk (e.g., gaze, embodied action, object use and so forth). To evidence this argument, a two-layered analysis of unusual question-answer sequences in a play episode involving a child with autism is presented. It is thereby demonstrated that only when the scope of enquiry is broadened to include gaze and other embodied action can an account be generated of orderliness within these sequences. This finding has important implications for CA's application as a research methodology within clinical linguistics.

  16. NESSUS (Numerical Evaluation of Stochastic Structures Under Stress)/EXPERT: Bridging the gap between artificial intelligence and FORTRAN

    NASA Technical Reports Server (NTRS)

    Fink, Pamela K.; Palmer, Karol K.

    1988-01-01

    The development of a probabilistic structural analysis methodology (PSAM) is described. In the near-term, the methodology will be applied to designing critical components of the next generation space shuttle main engine. In the long-term, PSAM will be applied very broadly, providing designers with a new technology for more effective design of structures whose character and performance are significantly affected by random variables. The software under development to implement the ideas developed in PSAM resembles, in many ways, conventional deterministic structural analysis code. However, several additional capabilities regarding the probabilistic analysis makes the input data requirements and the resulting output even more complex. As a result, an intelligent front- and back-end to the code is being developed to assist the design engineer in providing the input data in a correct and appropriate manner. The type of knowledge that this entails is, in general, heuristically-based, allowing the fairly well-understood technology of production rules to apply with little difficulty. However, the PSAM code, called NESSUS, is written in FORTRAN-77 and runs on a DEC VAX. Thus, the associated expert system, called NESSUS/EXPERT, must run on a DEC VAX as well, and integrate effectively and efficiently with the existing FORTRAN code. This paper discusses the process undergone to select a suitable tool, identify an appropriate division between the functions that should be performed in FORTRAN and those that should be performed by production rules, and how integration of the conventional and AI technologies was achieved.

  17. Evaluation of three electronic noses for detecting incipient wood decay

    Treesearch

    Manuela Baietto; Alphus D. Wilson; Daniele Bassi; Francesco Ferrini

    2010-01-01

    Tree assessment methodologies, currently used to evaluate the structural stability of individual urban trees, usually involve a visual analysis followed by measurements of the internal soundness of wood using various instruments that are often invasive, expensive, or inadequate for use within the urban environment. Moreover, most conventional instruments do not provide...

  18. A Critical Comparison of Psychometric Models for Measuring Achievement. Methodology Project.

    ERIC Educational Resources Information Center

    Choppin, Bruce; And Others

    A detailed description of five latent structure models of achievement measurement is presented. The first project paper, by David L. McArthur, analyzes the history of mental testing to show how conventional item analysis procedures were developed, and how dissatisfaction with them has led to fragmentation. The range of distinct conceptual and…

  19. Ten Adaptive Strategies for Family and Work Balance: Advice from Successful Families.

    ERIC Educational Resources Information Center

    Haddock, Shelley A.; Zimmerman, Toni Schindler; Ziemba, Scott J.; Current, Lisa R.

    2001-01-01

    Investigated adaptive strategies of middle class, dual earner couples (N=47) with children that are successfully managing family and work. Guided by grounded-theory methodology, analysis of interview data revealed these successful couples structured their lives around 10 major strategies. Each strategy is defined and illustrated through the…

  20. Towards the Measurement of EFL Listening Beliefs with Item Response Theory Methods

    ERIC Educational Resources Information Center

    Nix, John-Michael L.; Tseng, Wen-Ta

    2014-01-01

    The present research aims to identify the underlying English listening belief structure of English-as-a-foreign-language (EFL) learners, thereby informing methodologies for subsequent analysis of beliefs with respect to listening achievement. Development of a measurement model of English listening learning beliefs entailed the creation of an…

  1. Analysis of VET in Ukraine Since the Soviet Era

    ERIC Educational Resources Information Center

    Zinser, Richard

    2015-01-01

    Purpose: The purpose of this paper is to explore how vocational education and training (VET) in Ukraine has changed since the Soviet era; and to determine its structure, successes, and challenges. Design/methodology/approach: The author conducted interviews and tours at 15 vocational schools in seven cities in Ukraine. Findings: Ukraine is…

  2. Assessment of Effectiveness of Use of Intellectual Potential of a University: A Methodological Approach

    ERIC Educational Resources Information Center

    Stukalova, Irina B.; Stukalova, Anastasia A.; Selyanskaya, Galina N.

    2016-01-01

    This article presents the results of theoretical analysis of existing approaches to the categories of the "intellectual capital" and "intellectual potential" of an organization. The authors identified the specific peculiarities of developing the intellectual potential of a university and propose their own view of its structure.…

  3. Analysis of Communication and Dissemination Channels Influencing the Adoption of Integrated Soil Fertility Management in Western Kenya

    ERIC Educational Resources Information Center

    Adolwa, Ivan S.; Okoth, Peter F.; Mulwa, Richard M.; Esilaba, Anthony O.; Mairura, Franklin S.; Nambiro, Elizabeth

    2012-01-01

    Purpose: The following study was carried out to evaluate the socio-economic factors influencing access to Integrated Soil Fertility Management (ISFM) information and knowledge among farmers in western Kenya, and subsequent ISFM uptake with a view to assessing communication gaps. Design/Methodology/Approach: Structured questionnaires were…

  4. A Practical Methodology for the Systematic Development of Multiple Choice Tests.

    ERIC Educational Resources Information Center

    Blumberg, Phyllis; Felner, Joel

    Using Guttman's facet design analysis, four parallel forms of a multiple-choice test were developed. A mapping sentence, logically representing the universe of content of a basic cardiology course, specified the facets of the course and the semantic structural units linking them. The facets were: cognitive processes, disease priority, specific…

  5. Mathematics Lectures as Narratives: Insights from Network Graph Methodology

    ERIC Educational Resources Information Center

    Weinberg, Aaron; Wiesner, Emilie; Fukawa-Connelly, Tim

    2016-01-01

    Although lecture is the traditional method of university mathematics instruction, there has been little empirical research that describes the general structure of lectures. In this paper, we adapt ideas from narrative analysis and apply them to an upper-level mathematics lecture. We develop a framework that enables us to conceptualize the lecture…

  6. A Career Success Model for Academics at Malaysian Research Universities

    ERIC Educational Resources Information Center

    Abu Said, Al-Mansor; Mohd Rasdi, Roziah; Abu Samah, Bahaman; Silong, Abu Daud; Sulaiman, Suzaimah

    2015-01-01

    Purpose: The purpose of this paper is to develop a career success model for academics at the Malaysian research universities. Design/methodology/approach: Self-administered and online surveys were used for data collection among 325 academics from Malaysian research universities. Findings: Based on the analysis of structural equation modeling, the…

  7. Design and construction principles in nature and architecture.

    PubMed

    Knippers, Jan; Speck, Thomas

    2012-03-01

    This paper will focus on how the emerging scientific discipline of biomimetics can bring new insights into the field of architecture. An analysis of both architectural and biological methodologies will show important aspects connecting these two. The foundation of this paper is a case study of convertible structures based on elastic plant movements.

  8. Fatigue methodology III; Proceedings of the AHS National Technical Specialists' Meeting on Advanced Rotorcraft Structures, Scottsdale, AZ, Oct. 3-5, 1989

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1989-01-01

    Papers on rotorcraft and fatigue methodology are presented, covering topics such as reliability design for rotorcraft, a comparison between theory and fatigue test data on stress concentration factors, the retirement lives of rolling element bearings, hydrogen embrittlement risk analysis for high hardness steel parts, and rotating system load monitoring with minimum fixed system instrumentation. Additional topics include usage data collection to improve structural integrity of operational helicopters, usage monitory of military helicopters, improvements to the fatigue substantiation of the H-60 composite tail rotor blade, helicopter surviellance programs, and potential application of automotive fatigue technology in rotorcraft design. Also, consideration ismore » given to fatigue evaluation of C/MH-53 E main rotor damper threaded joints, SH-2F airframe fatigue test program, a ply termination concept for improving fracture and fatigue strength of composite laminates, the analysis and testing of composite panels subject to muzzle blast effects, the certification plan for an all-composite main rotor flexbeam, and the effects of stacking sequence on the flexural strength of composite beams.« less

  9. Multi-methodological investigation of the variability of the microstructure of HPMC hard capsules.

    PubMed

    Faulhammer, E; Kovalcik, A; Wahl, V; Markl, D; Stelzer, F; Lawrence, S; Khinast, J G; Paudel, A

    2016-09-25

    The objective of this study was to analyze differences in the subtle microstructure of three different grades of HMPC hard capsule shells using mechanical, spectroscopic, microscopic and tomographic approaches. Dynamic mechanical analysis (DMA), thermogravimetric analysis (TGA), vibrational spectroscopic, X-Ray scattering techniques as well as environmental scanning electron microscopy (ESEM) and optical coherence tomography (OCT) were used. Two HPMC capsules manufactured via chemical gelling, one capsule shell manufactured via thermal gelling and one thermally gelled transparent capsule were included. Characteristic micro-structural alterations (associated manufacturing processes) such as mechanical and physical properties relevant to capsule performance and processability were thoroughly elucidated with the integration of data obtained from multi-methodological investigations. The physico-chemical and physico-mechanical data obtained from a gamut of techniques implied that thermally gelled HPMC hard capsule shells could offer an advantage in terms of machinability during capsule filling, owing to their superior micro- and macroscopic structure as well as specifically the mechanical stability under dry or humid conditions. Copyright © 2016 Elsevier B.V. All rights reserved.

  10. Optical holographic structural analysis of Kevlar rocket motor cases

    NASA Astrophysics Data System (ADS)

    Harris, W. J.

    1981-05-01

    The methodology of applying optical holography to evaluation of subscale Kevlar 49 composite pressure vessels is explored. The results and advantages of the holographic technique are discussed. The cases utilized were of similar design, but each had specific design features, the effects of which are reviewed. Burst testing results are presented in conjunction with the holographic fringe patterns obtained during progressive pressurization. Examples of quantitative data extracted by analysis of fringe fields are included.

  11. A Graphics Editor for Structured Analysis with a Data Dictionary.

    DTIC Science & Technology

    1987-12-01

    4-3 Human/Computer Interface Considerations 4-3 Screen Layout .... ............. 4-4 Menu System ..... .............. 4-6 Voice Feedback...central computer system . This project is a direct follow on to the 1986 thesis by James W. Urscheler. lie created an initial version of a tool (nicknamed...graphics information. Background r SADT. SADT is the name of SofTech’s methodology for doing requirement analysis and system design. It was first published

  12. A user exposure based approach for non-structural road network vulnerability analysis

    PubMed Central

    Jin, Lei; Wang, Haizhong; Yu, Le; Liu, Lin

    2017-01-01

    Aiming at the dense urban road network vulnerability without structural negative consequences, this paper proposes a novel non-structural road network vulnerability analysis framework. Three aspects of the framework are mainly described: (i) the rationality of non-structural road network vulnerability, (ii) the metrics for negative consequences accounting for variant road conditions, and (iii) the introduction of a new vulnerability index based on user exposure. Based on the proposed methodology, a case study in the Sioux Falls network which was usually threatened by regular heavy snow during wintertime is detailedly discussed. The vulnerability ranking of links of Sioux Falls network with respect to heavy snow scenario is identified. As a result of non-structural consequences accompanied by conceivable degeneration of network, there are significant increases in generalized travel time costs which are measurements for “emotionally hurt” of topological road network. PMID:29176832

  13. Spatial pattern recognition of seismic events in South West Colombia

    NASA Astrophysics Data System (ADS)

    Benítez, Hernán D.; Flórez, Juan F.; Duque, Diana P.; Benavides, Alberto; Lucía Baquero, Olga; Quintero, Jiber

    2013-09-01

    Recognition of seismogenic zones in geographical regions supports seismic hazard studies. This recognition is usually based on visual, qualitative and subjective analysis of data. Spatial pattern recognition provides a well founded means to obtain relevant information from large amounts of data. The purpose of this work is to identify and classify spatial patterns in instrumental data of the South West Colombian seismic database. In this research, clustering tendency analysis validates whether seismic database possesses a clustering structure. A non-supervised fuzzy clustering algorithm creates groups of seismic events. Given the sensitivity of fuzzy clustering algorithms to centroid initial positions, we proposed a methodology to initialize centroids that generates stable partitions with respect to centroid initialization. As a result of this work, a public software tool provides the user with the routines developed for clustering methodology. The analysis of the seismogenic zones obtained reveals meaningful spatial patterns in South-West Colombia. The clustering analysis provides a quantitative location and dispersion of seismogenic zones that facilitates seismological interpretations of seismic activities in South West Colombia.

  14. Integrated Controls-Structures Design Methodology: Redesign of an Evolutionary Test Structure

    NASA Technical Reports Server (NTRS)

    Maghami, Peiman G.; Gupta, Sandeep; Elliot, Kenny B.; Joshi, Suresh M.

    1997-01-01

    An optimization-based integrated controls-structures design methodology for a class of flexible space structures is described, and the phase-0 Controls-Structures-Integration evolutionary model, a laboratory testbed at NASA Langley, is redesigned using this integrated design methodology. The integrated controls-structures design is posed as a nonlinear programming problem to minimize the control effort required to maintain a specified line-of-sight pointing performance, under persistent white noise disturbance. Static and dynamic dissipative control strategies are employed for feedback control, and parameters of these controllers are considered as the control design variables. Sizes of strut elements in various sections of the CEM are used as the structural design variables. Design guides for the struts are developed and employed in the integrated design process, to ensure that the redesigned structure can be effectively fabricated. The superiority of the integrated design methodology over the conventional design approach is demonstrated analytically by observing a significant reduction in the average control power needed to maintain specified pointing performance with the integrated design approach.

  15. Implementation of structural response sensitivity calculations in a large-scale finite-element analysis system

    NASA Technical Reports Server (NTRS)

    Giles, G. L.; Rogers, J. L., Jr.

    1982-01-01

    The methodology used to implement structural sensitivity calculations into a major, general-purpose finite-element analysis system (SPAR) is described. This implementation includes a generalized method for specifying element cross-sectional dimensions as design variables that can be used in analytically calculating derivatives of output quantities from static stress, vibration, and buckling analyses for both membrane and bending elements. Limited sample results for static displacements and stresses are presented to indicate the advantages of analytically calculating response derivatives compared to finite difference methods. Continuing developments to implement these procedures into an enhanced version of SPAR are also discussed.

  16. Development of an engineering analysis of progressive damage in composites during low velocity impact

    NASA Technical Reports Server (NTRS)

    Humphreys, E. A.

    1981-01-01

    A computerized, analytical methodology was developed to study damage accumulation during low velocity lateral impact of layered composite plates. The impact event was modeled as perfectly plastic with complete momentum transfer to the plate structure. A transient dynamic finite element approach was selected to predict the displacement time response of the plate structure. Composite ply and interlaminar stresses were computed at selected time intervals and subsequently evaluated to predict layer and interlaminar damage. The effects of damage on elemental stiffness were then incorporated back into the analysis for subsequent time steps. Damage predicted included fiber failure, matrix ply failure and interlaminar delamination.

  17. Probabilistic Structural Analysis Program

    NASA Technical Reports Server (NTRS)

    Pai, Shantaram S.; Chamis, Christos C.; Murthy, Pappu L. N.; Stefko, George L.; Riha, David S.; Thacker, Ben H.; Nagpal, Vinod K.; Mital, Subodh K.

    2010-01-01

    NASA/NESSUS 6.2c is a general-purpose, probabilistic analysis program that computes probability of failure and probabilistic sensitivity measures of engineered systems. Because NASA/NESSUS uses highly computationally efficient and accurate analysis techniques, probabilistic solutions can be obtained even for extremely large and complex models. Once the probabilistic response is quantified, the results can be used to support risk-informed decisions regarding reliability for safety-critical and one-of-a-kind systems, as well as for maintaining a level of quality while reducing manufacturing costs for larger-quantity products. NASA/NESSUS has been successfully applied to a diverse range of problems in aerospace, gas turbine engines, biomechanics, pipelines, defense, weaponry, and infrastructure. This program combines state-of-the-art probabilistic algorithms with general-purpose structural analysis and lifting methods to compute the probabilistic response and reliability of engineered structures. Uncertainties in load, material properties, geometry, boundary conditions, and initial conditions can be simulated. The structural analysis methods include non-linear finite-element methods, heat-transfer analysis, polymer/ceramic matrix composite analysis, monolithic (conventional metallic) materials life-prediction methodologies, boundary element methods, and user-written subroutines. Several probabilistic algorithms are available such as the advanced mean value method and the adaptive importance sampling method. NASA/NESSUS 6.2c is structured in a modular format with 15 elements.

  18. Narratives about illness and medication: a neglected theme/new methodology within pharmacy practice research. Part II: medication narratives in practice.

    PubMed

    Ryan, Kath; Bissell, Paul; Morecroft, Charles

    2007-08-01

    Part 2 of this paper aims to provide a methodological framework for the study of medication narratives, including a semi-structured interview guide and suggested method of analysis, in an attempt to aid the development of narrative scholarship within pharmacy practice research. Examples of medication narratives are provided to illustrate their diversity and usefulness. The framework is derived from the work of other researchers and adapted for our specific purpose. It comes from social psychology, narrative psychology, narrative anthropology, sociology and critical theory and fits within the social constructionist paradigm. The suggested methods of analysis could broadly be described as narrative analysis and discourse analysis. Examples of medication narratives are chosen from a variety of sources and brief interpretations are presented by way of illustration. Narrative analysis, a neglected area of research in pharmacy practice, has the potential to provide new understanding about how people relate to their medicines, how pharmacists are engaged in producing narratives and the importance of narrative in the education of students. IMPACT OF THE ARTICLE: This article aims to have the following impact on pharmacy practice research: Innovative approach to researching and conceptualising the use of medicines. Introduction of a new theoretical perspective and methodology. Incorporation of social science research methods into pharmacy practice research. Development of narrative scholarship within pharmacy.

  19. Turbulence study in the vicinity of piano key weir: relevance, instrumentation, parameters and methods

    NASA Astrophysics Data System (ADS)

    Tiwari, Harinarayan; Sharma, Nayan

    2017-05-01

    This research paper focuses on the need of turbulence, instruments reliable to capture turbulence, different turbulence parameters and some advance methodology which can decompose various turbulence structures at different levels near hydraulic structures. Small-scale turbulence research has valid prospects in open channel flow. The relevance of the study is amplified as we introduce any hydraulic structure in the channel which disturbs the natural flow and creates discontinuity. To recover this discontinuity, the piano key weir (PKW) might be used with sloped keys. Constraints of empirical results in the vicinity of PKW necessitate extensive laboratory experiments with fair and reliable instrumentation techniques. Acoustic Doppler velocimeter was established to be best suited within range of some limitations using principal component analysis. Wavelet analysis is proposed to decompose the underlying turbulence structure in a better way.

  20. Impact and Penetration Simulations for Composite Wing-like Structures

    NASA Technical Reports Server (NTRS)

    Knight, Norman F.

    1998-01-01

    The goal of this research project was to develop methodologies for the analysis of wing-like structures subjected to impact loadings. Low-speed impact causing either no damage or only minimal damage and high-speed impact causing severe laminate damage and possible penetration of the structure were to be considered during this research effort. To address this goal, an assessment of current analytical tools for impact analysis was performed. Assessment of the analytical tools for impact and penetration simulations with regard to accuracy, modeling, and damage modeling was considered as well as robustness, efficient, and usage in a wing design environment. Following a qualitative assessment, selected quantitative evaluations will be performed using the leading simulation tools. Based on this assessment, future research thrusts for impact and penetration simulation of composite wing-like structures were identified.

  1. Statistical Frequency-Dependent Analysis of Trial-to-Trial Variability in Single Time Series by Recurrence Plots.

    PubMed

    Tošić, Tamara; Sellers, Kristin K; Fröhlich, Flavio; Fedotenkova, Mariia; Beim Graben, Peter; Hutt, Axel

    2015-01-01

    For decades, research in neuroscience has supported the hypothesis that brain dynamics exhibits recurrent metastable states connected by transients, which together encode fundamental neural information processing. To understand the system's dynamics it is important to detect such recurrence domains, but it is challenging to extract them from experimental neuroscience datasets due to the large trial-to-trial variability. The proposed methodology extracts recurrent metastable states in univariate time series by transforming datasets into their time-frequency representations and computing recurrence plots based on instantaneous spectral power values in various frequency bands. Additionally, a new statistical inference analysis compares different trial recurrence plots with corresponding surrogates to obtain statistically significant recurrent structures. This combination of methods is validated by applying it to two artificial datasets. In a final study of visually-evoked Local Field Potentials in partially anesthetized ferrets, the methodology is able to reveal recurrence structures of neural responses with trial-to-trial variability. Focusing on different frequency bands, the δ-band activity is much less recurrent than α-band activity. Moreover, α-activity is susceptible to pre-stimuli, while δ-activity is much less sensitive to pre-stimuli. This difference in recurrence structures in different frequency bands indicates diverse underlying information processing steps in the brain.

  2. Statistical Frequency-Dependent Analysis of Trial-to-Trial Variability in Single Time Series by Recurrence Plots

    PubMed Central

    Tošić, Tamara; Sellers, Kristin K.; Fröhlich, Flavio; Fedotenkova, Mariia; beim Graben, Peter; Hutt, Axel

    2016-01-01

    For decades, research in neuroscience has supported the hypothesis that brain dynamics exhibits recurrent metastable states connected by transients, which together encode fundamental neural information processing. To understand the system's dynamics it is important to detect such recurrence domains, but it is challenging to extract them from experimental neuroscience datasets due to the large trial-to-trial variability. The proposed methodology extracts recurrent metastable states in univariate time series by transforming datasets into their time-frequency representations and computing recurrence plots based on instantaneous spectral power values in various frequency bands. Additionally, a new statistical inference analysis compares different trial recurrence plots with corresponding surrogates to obtain statistically significant recurrent structures. This combination of methods is validated by applying it to two artificial datasets. In a final study of visually-evoked Local Field Potentials in partially anesthetized ferrets, the methodology is able to reveal recurrence structures of neural responses with trial-to-trial variability. Focusing on different frequency bands, the δ-band activity is much less recurrent than α-band activity. Moreover, α-activity is susceptible to pre-stimuli, while δ-activity is much less sensitive to pre-stimuli. This difference in recurrence structures in different frequency bands indicates diverse underlying information processing steps in the brain. PMID:26834580

  3. Experimental validation of an integrated controls-structures design methodology for a class of flexible space structures

    NASA Technical Reports Server (NTRS)

    Maghami, Peiman G.; Gupta, Sandeep; Elliott, Kenny B.; Joshi, Suresh M.; Walz, Joseph E.

    1994-01-01

    This paper describes the first experimental validation of an optimization-based integrated controls-structures design methodology for a class of flexible space structures. The Controls-Structures-Interaction (CSI) Evolutionary Model, a laboratory test bed at Langley, is redesigned based on the integrated design methodology with two different dissipative control strategies. The redesigned structure is fabricated, assembled in the laboratory, and experimentally compared with the original test structure. Design guides are proposed and used in the integrated design process to ensure that the resulting structure can be fabricated. Experimental results indicate that the integrated design requires greater than 60 percent less average control power (by thruster actuators) than the conventional control-optimized design while maintaining the required line-of-sight performance, thereby confirming the analytical findings about the superiority of the integrated design methodology. Amenability of the integrated design structure to other control strategies is considered and evaluated analytically and experimentally. This work also demonstrates the capabilities of the Langley-developed design tool CSI DESIGN which provides a unified environment for structural and control design.

  4. A neural network based methodology to predict site-specific spectral acceleration values

    NASA Astrophysics Data System (ADS)

    Kamatchi, P.; Rajasankar, J.; Ramana, G. V.; Nagpal, A. K.

    2010-12-01

    A general neural network based methodology that has the potential to replace the computationally-intensive site-specific seismic analysis of structures is proposed in this paper. The basic framework of the methodology consists of a feed forward back propagation neural network algorithm with one hidden layer to represent the seismic potential of a region and soil amplification effects. The methodology is implemented and verified with parameters corresponding to Delhi city in India. For this purpose, strong ground motions are generated at bedrock level for a chosen site in Delhi due to earthquakes considered to originate from the central seismic gap of the Himalayan belt using necessary geological as well as geotechnical data. Surface level ground motions and corresponding site-specific response spectra are obtained by using a one-dimensional equivalent linear wave propagation model. Spectral acceleration values are considered as a target parameter to verify the performance of the methodology. Numerical studies carried out to validate the proposed methodology show that the errors in predicted spectral acceleration values are within acceptable limits for design purposes. The methodology is general in the sense that it can be applied to other seismically vulnerable regions and also can be updated by including more parameters depending on the state-of-the-art in the subject.

  5. A PDE-based methodology for modeling, parameter estimation and feedback control in structural and structural acoustic systems

    NASA Technical Reports Server (NTRS)

    Banks, H. T.; Brown, D. E.; Metcalf, Vern L.; Silcox, R. J.; Smith, Ralph C.; Wang, Yun

    1994-01-01

    A problem of continued interest concerns the control of vibrations in a flexible structure and the related problem of reducing structure-borne noise in structural acoustic systems. In both cases, piezoceramic patches bonded to the structures have been successfully used as control actuators. Through the application of a controlling voltage, the patches can be used to reduce structural vibrations which in turn lead to methods for reducing structure-borne noise. A PDE-based methodology for modeling, estimating physical parameters, and implementing a feedback control scheme for problems of this type is discussed. While the illustrating example is a circular plate, the methodology is sufficiently general so as to be applicable in a variety of structural and structural acoustic systems.

  6. ARCHITECT: The architecture-based technology evaluation and capability tradeoff method

    NASA Astrophysics Data System (ADS)

    Griendling, Kelly A.

    The use of architectures for the design, development, and documentation of system-of-systems engineering has become a common practice in recent years. This practice became mandatory in the defense industry in 2004 when the Department of Defense Architecture Framework (DoDAF) Promulgation Memo mandated that all Department of Defense (DoD) architectures must be DoDAF compliant. Despite this mandate, there has been significant confusion and a lack of consistency in the creation and the use of the architecture products. Products are typically created as static documents used for communication and documentation purposes that are difficult to change and do not support engineering design activities and acquisition decision making. At the same time, acquisition guidance has been recently reformed to move from the bottom-up approach of the Requirements Generation System (RGS) to the top-down approach mandated by the Joint Capabilities Integration and Devel- opment System (JCIDS), which requires the use of DoDAF to support acquisition. Defense agencies have had difficulty adjusting to this new policy, and are struggling to determine how to meet new acquisition requirements. This research has developed the Architecture-based Technology Evaluation and Capability Tradeoff (ARCHITECT) Methodology to respond to these challenges and address concerns raised about the defense acquisition process, particularly the time required to implement parts of the process, the need to evaluate solutions across capability and mission areas, and the need to use a rigorous, traceable, repeatable method that utilizes modeling and simulation to better substantiate early-phase acquisition decisions. The objective is to create a capability-based systems engineering methodology for the early phases of design and acquisition (specifically Pre-Milestone A activities) which improves agility in defense acquisition by (1) streamlining the development of key elements of JCIDS and DoDAF, (2) moving the creation of DoDAF products forward in the defense acquisition process, and (3) using DoDAF products for more than documentation by integrating them into the problem definition and analysis of alternatives phases and applying executable architecting. This research proposes and demonstrates the plausibility of a prescriptive methodology for developing executable DoDAF products which will explicitly support decision-making in the early phases of JCIDS. A set of criteria by which CBAs should be judged is proposed, and the methodology is developed with these criteria in mind. The methodology integrates existing tools and techniques for systems engineering and system of systems engineering with several new modeling and simulation tools and techniques developed as part of this research to fill gaps noted in prior CBAs. A suppression of enemy air defenses (SEAD) mission is used to demonstrate the ap- plication of ARCHITECT and to show the plausibility of the approach. For the SEAD study, metrics are derived and a gap analysis is performed. The study then identifies and quantitatively compares system and operational architecture alternatives for performing SEAD. A series of down-selections is performed to identify promising architectures, and these promising solutions are subject to further analysis where the impacts of force structure and network structure are examined. While the numerical results of the SEAD study are notional and could not be applied to an actual SEAD CBA, the example served to highlight many of the salient features of the methodology. The SEAD study presented enabled pre-Milestone A tradeoffs to be performed quantitatively across a large number of architectural alternatives in a traceable and repeatable manner. The alternatives considered included variations on operations, systems, organizational responsibilities (through the assignment of systems to tasks), network (or collaboration) structure, interoperability level, and force structure. All of the information used in the study is preserved in the environment, which is dynamic and allows for on-the-fly analysis. The assumptions used were consistent, which was assured through the use of single file documenting all inputs, which was shared across all models. Furthermore, a model was made of the ARCHITECT methodology itself, and was used to demonstrate that even if the steps took twice as long to perform as they did in the case of the SEAD example, the methodology still provides the ability to conduct CBA analyses in less time than prior CBAs to date. Overall, it is shown that the ARCHITECT methodology results in an improvement over current CBAs in the criteria developed here.

  7. A multi-disciplinary approach for the structural monitoring of Cultural Heritages in a seismic area

    NASA Astrophysics Data System (ADS)

    Fabrizia Buongiorno, Maria; Musacchio, Massimo; Guerra, Ignazio; Porco, Giacinto; Stramondo, Salvatore; Casula, Giuseppe; Caserta, Arrigo; Speranza, Fabio; Doumaz, Fawzi; Giovanna Bianchi, Maria; Luzi, Guido; Ilaria Pannaccione Apa, Maria; Montuori, Antonio; Gaudiosi, Iolanda; Vecchio, Antonio; Gervasi, Anna; Bonali, Elena; Romano, Dolores; Falcone, Sergio; La Piana, Carmelo

    2014-05-01

    In the recent years, the concepts of seismic risk vulnerability and structural health monitoring have become very important topics in the field of both structural and civil engineering for the identification of appropriate risk indicators and risk assessment methodologies in Cultural Heritages monitoring. The latter, which includes objects, building and sites with historical, architectural and/or engineering relevance, concerns the management, the preservation and the maintenance of the heritages within their surrounding environmental context, in response to climate changes and natural hazards (e.g. seismic, volcanic, landslides and flooding hazards). Within such a framework, the complexity and the great number of variables to be considered require a multi-disciplinary approach including strategies, methodologies and tools able to provide an effective monitoring of Cultural Heritages form both scientific and operational viewpoints. Based on this rationale, in this study, an advanced, technological and operationally-oriented approach is presented and tested, which enables measuring and monitoring Cultural Heritage conservation state and geophysical/geological setting of the area, in order to mitigate the seismic risk of the historical public goods at different spatial scales*. The integration between classical geophysical methods with new emerging sensing techniques enables a multi-depth, multi-resolution, and multi-scale monitoring in both space and time. An integrated system of methodologies, instrumentation and data-processing approaches for non-destructive Cultural Heritage investigations is proposed, which concerns, in detail, the analysis of seismogenetic sources, the geological-geotechnical setting of the area and site seismic effects evaluation, proximal remote sensing techniques (e.g. terrestrial laser scanner, ground-based radar systems, thermal cameras), high-resolution aerial and satellite-based remote sensing methodologies (e.g. aeromagnetic surveys, synthetic aperture radar, optical, multispectral and panchromatic measurements), static and dynamic structural health monitoring analysis (e.g. screening tests with georadar, sonic instruments, sclerometers and optic fibers). The final purpose of the proposed approach is the development of an investigation methodology for short- and long-term Cultural Heritages preservation in response to seismic stress, which has specific features of scalability, modularity and exportability for every possible monitoring configuration. Moreover, it allows gathering useful information to furnish guidelines for Institution and local Administration to plan consolidation actions and therefore prevention activity. Some preliminary results will be presented for the test site of Calabria Region, where some architectural heritages have been properly selected as case studies for monitoring purposes. *The present work is supported and funded by Ministero dell'Università, dell'Istruzione e della Ricerca (MIUR) under the research project PON01-02710 "MASSIMO" - "Monitoraggio in Area Sismica di Sistemi Monumentali".

  8. Genetic Transformation of the Biocontrol Fungus Gliocladium virens to Benomyl Resistance

    PubMed Central

    Ossanna, Nina; Mischke, Sue

    1990-01-01

    Methodology was developed to isolate and regenerate protoplasts from the biocontrol fungus Gliocladium virens and to transform them to benomyl resistance with a Neurospora crassa β-tubulin gene. Southern blots demonstrated that multiple copies of the vector integrated into the chromosomal DNA of stable biotypes but not of abortive transformants. Analysis of nuclear condition in vegetative and asexual structures demonstrated that no structure of G. virens is dependably uninucleate and thus preferentially suitable for transformation. Images PMID:16348312

  9. Probabilistic structural analysis to quantify uncertainties associated with turbopump blades

    NASA Technical Reports Server (NTRS)

    Nagpal, Vinod K.; Rubinstein, Robert; Chamis, Christos C.

    1988-01-01

    A probabilistic study of turbopump blades has been in progress at NASA Lewis Research Center for over the last two years. The objectives of this study are to evaluate the effects of uncertainties in geometry and material properties on the structural response of the turbopump blades to evaluate the tolerance limits on the design. A methodology based on probabilistic approach was developed to quantify the effects of the random uncertainties. The results indicate that only the variations in geometry have significant effects.

  10. Associated with aerospace vehicles development of methodologies for the estimation of thermal properties

    NASA Technical Reports Server (NTRS)

    Scott, Elaine P.

    1994-01-01

    Thermal stress analyses are an important aspect in the development of aerospace vehicles at NASA-LaRC. These analyses require knowledge of the temperature distributions within the vehicle structures which consequently necessitates the need for accurate thermal property data. The overall goal of this ongoing research effort is to develop methodologies for the estimation of the thermal property data needed to describe the temperature responses of these complex structures. The research strategy undertaken utilizes a building block approach. The idea here is to first focus on the development of property estimation methodologies for relatively simple conditions, such as isotropic materials at constant temperatures, and then systematically modify the technique for the analysis of more and more complex systems, such as anisotropic multi-component systems. The estimation methodology utilized is a statistically based method which incorporates experimental data and a mathematical model of the system. Several aspects of this overall research effort were investigated during the time of the ASEE summer program. One important aspect involved the calibration of the estimation procedure for the estimation of the thermal properties through the thickness of a standard material. Transient experiments were conducted using a Pyrex standard at various temperatures, and then the thermal properties (thermal conductivity and volumetric heat capacity) were estimated at each temperature. Confidence regions for the estimated values were also determined. These results were then compared to documented values. Another set of experimental tests were conducted on carbon composite samples at different temperatures. Again, the thermal properties were estimated for each temperature, and the results were compared with values obtained using another technique. In both sets of experiments, a 10-15 percent off-set between the estimated values and the previously determined values was found. Another effort was related to the development of the experimental techniques. Initial experiments required a resistance heater placed between two samples. The design was modified such that the heater was placed on the surface of only one sample, as would be necessary in the analysis of built up structures. Experiments using the modified technique were conducted on the composite sample used previously at different temperatures. The results were within 5 percent of those found using two samples. Finally, an initial heat transfer analysis, including conduction, convection and radiation components, was completed on a titanium sandwich structural sample. Experiments utilizing this sample are currently being designed and will be used to first estimate the material's effective thermal conductivity and later to determine the properties associated with each individual heat transfer component.

  11. D-isoascorbyl palmitate: lipase-catalyzed synthesis, structural characterization and process optimization using response surface methodology.

    PubMed

    Sun, Wen-Jing; Zhao, Hong-Xia; Cui, Feng-Jie; Li, Yun-Hong; Yu, Si-Lian; Zhou, Qiang; Qian, Jing-Ya; Dong, Ying

    2013-07-08

    Isoascorbic acid is a stereoisomer of L-ascorbic acid, and widely used as a food antioxidant. However, its highly hydrophilic behavior prevents its application in cosmetics or fats and oils-based foods. To overcome this problem, D-isoascorbyl palmitate was synthesized in the present study for improving the isoascorbic acid's oil solubility with an immobilized lipase in organic media. The structural information of synthesized product was clarified using LC-ESI-MS, FT-IR, 1H and 13C NMR analysis, and process parameters for high yield of D-isoascorbyl palmitate were optimized by using One-factor-at-a-time experiments and response surface methodology (RSM). The synthesized product had the purity of 95% and its structural characteristics were confirmed as isoascorbyl palmitate by LC-ESI-MS, FT-IR, 1H, and 13C NMR analysis. Results from "one-factor-at-a-time" experiments indicated that the enzyme load, reaction temperature and D-isoascorbic-to-palmitic acid molar ratio had a significant effect on the D-isoascorbyl palmitate conversion rate. 95.32% of conversion rate was obtained by using response surface methodology (RSM) under the the optimized condition: enzyme load of 20% (w/w), reaction temperature of 53°C and D- isoascorbic-to-palmitic acid molar ratio of 1:4 when the reaction parameters were set as: acetone 20 mL, 40 g/L of molecular sieves content, 200 rpm speed for 24-h reaction time. The findings of this study can become a reference for developing industrial processes for the preparation of isoascorbic acid ester, which might be used in food additives, cosmetic formulations and for the synthesis of other isoascorbic acid derivatives.

  12. D-isoascorbyl palmitate: lipase-catalyzed synthesis, structural characterization and process optimization using response surface methodology

    PubMed Central

    2013-01-01

    Background Isoascorbic acid is a stereoisomer of L-ascorbic acid, and widely used as a food antioxidant. However, its highly hydrophilic behavior prevents its application in cosmetics or fats and oils-based foods. To overcome this problem, D-isoascorbyl palmitate was synthesized in the present study for improving the isoascorbic acid’s oil solubility with an immobilized lipase in organic media. The structural information of synthesized product was clarified using LC-ESI-MS, FT-IR, 1H and 13C NMR analysis, and process parameters for high yield of D-isoascorbyl palmitate were optimized by using One–factor-at-a-time experiments and response surface methodology (RSM). Results The synthesized product had the purity of 95% and its structural characteristics were confirmed as isoascorbyl palmitate by LC-ESI-MS, FT-IR, 1H, and 13C NMR analysis. Results from “one–factor-at-a-time” experiments indicated that the enzyme load, reaction temperature and D-isoascorbic-to-palmitic acid molar ratio had a significant effect on the D-isoascorbyl palmitate conversion rate. 95.32% of conversion rate was obtained by using response surface methodology (RSM) under the the optimized condition: enzyme load of 20% (w/w), reaction temperature of 53°C and D- isoascorbic-to-palmitic acid molar ratio of 1:4 when the reaction parameters were set as: acetone 20 mL, 40 g/L of molecular sieves content, 200 rpm speed for 24-h reaction time. Conclusion The findings of this study can become a reference for developing industrial processes for the preparation of isoascorbic acid ester, which might be used in food additives, cosmetic formulations and for the synthesis of other isoascorbic acid derivatives. PMID:23835418

  13. Reliability in content analysis: The case of semantic feature norms classification.

    PubMed

    Bolognesi, Marianna; Pilgram, Roosmaryn; van den Heerik, Romy

    2017-12-01

    Semantic feature norms (e.g., STIMULUS: car → RESPONSE: ) are commonly used in cognitive psychology to look into salient aspects of given concepts. Semantic features are typically collected in experimental settings and then manually annotated by the researchers into feature types (e.g., perceptual features, taxonomic features, etc.) by means of content analyses-that is, by using taxonomies of feature types and having independent coders perform the annotation task. However, the ways in which such content analyses are typically performed and reported are not consistent across the literature. This constitutes a serious methodological problem that might undermine the theoretical claims based on such annotations. In this study, we first offer a review of some of the released datasets of annotated semantic feature norms and the related taxonomies used for content analysis. We then provide theoretical and methodological insights in relation to the content analysis methodology. Finally, we apply content analysis to a new dataset of semantic features and show how the method should be applied in order to deliver reliable annotations and replicable coding schemes. We tackle the following issues: (1) taxonomy structure, (2) the description of categories, (3) coder training, and (4) sustainability of the coding scheme-that is, comparison of the annotations provided by trained versus novice coders. The outcomes of the project are threefold: We provide methodological guidelines for semantic feature classification; we provide a revised and adapted taxonomy that can (arguably) be applied to both concrete and abstract concepts; and we provide a dataset of annotated semantic feature norms.

  14. Application of design sensitivity analysis for greater improvement on machine structural dynamics

    NASA Technical Reports Server (NTRS)

    Yoshimura, Masataka

    1987-01-01

    Methodologies are presented for greatly improving machine structural dynamics by using design sensitivity analyses and evaluative parameters. First, design sensitivity coefficients and evaluative parameters of structural dynamics are described. Next, the relations between the design sensitivity coefficients and the evaluative parameters are clarified. Then, design improvement procedures of structural dynamics are proposed for the following three cases: (1) addition of elastic structural members, (2) addition of mass elements, and (3) substantial charges of joint design variables. Cases (1) and (2) correspond to the changes of the initial framework or configuration, and (3) corresponds to the alteration of poor initial design variables. Finally, numerical examples are given for demonstrating the availability of the methods proposed.

  15. Pressure-induced structural phase transformation and superconducting properties of titanium mononitride

    NASA Astrophysics Data System (ADS)

    Li, Qian; Guo, Yanan; Zhang, Miao; Ge, Xinlei

    2018-03-01

    In this work, we have systematically performed the first-principles structure search on titanium mononitride (TiN) within Crystal Structure AnaLYsis by Particle Swarm Optimization (CALYPSO) methodology at high pressures. Here, we have confirmed a phase transition from cubic rock-salt (fcc) phase to CsCl (bcc) phase of TiN at ∼348 GPa. Further simulations reveal that the bcc phase is dynamically stable, and could be synthesized experimentally in principle. The calculated elastic anisotropy decreases with the phase transformation from fcc to bcc structure under high pressures, and the material changes from ductile to brittle simultaneously. Moreover, we found that both structures are superconductive with the superconducting critical temperature of 2-12 K.

  16. SiteBinder: an improved approach for comparing multiple protein structural motifs.

    PubMed

    Sehnal, David; Vařeková, Radka Svobodová; Huber, Heinrich J; Geidl, Stanislav; Ionescu, Crina-Maria; Wimmerová, Michaela; Koča, Jaroslav

    2012-02-27

    There is a paramount need to develop new techniques and tools that will extract as much information as possible from the ever growing repository of protein 3D structures. We report here on the development of a software tool for the multiple superimposition of large sets of protein structural motifs. Our superimposition methodology performs a systematic search for the atom pairing that provides the best fit. During this search, the RMSD values for all chemically relevant pairings are calculated by quaternion algebra. The number of evaluated pairings is markedly decreased by using PDB annotations for atoms. This approach guarantees that the best fit will be found and can be applied even when sequence similarity is low or does not exist at all. We have implemented this methodology in the Web application SiteBinder, which is able to process up to thousands of protein structural motifs in a very short time, and which provides an intuitive and user-friendly interface. Our benchmarking analysis has shown the robustness, efficiency, and versatility of our methodology and its implementation by the successful superimposition of 1000 experimentally determined structures for each of 32 eukaryotic linear motifs. We also demonstrate the applicability of SiteBinder using three case studies. We first compared the structures of 61 PA-IIL sugar binding sites containing nine different sugars, and we found that the sugar binding sites of PA-IIL and its mutants have a conserved structure despite their binding different sugars. We then superimposed over 300 zinc finger central motifs and revealed that the molecular structure in the vicinity of the Zn atom is highly conserved. Finally, we superimposed 12 BH3 domains from pro-apoptotic proteins. Our findings come to support the hypothesis that there is a structural basis for the functional segregation of BH3-only proteins into activators and enablers.

  17. Coastal erosion hazard and vulnerability using sig tools. Comparison between "La Barra town, Buenaventura, (Pacific Ocean of Colombia) and Providence - Santa Catalina islands (Colombian Caribbean Sea)

    NASA Astrophysics Data System (ADS)

    Coca-Domínguez, Oswaldo; Ricaurte-Villota, Constanza; Morales-Giraldo, David; Rangel-Buitrago, Nelson

    2014-05-01

    Analysis of hazards and vulnerability associated to coastal erosion along coastlines is a first issue in order to establish plans for adaptation to climate change in coastal areas. La Barra Town, Buenaventura (Pacific ocean of Colombia) and Providence - Santa Catalina Islands (Colombian Caribbean) were selected to develop a detailed analysis of coastal erosion hazard and vulnerability from different perspectives: i) physical (hazard) , ii) social , iii) conservation approach and iv) cultural heritage (Raizal). The analysis was made by a semi quantitative approximation method, applying variables associated with the intrinsic coastal zone properties (i.e. type of beach, exposure of the coast to waves, etc.). Coastal erosion data and associated variables as well land use; conservation and heritage data were used to carry out a further detailed analysis of the human - structural vulnerability and exposure to hazards. The data shows erosion rates close to -17 m yr-1 in La Barra Town (highlighting their critical condition and urgent relocation process), while in some sectors of Providence Island, such as Old Town, erosion rate was -5 m yr-1. The observed erosion process affects directly the land use and the local and regional economy. The differences between indexes and the structural and physical vulnerability as well the use of methodological variables are presented in the context of each region. In this work, all the information was worked using a GIS environment since this allows editing and updating the information continuously. The application of this methodology generates useful information in order to promote risk management as well prevention, mitigation and reduction plans. In both areas the adaptation must be a priority strategy to be considered, including relocation alternatives and sustainable protection with the support of studies of uses and future outlooks in the coast. The methodology is framed into the use of GIS tools and it highlights their benefits in the analysis of information.

  18. Using Plate Finite Elements for Modeling Fillets in Design, Optimization, and Dynamic Analysis

    NASA Technical Reports Server (NTRS)

    Brown, A. M.; Seugling, R. M.

    2003-01-01

    A methodology has been developed that allows the use of plate elements instead of numerically inefficient solid elements for modeling structures with 90 degree fillets. The technique uses plate bridges with pseudo Young's modulus (Eb) and thickness (tb) values placed between the tangent points of the fillets. These parameters are obtained by solving two nonlinear simultaneous equations in terms of the independent variables rlt and twallt. These equations are generated by equating the rotation at the tangent point of a bridge system with that of a fillet, where both rotations are derived using beam theory. Accurate surface fits of the solutions are also presented to provide the user with closed-form equations for the parameters. The methodology was verified on the subcomponent level and with a representative filleted structure, where the technique yielded a plate model exhibiting a level of accuracy better than or equal to a high-fidelity solid model and with a 90-percent reduction in the number of DOFs. The application of this method for parametric design studies, optimization, and dynamic analysis should prove extremely beneficial for the finite element practitioner. Although the method does not attempt to produce accurate stresses in the filleted region, it can also be used to obtain stresses elsewhere in the structure for preliminary analysis. A future avenue of study is to extend the theory developed here to other fillet geometries, including fillet angles other than 90 and multifaceted intersections.

  19. Vulnerability curves vs. vulnerability indicators: application of an indicator-based methodology for debris-flow hazards

    NASA Astrophysics Data System (ADS)

    Papathoma-Köhle, Maria

    2016-08-01

    The assessment of the physical vulnerability of elements at risk as part of the risk analysis is an essential aspect for the development of strategies and structural measures for risk reduction. Understanding, analysing and, if possible, quantifying physical vulnerability is a prerequisite for designing strategies and adopting tools for its reduction. The most common methods for assessing physical vulnerability are vulnerability matrices, vulnerability curves and vulnerability indicators; however, in most of the cases, these methods are used in a conflicting way rather than in combination. The article focuses on two of these methods: vulnerability curves and vulnerability indicators. Vulnerability curves express physical vulnerability as a function of the intensity of the process and the degree of loss, considering, in individual cases only, some structural characteristics of the affected buildings. However, a considerable amount of studies argue that vulnerability assessment should focus on the identification of these variables that influence the vulnerability of an element at risk (vulnerability indicators). In this study, an indicator-based methodology (IBM) for mountain hazards including debris flow (Kappes et al., 2012) is applied to a case study for debris flows in South Tyrol, where in the past a vulnerability curve has been developed. The relatively "new" indicator-based method is being scrutinised and recommendations for its improvement are outlined. The comparison of the two methodological approaches and their results is challenging since both methodological approaches deal with vulnerability in a different way. However, it is still possible to highlight their weaknesses and strengths, show clearly that both methodologies are necessary for the assessment of physical vulnerability and provide a preliminary "holistic methodological framework" for physical vulnerability assessment showing how the two approaches may be used in combination in the future.

  20. Inferring hidden causal relations between pathway members using reduced Google matrix of directed biological networks

    PubMed Central

    2018-01-01

    Signaling pathways represent parts of the global biological molecular network which connects them into a seamless whole through complex direct and indirect (hidden) crosstalk whose structure can change during development or in pathological conditions. We suggest a novel methodology, called Googlomics, for the structural analysis of directed biological networks using spectral analysis of their Google matrices, using parallels with quantum scattering theory, developed for nuclear and mesoscopic physics and quantum chaos. We introduce analytical “reduced Google matrix” method for the analysis of biological network structure. The method allows inferring hidden causal relations between the members of a signaling pathway or a functionally related group of genes. We investigate how the structure of hidden causal relations can be reprogrammed as a result of changes in the transcriptional network layer during cancerogenesis. The suggested Googlomics approach rigorously characterizes complex systemic changes in the wiring of large causal biological networks in a computationally efficient way. PMID:29370181

  1. Techniques of remote sensing applied to the environmental analysis of part of an aquifer located in the São José dos Campos Region sp, Brazil.

    PubMed

    Bressan, Mariana Affonseca; Dos Anjos, Célio Eustáquio

    2003-05-01

    The anthropogenic activity on the surface can modify and introduce new mechanisms of recharging the groundwater system, modifying the tax, the frequency and the quality of recharge of underground waters. The understanding of these mechanisms and the correct evaluation of such modifications are fundamental in determining the vulnerability of groundwater contamination. The groundwater flow of the South Paraíba Compartment, in the region of São José dos Campos, São Paulo, is directly related to structural features of the Taubaté Basin and, therefore, the analysis of its behaviour enhances the understanding of tectonic structure. The methodology adopted for this work consists in pre-processing and processing of the satellite images, visual interpretation of HSI products, field work and data integration. The derivation of the main structural features was based on visual analysis of the texture elements of drainage, and the relief in sedimentary and crystalline rocks. Statistical analysis of the feature densities and the metric-geometric relations between the analysed elements have been conducted. The crystalline rocks, on which the sediments were laying, conditions and controls the structural arrangement of sedimentary formations. The formation of the South Paraíba Grabén is associated with Cenozoic distensive movement which reactivated old features of crust weakness and generated previous cycles with normal characteristics. The environmental analysis is based on the integration of the existing methodology to characterise vulnerability of an universal pollutant and density fracture zone. The digital integration was processed using GIS (Geographic Information System) to delineate five defined vulnerability classes. The hydrogeological settings were analysed in each thematic map and, using fuzzy logic, an index for each different vulnerability class was compiled. Evidence maps could be combined in a series of steps using map algebra.

  2. Protein classification using sequential pattern mining.

    PubMed

    Exarchos, Themis P; Papaloukas, Costas; Lampros, Christos; Fotiadis, Dimitrios I

    2006-01-01

    Protein classification in terms of fold recognition can be employed to determine the structural and functional properties of a newly discovered protein. In this work sequential pattern mining (SPM) is utilized for sequence-based fold recognition. One of the most efficient SPM algorithms, cSPADE, is employed for protein primary structure analysis. Then a classifier uses the extracted sequential patterns for classifying proteins of unknown structure in the appropriate fold category. The proposed methodology exhibited an overall accuracy of 36% in a multi-class problem of 17 candidate categories. The classification performance reaches up to 65% when the three most probable protein folds are considered.

  3. Criteria for the Research Institute for Fragrance Materials, Inc. (RIFM) safety evaluation process for fragrance ingredients.

    PubMed

    Api, A M; Belsito, D; Bruze, M; Cadby, P; Calow, P; Dagli, M L; Dekant, W; Ellis, G; Fryer, A D; Fukayama, M; Griem, P; Hickey, C; Kromidas, L; Lalko, J F; Liebler, D C; Miyachi, Y; Politano, V T; Renskers, K; Ritacco, G; Salvito, D; Schultz, T W; Sipes, I G; Smith, B; Vitale, D; Wilcox, D K

    2015-08-01

    The Research Institute for Fragrance Materials, Inc. (RIFM) has been engaged in the generation and evaluation of safety data for fragrance materials since its inception over 45 years ago. Over time, RIFM's approach to gathering data, estimating exposure and assessing safety has evolved as the tools for risk assessment evolved. This publication is designed to update the RIFM safety assessment process, which follows a series of decision trees, reflecting advances in approaches in risk assessment and new and classical toxicological methodologies employed by RIFM over the past ten years. These changes include incorporating 1) new scientific information including a framework for choosing structural analogs, 2) consideration of the Threshold of Toxicological Concern (TTC), 3) the Quantitative Risk Assessment (QRA) for dermal sensitization, 4) the respiratory route of exposure, 5) aggregate exposure assessment methodology, 6) the latest methodology and approaches to risk assessments, 7) the latest alternatives to animal testing methodology and 8) environmental risk assessment. The assessment begins with a thorough analysis of existing data followed by in silico analysis, identification of 'read across' analogs, generation of additional data through in vitro testing as well as consideration of the TTC approach. If necessary, risk management may be considered. Copyright © 2014 Elsevier Ltd. All rights reserved.

  4. Canadian Air Force Establishment Analysis: Creating a meta-methodology to address integrated questions of force structure, workforce planning and organizational design

    DTIC Science & Technology

    2010-01-28

    Considerations – Position categories: • Hard • Generic or “ soft ” • Advanced Training – Language requirements – Need for • military, combat and/or field...Analysis (DGMPRA) Presentation to MORS WG Personnel and National Security: A Quantitative Approach 25-28 January 2010 Defence Research and...SUPPLEMENTARY NOTES Personnel and National Security: A Quantitative Approach (Unclass), 25-28 January 2010, Johns Hopkins University Applied Physics

  5. Development and application of optimum sensitivity analysis of structures

    NASA Technical Reports Server (NTRS)

    Barthelemy, J. F. M.; Hallauer, W. L., Jr.

    1984-01-01

    The research focused on developing an algorithm applying optimum sensitivity analysis for multilevel optimization. The research efforts have been devoted to assisting NASA Langley's Interdisciplinary Research Office (IRO) in the development of a mature methodology for a multilevel approach to the design of complex (large and multidisciplinary) engineering systems. An effort was undertaken to identify promising multilevel optimization algorithms. In the current reporting period, the computer program generating baseline single level solutions was completed and tested out.

  6. Frequency analysis of a two-stage planetary gearbox using two different methodologies

    NASA Astrophysics Data System (ADS)

    Feki, Nabih; Karray, Maha; Khabou, Mohamed Tawfik; Chaari, Fakher; Haddar, Mohamed

    2017-12-01

    This paper is focused on the characterization of the frequency content of vibration signals issued from a two-stage planetary gearbox. To achieve this goal, two different methodologies are adopted: the lumped-parameter modeling approach and the phenomenological modeling approach. The two methodologies aim to describe the complex vibrations generated by a two-stage planetary gearbox. The phenomenological model describes directly the vibrations as measured by a sensor fixed outside the fixed ring gear with respect to an inertial reference frame, while results from a lumped-parameter model are referenced with respect to a rotating frame and then transferred into an inertial reference frame. Two different case studies of the two-stage planetary gear are adopted to describe the vibration and the corresponding spectra using both models. Each case presents a specific geometry and a specific spectral structure.

  7. Engine dynamic analysis with general nonlinear finite element codes. II - Bearing element implementation, overall numerical characteristics and benchmarking

    NASA Technical Reports Server (NTRS)

    Padovan, J.; Adams, M.; Lam, P.; Fertis, D.; Zeid, I.

    1982-01-01

    Second-year efforts within a three-year study to develop and extend finite element (FE) methodology to efficiently handle the transient/steady state response of rotor-bearing-stator structure associated with gas turbine engines are outlined. The two main areas aim at (1) implanting the squeeze film damper element into a general purpose FE code for testing and evaluation; and (2) determining the numerical characteristics of the FE-generated rotor-bearing-stator simulation scheme. The governing FE field equations are set out and the solution methodology is presented. The choice of ADINA as the general-purpose FE code is explained, and the numerical operational characteristics of the direct integration approach of FE-generated rotor-bearing-stator simulations is determined, including benchmarking, comparison of explicit vs. implicit methodologies of direct integration, and demonstration problems.

  8. Accurate interlaminar stress recovery from finite element analysis

    NASA Technical Reports Server (NTRS)

    Tessler, Alexander; Riggs, H. Ronald

    1994-01-01

    The accuracy and robustness of a two-dimensional smoothing methodology is examined for the problem of recovering accurate interlaminar shear stress distributions in laminated composite and sandwich plates. The smoothing methodology is based on a variational formulation which combines discrete least-squares and penalty-constraint functionals in a single variational form. The smoothing analysis utilizes optimal strains computed at discrete locations in a finite element analysis. These discrete strain data are smoothed with a smoothing element discretization, producing superior accuracy strains and their first gradients. The approach enables the resulting smooth strain field to be practically C1-continuous throughout the domain of smoothing, exhibiting superconvergent properties of the smoothed quantity. The continuous strain gradients are also obtained directly from the solution. The recovered strain gradients are subsequently employed in the integration o equilibrium equations to obtain accurate interlaminar shear stresses. The problem is a simply-supported rectangular plate under a doubly sinusoidal load. The problem has an exact analytic solution which serves as a measure of goodness of the recovered interlaminar shear stresses. The method has the versatility of being applicable to the analysis of rather general and complex structures built of distinct components and materials, such as found in aircraft design. For these types of structures, the smoothing is achieved with 'patches', each patch covering the domain in which the smoothed quantity is physically continuous.

  9. An Overview of Prognosis Health Management Research at Glenn Research Center for Gas Turbine Engine Structures With Special Emphasis on Deformation and Damage Modeling

    NASA Technical Reports Server (NTRS)

    Arnold, Steven M.; Goldberg, Robert K.; Lerch, Bradley A.; Saleeb, Atef F.

    2009-01-01

    Herein a general, multimechanism, physics-based viscoelastoplastic model is presented in the context of an integrated diagnosis and prognosis methodology which is proposed for structural health monitoring, with particular applicability to gas turbine engine structures. In this methodology, diagnostics and prognostics will be linked through state awareness variable(s). Key technologies which comprise the proposed integrated approach include (1) diagnostic/detection methodology, (2) prognosis/lifing methodology, (3) diagnostic/prognosis linkage, (4) experimental validation, and (5) material data information management system. A specific prognosis lifing methodology, experimental characterization and validation and data information management are the focal point of current activities being pursued within this integrated approach. The prognostic lifing methodology is based on an advanced multimechanism viscoelastoplastic model which accounts for both stiffness and/or strength reduction damage variables. Methods to characterize both the reversible and irreversible portions of the model are discussed. Once the multiscale model is validated the intent is to link it to appropriate diagnostic methods to provide a full-featured structural health monitoring system.

  10. An Overview of Prognosis Health Management Research at GRC for Gas Turbine Engine Structures With Special Emphasis on Deformation and Damage Modeling

    NASA Technical Reports Server (NTRS)

    Arnold, Steven M.; Goldberg, Robert K.; Lerch, Bradley A.; Saleeb, Atef F.

    2009-01-01

    Herein a general, multimechanism, physics-based viscoelastoplastic model is presented in the context of an integrated diagnosis and prognosis methodology which is proposed for structural health monitoring, with particular applicability to gas turbine engine structures. In this methodology, diagnostics and prognostics will be linked through state awareness variable(s). Key technologies which comprise the proposed integrated approach include 1) diagnostic/detection methodology, 2) prognosis/lifing methodology, 3) diagnostic/prognosis linkage, 4) experimental validation and 5) material data information management system. A specific prognosis lifing methodology, experimental characterization and validation and data information management are the focal point of current activities being pursued within this integrated approach. The prognostic lifing methodology is based on an advanced multi-mechanism viscoelastoplastic model which accounts for both stiffness and/or strength reduction damage variables. Methods to characterize both the reversible and irreversible portions of the model are discussed. Once the multiscale model is validated the intent is to link it to appropriate diagnostic methods to provide a full-featured structural health monitoring system.

  11. Network Ethnography and the "Cyberflâneur": Evolving Policy Sociology in Education

    ERIC Educational Resources Information Center

    Hogan, Anna

    2016-01-01

    This paper makes the argument that new global spatialities and new governance structures in education have important implications for how we think about education policy and do education policy analysis. This context necessitates that researchers engage in new methodologies to ensure that there is a suitable link between their research problem and…

  12. Teaching Health Education. A Thematic Analysis of Early Career Teachers' Experiences Following Pre-Service Health Training

    ERIC Educational Resources Information Center

    Pickett, Karen; Rietdijk, Willeke; Byrne, Jenny; Shepherd, Jonathan; Roderick, Paul; Grace, Marcus

    2017-01-01

    Purpose: The purpose of this paper is to understand early career teachers' perceptions of the impact of a pre-service health education programme on their health promotion practice in schools and the contextual factors that influence this. Design/methodology/approach: Semi-structured interviews were conducted with 14 primary and secondary trainee…

  13. A Confirmatory Factor Analysis for SERVPERF Instrument Based on a Sample of Students from Syrian Universities

    ERIC Educational Resources Information Center

    Mahmoud, Ali Bassam; Khalifa, Bayan

    2015-01-01

    Purpose: The purpose of this paper is to confirm the factorial structure of SERVPERF based on an exploration of its dimensionality among Syrian universities' students. It also aimed at assessing the perceived service quality offered at these universities. Design/methodology/approach: A cross-sectional survey was conducted targeting students at…

  14. The Adoption Process of Ricefield-Based Fish Seed Production in Northwest Bangladesh: An Understanding through Quantitative and Qualitative Investigation

    ERIC Educational Resources Information Center

    Haque, Mohammad Mahfujul; Little, David C.; Barman, Benoy K.; Wahab, Md. Abdul

    2010-01-01

    Purpose: The purpose of the study was to understand the adoption process of ricefield based fish seed production (RBFSP) that has been developed, promoted and established in Northwest Bangladesh. Design/Methodology/Approach: Quantitative investigation based on regression analysis and qualitative investigation using semi-structured interview were…

  15. Family Structure, Mother-Child Communication, Father-Child Communication, and Adolescent Life Satisfaction: A Cross-Sectional Multilevel Analysis

    ERIC Educational Resources Information Center

    Levin, Kate A.; Currie, Candace

    2010-01-01

    Purpose: The purpose of this paper is to investigate the association between mother-child and father-child communication and children's life satisfaction, and the moderating effect of communication with stepparents. Design/methodology/approach: Data from the 2006 Health Behaviour in School-aged Children: WHO-collaborative Study in Scotland…

  16. School as a Determinant for Health Outcomes--A Structural Equation Model Analysis

    ERIC Educational Resources Information Center

    Ravens-Sieberer, Ulrike; Freeman, John; Kokonyei, Gyongyi; Thomas, Christiane A.; Erhart, Michael

    2009-01-01

    Purpose: The purpose of this paper is to investigate whether students' perceptions of their school environment and their adjustment to school are associated with health outcomes across gender and age groups. Design/methodology/approach: Data from the cross-sectional international Health Behavior in School-aged Children Survey of the year 2002…

  17. Four-Year Cross-Lagged Associations between Physical and Mental Health in the Medical Outcomes Study.

    ERIC Educational Resources Information Center

    Hays, Ron D.; And Others

    1994-01-01

    Applied structural equation modeling to evaluation of cross-lagged panel models. Self-reports of physical and mental health at three time points spanning four-year interval were analyzed to illustrate cross-lagged analysis methodology. Data were analyzed from 856 patients with hypertension, diabetes, heart disease, or depression. Cross-lagged…

  18. The Relationships among Chinese Practicing Teachers' Epistemic Beliefs, Pedagogical Beliefs and Their Beliefs about the Use of ICT

    ERIC Educational Resources Information Center

    Deng, Feng; Chai, Ching Sing; Tsai, Chin-Chung; Lee, Min-Hsien

    2014-01-01

    This study aimed to investigate the relationships among practicing teachers' epistemic beliefs, pedagogical beliefs and their beliefs about the use of ICT through survey methodology. Participants were 396 high school practicing teachers from mainland China. The path analysis results analyzed via structural equation modelling technique indicated…

  19. The Reciprocal Influence of Organizational Culture and Training and Development Programs: Building the Case for a Culture Analysis within Program Planning

    ERIC Educational Resources Information Center

    Kissack, Heather C.; Callahan, Jamie L.

    2010-01-01

    Purpose: The purpose of this paper is to demonstrate that training designers can, and should, account for organizational culture during training needs assessments. Design/methodology/approach: Utilizing the approach and arguments in Giddens' structuration theory, the paper conceptually applies these tenets to training and development programs…

  20. Validation of Virtual Learning Team Competencies for Individual Students in a Distance Education Setting

    ERIC Educational Resources Information Center

    Topchyan, Ruzanna; Zhang, Jie

    2014-01-01

    The purpose of this study was twofold. First, the study aimed to validate the scale of the Virtual Team Competency Inventory in distance education, which had initially been designed for a corporate setting. Second, the methodological advantages of Exploratory Structural Equation Modeling (ESEM) framework over Confirmatory Factor Analysis (CFA)…

  1. Teaching Interpersonal Communication through an Analysis of Students' Initial Interaction: A Q-Methodological Study of Styles in Meeting People.

    ERIC Educational Resources Information Center

    Aitken, Joan E.

    A study categorized self-perceptions of subjects regarding their feelings about initial communication interaction. Using Q-Technique, a total of 138 subjects, mostly students at a midsized, midwestern, urban university enrolled in interpersonal communication courses, were studied through the use of two structured Q-sorts containing statements…

  2. Class Size and Student Performance at a Public Research University: A Cross-Classified Model

    ERIC Educational Resources Information Center

    Johnson, Iryna Y.

    2010-01-01

    This study addresses several methodological problems that have confronted prior research on the effect of class size on student achievement. Unlike previous studies, this analysis accounts for the hierarchical data structure of student achievement, where grades are nested within classes and students, and considers a wide range of class sizes…

  3. A Common Methodology: Using Cluster Analysis to Identify Organizational Culture across Two Workforce Datasets

    ERIC Educational Resources Information Center

    Munn, Sunny L.

    2016-01-01

    Organizational structures are comprised of an organizational culture created by the beliefs, values, traditions, policies and processes carried out by the organization. The work-life system in which individuals use work-life initiatives to achieve a work-life balance can be influenced by the type of organizational culture within one's workplace,…

  4. Making Learning and Web 2.0 Technologies Work for Higher Learning Institutions in Africa

    ERIC Educational Resources Information Center

    Lwoga, Edda

    2012-01-01

    Purpose: This paper seeks to assess the extent to which learning and Web 2.0 technologies are utilised to support learning and teaching in Africa's higher learning institutions, with a specific focus on Tanzania's public universities. Design/methodology/approach: A combination of content analysis and semi-structured interviews was used to collect…

  5. Multiscale Analysis of Delamination of Carbon Fiber-Epoxy Laminates with Carbon Nanotubes

    NASA Technical Reports Server (NTRS)

    Riddick, Jaret C.; Frankland, SJV; Gates, TS

    2006-01-01

    A multi-scale analysis is presented to parametrically describe the Mode I delamination of a carbon fiber/epoxy laminate. In the midplane of the laminate, carbon nanotubes are included for the purposes of selectively enhancing the fracture toughness of the laminate. To analyze carbon fiber epoxy carbon nanotube laminate, the multi-scale methodology presented here links a series of parameterizations taken at various length scales ranging from the atomistic through the micromechanical to the structural level. At the atomistic scale molecular dynamics simulations are performed in conjunction with an equivalent continuum approach to develop constitutive properties for representative volume elements of the molecular structure of components of the laminate. The molecular-level constitutive results are then used in the Mori-Tanaka micromechanics to develop bulk properties for the epoxy-carbon nanotube matrix system. In order to demonstrate a possible application of this multi-scale methodology, a double cantilever beam specimen is modeled. An existing analysis is employed which uses discrete springs to model the fiber bridging affect during delamination propagation. In the absence of empirical data or a damage mechanics model describing the effect of CNTs on fracture toughness, several tractions laws are postulated, linking CNT volume fraction to fiber bridging in a DCB specimen. Results from this demonstration are presented in terms of DCB specimen load-displacement responses.

  6. Design, analysis, and test verification of advanced encapsulation systems

    NASA Technical Reports Server (NTRS)

    Mardesich, N.; Minning, C.

    1982-01-01

    Design sensitivities are established for the development of photovoltaic module criteria and the definition of needed research tasks. The program consists of three phases. In Phase I, analytical models were developed to perform optical, thermal, electrical, and structural analyses on candidate encapsulation systems. From these analyses several candidate systems will be selected for qualification testing during Phase II. Additionally, during Phase II, test specimens of various types will be constructed and tested to determine the validity of the analysis methodology developed in Phase I. In Phse III, a finalized optimum design based on knowledge gained in Phase I and II will be developed. All verification testing was completed during this period. Preliminary results and observations are discussed. Descriptions of the thermal, thermal structural, and structural deflection test setups are included.

  7. Inference regarding multiple structural changes in linear models with endogenous regressors☆

    PubMed Central

    Hall, Alastair R.; Han, Sanggohn; Boldea, Otilia

    2012-01-01

    This paper considers the linear model with endogenous regressors and multiple changes in the parameters at unknown times. It is shown that minimization of a Generalized Method of Moments criterion yields inconsistent estimators of the break fractions, but minimization of the Two Stage Least Squares (2SLS) criterion yields consistent estimators of these parameters. We develop a methodology for estimation and inference of the parameters of the model based on 2SLS. The analysis covers the cases where the reduced form is either stable or unstable. The methodology is illustrated via an application to the New Keynesian Phillips Curve for the US. PMID:23805021

  8. Stochastic HKMDHE: A multi-objective contrast enhancement algorithm

    NASA Astrophysics Data System (ADS)

    Pratiher, Sawon; Mukhopadhyay, Sabyasachi; Maity, Srideep; Pradhan, Asima; Ghosh, Nirmalya; Panigrahi, Prasanta K.

    2018-02-01

    This contribution proposes a novel extension of the existing `Hyper Kurtosis based Modified Duo-Histogram Equalization' (HKMDHE) algorithm, for multi-objective contrast enhancement of biomedical images. A novel modified objective function has been formulated by joint optimization of the individual histogram equalization objectives. The optimal adequacy of the proposed methodology with respect to image quality metrics such as brightness preserving abilities, peak signal-to-noise ratio (PSNR), Structural Similarity Index (SSIM) and universal image quality metric has been experimentally validated. The performance analysis of the proposed Stochastic HKMDHE with existing histogram equalization methodologies like Global Histogram Equalization (GHE) and Contrast Limited Adaptive Histogram Equalization (CLAHE) has been given for comparative evaluation.

  9. Virtual screening of compound libraries.

    PubMed

    Cerqueira, Nuno M F S A; Sousa, Sérgio F; Fernandes, Pedro A; Ramos, Maria João

    2009-01-01

    During the last decade, Virtual Screening (VS) has definitively established itself as an important part of the drug discovery and development process. VS involves the selection of likely drug candidates from large libraries of chemical structures by using computational methodologies, but the generic definition of VS encompasses many different methodologies. This chapter provides an introduction to the field by reviewing a variety of important aspects, including the different types of virtual screening methods, and the several steps required for a successful virtual screening campaign within a state-of-the-art approach, from target selection to postfilter application. This analysis is further complemented with a small collection important VS success stories.

  10. Aerothermoelastic analysis of a NASP demonstrator model

    NASA Technical Reports Server (NTRS)

    Heeg, Jennifer; Zeiler, Thomas A.; Pototzky, Anthony S.; Spain, Charles V.; Engelund, Walter C.

    1993-01-01

    The proposed National AeroSpace Plane (NASP) is designed to travel at speeds up to Mach 25. Because aerodynamic heating during high-speed flight through the atmosphere could destiffen a structure, significant couplings between the elastic and rigid body modes could result in lower flutter speeds and more pronounced aeroelastic response characteristics. These speeds will also generate thermal loads on the structure. The purpose of this research is develop methodologies applicable to the NASP and to apply them to a representative model to determine its aerothermoelastic characteristics when subjected to these thermal loads. This paper describes an aerothermoelastic analysis of the generic hypersonic vehicle configuration. The steps involved in this analysis were: (1) generating vehicle surface temperatures at the appropriate flight conditions; (2) applying these temperatures to the vehicle's structure to predict changes in the stiffness resulting from material property degradation; (3) predicting the vibration characteristics of the heated structure at the various temperature conditions; (4) performing aerodynamic analyses; and (5) conducting flutter analysis of the heated vehicle. Results of these analyses and conclusions representative of a NASP vehicle are provided in this paper.

  11. Aircraft Structural Mass Property Prediction Using Conceptual-Level Structural Analysis

    NASA Technical Reports Server (NTRS)

    Sexstone, Matthew G.

    1998-01-01

    This paper describes a methodology that extends the use of the Equivalent LAminated Plate Solution (ELAPS) structural analysis code from conceptual-level aircraft structural analysis to conceptual-level aircraft mass property analysis. Mass property analysis in aircraft structures has historically depended upon parametric weight equations at the conceptual design level and Finite Element Analysis (FEA) at the detailed design level. ELAPS allows for the modeling of detailed geometry, metallic and composite materials, and non-structural mass coupled with analytical structural sizing to produce high-fidelity mass property analyses representing fully configured vehicles early in the design process. This capability is especially valuable for unusual configuration and advanced concept development where existing parametric weight equations are inapplicable and FEA is too time consuming for conceptual design. This paper contrasts the use of ELAPS relative to empirical weight equations and FEA. ELAPS modeling techniques are described and the ELAPS-based mass property analysis process is detailed. Examples of mass property stochastic calculations produced during a recent systems study are provided. This study involved the analysis of three remotely piloted aircraft required to carry scientific payloads to very high altitudes at subsonic speeds. Due to the extreme nature of this high-altitude flight regime, few existing vehicle designs are available for use in performance and weight prediction. ELAPS was employed within a concurrent engineering analysis process that simultaneously produces aerodynamic, structural, and static aeroelastic results for input to aircraft performance analyses. The ELAPS models produced for each concept were also used to provide stochastic analyses of wing structural mass properties. The results of this effort indicate that ELAPS is an efficient means to conduct multidisciplinary trade studies at the conceptual design level.

  12. Aircraft Structural Mass Property Prediction Using Conceptual-Level Structural Analysis

    NASA Technical Reports Server (NTRS)

    Sexstone, Matthew G.

    1998-01-01

    This paper describes a methodology that extends the use of the Equivalent LAminated Plate Solution (ELAPS) structural analysis code from conceptual-level aircraft structural analysis to conceptual-level aircraft mass property analysis. Mass property analysis in aircraft structures has historically depended upon parametric weight equations at the conceptual design level and Finite Element Analysis (FEA) at the detailed design level ELAPS allows for the modeling of detailed geometry, metallic and composite materials, and non-structural mass coupled with analytical structural sizing to produce high-fidelity mass property analyses representing fully configured vehicles early in the design process. This capability is especially valuable for unusual configuration and advanced concept development where existing parametric weight equations are inapplicable and FEA is too time consuming for conceptual design. This paper contrasts the use of ELAPS relative to empirical weight equations and FEA. ELAPS modeling techniques are described and the ELAPS-based mass property analysis process is detailed Examples of mass property stochastic calculations produced during a recent systems study are provided This study involved the analysis of three remotely piloted aircraft required to carry scientific payloads to very high altitudes at subsonic speeds. Due to the extreme nature of this high-altitude flight regime,few existing vehicle designs are available for use in performance and weight prediction. ELAPS was employed within a concurrent engineering analysis process that simultaneously produces aerodynamic, structural, and static aeroelastic results for input to aircraft performance analyses. The ELAPS models produced for each concept were also used to provide stochastic analyses of wing structural mass properties. The results of this effort indicate that ELAPS is an efficient means to conduct multidisciplinary trade studies at the conceptual design level.

  13. Research and development activities in unified control-structure modeling and design

    NASA Technical Reports Server (NTRS)

    Nayak, A. P.

    1985-01-01

    Results of work to develop a unified control/structures modeling and design capability for large space structures modeling are presented. Recent analytical results are presented to demonstrate the significant interdependence between structural and control properties. A new design methodology is suggested in which the structure, material properties, dynamic model and control design are all optimized simultaneously. Parallel research done by other researchers is reviewed. The development of a methodology for global design optimization is recommended as a long-term goal. It is suggested that this methodology should be incorporated into computer aided engineering programs, which eventually will be supplemented by an expert system to aid design optimization.

  14. The Structure of the Mouse Serotonin 5-HT3 Receptor in Lipid Vesicles.

    PubMed

    Kudryashev, Mikhail; Castaño-Díez, Daniel; Deluz, Cédric; Hassaine, Gherici; Grasso, Luigino; Graf-Meyer, Alexandra; Vogel, Horst; Stahlberg, Henning

    2016-01-05

    The function of membrane proteins is best understood if their structure in the lipid membrane is known. Here, we determined the structure of the mouse serotonin 5-HT3 receptor inserted in lipid bilayers to a resolution of 12 Å without stabilizing antibodies by cryo electron tomography and subtomogram averaging. The reconstruction reveals protein secondary structure elements in the transmembrane region, the extracellular pore, and the transmembrane channel pathway, showing an overall similarity to the available X-ray model of the truncated 5-HT3 receptor determined in the presence of a stabilizing nanobody. Structural analysis of the 5-HT3 receptor embedded in a lipid bilayer allowed the position of the membrane to be determined. Interactions between the densely packed receptors in lipids were visualized, revealing that the interactions were maintained by the short horizontal helices. In combination with methodological improvements, our approach enables the structural analysis of membrane proteins in response to voltage and ligand gating. Copyright © 2016 Elsevier Ltd. All rights reserved.

  15. Modal density of rectangular structures in a wide frequency range

    NASA Astrophysics Data System (ADS)

    Parrinello, A.; Ghiringhelli, G. L.

    2018-04-01

    A novel approach to investigate the modal density of a rectangular structure in a wide frequency range is presented. First, the modal density is derived, in the whole frequency range of interest, on the basis of sound transmission through the infinite counterpart of the structure; then, it is corrected by means of the low-frequency modal behavior of the structure, taking into account actual size and boundary conditions. A statistical analysis reveals the connection between the modal density of the structure and the transmission of sound through its thickness. A transfer matrix approach is used to compute the required acoustic parameters, making it possible to deal with structures having arbitrary stratifications of different layers. A finite element method is applied on coarse grids to derive the first few eigenfrequencies required to correct the modal density. Both the transfer matrix approach and the coarse grids involved in the finite element analysis grant high efficiency. Comparison with alternative formulations demonstrates the effectiveness of the proposed methodology.

  16. Combining analysis with optimization at Langley Research Center. An evolutionary process

    NASA Technical Reports Server (NTRS)

    Rogers, J. L., Jr.

    1982-01-01

    The evolutionary process of combining analysis and optimization codes was traced with a view toward providing insight into the long term goal of developing the methodology for an integrated, multidisciplinary software system for the concurrent analysis and optimization of aerospace structures. It was traced along the lines of strength sizing, concurrent strength and flutter sizing, and general optimization to define a near-term goal for combining analysis and optimization codes. Development of a modular software system combining general-purpose, state-of-the-art, production-level analysis computer programs for structures, aerodynamics, and aeroelasticity with a state-of-the-art optimization program is required. Incorporation of a modular and flexible structural optimization software system into a state-of-the-art finite element analysis computer program will facilitate this effort. This effort results in the software system used that is controlled with a special-purpose language, communicates with a data management system, and is easily modified for adding new programs and capabilities. A 337 degree-of-freedom finite element model is used in verifying the accuracy of this system.

  17. Rating of Dynamic Coefficient for Simple Beam Bridge Design on High-Speed Railways

    NASA Astrophysics Data System (ADS)

    Diachenko, Leonid; Benin, Andrey; Smirnov, Vladimir; Diachenko, Anastasia

    2018-06-01

    The aim of the work is to improve the methodology for the dynamic computation of simple beam spans during the impact of high-speed trains. Mathematical simulation utilizing numerical and analytical methods of structural mechanics is used in the research. The article analyses parameters of the effect of high-speed trains on simple beam spanning bridge structures and suggests a technique of determining of the dynamic index to the live load. Reliability of the proposed methodology is confirmed by results of numerical simulation of high-speed train passage over spans with different speeds. The proposed algorithm of dynamic computation is based on a connection between maximum acceleration of the span in the resonance mode of vibrations and the main factors of stress-strain state. The methodology allows determining maximum and also minimum values of the main efforts in the construction that makes possible to perform endurance tests. It is noted that dynamic additions for the components of the stress-strain state (bending moments, transverse force and vertical deflections) are different. This condition determines the necessity for differentiated approach to evaluation of dynamic coefficients performing design verification of I and II groups of limiting state. The practical importance: the methodology of determining the dynamic coefficients allows making dynamic calculation and determining the main efforts in split beam spans without numerical simulation and direct dynamic analysis that significantly reduces the labour costs for design.

  18. An exact arithmetic toolbox for a consistent and reproducible structural analysis of metabolic network models

    PubMed Central

    Chindelevitch, Leonid; Trigg, Jason; Regev, Aviv; Berger, Bonnie

    2014-01-01

    Constraint-based models are currently the only methodology that allows the study of metabolism at the whole-genome scale. Flux balance analysis is commonly used to analyse constraint-based models. Curiously, the results of this analysis vary with the software being run, a situation that we show can be remedied by using exact rather than floating-point arithmetic. Here we introduce MONGOOSE, a toolbox for analysing the structure of constraint-based metabolic models in exact arithmetic. We apply MONGOOSE to the analysis of 98 existing metabolic network models and find that the biomass reaction is surprisingly blocked (unable to sustain non-zero flux) in nearly half of them. We propose a principled approach for unblocking these reactions and extend it to the problems of identifying essential and synthetic lethal reactions and minimal media. Our structural insights enable a systematic study of constraint-based metabolic models, yielding a deeper understanding of their possibilities and limitations. PMID:25291352

  19. Modified Distribution-Free Goodness-of-Fit Test Statistic.

    PubMed

    Chun, So Yeon; Browne, Michael W; Shapiro, Alexander

    2018-03-01

    Covariance structure analysis and its structural equation modeling extensions have become one of the most widely used methodologies in social sciences such as psychology, education, and economics. An important issue in such analysis is to assess the goodness of fit of a model under analysis. One of the most popular test statistics used in covariance structure analysis is the asymptotically distribution-free (ADF) test statistic introduced by Browne (Br J Math Stat Psychol 37:62-83, 1984). The ADF statistic can be used to test models without any specific distribution assumption (e.g., multivariate normal distribution) of the observed data. Despite its advantage, it has been shown in various empirical studies that unless sample sizes are extremely large, this ADF statistic could perform very poorly in practice. In this paper, we provide a theoretical explanation for this phenomenon and further propose a modified test statistic that improves the performance in samples of realistic size. The proposed statistic deals with the possible ill-conditioning of the involved large-scale covariance matrices.

  20. Development of methodology for identification the nature of the polyphenolic extracts by FTIR associated with multivariate analysis

    NASA Astrophysics Data System (ADS)

    Grasel, Fábio dos Santos; Ferrão, Marco Flôres; Wolf, Carlos Rodolfo

    2016-01-01

    Tannins are polyphenolic compounds of complex structures formed by secondary metabolism in several plants. These polyphenolic compounds have different applications, such as drugs, anti-corrosion agents, flocculants, and tanning agents. This study analyses six different type of polyphenolic extracts by Fourier transform infrared spectroscopy (FTIR) combined with multivariate analysis. Through both principal component analysis (PCA) and hierarchical cluster analysis (HCA), we observed well-defined separation between condensed (quebracho and black wattle) and hydrolysable (valonea, chestnut, myrobalan, and tara) tannins. For hydrolysable tannins, it was also possible to observe the formation of two different subgroups between samples of chestnut and valonea and between samples of tara and myrobalan. Among all samples analysed, the chestnut and valonea showed the greatest similarity, indicating that these extracts contain equivalent chemical compositions and structure and, therefore, similar properties.

  1. Fatigue Life Methodology for Bonded Composite Skin/Stringer Configurations

    NASA Technical Reports Server (NTRS)

    Krueger, Ronald; Paris, Isabelle L.; OBrien, T. Kevin; Minguet, Pierre J.

    2001-01-01

    A methodology is presented for determining the fatigue life of composite structures based on fatigue characterization data and geometric nonlinear finite element (FE) analyses. To demonstrate the approach, predicted results were compared to fatigue tests performed on specimens which represented a tapered composite flange bonded onto a composite skin. In a first step, tension tests were performed to evaluate the debonding mechanisms between the flange and the skin. In a second step, a 2D FE model was developed to analyze the tests. To predict matrix cracking onset, the relationship between the tension load and the maximum principal stresses transverse to the fiber direction was determined through FE analysis. Transverse tension fatigue life data were used to -enerate an onset fatigue life P-N curve for matrix cracking. The resulting prediction was in good agreement with data from the fatigue tests. In a third step, a fracture mechanics approach based on FE analysis was used to determine the relationship between the tension load and the critical energy release rate. Mixed mode energy release rate fatigue life data were used to create a fatigue life onset G-N curve for delamination. The resulting prediction was in good agreement with data from the fatigue tests. Further, the prediction curve for cumulative life to failure was generated from the previous onset fatigue life curves. The results showed that the methodology offers a significant potential to Predict cumulative fatigue life of composite structures.

  2. Structural/aerodynamic Blade Analyzer (SAB) User's Guide, Version 1.0

    NASA Technical Reports Server (NTRS)

    Morel, M. R.

    1994-01-01

    The structural/aerodynamic blade (SAB) analyzer provides an automated tool for the static-deflection analysis of turbomachinery blades with aerodynamic and rotational loads. A structural code calculates a deflected blade shape using aerodynamic loads input. An aerodynamic solver computes aerodynamic loads using deflected blade shape input. The two programs are iterated automatically until deflections converge. Currently, SAB version 1.0 is interfaced with MSC/NASTRAN to perform the structural analysis and PROP3D to perform the aerodynamic analysis. This document serves as a guide for the operation of the SAB system with specific emphasis on its use at NASA Lewis Research Center (LeRC). This guide consists of six chapters: an introduction which gives a summary of SAB; SAB's methodology, component files, links, and interfaces; input/output file structure; setup and execution of the SAB files on the Cray computers; hints and tips to advise the user; and an example problem demonstrating the SAB process. In addition, four appendices are presented to define the different computer programs used within the SAB analyzer and describe the required input decks.

  3. Three dimensional, numerical analysis of an elasto hydrodynamic lubrication using fluid structure interaction (FSI) approach

    NASA Astrophysics Data System (ADS)

    Hanoca, P.; Ramakrishna, H. V.

    2018-03-01

    This work is related to develop a methodology to model and simulate the TEHD using the sequential application of CFD and CSD. The FSI analyses are carried out using ANSYS Workbench. In this analysis steady state, 3D Navier-Stoke equations along with energy equation are solved. Liquid properties are introduced where the viscosity and density are the function of pressure and temperature. The cavitation phenomenon is adopted in the analysis. Numerical analysis has been carried at different speeds and surfaces temperatures. During the analysis, it was found that as speed increases, hydrodynamic pressures will also increases. The pressure profile obtained from the Roelands equation is more sensitive to the temperature as compared to the Barus equation. The stress distributions specify the significant positions in the bearing structure. The developed method is capable of giving latest approaching into the physics of elasto hydrodynamic lubrication.

  4. The physical vulnerability of elements at risk: a methodology based on fluid and classical mechanics

    NASA Astrophysics Data System (ADS)

    Mazzorana, B.; Fuchs, S.; Levaggi, L.

    2012-04-01

    The impacts of the flood events occurred in autumn 2011 in the Italian regions Liguria and Tuscany revived the engagement of the public decision makers to enhance in synergy flood control and land use planning. In this context, the design of efficient flood risk mitigation strategies and their subsequent implementation critically relies on a careful vulnerability analysis of both, the immobile and mobile elements at risk potentially exposed to flood hazards. Based on fluid and classical mechanics notions we developed computation schemes enabling for a dynamic vulnerability and risk analysis facing a broad typological variety of elements at risk. The methodological skeleton consists of (1) hydrodynamic computation of the time-varying flood intensities resulting for each element at risk in a succession of loading configurations; (2) modelling the mechanical response of the impacted elements through static, elasto-static and dynamic analyses; (3) characterising the mechanical response through proper structural damage variables and (4) economic valuation of the expected losses as a function of the quantified damage variables. From a computational perspective we coupled the description of the hydrodynamic flow behaviour and the induced structural modifications of the elements at risk exposed. Valuation methods, suitable to support a correct mapping from the value domains of the physical damage variables to the economic loss values are discussed. In such a way we target to complement from a methodological perspective the existing, mainly empirical, vulnerability and risk assessment approaches to refine the conceptual framework of the cost-benefit analysis. Moreover, we aim to support the design of effective flood risk mitigation strategies by diminishing the main criticalities within the systems prone to flood risk.

  5. Full-field modal analysis during base motion excitation using high-speed 3D digital image correlation

    NASA Astrophysics Data System (ADS)

    Molina-Viedma, Ángel J.; López-Alba, Elías; Felipe-Sesé, Luis; Díaz, Francisco A.

    2017-10-01

    In recent years, many efforts have been made to exploit full-field measurement optical techniques for modal identification. Three-dimensional digital image correlation using high-speed cameras has been extensively employed for this purpose. Modal identification algorithms are applied to process the frequency response functions (FRF), which relate the displacement response of the structure to the excitation force. However, one of the most common tests for modal analysis involves the base motion excitation of a structural element instead of force excitation. In this case, the relationship between response and excitation is typically based on displacements, which are known as transmissibility functions. In this study, a methodology for experimental modal analysis using high-speed 3D digital image correlation and base motion excitation tests is proposed. In particular, a cantilever beam was excited from its base with a random signal, using a clamped edge join. Full-field transmissibility functions were obtained through the beam and converted into FRF for proper identification, considering a single degree-of-freedom theoretical conversion. Subsequently, modal identification was performed using a circle-fit approach. The proposed methodology facilitates the management of the typically large amounts of data points involved in the DIC measurement during modal identification. Moreover, it was possible to determine the natural frequencies, damping ratios and full-field mode shapes without requiring any additional tests. Finally, the results were experimentally validated by comparing them with those obtained by employing traditional accelerometers, analytical models and finite element method analyses. The comparison was performed by using the quantitative indicator modal assurance criterion. The results showed a high level of correspondence, consolidating the proposed experimental methodology.

  6. A review of whole cell wall NMR by the direct-dissolution of biomass

    DOE PAGES

    Foston, Marcus B.; Samuel, Reichel; He, Jian; ...

    2016-01-19

    To fully realize the potential of lignocellulosic biomass as a renewable resource for the production of fuels, chemicals, and materials, an improved understanding of the chemical and molecular structures within biomass and how those structures are formed during biosynthesis and transformed during (thermochemical and biological) conversion must be developed. This effort will require analytical techniques which are not only in-depth, rapid, and cost-effective, but also leave native cell wall features intact. Whole plant cell wall nuclear magnetic resonance (NMR) analysis facilitates unparalleled structural characterization of lignocellulosic biomass without causing (or with minimal) structural modification. The objective of this review ismore » to summarize research pertaining to solution- or gel-state whole plant cell wall NMR analysis of biomass, demonstrating the capability of NMR to delineate the structural features and transformations of biomass. In particular, this review will focus on the application of a two-dimensional solution-state NMR technique and perdeuterated ionic liquid based organic electrolyte solvents for the direct dissolution and analysis of biomass. Furthermore, we believe this type of analysis will be critical to advancing biofuel research, improving bioprocessing methodology, and enhancing plant bioengineering efforts.« less

  7. A review of whole cell wall NMR by the direct-dissolution of biomass

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Foston, Marcus B.; Samuel, Reichel; He, Jian

    To fully realize the potential of lignocellulosic biomass as a renewable resource for the production of fuels, chemicals, and materials, an improved understanding of the chemical and molecular structures within biomass and how those structures are formed during biosynthesis and transformed during (thermochemical and biological) conversion must be developed. This effort will require analytical techniques which are not only in-depth, rapid, and cost-effective, but also leave native cell wall features intact. Whole plant cell wall nuclear magnetic resonance (NMR) analysis facilitates unparalleled structural characterization of lignocellulosic biomass without causing (or with minimal) structural modification. The objective of this review ismore » to summarize research pertaining to solution- or gel-state whole plant cell wall NMR analysis of biomass, demonstrating the capability of NMR to delineate the structural features and transformations of biomass. In particular, this review will focus on the application of a two-dimensional solution-state NMR technique and perdeuterated ionic liquid based organic electrolyte solvents for the direct dissolution and analysis of biomass. Furthermore, we believe this type of analysis will be critical to advancing biofuel research, improving bioprocessing methodology, and enhancing plant bioengineering efforts.« less

  8. A new formulation for air-blast fluid-structure interaction using an immersed approach. Part I: basic methodology and FEM-based simulations

    NASA Astrophysics Data System (ADS)

    Bazilevs, Y.; Kamran, K.; Moutsanidis, G.; Benson, D. J.; Oñate, E.

    2017-07-01

    In this two-part paper we begin the development of a new class of methods for modeling fluid-structure interaction (FSI) phenomena for air blast. We aim to develop accurate, robust, and practical computational methodology, which is capable of modeling the dynamics of air blast coupled with the structure response, where the latter involves large, inelastic deformations and disintegration into fragments. An immersed approach is adopted, which leads to an a-priori monolithic FSI formulation with intrinsic contact detection between solid objects, and without formal restrictions on the solid motions. In Part I of this paper, the core air-blast FSI methodology suitable for a variety of discretizations is presented and tested using standard finite elements. Part II of this paper focuses on a particular instantiation of the proposed framework, which couples isogeometric analysis (IGA) based on non-uniform rational B-splines and a reproducing-kernel particle method (RKPM), which is a Meshfree technique. The combination of IGA and RKPM is felt to be particularly attractive for the problem class of interest due to the higher-order accuracy and smoothness of both discretizations, and relative simplicity of RKPM in handling fragmentation scenarios. A collection of mostly 2D numerical examples is presented in each of the parts to illustrate the good performance of the proposed air-blast FSI framework.

  9. Probabilistic Sizing and Verification of Space Ceramic Structures

    NASA Astrophysics Data System (ADS)

    Denaux, David; Ballhause, Dirk; Logut, Daniel; Lucarelli, Stefano; Coe, Graham; Laine, Benoit

    2012-07-01

    Sizing of ceramic parts is best optimised using a probabilistic approach which takes into account the preexisting flaw distribution in the ceramic part to compute a probability of failure of the part depending on the applied load, instead of a maximum allowable load as for a metallic part. This requires extensive knowledge of the material itself but also an accurate control of the manufacturing process. In the end, risk reduction approaches such as proof testing may be used to lower the final probability of failure of the part. Sizing and verification of ceramic space structures have been performed by Astrium for more than 15 years, both with Zerodur and SiC: Silex telescope structure, Seviri primary mirror, Herschel telescope, Formosat-2 instrument, and other ceramic structures flying today. Throughout this period of time, Astrium has investigated and developed experimental ceramic analysis tools based on the Weibull probabilistic approach. In the scope of the ESA/ESTEC study: “Mechanical Design and Verification Methodologies for Ceramic Structures”, which is to be concluded in the beginning of 2012, existing theories, technical state-of-the-art from international experts, and Astrium experience with probabilistic analysis tools have been synthesized into a comprehensive sizing and verification method for ceramics. Both classical deterministic and more optimised probabilistic methods are available, depending on the criticality of the item and on optimisation needs. The methodology, based on proven theory, has been successfully applied to demonstration cases and has shown its practical feasibility.

  10. Deterministic Multiaxial Creep and Creep Rupture Enhancements for CARES/Creep Integrated Design Code

    NASA Technical Reports Server (NTRS)

    Jadaan, Osama M.

    1998-01-01

    High temperature and long duration applications of monolithic ceramics can place their failure mode in the creep rupture regime. A previous model advanced by the authors described a methodology by which the creep rupture life of a loaded component can be predicted. That model was based on the life fraction damage accumulation rule in association with the modified Monkman-Grant creep rupture criterion. However, that model did not take into account the deteriorating state of the material due to creep damage (e.g., cavitation) as time elapsed. In addition, the material creep parameters used in that life prediction methodology, were based on uniaxial creep curves displaying primary and secondary creep behavior, with no tertiary regime. The objective of this paper is to present a creep life prediction methodology based on a modified form of the Kachanov-Rabotnov continuum damage mechanics (CDM) theory. In this theory, the uniaxial creep rate is described in terms of sum, temperature, time, and the current state of material damage. This scalar damage state parameter is basically an abstract measure of the current state of material damage due to creep deformation. The damage rate is assumed to vary with stress, temperature, time, and the current state of damage itself. Multiaxial creep and creep rupture formulations of the CDM approach are presented in this paper. Parameter estimation methodologies based on nonlinear regression analysis are also described for both, isothermal constant stress states and anisothermal variable stress conditions This creep life prediction methodology was preliminarily added to the integrated design code CARES/Creep (Ceramics Analysis and Reliability Evaluation of Structures/Creep), which is a postprocessor program to commercially available finite element analysis (FEA) packages. Two examples, showing comparisons between experimental and predicted creep lives of ceramic specimens, are used to demonstrate the viability of Ns methodology and the CARES/Creep program.

  11. Methodological Reflections on the Contribution of Qualitative Research to the Evaluation of Clinical Ethics Support Services.

    PubMed

    Wäscher, Sebastian; Salloch, Sabine; Ritter, Peter; Vollmann, Jochen; Schildmann, Jan

    2017-05-01

    This article describes a process of developing, implementing and evaluating a clinical ethics support service intervention with the goal of building up a context-sensitive structure of minimal clinical-ethics in an oncology department without prior clinical ethics structure. Scholars from different disciplines have called for an improvement in the evaluation of clinical ethics support services (CESS) for different reasons over several decades. However, while a lot has been said about the concepts and methodological challenges of evaluating CESS up to the present time, relatively few empirical studies have been carried out. The aim of this article is twofold. On the one hand, it describes a process of development, modifying and evaluating a CESS intervention as part of the ETHICO research project, using the approach of qualitative-formative evaluation. On the other hand, it provides a methodological analysis which specifies the contribution of qualitative empirical methods to the (formative) evaluation of CESS. We conclude with a consideration of the strengths and limitations of qualitative evaluation research with regards to the evaluation and development of context sensitive CESS. We further discuss our own approach in contrast to rather traditional consult or committee models. © 2017 John Wiley & Sons Ltd.

  12. Multiple templates-based homology modeling enhances structure quality of AT1 receptor: validation by molecular dynamics and antagonist docking.

    PubMed

    Sokkar, Pandian; Mohandass, Shylajanaciyar; Ramachandran, Murugesan

    2011-07-01

    We present a comparative account on 3D-structures of human type-1 receptor (AT1) for angiotensin II (AngII), modeled using three different methodologies. AngII activates a wide spectrum of signaling responses via the AT1 receptor that mediates physiological control of blood pressure and diverse pathological actions in cardiovascular, renal, and other cell types. Availability of 3D-model of AT1 receptor would significantly enhance the development of new drugs for cardiovascular diseases. However, templates of AT1 receptor with low sequence similarity increase the complexity in straightforward homology modeling, and hence there is a need to evaluate different modeling methodologies in order to use the models for sensitive applications such as rational drug design. Three models were generated for AT1 receptor by, (1) homology modeling with bovine rhodopsin as template, (2) homology modeling with multiple templates and (3) threading using I-TASSER web server. Molecular dynamics (MD) simulation (15 ns) of models in explicit membrane-water system, Ramachandran plot analysis and molecular docking with antagonists led to the conclusion that multiple template-based homology modeling outweighs other methodologies for AT1 modeling.

  13. Guiding principles of USGS methodology for assessment of undiscovered conventional oil and gas resources

    USGS Publications Warehouse

    Charpentier, R.R.; Klett, T.R.

    2005-01-01

    During the last 30 years, the methodology for assessment of undiscovered conventional oil and gas resources used by the Geological Survey has undergone considerable change. This evolution has been based on five major principles. First, the U.S. Geological Survey has responsibility for a wide range of U.S. and world assessments and requires a robust methodology suitable for immaturely explored as well as maturely explored areas. Second, the assessments should be based on as comprehensive a set of geological and exploration history data as possible. Third, the perils of methods that solely use statistical methods without geological analysis are recognized. Fourth, the methodology and course of the assessment should be documented as transparently as possible, within the limits imposed by the inevitable use of subjective judgement. Fifth, the multiple uses of the assessments require a continuing effort to provide the documentation in such ways as to increase utility to the many types of users. Undiscovered conventional oil and gas resources are those recoverable volumes in undiscovered, discrete, conventional structural or stratigraphic traps. The USGS 2000 methodology for these resources is based on a framework of assessing numbers and sizes of undiscovered oil and gas accumulations and the associated risks. The input is standardized on a form termed the Seventh Approximation Data Form for Conventional Assessment Units. Volumes of resource are then calculated using a Monte Carlo program named Emc2, but an alternative analytic (non-Monte Carlo) program named ASSESS also can be used. The resource assessment methodology continues to change. Accumulation-size distributions are being examined to determine how sensitive the results are to size-distribution assumptions. The resource assessment output is changing to provide better applicability for economic analysis. The separate methodology for assessing continuous (unconventional) resources also has been evolving. Further studies of the relationship between geologic models of conventional and continuous resources will likely impact the respective resource assessment methodologies. ?? 2005 International Association for Mathematical Geology.

  14. Statistical Analysis of Big Data on Pharmacogenomics

    PubMed Central

    Fan, Jianqing; Liu, Han

    2013-01-01

    This paper discusses statistical methods for estimating complex correlation structure from large pharmacogenomic datasets. We selectively review several prominent statistical methods for estimating large covariance matrix for understanding correlation structure, inverse covariance matrix for network modeling, large-scale simultaneous tests for selecting significantly differently expressed genes and proteins and genetic markers for complex diseases, and high dimensional variable selection for identifying important molecules for understanding molecule mechanisms in pharmacogenomics. Their applications to gene network estimation and biomarker selection are used to illustrate the methodological power. Several new challenges of Big data analysis, including complex data distribution, missing data, measurement error, spurious correlation, endogeneity, and the need for robust statistical methods, are also discussed. PMID:23602905

  15. Identification of walking human model using agent-based modelling

    NASA Astrophysics Data System (ADS)

    Shahabpoor, Erfan; Pavic, Aleksandar; Racic, Vitomir

    2018-03-01

    The interaction of walking people with large vibrating structures, such as footbridges and floors, in the vertical direction is an important yet challenging phenomenon to describe mathematically. Several different models have been proposed in the literature to simulate interaction of stationary people with vibrating structures. However, the research on moving (walking) human models, explicitly identified for vibration serviceability assessment of civil structures, is still sparse. In this study, the results of a comprehensive set of FRF-based modal tests were used, in which, over a hundred test subjects walked in different group sizes and walking patterns on a test structure. An agent-based model was used to simulate discrete traffic-structure interactions. The occupied structure modal parameters found in tests were used to identify the parameters of the walking individual's single-degree-of-freedom (SDOF) mass-spring-damper model using 'reverse engineering' methodology. The analysis of the results suggested that the normal distribution with the average of μ = 2.85Hz and standard deviation of σ = 0.34Hz can describe human SDOF model natural frequency. Similarly, the normal distribution with μ = 0.295 and σ = 0.047 can describe the human model damping ratio. Compared to the previous studies, the agent-based modelling methodology proposed in this paper offers significant flexibility in simulating multi-pedestrian walking traffics, external forces and simulating different mechanisms of human-structure and human-environment interaction at the same time.

  16. Direct Bio-printing with Heterogeneous Topology Design.

    PubMed

    Ahsan, Amm Nazmul; Xie, Ruinan; Khoda, Bashir

    2017-01-01

    Bio-additive manufacturing is a promising tool to fabricate porous scaffold structures for expediting the tissue regeneration processes. Unlike the most traditional bulk material objects, the microstructures of tissue and organs are mostly highly anisotropic, heterogeneous, and porous in nature. However, modelling the internal heterogeneity of tissues/organs structures in the traditional CAD environment is difficult and oftentimes inaccurate. Besides, the de facto STL conversion of bio-models introduces loss of information and piles up more errors in each subsequent step (build orientation, slicing, tool-path planning) of the bio-printing process plan. We are proposing a topology based scaffold design methodology to accurately represent the heterogeneous internal architecture of tissues/organs. An image analysis technique is used that digitizes the topology information contained in medical images of tissues/organs. A weighted topology reconstruction algorithm is implemented to represent the heterogeneity with parametric functions. The parametric functions are then used to map the spatial material distribution. The generated information is directly transferred to the 3D bio-printer and heterogeneous porous tissue scaffold structure is manufactured without STL file. The proposed methodology is implemented to verify the effectiveness of the approach and the designed example structure is bio-fabricated with a deposition based bio-additive manufacturing system.

  17. A new approach to flood loss estimation and vulnerability assessment for historic buildings in England

    NASA Astrophysics Data System (ADS)

    Stephenson, V.; D'Ayala, D.

    2013-10-01

    The recent increase in frequency and severity of flooding in the UK has led to a shift in the perception of risk associated with flood hazards. This has extended to the conservation community, and the risks posed to historic structures that suffer from flooding are particularly concerning for those charged with preserving and maintaining such buildings. In order to fully appraise the risks in a manner appropriate to the complex issue of preservation, a new methodology is proposed that studies the nature of vulnerability of such structures, and places it in the context of risk assessment, accounting for the vulnerable object and the subsequent exposure of that object to flood hazards. The testing of the methodology is carried out using three urban case studies and the results of the survey analysis provide key findings and guidance on the development of fragility curves for historic structures exposed to flooding. This occurs through appraisal of key vulnerability indicators related to building form, structural and fabric integrity, and preservation of architectural and archaeological values. This in turn facilitates the production of strategies for mitigating and managing the losses threatened by such extreme climate events.

  18. Utilizing a structural meta-ontology for family-based quality assurance of the BioPortal ontologies.

    PubMed

    Ochs, Christopher; He, Zhe; Zheng, Ling; Geller, James; Perl, Yehoshua; Hripcsak, George; Musen, Mark A

    2016-06-01

    An Abstraction Network is a compact summary of an ontology's structure and content. In previous research, we showed that Abstraction Networks support quality assurance (QA) of biomedical ontologies. The development of an Abstraction Network and its associated QA methodologies, however, is a labor-intensive process that previously was applicable only to one ontology at a time. To improve the efficiency of the Abstraction-Network-based QA methodology, we introduced a QA framework that uses uniform Abstraction Network derivation techniques and QA methodologies that are applicable to whole families of structurally similar ontologies. For the family-based framework to be successful, it is necessary to develop a method for classifying ontologies into structurally similar families. We now describe a structural meta-ontology that classifies ontologies according to certain structural features that are commonly used in the modeling of ontologies (e.g., object properties) and that are important for Abstraction Network derivation. Each class of the structural meta-ontology represents a family of ontologies with identical structural features, indicating which types of Abstraction Networks and QA methodologies are potentially applicable to all of the ontologies in the family. We derive a collection of 81 families, corresponding to classes of the structural meta-ontology, that enable a flexible, streamlined family-based QA methodology, offering multiple choices for classifying an ontology. The structure of 373 ontologies from the NCBO BioPortal is analyzed and each ontology is classified into multiple families modeled by the structural meta-ontology. Copyright © 2016 Elsevier Inc. All rights reserved.

  19. Modeling and replicating statistical topology and evidence for CMB nonhomogeneity

    PubMed Central

    Agami, Sarit

    2017-01-01

    Under the banner of “big data,” the detection and classification of structure in extremely large, high-dimensional, data sets are two of the central statistical challenges of our times. Among the most intriguing new approaches to this challenge is “TDA,” or “topological data analysis,” one of the primary aims of which is providing nonmetric, but topologically informative, preanalyses of data which make later, more quantitative, analyses feasible. While TDA rests on strong mathematical foundations from topology, in applications, it has faced challenges due to difficulties in handling issues of statistical reliability and robustness, often leading to an inability to make scientific claims with verifiable levels of statistical confidence. We propose a methodology for the parametric representation, estimation, and replication of persistence diagrams, the main diagnostic tool of TDA. The power of the methodology lies in the fact that even if only one persistence diagram is available for analysis—the typical case for big data applications—the replications permit conventional statistical hypothesis testing. The methodology is conceptually simple and computationally practical, and provides a broadly effective statistical framework for persistence diagram TDA analysis. We demonstrate the basic ideas on a toy example, and the power of the parametric approach to TDA modeling in an analysis of cosmic microwave background (CMB) nonhomogeneity. PMID:29078301

  20. Finite element methodology for integrated flow-thermal-structural analysis

    NASA Technical Reports Server (NTRS)

    Thornton, Earl A.; Ramakrishnan, R.; Vemaganti, G. R.

    1988-01-01

    Papers entitled, An Adaptive Finite Element Procedure for Compressible Flows and Strong Viscous-Inviscid Interactions, and An Adaptive Remeshing Method for Finite Element Thermal Analysis, were presented at the June 27 to 29, 1988, meeting of the AIAA Thermophysics, Plasma Dynamics and Lasers Conference, San Antonio, Texas. The papers describe research work supported under NASA/Langley Research Grant NsG-1321, and are submitted in fulfillment of the progress report requirement on the grant for the period ending February 29, 1988.

Top