Sample records for analysis results examples

  1. Development of Benchmark Examples for Quasi-Static Delamination Propagation and Fatigue Growth Predictions

    NASA Technical Reports Server (NTRS)

    Krueger, Ronald

    2012-01-01

    The development of benchmark examples for quasi-static delamination propagation and cyclic delamination onset and growth prediction is presented and demonstrated for Abaqus/Standard. The example is based on a finite element model of a Double-Cantilever Beam specimen. The example is independent of the analysis software used and allows the assessment of the automated delamination propagation, onset and growth prediction capabilities in commercial finite element codes based on the virtual crack closure technique (VCCT). First, a quasi-static benchmark example was created for the specimen. Second, based on the static results, benchmark examples for cyclic delamination growth were created. Third, the load-displacement relationship from a propagation analysis and the benchmark results were compared, and good agreement could be achieved by selecting the appropriate input parameters. Fourth, starting from an initially straight front, the delamination was allowed to grow under cyclic loading. The number of cycles to delamination onset and the number of cycles during delamination growth for each growth increment were obtained from the automated analysis and compared to the benchmark examples. Again, good agreement between the results obtained from the growth analysis and the benchmark results could be achieved by selecting the appropriate input parameters. The benchmarking procedure proved valuable by highlighting the issues associated with choosing the input parameters of the particular implementation. Selecting the appropriate input parameters, however, was not straightforward and often required an iterative procedure. Overall the results are encouraging, but further assessment for mixed-mode delamination is required.

  2. Development of Benchmark Examples for Static Delamination Propagation and Fatigue Growth Predictions

    NASA Technical Reports Server (NTRS)

    Kruger, Ronald

    2011-01-01

    The development of benchmark examples for static delamination propagation and cyclic delamination onset and growth prediction is presented and demonstrated for a commercial code. The example is based on a finite element model of an End-Notched Flexure (ENF) specimen. The example is independent of the analysis software used and allows the assessment of the automated delamination propagation, onset and growth prediction capabilities in commercial finite element codes based on the virtual crack closure technique (VCCT). First, static benchmark examples were created for the specimen. Second, based on the static results, benchmark examples for cyclic delamination growth were created. Third, the load-displacement relationship from a propagation analysis and the benchmark results were compared, and good agreement could be achieved by selecting the appropriate input parameters. Fourth, starting from an initially straight front, the delamination was allowed to grow under cyclic loading. The number of cycles to delamination onset and the number of cycles during stable delamination growth for each growth increment were obtained from the automated analysis and compared to the benchmark examples. Again, good agreement between the results obtained from the growth analysis and the benchmark results could be achieved by selecting the appropriate input parameters. The benchmarking procedure proved valuable by highlighting the issues associated with the input parameters of the particular implementation. Selecting the appropriate input parameters, however, was not straightforward and often required an iterative procedure. Overall, the results are encouraging but further assessment for mixed-mode delamination is required.

  3. Development and Application of Benchmark Examples for Mode II Static Delamination Propagation and Fatigue Growth Predictions

    NASA Technical Reports Server (NTRS)

    Krueger, Ronald

    2011-01-01

    The development of benchmark examples for static delamination propagation and cyclic delamination onset and growth prediction is presented and demonstrated for a commercial code. The example is based on a finite element model of an End-Notched Flexure (ENF) specimen. The example is independent of the analysis software used and allows the assessment of the automated delamination propagation, onset and growth prediction capabilities in commercial finite element codes based on the virtual crack closure technique (VCCT). First, static benchmark examples were created for the specimen. Second, based on the static results, benchmark examples for cyclic delamination growth were created. Third, the load-displacement relationship from a propagation analysis and the benchmark results were compared, and good agreement could be achieved by selecting the appropriate input parameters. Fourth, starting from an initially straight front, the delamination was allowed to grow under cyclic loading. The number of cycles to delamination onset and the number of cycles during delamination growth for each growth increment were obtained from the automated analysis and compared to the benchmark examples. Again, good agreement between the results obtained from the growth analysis and the benchmark results could be achieved by selecting the appropriate input parameters. The benchmarking procedure proved valuable by highlighting the issues associated with choosing the input parameters of the particular implementation. Selecting the appropriate input parameters, however, was not straightforward and often required an iterative procedure. Overall the results are encouraging, but further assessment for mixed-mode delamination is required.

  4. DataToText: A Consumer-Oriented Approach to Data Analysis

    ERIC Educational Resources Information Center

    Kenny, David A.

    2010-01-01

    DataToText is a project developed where the user communicates the relevant information for an analysis and DataToText computer routine produces text output that describes in words, tables, and figures the results from the analyses. Two extended examples are given, one an example of a moderator analysis and the other an example of a dyadic data…

  5. Regression Analysis by Example. 5th Edition

    ERIC Educational Resources Information Center

    Chatterjee, Samprit; Hadi, Ali S.

    2012-01-01

    Regression analysis is a conceptually simple method for investigating relationships among variables. Carrying out a successful application of regression analysis, however, requires a balance of theoretical results, empirical rules, and subjective judgment. "Regression Analysis by Example, Fifth Edition" has been expanded and thoroughly…

  6. Task 2 Report: Algorithm Development and Performance Analysis

    DTIC Science & Technology

    1993-07-01

    separated peaks ............................................. 39 7-16 Example ILGC data for schedule 3 phosphites showing an analysis method which integrates...more closely follows the baseline ................. 40 7-18 Example R.GC data for schedule 3 phosphites showing an analysis method resulting in unwanted...much of the ambiguity that can arise in GC/MS with trace environmental samples, for example. Correlated chromatography, on the other hand, separates the

  7. USEPA EXAMPLE EXIT LEVEL ANALYSIS RESULTS

    EPA Science Inventory

    Developed by NERL/ERD for the Office of Solid Waste, the enclosed product provides an example uncertainty analysis (UA) and initial process-based sensitivity analysis (SA) of hazardous waste "exit" concentrations for 7 chemicals and metals using the 3MRA Version 1.0 Modeling Syst...

  8. Isolating the Effects of Training Using Simple Regression Analysis: An Example of the Procedure.

    ERIC Educational Resources Information Center

    Waugh, C. Keith

    This paper provides a case example of simple regression analysis, a forecasting procedure used to isolate the effects of training from an identified extraneous variable. This case example focuses on results of a three-day sales training program to improve bank loan officers' knowledge, skill-level, and attitude regarding solicitation and sale of…

  9. To what degree does the missing-data technique influence the estimated growth in learning strategies over time? A tutorial example of sensitivity analysis for longitudinal data.

    PubMed

    Coertjens, Liesje; Donche, Vincent; De Maeyer, Sven; Vanthournout, Gert; Van Petegem, Peter

    2017-01-01

    Longitudinal data is almost always burdened with missing data. However, in educational and psychological research, there is a large discrepancy between methodological suggestions and research practice. The former suggests applying sensitivity analysis in order to the robustness of the results in terms of varying assumptions regarding the mechanism generating the missing data. However, in research practice, participants with missing data are usually discarded by relying on listwise deletion. To help bridge the gap between methodological recommendations and applied research in the educational and psychological domain, this study provides a tutorial example of sensitivity analysis for latent growth analysis. The example data concern students' changes in learning strategies during higher education. One cohort of students in a Belgian university college was asked to complete the Inventory of Learning Styles-Short Version, in three measurement waves. A substantial number of students did not participate on each occasion. Change over time in student learning strategies was assessed using eight missing data techniques, which assume different mechanisms for missingness. The results indicated that, for some learning strategy subscales, growth estimates differed between the models. Guidelines in terms of reporting the results from sensitivity analysis are synthesised and applied to the results from the tutorial example.

  10. Development and Application of Benchmark Examples for Mixed-Mode I/II Quasi-Static Delamination Propagation Predictions

    NASA Technical Reports Server (NTRS)

    Krueger, Ronald

    2012-01-01

    The development of benchmark examples for quasi-static delamination propagation prediction is presented and demonstrated for a commercial code. The examples are based on finite element models of the Mixed-Mode Bending (MMB) specimen. The examples are independent of the analysis software used and allow the assessment of the automated delamination propagation prediction capability in commercial finite element codes based on the virtual crack closure technique (VCCT). First, quasi-static benchmark examples were created for the specimen. Second, starting from an initially straight front, the delamination was allowed to propagate under quasi-static loading. Third, the load-displacement relationship from a propagation analysis and the benchmark results were compared, and good agreement could be achieved by selecting the appropriate input parameters. Good agreement between the results obtained from the automated propagation analysis and the benchmark results could be achieved by selecting input parameters that had previously been determined during analyses of mode I Double Cantilever Beam and mode II End Notched Flexure specimens. The benchmarking procedure proved valuable by highlighting the issues associated with choosing the input parameters of the particular implementation. Overall the results are encouraging, but further assessment for mixed-mode delamination fatigue onset and growth is required.

  11. Development of a Benchmark Example for Delamination Fatigue Growth Prediction

    NASA Technical Reports Server (NTRS)

    Krueger, Ronald

    2010-01-01

    The development of a benchmark example for cyclic delamination growth prediction is presented and demonstrated for a commercial code. The example is based on a finite element model of a Double Cantilever Beam (DCB) specimen, which is independent of the analysis software used and allows the assessment of the delamination growth prediction capabilities in commercial finite element codes. First, the benchmark result was created for the specimen. Second, starting from an initially straight front, the delamination was allowed to grow under cyclic loading in a finite element model of a commercial code. The number of cycles to delamination onset and the number of cycles during stable delamination growth for each growth increment were obtained from the analysis. In general, good agreement between the results obtained from the growth analysis and the benchmark results could be achieved by selecting the appropriate input parameters. Overall, the results are encouraging but further assessment for mixed-mode delamination is required

  12. A study of concept-based similarity approaches for recommending program examples

    NASA Astrophysics Data System (ADS)

    Hosseini, Roya; Brusilovsky, Peter

    2017-07-01

    This paper investigates a range of concept-based example recommendation approaches that we developed to provide example-based problem-solving support in the domain of programming. The goal of these approaches is to offer students a set of most relevant remedial examples when they have trouble solving a code comprehension problem where students examine a program code to determine its output or the final value of a variable. In this paper, we use the ideas of semantic-level similarity-based linking developed in the area of intelligent hypertext to generate examples for the given problem. To determine the best-performing approach, we explored two groups of similarity approaches for selecting examples: non-structural approaches focusing on examples that are similar to the problem in terms of concept coverage and structural approaches focusing on examples that are similar to the problem by the structure of the content. We also explored the value of personalized example recommendation based on student's knowledge levels and learning goal of the exercise. The paper presents concept-based similarity approaches that we developed, explains the data collection studies and reports the result of comparative analysis. The results of our analysis showed better ranking performance of the personalized structural variant of cosine similarity approach.

  13. Twenty Years On!: Updating the IEA BESTEST Building Thermal Fabric Test Cases for ASHRAE Standard 140

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Judkoff, R.; Neymark, J.

    2013-07-01

    ANSI/ASHRAE Standard 140, Standard Method of Test for the Evaluation of Building Energy Analysis Computer Programs applies the IEA BESTEST building thermal fabric test cases and example simulation results originally published in 1995. These software accuracy test cases and their example simulation results, which comprise the first test suite adapted for the initial 2001 version of Standard 140, are approaching their 20th anniversary. In response to the evolution of the state of the art in building thermal fabric modeling since the test cases and example simulation results were developed, work is commencing to update the normative test specification and themore » informative example results.« less

  14. Twenty Years On!: Updating the IEA BESTEST Building Thermal Fabric Test Cases for ASHRAE Standard 140: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Judkoff, R.; Neymark, J.

    2013-07-01

    ANSI/ASHRAE Standard 140, Standard Method of Test for the Evaluation of Building Energy Analysis Computer Programs applies the IEA BESTEST building thermal fabric test cases and example simulation results originally published in 1995. These software accuracy test cases and their example simulation results, which comprise the first test suite adapted for the initial 2001 version of Standard 140, are approaching their 20th anniversary. In response to the evolution of the state of the art in building thermal fabric modeling since the test cases and example simulation results were developed, work is commencing to update the normative test specification and themore » informative example results.« less

  15. Forensic Analysis of Digital Image Tampering

    DTIC Science & Technology

    2004-12-01

    analysis of when each method fails, which Chapter 4 discusses. Finally, a test image containing an invisible watermark using LSB steganography is...2.2 – Example of invisible watermark using Steganography Software F5 ............. 8 Figure 2.3 – Example of copy-move image forgery [12...Figure 3.11 – Algorithm for JPEG Block Technique ....................................................... 54 Figure 3.12 – “Forged” Image with Result

  16. Additional Guidance for Evaluating and Calculating Degradation Kinetics in Environmental Media

    EPA Pesticide Factsheets

    EFED compiled examples where the PestDF (version 0.8.4), the tool used most commonly by USEPA to conduct kinetic analysis following the NAFTA guidance, results required additional interpretation. Here are some of these examples.

  17. Computational Fluid Dynamics Analysis Success Stories of X-Plane Design to Flight Test

    NASA Technical Reports Server (NTRS)

    Cosentino, Gary B.

    2008-01-01

    Examples of the design and flight test of three true X-planes are described, particularly X-plane design techniques that relied heavily on computational fluid dynamics(CFD) analysis. Three examples are presented: the X-36 Tailless Fighter Agility Research Aircraft, the X-45A Unmanned Combat Air Vehicle, and the X-48B Blended Wing Body Demonstrator Aircraft. An overview is presented of the uses of CFD analysis, comparison and contrast with wind tunnel testing, and information derived from CFD analysis that directly related to successful flight test. Lessons learned on the proper and improper application of CFD analysis are presented. Highlights of the flight-test results of the three example X-planes are presented. This report discusses developing an aircraft shape from early concept and three-dimensional modeling through CFD analysis, wind tunnel testing, further refined CFD analysis, and, finally, flight. An overview of the areas in which CFD analysis does and does not perform well during this process is presented. How wind tunnel testing complements, calibrates, and verifies CFD analysis is discussed. Lessons learned revealing circumstances under which CFD analysis results can be misleading are given. Strengths and weaknesses of the various flow solvers, including panel methods, Euler, and Navier-Stokes techniques, are discussed.

  18. Development and Application of Benchmark Examples for Mixed-Mode I/II Quasi-Static Delamination Propagation Predictions

    NASA Technical Reports Server (NTRS)

    Krueger, Ronald

    2012-01-01

    The development of benchmark examples for quasi-static delamination propagation prediction is presented. The example is based on a finite element model of the Mixed-Mode Bending (MMB) specimen for 50% mode II. The benchmarking is demonstrated for Abaqus/Standard, however, the example is independent of the analysis software used and allows the assessment of the automated delamination propagation prediction capability in commercial finite element codes based on the virtual crack closure technique (VCCT). First, a quasi-static benchmark example was created for the specimen. Second, starting from an initially straight front, the delamination was allowed to propagate under quasi-static loading. Third, the load-displacement as well as delamination length versus applied load/displacement relationships from a propagation analysis and the benchmark results were compared, and good agreement could be achieved by selecting the appropriate input parameters. The benchmarking procedure proved valuable by highlighting the issues associated with choosing the input parameters of the particular implementation. Overall, the results are encouraging, but further assessment for mixed-mode delamination fatigue onset and growth is required.

  19. Architectural Analysis of Dynamically Reconfigurable Systems

    NASA Technical Reports Server (NTRS)

    Lindvall, Mikael; Godfrey, Sally; Ackermann, Chris; Ray, Arnab; Yonkwa, Lyly

    2010-01-01

    oTpics include: the problem (increased flexibility of architectural styles decrease analyzability, behavior emerges and varies depending on the configuration, does the resulting system run according to the intended design, and architectural decisions can impede or facilitate testing); top down approach to architecture analysis, detection of defects and deviations, and architecture and its testability; currently targeted projects GMSEC and CFS; analyzing software architectures; analyzing runtime events; actual architecture recognition; GMPUB in Dynamic SAVE; sample output from new approach; taking message timing delays into account; CFS examples of architecture and testability; some recommendations for improved testablity; and CFS examples of abstract interfaces and testability; CFS example of opening some internal details.

  20. The Contribution of Particle Swarm Optimization to Three-Dimensional Slope Stability Analysis

    PubMed Central

    A Rashid, Ahmad Safuan; Ali, Nazri

    2014-01-01

    Over the last few years, particle swarm optimization (PSO) has been extensively applied in various geotechnical engineering including slope stability analysis. However, this contribution was limited to two-dimensional (2D) slope stability analysis. This paper applied PSO in three-dimensional (3D) slope stability problem to determine the critical slip surface (CSS) of soil slopes. A detailed description of adopted PSO was presented to provide a good basis for more contribution of this technique to the field of 3D slope stability problems. A general rotating ellipsoid shape was introduced as the specific particle for 3D slope stability analysis. A detailed sensitivity analysis was designed and performed to find the optimum values of parameters of PSO. Example problems were used to evaluate the applicability of PSO in determining the CSS of 3D slopes. The first example presented a comparison between the results of PSO and PLAXI-3D finite element software and the second example compared the ability of PSO to determine the CSS of 3D slopes with other optimization methods from the literature. The results demonstrated the efficiency and effectiveness of PSO in determining the CSS of 3D soil slopes. PMID:24991652

  1. The contribution of particle swarm optimization to three-dimensional slope stability analysis.

    PubMed

    Kalatehjari, Roohollah; Rashid, Ahmad Safuan A; Ali, Nazri; Hajihassani, Mohsen

    2014-01-01

    Over the last few years, particle swarm optimization (PSO) has been extensively applied in various geotechnical engineering including slope stability analysis. However, this contribution was limited to two-dimensional (2D) slope stability analysis. This paper applied PSO in three-dimensional (3D) slope stability problem to determine the critical slip surface (CSS) of soil slopes. A detailed description of adopted PSO was presented to provide a good basis for more contribution of this technique to the field of 3D slope stability problems. A general rotating ellipsoid shape was introduced as the specific particle for 3D slope stability analysis. A detailed sensitivity analysis was designed and performed to find the optimum values of parameters of PSO. Example problems were used to evaluate the applicability of PSO in determining the CSS of 3D slopes. The first example presented a comparison between the results of PSO and PLAXI-3D finite element software and the second example compared the ability of PSO to determine the CSS of 3D slopes with other optimization methods from the literature. The results demonstrated the efficiency and effectiveness of PSO in determining the CSS of 3D soil slopes.

  2. Application of Benchmark Examples to Assess the Single and Mixed-Mode Static Delamination Propagation Capabilities in ANSYS

    NASA Technical Reports Server (NTRS)

    Krueger, Ronald

    2012-01-01

    The application of benchmark examples for the assessment of quasi-static delamination propagation capabilities is demonstrated for ANSYS. The examples are independent of the analysis software used and allow the assessment of the automated delamination propagation in commercial finite element codes based on the virtual crack closure technique (VCCT). The examples selected are based on two-dimensional finite element models of Double Cantilever Beam (DCB), End-Notched Flexure (ENF), Mixed-Mode Bending (MMB) and Single Leg Bending (SLB) specimens. First, the quasi-static benchmark examples were recreated for each specimen using the current implementation of VCCT in ANSYS . Second, the delamination was allowed to propagate under quasi-static loading from its initial location using the automated procedure implemented in the finite element software. Third, the load-displacement relationship from a propagation analysis and the benchmark results were compared, and good agreement could be achieved by selecting the appropriate input parameters. The benchmarking procedure proved valuable by highlighting the issues associated with choosing the input parameters of the particular implementation. Overall the results are encouraging, but further assessment for three-dimensional solid models is required.

  3. Simpson's paradox visualized: The example of the Rosiglitazone meta-analysis

    PubMed Central

    Rücker, Gerta; Schumacher, Martin

    2008-01-01

    Background Simpson's paradox is sometimes referred to in the areas of epidemiology and clinical research. It can also be found in meta-analysis of randomized clinical trials. However, though readers are able to recalculate examples from hypothetical as well as real data, they may have problems to easily figure where it emerges from. Method First, two kinds of plots are proposed to illustrate the phenomenon graphically, a scatter plot and a line graph. Subsequently, these can be overlaid, resulting in a overlay plot. The plots are applied to the recent large meta-analysis of adverse effects of rosiglitazone on myocardial infarction and to an example from the literature. A large set of meta-analyses is screened for further examples. Results As noted earlier by others, occurrence of Simpson's paradox in the meta-analytic setting, if present, is associated with imbalance of treatment arm size. This is well illustrated by the proposed plots. The rosiglitazone meta-analysis shows an effect reversion if all trials are pooled. In a sample of 157 meta-analyses, nine showed an effect reversion after pooling, though non-significant in all cases. Conclusion The plots give insight on how the imbalance of trial arm size works as a confounder, thus producing Simpson's paradox. Readers can see why meta-analytic methods must be used and what is wrong with simple pooling. PMID:18513392

  4. Relativity Concept Inventory: Development, Analysis, and Results

    ERIC Educational Resources Information Center

    Aslanides, J. S.; Savage, C. M.

    2013-01-01

    We report on a concept inventory for special relativity: the development process, data analysis methods, and results from an introductory relativity class. The Relativity Concept Inventory tests understanding of relativistic concepts. An unusual feature is confidence testing for each question. This can provide additional information; for example,…

  5. Rapid Harmonic Analysis of Piezoelectric MEMS Resonators.

    PubMed

    Puder, Jonathan M; Pulskamp, Jeffrey S; Rudy, Ryan Q; Cassella, Cristian; Rinaldi, Matteo; Chen, Guofeng; Bhave, Sunil A; Polcawich, Ronald G

    2018-06-01

    This paper reports on a novel simulation method combining the speed of analytical evaluation with the accuracy of finite-element analysis (FEA). This method is known as the rapid analytical-FEA technique (RAFT). The ability of the RAFT to accurately predict frequency response orders of magnitude faster than conventional simulation methods while providing deeper insights into device design not possible with other types of analysis is detailed. Simulation results from the RAFT across wide bandwidths are compared to measured results of resonators fabricated with various materials, frequencies, and topologies with good agreement. These include resonators targeting beam extension, disk flexure, and Lamé beam modes. An example scaling analysis is presented and other applications enabled are discussed as well. The supplemental material includes example code for implementation in ANSYS, although any commonly employed FEA package may be used.

  6. Propensity Score Analysis: An Alternative Statistical Approach for HRD Researchers

    ERIC Educational Resources Information Center

    Keiffer, Greggory L.; Lane, Forrest C.

    2016-01-01

    Purpose: This paper aims to introduce matching in propensity score analysis (PSA) as an alternative statistical approach for researchers looking to make causal inferences using intact groups. Design/methodology/approach: An illustrative example demonstrated the varying results of analysis of variance, analysis of covariance and PSA on a heuristic…

  7. The Propensity Score Analytical Framework: An Overview and Institutional Research Example

    ERIC Educational Resources Information Center

    Herzog, Serge

    2014-01-01

    Estimating the effect of campus math tutoring support, this study demonstrates the use of propensity score weighted and matched-data analysis and examines the correspondence with results from parametric regression analysis.

  8. Incorporating Descriptive Assessment Results into the Design of a Functional Analysis: A Case Example Involving a Preschooler's Hand Mouthing

    ERIC Educational Resources Information Center

    Tiger, Jeffrey H.; Hanley, Gregory P.; Bessette, Kimberly K.

    2006-01-01

    Functional analysis methodology has become the hallmark of behavioral assessment, yielding a determination of behavioral function in roughly 96% of the cases published (Hanley, Iwata, & McCord, 2003). Some authors have suggested that incorporating the results of a descriptive assessment into the design of a functional analysis may be useful in…

  9. Code query by example

    NASA Astrophysics Data System (ADS)

    Vaucouleur, Sebastien

    2011-02-01

    We introduce code query by example for customisation of evolvable software products in general and of enterprise resource planning systems (ERPs) in particular. The concept is based on an initial empirical study on practices around ERP systems. We motivate our design choices based on those empirical results, and we show how the proposed solution helps with respect to the infamous upgrade problem: the conflict between the need for customisation and the need for upgrade of ERP systems. We further show how code query by example can be used as a form of lightweight static analysis, to detect automatically potential defects in large software products. Code query by example as a form of lightweight static analysis is particularly interesting in the context of ERP systems: it is often the case that programmers working in this field are not computer science specialists but more of domain experts. Hence, they require a simple language to express custom rules.

  10. Linear regression analysis: part 14 of a series on evaluation of scientific publications.

    PubMed

    Schneider, Astrid; Hommel, Gerhard; Blettner, Maria

    2010-11-01

    Regression analysis is an important statistical method for the analysis of medical data. It enables the identification and characterization of relationships among multiple factors. It also enables the identification of prognostically relevant risk factors and the calculation of risk scores for individual prognostication. This article is based on selected textbooks of statistics, a selective review of the literature, and our own experience. After a brief introduction of the uni- and multivariable regression models, illustrative examples are given to explain what the important considerations are before a regression analysis is performed, and how the results should be interpreted. The reader should then be able to judge whether the method has been used correctly and interpret the results appropriately. The performance and interpretation of linear regression analysis are subject to a variety of pitfalls, which are discussed here in detail. The reader is made aware of common errors of interpretation through practical examples. Both the opportunities for applying linear regression analysis and its limitations are presented.

  11. Managing large-scale workflow execution from resource provisioning to provenance tracking: The CyberShake example

    USGS Publications Warehouse

    Deelman, E.; Callaghan, S.; Field, E.; Francoeur, H.; Graves, R.; Gupta, N.; Gupta, V.; Jordan, T.H.; Kesselman, C.; Maechling, P.; Mehringer, J.; Mehta, G.; Okaya, D.; Vahi, K.; Zhao, L.

    2006-01-01

    This paper discusses the process of building an environment where large-scale, complex, scientific analysis can be scheduled onto a heterogeneous collection of computational and storage resources. The example application is the Southern California Earthquake Center (SCEC) CyberShake project, an analysis designed to compute probabilistic seismic hazard curves for sites in the Los Angeles area. We explain which software tools were used to build to the system, describe their functionality and interactions. We show the results of running the CyberShake analysis that included over 250,000 jobs using resources available through SCEC and the TeraGrid. ?? 2006 IEEE.

  12. Active controls: A look at analytical methods and associated tools

    NASA Technical Reports Server (NTRS)

    Newsom, J. R.; Adams, W. M., Jr.; Mukhopadhyay, V.; Tiffany, S. H.; Abel, I.

    1984-01-01

    A review of analytical methods and associated tools for active controls analysis and design problems is presented. Approaches employed to develop mathematical models suitable for control system analysis and/or design are discussed. Significant efforts have been expended to develop tools to generate the models from the standpoint of control system designers' needs and develop the tools necessary to analyze and design active control systems. Representative examples of these tools are discussed. Examples where results from the methods and tools have been compared with experimental data are also presented. Finally, a perspective on future trends in analysis and design methods is presented.

  13. CFD to Flight: Some Recent Success Stories of X-Plane Design to Flight Test at the NASA Dryden Flight Research Center

    NASA Technical Reports Server (NTRS)

    Cosentino, Gary B.

    2007-01-01

    Several examples from the past decade of success stories involving the design and flight test of three true X-planes will be described: in particular, X-plane design techniques that relied heavily upon computational fluid dynamics (CFD). Three specific examples chosen from the author s personal experience are presented: the X-36 Tailless Fighter Agility Research Aircraft, the X-45A Unmanned Combat Air Vehicle, and, most recently, the X-48B Blended Wing Body Demonstrator Aircraft. An overview will be presented of the uses of CFD analysis, comparisons and contrasts with wind tunnel testing, and information derived from the CFD analysis that directly related to successful flight test. Some lessons learned on the proper application, and misapplication, of CFD are illustrated. Finally, some highlights of the flight-test results of the three example X-planes will be presented. This overview paper will discuss some of the author s experience with taking an aircraft shape from early concept and three-dimensional modeling through CFD analysis, wind tunnel testing, further refined CFD analysis, and, finally, flight. An overview of the key roles in which CFD plays well during this process, and some other roles in which it does not, are discussed. How wind tunnel testing complements, calibrates, and verifies CFD analysis is also covered. Lessons learned on where CFD results can be misleading are also given. Strengths and weaknesses of the various types of flow solvers, including panel methods, Euler, and Navier-Stokes techniques, are discussed. The paper concludes with the three specific examples, including some flight test video footage of the X-36, the X-45A, and the X-48B.

  14. CFD to Flight: Some Recent Success Stories of X-plane Design to Flight Test at the NASA Dryden Flight Research Center

    NASA Technical Reports Server (NTRS)

    Cosentino, Gary B.

    2007-01-01

    Several examples from the past decade of success stories involving the design and ight test of three true X-planes will be described: in particular, X-plane design techniques that relied heavily upon computational fluid dynamics (CFD). Three specific examples chosen from the authors personal experience are presented: the X-36 Tailless Fighter Agility Research Aircraft, the X-45A Unmanned Combat Air Vehicle, and, most recently, the X-48B Blended Wing Body Demonstrator Aircraft. An overview will be presented of the uses of CFD analysis, comparisons and contrasts with wind tunnel testing, and information derived from the CFD analysis that directly related to successful flight test. Some lessons learned on the proper application, and misapplication, of CFD are illustrated. Finally, some highlights of the flight-test results of the three example X-planes will be presented. This overview paper will discuss some of the authors experience with taking an aircraft shape from early concept and three-dimensional modeling through CFD analysis, wind tunnel testing, further re ned CFD analysis, and, finally, flight. An overview of the key roles in which CFD plays well during this process, and some other roles in which it does not, are discussed. How wind tunnel testing complements, calibrates, and verifies CFD analysis is also covered. Lessons learned on where CFD results can be misleading are also given. Strengths and weaknesses of the various types of ow solvers, including panel methods, Euler, and Navier-Stokes techniques, are discussed. The paper concludes with the three specific examples, including some flight test video footage of the X-36, the X-45A, and the X-48B.

  15. The Simpson's paradox unraveled

    PubMed Central

    Hernán, Miguel A; Clayton, David; Keiding, Niels

    2011-01-01

    Background In a famous article, Simpson described a hypothetical data example that led to apparently paradoxical results. Methods We make the causal structure of Simpson's example explicit. Results We show how the paradox disappears when the statistical analysis is appropriately guided by subject-matter knowledge. We also review previous explanations of Simpson's paradox that attributed it to two distinct phenomena: confounding and non-collapsibility. Conclusion Analytical errors may occur when the problem is stripped of its causal context and analyzed merely in statistical terms. PMID:21454324

  16. Regional Morphology Analysis Package (RMAP): Empirical Orthogonal Function Analysis, Background and Examples

    DTIC Science & Technology

    2007-10-01

    1984. Complex principal component analysis : Theory and examples. Journal of Climate and Applied Meteorology 23: 1660-1673. Hotelling, H. 1933...Sediments 99. ASCE: 2,566-2,581. Von Storch, H., and A. Navarra. 1995. Analysis of climate variability. Applications of statistical techniques. Berlin...ERDC TN-SWWRP-07-9 October 2007 Regional Morphology Empirical Analysis Package (RMAP): Orthogonal Function Analysis , Background and Examples by

  17. Summarising and validating test accuracy results across multiple studies for use in clinical practice.

    PubMed

    Riley, Richard D; Ahmed, Ikhlaaq; Debray, Thomas P A; Willis, Brian H; Noordzij, J Pieter; Higgins, Julian P T; Deeks, Jonathan J

    2015-06-15

    Following a meta-analysis of test accuracy studies, the translation of summary results into clinical practice is potentially problematic. The sensitivity, specificity and positive (PPV) and negative (NPV) predictive values of a test may differ substantially from the average meta-analysis findings, because of heterogeneity. Clinicians thus need more guidance: given the meta-analysis, is a test likely to be useful in new populations, and if so, how should test results inform the probability of existing disease (for a diagnostic test) or future adverse outcome (for a prognostic test)? We propose ways to address this. Firstly, following a meta-analysis, we suggest deriving prediction intervals and probability statements about the potential accuracy of a test in a new population. Secondly, we suggest strategies on how clinicians should derive post-test probabilities (PPV and NPV) in a new population based on existing meta-analysis results and propose a cross-validation approach for examining and comparing their calibration performance. Application is made to two clinical examples. In the first example, the joint probability that both sensitivity and specificity will be >80% in a new population is just 0.19, because of a low sensitivity. However, the summary PPV of 0.97 is high and calibrates well in new populations, with a probability of 0.78 that the true PPV will be at least 0.95. In the second example, post-test probabilities calibrate better when tailored to the prevalence in the new population, with cross-validation revealing a probability of 0.97 that the observed NPV will be within 10% of the predicted NPV. © 2015 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd.

  18. Results of an integrated structure/control law design sensitivity analysis

    NASA Technical Reports Server (NTRS)

    Gilbert, Michael G.

    1989-01-01

    A design sensitivity analysis method for Linear Quadratic Cost, Gaussian (LQG) optimal control laws, which predicts change in the optimal control law due to changes in fixed problem parameters using analytical sensitivity equations is discussed. Numerical results of a design sensitivity analysis for a realistic aeroservoelastic aircraft example are presented. In this example, the sensitivity of the optimally controlled aircraft's response to various problem formulation and physical aircraft parameters is determined. These results are used to predict the aircraft's new optimally controlled response if the parameter was to have some other nominal value during the control law design process. The sensitivity results are validated by recomputing the optimal control law for discrete variations in parameters, computing the new actual aircraft response, and comparing with the predicted response. These results show an improvement in sensitivity accuracy for integrated design purposes over methods which do not include changes in the optimal control law. Use of the analytical LQG sensitivity expressions is also shown to be more efficient than finite difference methods for the computation of the equivalent sensitivity information.

  19. Example-based super-resolution for single-image analysis from the Chang'e-1 Mission

    NASA Astrophysics Data System (ADS)

    Wu, Fan-Lu; Wang, Xiang-Jun

    2016-11-01

    Due to the low spatial resolution of images taken from the Chang'e-1 (CE-1) orbiter, the details of the lunar surface are blurred and lost. Considering the limited spatial resolution of image data obtained by a CCD camera on CE-1, an example-based super-resolution (SR) algorithm is employed to obtain high-resolution (HR) images. SR reconstruction is important for the application of image data to increase the resolution of images. In this article, a novel example-based algorithm is proposed to implement SR reconstruction by single-image analysis, and the computational cost is reduced compared to other example-based SR methods. The results show that this method can enhance the resolution of images using SR and recover detailed information about the lunar surface. Thus it can be used for surveying HR terrain and geological features. Moreover, the algorithm is significant for the HR processing of remotely sensed images obtained by other imaging systems.

  20. Successes and Challenges in Linking Observations and Modeling of Marine and Terrestrial Cryospheric Processes

    NASA Astrophysics Data System (ADS)

    Herzfeld, U. C.; Hunke, E. C.; Trantow, T.; Greve, R.; McDonald, B.; Wallin, B.

    2014-12-01

    Understanding of the state of the cryosphere and its relationship to other components of the Earth system requires both models of geophysical processes and observations of geophysical properties and processes, however linking observations and models is far from trivial. This paper looks at examples from sea ice and land ice model-observation linkages to examine some approaches, challenges and solutions. In a sea-ice example, ice deformation is analyzed as a key process that indicates fundamental changes in the Arctic sea ice cover. Simulation results from the Los Alamos Sea-Ice Model CICE, which is also the sea-ice component of the Community Earth System Model (CESM), are compared to parameters indicative of deformation as derived from mathematical analysis of remote sensing data. Data include altimeter, micro-ASAR and image data from manned and unmanned aircraft campaigns (NASA OIB and Characterization of Arctic Sea Ice Experiment, CASIE). The key problem to linking data and model results is the derivation of matching parameters on both the model and observation side.For terrestrial glaciology, we include an example of a surge process in a glacier system and and example of a dynamic ice sheet model for Greenland. To investigate the surge of the Bering Bagley Glacier System, we use numerical forward modeling experiments and, on the data analysis side, a connectionist approach to analyze crevasse provinces. In the Greenland ice sheet example, we look at the influence of ice surface and bed topography, as derived from remote sensing data, on on results from a dynamic ice sheet model.

  1. Regression Analysis: Legal Applications in Institutional Research

    ERIC Educational Resources Information Center

    Frizell, Julie A.; Shippen, Benjamin S., Jr.; Luna, Andrew L.

    2008-01-01

    This article reviews multiple regression analysis, describes how its results should be interpreted, and instructs institutional researchers on how to conduct such analyses using an example focused on faculty pay equity between men and women. The use of multiple regression analysis will be presented as a method with which to compare salaries of…

  2. Standardized Effect Size Measures for Mediation Analysis in Cluster-Randomized Trials

    ERIC Educational Resources Information Center

    Stapleton, Laura M.; Pituch, Keenan A.; Dion, Eric

    2015-01-01

    This article presents 3 standardized effect size measures to use when sharing results of an analysis of mediation of treatment effects for cluster-randomized trials. The authors discuss 3 examples of mediation analysis (upper-level mediation, cross-level mediation, and cross-level mediation with a contextual effect) with demonstration of the…

  3. The Effectiveness of Worked Examples Associated with Presentation Format and Prior Knowledge: A Web-Based Experiment

    ERIC Educational Resources Information Center

    Hsiao, E-Ling

    2010-01-01

    The aim of this study is to explore whether presentation format and prior knowledge affect the effectiveness of worked examples. The experiment was conducted through a specially designed online instrument. A 2X2X3 factorial before-and-after design was conducted. Three-way ANOVA was employed for data analysis. The result showed first, that prior…

  4. The Columbia Debris Loan Program; Examples of Microscopic Analysis

    NASA Technical Reports Server (NTRS)

    Russell, Rick; Thurston, Scott; Smith, Stephen; Marder, Arnold; Steckel, Gary

    2006-01-01

    Following the tragic loss of the Space Shuttle Columbia NASA formed The Columbia Recovery Office (CRO). The CRO was initially formed at the Johnson Space Center after the conclusion of recovery operations on May 1,2003 and then transferred .to the Kennedy Space Center on October 6,2003 and renamed The Columbia Recovery Office and Preservation. An integral part of the preservation project was the development of a process to loan Columbia debris to qualified researchers and technical educators. The purposes of this program include aiding in the advancement of advanced spacecraft design and flight safety development, the advancement of the study of hypersonic re-entry to enhance ground safety, to train and instruct accident investigators and to establish an enduring legacy for Space Shuttle Columbia and her crew. Along with a summary of the debris loan process examples of microscopic analysis of Columbia debris items will be presented. The first example will be from the reconstruction following the STS- 107 accident and how the Materials and Proessteesa m used microscopic analysis to confirm the accident scenario. Additionally, three examples of microstructural results from the debris loan process from NASA internal, academia and private industry will be presented.

  5. Inference for binomial probability based on dependent Bernoulli random variables with applications to meta‐analysis and group level studies

    PubMed Central

    Bakbergenuly, Ilyas; Morgenthaler, Stephan

    2016-01-01

    We study bias arising as a result of nonlinear transformations of random variables in random or mixed effects models and its effect on inference in group‐level studies or in meta‐analysis. The findings are illustrated on the example of overdispersed binomial distributions, where we demonstrate considerable biases arising from standard log‐odds and arcsine transformations of the estimated probability p^, both for single‐group studies and in combining results from several groups or studies in meta‐analysis. Our simulations confirm that these biases are linear in ρ, for small values of ρ, the intracluster correlation coefficient. These biases do not depend on the sample sizes or the number of studies K in a meta‐analysis and result in abysmal coverage of the combined effect for large K. We also propose bias‐correction for the arcsine transformation. Our simulations demonstrate that this bias‐correction works well for small values of the intraclass correlation. The methods are applied to two examples of meta‐analyses of prevalence. PMID:27192062

  6. Survival analysis: Part I — analysis of time-to-event

    PubMed Central

    2018-01-01

    Length of time is a variable often encountered during data analysis. Survival analysis provides simple, intuitive results concerning time-to-event for events of interest, which are not confined to death. This review introduces methods of analyzing time-to-event. The Kaplan-Meier survival analysis, log-rank test, and Cox proportional hazards regression modeling method are described with examples of hypothetical data. PMID:29768911

  7. Topological analysis of metabolic networks based on petri net theory.

    PubMed

    Zevedei-Oancea, Ionela; Schuster, Stefan

    2011-01-01

    Petri net concepts provide additional tools for the modelling of metabolic networks. Here, the similarities between the counterparts in traditional biochemical modelling and Petri net theory are discussed. For example the stoichiometry matrix of a metabolic network corresponds to the incidence matrix of the Petri net. The flux modes and conservation relations have the T-invariants, respectively, P-invariants as counterparts. We reveal the biological meaning of some notions specific to the Petri net framework (traps, siphons, deadlocks, liveness). We focus on the topological analysis rather than on the analysis of the dynamic behaviour. The treatment of external metabolites is discussed. Some simple theoretical examples are presented for illustration. Also the Petri nets corresponding to some biochemical networks are built to support our results. For example, the role of triose phosphate isomerase (TPI) in Trypanosoma brucei metabolism is evaluated by detecting siphons and traps. All Petri net properties treated in this contribution are exemplified on a system extracted from nucleotide metabolism.

  8. Topological analysis of metabolic networks based on Petri net theory.

    PubMed

    Zevedei-Oancea, Ionela; Schuster, Stefan

    2003-01-01

    Petri net concepts provide additional tools for the modelling of metabolic networks. Here, the similarities between the counterparts in traditional biochemical modelling and Petri net theory are discussed. For example the stoichiometry matrix of a metabolic network corresponds to the incidence matrix of the Petri net. The flux modes and conservation relations have the T-invariants, respectively, P-invariants as counterparts. We reveal the biological meaning of some notions specific to the Petri net framework (traps, siphons, deadlocks, liveness). We focus on the topological analysis rather than on the analysis of the dynamic behaviour. The treatment of external metabolites is discussed. Some simple theoretical examples are presented for illustration. Also the Petri nets corresponding to some biochemical networks are built to support our results. For example, the role of triose phosphate isomerase (TPI) in Trypanosoma brucei metabolism is evaluated by detecting siphons and traps. All Petri net properties treated in this contribution are exemplified on a system extracted from nucleotide metabolism.

  9. Best (but oft-forgotten) practices: mediation analysis.

    PubMed

    Fairchild, Amanda J; McDaniel, Heather L

    2017-06-01

    This contribution in the "Best (but Oft-Forgotten) Practices" series considers mediation analysis. A mediator (sometimes referred to as an intermediate variable, surrogate endpoint, or intermediate endpoint) is a third variable that explains how or why ≥2 other variables relate in a putative causal pathway. The current article discusses mediation analysis with the ultimate intention of helping nutrition researchers to clarify the rationale for examining mediation, avoid common pitfalls when using the model, and conduct well-informed analyses that can contribute to improving causal inference in evaluations of underlying mechanisms of effects on nutrition-related behavioral and health outcomes. We give specific attention to underevaluated limitations inherent in common approaches to mediation. In addition, we discuss how to conduct a power analysis for mediation models and offer an applied example to demonstrate mediation analysis. Finally, we provide an example write-up of mediation analysis results as a model for applied researchers. © 2017 American Society for Nutrition.

  10. Best (but oft-forgotten) practices: mediation analysis12

    PubMed Central

    McDaniel, Heather L

    2017-01-01

    This contribution in the “Best (but Oft-Forgotten) Practices” series considers mediation analysis. A mediator (sometimes referred to as an intermediate variable, surrogate endpoint, or intermediate endpoint) is a third variable that explains how or why ≥2 other variables relate in a putative causal pathway. The current article discusses mediation analysis with the ultimate intention of helping nutrition researchers to clarify the rationale for examining mediation, avoid common pitfalls when using the model, and conduct well-informed analyses that can contribute to improving causal inference in evaluations of underlying mechanisms of effects on nutrition-related behavioral and health outcomes. We give specific attention to underevaluated limitations inherent in common approaches to mediation. In addition, we discuss how to conduct a power analysis for mediation models and offer an applied example to demonstrate mediation analysis. Finally, we provide an example write-up of mediation analysis results as a model for applied researchers. PMID:28446497

  11. Discovering Hidden Controlling Parameters using Data Analytics and Dimensional Analysis

    NASA Astrophysics Data System (ADS)

    Del Rosario, Zachary; Lee, Minyong; Iaccarino, Gianluca

    2017-11-01

    Dimensional Analysis is a powerful tool, one which takes a priori information and produces important simplifications. However, if this a priori information - the list of relevant parameters - is missing a relevant quantity, then the conclusions from Dimensional Analysis will be incorrect. In this work, we present novel conclusions in Dimensional Analysis, which provide a means to detect this failure mode of missing or hidden parameters. These results are based on a restated form of the Buckingham Pi theorem that reveals a ridge function structure underlying all dimensionless physical laws. We leverage this structure by constructing a hypothesis test based on sufficient dimension reduction, allowing for an experimental data-driven detection of hidden parameters. Both theory and examples will be presented, using classical turbulent pipe flow as the working example. Keywords: experimental techniques, dimensional analysis, lurking variables, hidden parameters, buckingham pi, data analysis. First author supported by the NSF GRFP under Grant Number DGE-114747.

  12. Low-dimensional Representation of Error Covariance

    NASA Technical Reports Server (NTRS)

    Tippett, Michael K.; Cohn, Stephen E.; Todling, Ricardo; Marchesin, Dan

    2000-01-01

    Ensemble and reduced-rank approaches to prediction and assimilation rely on low-dimensional approximations of the estimation error covariances. Here stability properties of the forecast/analysis cycle for linear, time-independent systems are used to identify factors that cause the steady-state analysis error covariance to admit a low-dimensional representation. A useful measure of forecast/analysis cycle stability is the bound matrix, a function of the dynamics, observation operator and assimilation method. Upper and lower estimates for the steady-state analysis error covariance matrix eigenvalues are derived from the bound matrix. The estimates generalize to time-dependent systems. If much of the steady-state analysis error variance is due to a few dominant modes, the leading eigenvectors of the bound matrix approximate those of the steady-state analysis error covariance matrix. The analytical results are illustrated in two numerical examples where the Kalman filter is carried to steady state. The first example uses the dynamics of a generalized advection equation exhibiting nonmodal transient growth. Failure to observe growing modes leads to increased steady-state analysis error variances. Leading eigenvectors of the steady-state analysis error covariance matrix are well approximated by leading eigenvectors of the bound matrix. The second example uses the dynamics of a damped baroclinic wave model. The leading eigenvectors of a lowest-order approximation of the bound matrix are shown to approximate well the leading eigenvectors of the steady-state analysis error covariance matrix.

  13. The analysis of delays in simulator digital computing systems. Volume 1: Formulation of an analysis approach using a central example simulator model

    NASA Technical Reports Server (NTRS)

    Heffley, R. K.; Jewell, W. F.; Whitbeck, R. F.; Schulman, T. M.

    1980-01-01

    The effects of spurious delays in real time digital computing systems are examined. Various sources of spurious delays are defined and analyzed using an extant simulator system as an example. A specific analysis procedure is set forth and four cases are viewed in terms of their time and frequency domain characteristics. Numerical solutions are obtained for three single rate one- and two-computer examples, and the analysis problem is formulated for a two-rate, two-computer example.

  14. Development of Curved-Plate Elements for the Exact Buckling Analysis of Composite Plate Assemblies Including Transverse Shear Effects

    NASA Technical Reports Server (NTRS)

    McGowan, David M.; Anderson, Melvin S.

    1998-01-01

    The analytical formulation of curved-plate non-linear equilibrium equations that include transverse-shear-deformation effects is presented. A unified set of non-linear strains that contains terms from both physical and tensorial strain measures is used. Using several simplifying assumptions, linearized, stability equations are derived that describe the response of the plate just after bifurcation buckling occurs. These equations are then modified to allow the plate reference surface to be located a distance z(c), from the centroid surface which is convenient for modeling stiffened-plate assemblies. The implementation of the new theory into the VICONOPT buckling and vibration analysis and optimum design program code is described. Either classical plate theory (CPT) or first-order shear-deformation plate theory (SDPT) may be selected in VICONOPT. Comparisons of numerical results for several example problems with different loading states are made. Results from the new curved-plate analysis compare well with closed-form solution results and with results from known example problems in the literature. Finally, a design-optimization study of two different cylindrical shells subject to uniform axial compression is presented.

  15. Learning Needs Analysis of Collaborative E-Classes in Semi-Formal Settings: The REVIT Example

    ERIC Educational Resources Information Center

    Mavroudi, Anna; Hadzilacos, Thanasis

    2013-01-01

    Analysis, the first phase of the typical instructional design process, is often downplayed. This paper focuses on the analysis concerning a series of e-courses for collaborative adult education in semi-formal settings by reporting and generalizing results from the REVIT project. REVIT, an EU-funded research project, offered custom e-courses to…

  16. PresenceAbsence: An R package for presence absence analysis

    Treesearch

    Elizabeth A. Freeman; Gretchen Moisen

    2008-01-01

    The PresenceAbsence package for R provides a set of functions useful when evaluating the results of presence-absence analysis, for example, models of species distribution or the analysis of diagnostic tests. The package provides a toolkit for selecting the optimal threshold for translating a probability surface into presence-absence maps specifically tailored to their...

  17. Learning Probabilistic Logic Models from Probabilistic Examples

    PubMed Central

    Chen, Jianzhong; Muggleton, Stephen; Santos, José

    2009-01-01

    Abstract We revisit an application developed originally using abductive Inductive Logic Programming (ILP) for modeling inhibition in metabolic networks. The example data was derived from studies of the effects of toxins on rats using Nuclear Magnetic Resonance (NMR) time-trace analysis of their biofluids together with background knowledge representing a subset of the Kyoto Encyclopedia of Genes and Genomes (KEGG). We now apply two Probabilistic ILP (PILP) approaches - abductive Stochastic Logic Programs (SLPs) and PRogramming In Statistical modeling (PRISM) to the application. Both approaches support abductive learning and probability predictions. Abductive SLPs are a PILP framework that provides possible worlds semantics to SLPs through abduction. Instead of learning logic models from non-probabilistic examples as done in ILP, the PILP approach applied in this paper is based on a general technique for introducing probability labels within a standard scientific experimental setting involving control and treated data. Our results demonstrate that the PILP approach provides a way of learning probabilistic logic models from probabilistic examples, and the PILP models learned from probabilistic examples lead to a significant decrease in error accompanied by improved insight from the learned results compared with the PILP models learned from non-probabilistic examples. PMID:19888348

  18. Learning Probabilistic Logic Models from Probabilistic Examples.

    PubMed

    Chen, Jianzhong; Muggleton, Stephen; Santos, José

    2008-10-01

    We revisit an application developed originally using abductive Inductive Logic Programming (ILP) for modeling inhibition in metabolic networks. The example data was derived from studies of the effects of toxins on rats using Nuclear Magnetic Resonance (NMR) time-trace analysis of their biofluids together with background knowledge representing a subset of the Kyoto Encyclopedia of Genes and Genomes (KEGG). We now apply two Probabilistic ILP (PILP) approaches - abductive Stochastic Logic Programs (SLPs) and PRogramming In Statistical modeling (PRISM) to the application. Both approaches support abductive learning and probability predictions. Abductive SLPs are a PILP framework that provides possible worlds semantics to SLPs through abduction. Instead of learning logic models from non-probabilistic examples as done in ILP, the PILP approach applied in this paper is based on a general technique for introducing probability labels within a standard scientific experimental setting involving control and treated data. Our results demonstrate that the PILP approach provides a way of learning probabilistic logic models from probabilistic examples, and the PILP models learned from probabilistic examples lead to a significant decrease in error accompanied by improved insight from the learned results compared with the PILP models learned from non-probabilistic examples.

  19. Comparing Examples: WebAssign versus Textbook

    NASA Astrophysics Data System (ADS)

    Richards, Evan; Polak, Jeff; Hardin, Ashley; Risley, John, , Dr.

    2005-11-01

    Research shows students can learn from worked examples.^1 This pilot study compared two groups of students' performance (10 each) in solving physics problems. One group had access to interactive examples^2 released in WebAssign^3, while the other group had access to the counterpart textbook examples. Verbal data from students in problem solving sessions was collected using a think aloud protocol^4 and the data was analyzed using Chi's procedures.^5 An explanation of the methodology and results will be presented. Future phases of this pilot study based upon these results will also be discussed. ^1Atkinson, R.K., Derry, S.J., Renkl A., Wortham, D. (2000). ``Learning from Examples: Instructional Principles from the Worked Examples Research'', Review of Educational Research, vol. 70, n. 2, pp. 181-214. ^2Serway, R.A. & Faughn, J.S. (2006). College Physics (7^th ed.). Belmont, CA: Thomson Brooks/Cole. ^3 see www.webassign.net ^4 Ericsson, K.A. & Simon, H.A. (1984). Protocol Analysis: Verbal Reports as Data. Cambridge, Massachusetts: The MIT Press. ^5 Chi, Michelene T.H. (1997). ``Quantifying Qualitative Analyses of Verbal Data: A Practical Guide,'' The Journal of the Learning Sciences, vol. 6, n. 3, pp. 271-315.

  20. Measuring Nitrification: A Laboratory Approach to Nutrient Cycling.

    ERIC Educational Resources Information Center

    Hicks, David J.

    1990-01-01

    Presented is an approach to the study of nutrient cycling in the school laboratory. Discussed are obtaining, processing, and incubating samples; extraction of ions from soil; procedures for nitrate and ammonium analysis; data analysis; an example of results; and other aspects of the nitrogen cycle. (CW)

  1. How individual participant data meta-analyses have influenced trial design, conduct, and analysis

    PubMed Central

    Tierney, Jayne F.; Pignon, Jean-Pierre; Gueffyier, Francois; Clarke, Mike; Askie, Lisa; Vale, Claire L.; Burdett, Sarah; Alderson, P.; Askie, L.; Bennett, D.; Burdett, S.; Clarke, M.; Dias, S.; Emberson, J.; Gueyffier, F.; Iorio, A.; Macleod, M.; Mol, B.W.; Moons, C.; Parmar, M.; Perera, R.; Phillips, R.; Pignon, J.P.; Rees, J.; Reitsma, H.; Riley, R.; Rovers, M.; Rydzewska, L.; Schmid, C.; Shepperd, S.; Stenning, S.; Stewart, L.; Tierney, J.; Tudur Smith, C.; Vale, C.; Welge, J.; White, I.; Whiteley, W.

    2015-01-01

    Objectives To demonstrate how individual participant data (IPD) meta-analyses have impacted directly on the design and conduct of trials and highlight other advantages IPD might offer. Study Design and Setting Potential examples of the impact of IPD meta-analyses on trials were identified at an international workshop, attended by individuals with experience in the conduct of IPD meta-analyses and knowledge of trials in their respective clinical areas. Experts in the field who did not attend were asked to provide any further examples. We then examined relevant trial protocols, publications, and Web sites to verify the impacts of the IPD meta-analyses. A subgroup of workshop attendees sought further examples and identified other aspects of trial design and conduct that may inform IPD meta-analyses. Results We identified 52 examples of IPD meta-analyses thought to have had a direct impact on the design or conduct of trials. After screening relevant trial protocols and publications, we identified 28 instances where IPD meta-analyses had clearly impacted on trials. They have influenced the selection of comparators and participants, sample size calculations, analysis and interpretation of subsequent trials, and the conduct and analysis of ongoing trials, sometimes in ways that would not possible with systematic reviews of aggregate data. We identified additional potential ways that IPD meta-analyses could be used to influence trials. Conclusions IPD meta-analysis could be better used to inform the design, conduct, analysis, and interpretation of trials. PMID:26186982

  2. Genome Data Exploration Using Correspondence Analysis

    PubMed Central

    Tekaia, Fredj

    2016-01-01

    Recent developments of sequencing technologies that allow the production of massive amounts of genomic and genotyping data have highlighted the need for synthetic data representation and pattern recognition methods that can mine and help discovering biologically meaningful knowledge included in such large data sets. Correspondence analysis (CA) is an exploratory descriptive method designed to analyze two-way data tables, including some measure of association between rows and columns. It constructs linear combinations of variables, known as factors. CA has been used for decades to study high-dimensional data, and remarkable inferences from large data tables were obtained by reducing the dimensionality to a few orthogonal factors that correspond to the largest amount of variability in the data. Herein, I review CA and highlight its use by considering examples in handling high-dimensional data that can be constructed from genomic and genetic studies. Examples in amino acid compositions of large sets of species (viruses, phages, yeast, and fungi) as well as an example related to pairwise shared orthologs in a set of yeast and fungal species, as obtained from their proteome comparisons, are considered. For the first time, results show striking segregations between yeasts and fungi as well as between viruses and phages. Distributions obtained from shared orthologs show clusters of yeast and fungal species corresponding to their phylogenetic relationships. A direct comparison with the principal component analysis method is discussed using a recently published example of genotyping data related to newly discovered traces of an ancient hominid that was compared to modern human populations in the search for ancestral similarities. CA offers more detailed results highlighting links between modern humans and the ancient hominid and their characterizations. Compared to the popular principal component analysis method, CA allows easier and more effective interpretation of results, particularly by the ability of relating individual patterns with their corresponding characteristic variables. PMID:27279736

  3. Rapid characterisation of surface modifications and treatments using a benchtop SIMS instrument

    NASA Astrophysics Data System (ADS)

    McPhail, D. S.; Sokhan, M.; Rees, E. E.; Cliff, B.; Eccles, A. J.; Chater, R. J.

    2004-06-01

    The development of a novel benchtop SIMS instrument (Millbrook MiniSIMS) [Appl. Surf. Sci. 144 (1999) 106] has brought routine SIMS analysis to many new users, for example museum conservators. This is a result of the simple operation and the relatively low capital cost of the instrument. We report here on the continued development of the system in terms of increasing performance and functionality and its use in museum conservation based applications where a mobile instrument for high throughput, rapid SIMS analysis has proven to be of great benefit to the user. The example we describe here is the application of the MiniSIMS to the analysis of silver thread woven into a silk dress before and after laser cleaning.

  4. Some historical relationships between science and technology with implications for behavior analysis

    PubMed Central

    Moxley, Roy A.

    1989-01-01

    The relationship between science and technology is examined in terms of some implications for behavior analysis. Problems result when this relationship is seen as one in which science generally begets technology in a one-way, or hierarchical, relationship. These problems are not found when the relationship between science and technology is seen as two-way, or symmetrical, within a larger context of relationships. Some historical examples are presented. Collectively, these and other examples in the references weaken the case for a prevailing one-way, hierarchical relationship and strengthen the case for a two-way, symmetrical relationship. In addition to being more accurate historically, the symmetrical relationship is also more consistent with the principles of behavior analysis. PMID:22478016

  5. Human errors and measurement uncertainty

    NASA Astrophysics Data System (ADS)

    Kuselman, Ilya; Pennecchi, Francesca

    2015-04-01

    Evaluating the residual risk of human errors in a measurement and testing laboratory, remaining after the error reduction by the laboratory quality system, and quantifying the consequences of this risk for the quality of the measurement/test results are discussed based on expert judgments and Monte Carlo simulations. A procedure for evaluation of the contribution of the residual risk to the measurement uncertainty budget is proposed. Examples are provided using earlier published sets of expert judgments on human errors in pH measurement of groundwater, elemental analysis of geological samples by inductively coupled plasma mass spectrometry, and multi-residue analysis of pesticides in fruits and vegetables. The human error contribution to the measurement uncertainty budget in the examples was not negligible, yet also not dominant. This was assessed as a good risk management result.

  6. Aircrew Discourse: Exploring Strategies of Information and Action Management

    NASA Technical Reports Server (NTRS)

    Irwin, Cheryl M.; Veinott, Elizabeth S.; Shafto, Michael G. (Technical Monitor)

    1995-01-01

    This paper explores methodology issues encountered in the analysis of flightcrew communications in aviation simulation research. Examples are provided by two recent studies which are compared on three issues: level of analysis, data definition, and interpretation of the results. The data discussed were collected in a study comparing two levels of aircraft automation. The first example is an investigation of how pilots' information transfer strategies differed as a function of automation during low and high-workload flight phases. The second study focuses on how crews managed actions in the two aircraft during a ten minute, high-workload flight segment. Results indicated that crews in the two aircraft differed in their strategies of information and action management. The differences are discussed in terms of their operational and research significance.

  7. Adding results to a meta-analysis: Theory and example

    NASA Astrophysics Data System (ADS)

    Willson, Victor L.

    Meta-analysis has been used as a research method to describe bodies of research data. It promotes hypothesis formation and the development of science education laws. A function overlooked, however, is the role it plays in updating research. Methods to integrate new research with meta-analysis results need explication. A procedure is presented using Bayesian analysis. Research in science education attitude correlation with achievement has been published after a recent meta-analysis of the topic. The results show how new findings complement the previous meta-analysis and extend its conclusions. Additional methodological questions adddressed are how studies are to be weighted, which variables are to be examined, and how often meta-analysis are to be updated.

  8. Global Sensitivity and Data-Worth Analyses in iTOUGH2: User's Guide

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wainwright, Haruko Murakami; Finsterle, Stefan

    2016-07-15

    This manual explains the use of local sensitivity analysis, the global Morris OAT and Sobol’ methods, and a related data-worth analysis as implemented in iTOUGH2. In addition to input specification and output formats, it includes some examples to show how to interpret results.

  9. Monitoring of energy efficiency of technological modes of gas transport using modern gas-turbine equipment

    NASA Astrophysics Data System (ADS)

    Golik, V. V.; Zemenkova, M. Yu; Shipovalov, A. N.; Akulov, K. A.

    2018-05-01

    The paper presents calculations and an example of energy efficiency justification of the regimes of the equipment used. The engineering design of the gas pipeline in the part of monitoring the energy efficiency of a gas compressor unit (GCU) is considered. The results of the GCU characteristics and its components evaluation are described. The evaluation results of the energy efficiency indicators of the gas pipeline are presented. As an example of the result of the analysis, it is proposed to use gas compressor unit GCU-32 "Ladoga" because of its efficiency and cost effectiveness, in comparison with analogues.

  10. Analyzing multiple data sets by interconnecting RSAT programs via SOAP Web services: an example with ChIP-chip data.

    PubMed

    Sand, Olivier; Thomas-Chollier, Morgane; Vervisch, Eric; van Helden, Jacques

    2008-01-01

    This protocol shows how to access the Regulatory Sequence Analysis Tools (RSAT) via a programmatic interface in order to automate the analysis of multiple data sets. We describe the steps for writing a Perl client that connects to the RSAT Web services and implements a workflow to discover putative cis-acting elements in promoters of gene clusters. In the presented example, we apply this workflow to lists of transcription factor target genes resulting from ChIP-chip experiments. For each factor, the protocol predicts the binding motifs by detecting significantly overrepresented hexanucleotides in the target promoters and generates a feature map that displays the positions of putative binding sites along the promoter sequences. This protocol is addressed to bioinformaticians and biologists with programming skills (notions of Perl). Running time is approximately 6 min on the example data set.

  11. Decoding spike timing: the differential reverse correlation method

    PubMed Central

    Tkačik, Gašper; Magnasco, Marcelo O.

    2009-01-01

    It is widely acknowledged that detailed timing of action potentials is used to encode information, for example in auditory pathways; however the computational tools required to analyze encoding through timing are still in their infancy. We present a simple example of encoding, based on a recent model of time-frequency analysis, in which units fire action potentials when a certain condition is met, but the timing of the action potential depends also on other features of the stimulus. We show that, as a result, spike-triggered averages are smoothed so much they do not represent the true features of the encoding. Inspired by this example, we present a simple method, differential reverse correlations, that can separate an analysis of what causes a neuron to spike, and what controls its timing. We analyze with this method the leaky integrate-and-fire neuron and show the method accurately reconstructs the model's kernel. PMID:18597928

  12. Multivariate meta-analysis for non-linear and other multi-parameter associations

    PubMed Central

    Gasparrini, A; Armstrong, B; Kenward, M G

    2012-01-01

    In this paper, we formalize the application of multivariate meta-analysis and meta-regression to synthesize estimates of multi-parameter associations obtained from different studies. This modelling approach extends the standard two-stage analysis used to combine results across different sub-groups or populations. The most straightforward application is for the meta-analysis of non-linear relationships, described for example by regression coefficients of splines or other functions, but the methodology easily generalizes to any setting where complex associations are described by multiple correlated parameters. The modelling framework of multivariate meta-analysis is implemented in the package mvmeta within the statistical environment R. As an illustrative example, we propose a two-stage analysis for investigating the non-linear exposure–response relationship between temperature and non-accidental mortality using time-series data from multiple cities. Multivariate meta-analysis represents a useful analytical tool for studying complex associations through a two-stage procedure. Copyright © 2012 John Wiley & Sons, Ltd. PMID:22807043

  13. Inference for binomial probability based on dependent Bernoulli random variables with applications to meta-analysis and group level studies.

    PubMed

    Bakbergenuly, Ilyas; Kulinskaya, Elena; Morgenthaler, Stephan

    2016-07-01

    We study bias arising as a result of nonlinear transformations of random variables in random or mixed effects models and its effect on inference in group-level studies or in meta-analysis. The findings are illustrated on the example of overdispersed binomial distributions, where we demonstrate considerable biases arising from standard log-odds and arcsine transformations of the estimated probability p̂, both for single-group studies and in combining results from several groups or studies in meta-analysis. Our simulations confirm that these biases are linear in ρ, for small values of ρ, the intracluster correlation coefficient. These biases do not depend on the sample sizes or the number of studies K in a meta-analysis and result in abysmal coverage of the combined effect for large K. We also propose bias-correction for the arcsine transformation. Our simulations demonstrate that this bias-correction works well for small values of the intraclass correlation. The methods are applied to two examples of meta-analyses of prevalence. © 2016 The Authors. Biometrical Journal Published by Wiley-VCH Verlag GmbH & Co. KGaA.

  14. Optimal guidance law development for an advanced launch system

    NASA Technical Reports Server (NTRS)

    Calise, Anthony J.; Hodges, Dewey H.

    1990-01-01

    A regular perturbation analysis is presented. Closed-loop simulations were performed with a first order correction including all of the atmospheric terms. In addition, a method was developed for independently checking the accuracy of the analysis and the rather extensive programming required to implement the complete first order correction with all of the aerodynamic effects included. This amounted to developing an equivalent Hamiltonian computed from the first order analysis. A second order correction was also completed for the neglected spherical Earth and back-pressure effects. Finally, an analysis was begun on a method for dealing with control inequality constraints. The results on including higher order corrections do show some improvement for this application; however, it is not known at this stage if significant improvement will result when the aerodynamic forces are included. The weak formulation for solving optimal problems was extended in order to account for state inequality constraints. The formulation was tested on three example problems and numerical results were compared to the exact solutions. Development of a general purpose computational environment for the solution of a large class of optimal control problems is under way. An example, along with the necessary input and the output, is given.

  15. An active learning approach for rapid characterization of endothelial cells in human tumors.

    PubMed

    Padmanabhan, Raghav K; Somasundar, Vinay H; Griffith, Sandra D; Zhu, Jianliang; Samoyedny, Drew; Tan, Kay See; Hu, Jiahao; Liao, Xuejun; Carin, Lawrence; Yoon, Sam S; Flaherty, Keith T; Dipaola, Robert S; Heitjan, Daniel F; Lal, Priti; Feldman, Michael D; Roysam, Badrinath; Lee, William M F

    2014-01-01

    Currently, no available pathological or molecular measures of tumor angiogenesis predict response to antiangiogenic therapies used in clinical practice. Recognizing that tumor endothelial cells (EC) and EC activation and survival signaling are the direct targets of these therapies, we sought to develop an automated platform for quantifying activity of critical signaling pathways and other biological events in EC of patient tumors by histopathology. Computer image analysis of EC in highly heterogeneous human tumors by a statistical classifier trained using examples selected by human experts performed poorly due to subjectivity and selection bias. We hypothesized that the analysis can be optimized by a more active process to aid experts in identifying informative training examples. To test this hypothesis, we incorporated a novel active learning (AL) algorithm into FARSIGHT image analysis software that aids the expert by seeking out informative examples for the operator to label. The resulting FARSIGHT-AL system identified EC with specificity and sensitivity consistently greater than 0.9 and outperformed traditional supervised classification algorithms. The system modeled individual operator preferences and generated reproducible results. Using the results of EC classification, we also quantified proliferation (Ki67) and activity in important signal transduction pathways (MAP kinase, STAT3) in immunostained human clear cell renal cell carcinoma and other tumors. FARSIGHT-AL enables characterization of EC in conventionally preserved human tumors in a more automated process suitable for testing and validating in clinical trials. The results of our study support a unique opportunity for quantifying angiogenesis in a manner that can now be tested for its ability to identify novel predictive and response biomarkers.

  16. Forward and backward uncertainty propagation: an oxidation ditch modelling example.

    PubMed

    Abusam, A; Keesman, K J; van Straten, G

    2003-01-01

    In the field of water technology, forward uncertainty propagation is frequently used, whereas backward uncertainty propagation is rarely used. In forward uncertainty analysis, one moves from a given (or assumed) parameter subspace towards the corresponding distribution of the output or objective function. However, in the backward uncertainty propagation, one moves in the reverse direction, from the distribution function towards the parameter subspace. Backward uncertainty propagation, which is a generalisation of parameter estimation error analysis, gives information essential for designing experimental or monitoring programmes, and for tighter bounding of parameter uncertainty intervals. The procedure of carrying out backward uncertainty propagation is illustrated in this technical note by working example for an oxidation ditch wastewater treatment plant. Results obtained have demonstrated that essential information can be achieved by carrying out backward uncertainty propagation analysis.

  17. Manipulating measurement scales in medical statistical analysis and data mining: A review of methodologies

    PubMed Central

    Marateb, Hamid Reza; Mansourian, Marjan; Adibi, Peyman; Farina, Dario

    2014-01-01

    Background: selecting the correct statistical test and data mining method depends highly on the measurement scale of data, type of variables, and purpose of the analysis. Different measurement scales are studied in details and statistical comparison, modeling, and data mining methods are studied based upon using several medical examples. We have presented two ordinal–variables clustering examples, as more challenging variable in analysis, using Wisconsin Breast Cancer Data (WBCD). Ordinal-to-Interval scale conversion example: a breast cancer database of nine 10-level ordinal variables for 683 patients was analyzed by two ordinal-scale clustering methods. The performance of the clustering methods was assessed by comparison with the gold standard groups of malignant and benign cases that had been identified by clinical tests. Results: the sensitivity and accuracy of the two clustering methods were 98% and 96%, respectively. Their specificity was comparable. Conclusion: by using appropriate clustering algorithm based on the measurement scale of the variables in the study, high performance is granted. Moreover, descriptive and inferential statistics in addition to modeling approach must be selected based on the scale of the variables. PMID:24672565

  18. Applying Longitudinal Mean and Covariance Structures (LMACS) Analysis to Assess Construct Stability Over Two Time Points: An Example Using Psychological Entitlement

    ERIC Educational Resources Information Center

    Bashkov, Bozhidar M.; Finney, Sara J.

    2013-01-01

    Traditional methods of assessing construct stability are reviewed and longitudinal mean and covariance structures (LMACS) analysis, a modern approach, is didactically illustrated using psychological entitlement data. Measurement invariance and latent variable stability results are interpreted, emphasizing substantive implications for educators and…

  19. An Economic Analysis of United States Assistance to Selected Less Developed Countries.

    DTIC Science & Technology

    1983-07-01

    and peoples, upon request, assistance of such nature and in such amounts as the United States deems advisable and as may be effectively used by free...country has the capability to absorb and utilize the arms effectively . (11) What other military interests--for example, overflight rights or access to...expenditures has created the opposite effect . Internal stability has been sacrificed as a result of a defense build up. As an exampl,- -f lthrs phnnomnenon

  20. CRREL (Cold Regions Research and Engineering Laboratory) Technical Publications. Supplement, October 1986-September 1988

    DTIC Science & Technology

    1988-09-01

    1000. Extensive post -test optical analysis allowed Antenna polarization and height, and sigaal stacking estimation of the size distribution and number of...to 10 C higher under natural activated sludge. A design example is presented for conditions than in the wind tunnel studies. Results each case. All...typically limitations of the methcd are presented, examples are columnar type crystal structure. The remaining 2i% shown, and notes on user instructions are

  1. The study of two-dimensional oscillations using a smartphone acceleration sensor: example of Lissajous curves

    NASA Astrophysics Data System (ADS)

    Tuset-Sanchis, Luis; Castro-Palacio, Juan C.; Gómez-Tejedor, José A.; Manjón, Francisco J.; Monsoriu, Juan A.

    2015-08-01

    A smartphone acceleration sensor is used to study two-dimensional harmonic oscillations. The data recorded by the free android application, Accelerometer Toy, is used to determine the periods of oscillation by graphical analysis. Different patterns of the Lissajous curves resulting from the superposition of harmonic motions are illustrated for three experiments. This work introduces an example of how two-dimensional oscillations can be easily studied with a smartphone acceleration sensor.

  2. Discrimination of Seismic Sources Using Israel Seismic Network.

    DTIC Science & Technology

    1996-07-01

    earthquakes, quarry blasts, underwater explosions and 13 ISN stations in the Dead Sea basin and Negev region 4. Seismogram of event ESI on the Jordanian...1.5 Hz)/A 0 (10 Hz) 51 22. Example of recordings from the Negev quarry blast ES6 53 23. Example of recordings from the Dead Sea earthquake QS3 54 24...Discrimination results for the southern dataset: semblance versus 55 energy ratio IV 25. Velogram analysis, Dead Sea/ Negev region - Discrimination

  3. Modal Analysis of Space-rocket Equipment Components

    NASA Astrophysics Data System (ADS)

    Igolkin, A. A.; Safin, A. I.; Prokofiev, A. B.

    2018-01-01

    In order to prevent vibration damage an analysis of natural frequencies and mode shapes of elements of rocket and space technology should be developed. This paper discusses technique of modal analysis on the example of the carrier platform. Modal analysis was performed by using mathematical modeling and laser vibrometer. Experimental data was clarified by using Test.Lab software. As a result of modal analysis amplitude-frequency response of carrier platform was obtained and the parameters of the elasticity was clarified.

  4. Making the Hubble Space Telescope servicing mission safe

    NASA Technical Reports Server (NTRS)

    Bahr, N. J.; Depalo, S. V.

    1992-01-01

    The implementation of the HST system safety program is detailed. Numerous safety analyses are conducted through various phases of design, test, and fabrication, and results are presented to NASA management for discussion during dedicated safety reviews. Attention is given to the system safety assessment and risk analysis methodologies used, i.e., hazard analysis, fault tree analysis, and failure modes and effects analysis, and to how they are coupled with engineering and test analysis for a 'synergistic picture' of the system. Some preliminary safety analysis results, showing the relationship between hazard identification, control or abatement, and finally control verification, are presented as examples of this safety process.

  5. Time Analysis of Building Dynamic Response Under Seismic Action. Part 2: Example of Calculation

    NASA Astrophysics Data System (ADS)

    Ufimtcev, E. M.

    2017-11-01

    The second part of the article illustrates the use of the time analysis method (TAM) by the example of the calculation of a 3-storey building, the design dynamic model (DDM) of which is adopted in the form of a flat vertical cantilever rod with 3 horizontal degrees of freedom associated with floor and coverage levels. The parameters of natural oscillations (frequencies and modes) and the results of the calculation of the elastic forced oscillations of the building’s DDM - oscillograms of the reaction parameters on the time interval t ∈ [0; 131,25] sec. The obtained results are analyzed on the basis of the computed values of the discrepancy of the DDS motion equation and the comparison of the results calculated on the basis of the numerical approach (FEM) and the normative method set out in SP 14.13330.2014 “Construction in Seismic Regions”. The data of the analysis testify to the accuracy of the construction of the computational model as well as the high accuracy of the results obtained. In conclusion, it is revealed that the use of the TAM will improve the strength of buildings and structures subject to seismic influences when designing them.

  6. Parameterization of aquatic ecosystem functioning and its natural variation: Hierarchical Bayesian modelling of plankton food web dynamics

    NASA Astrophysics Data System (ADS)

    Norros, Veera; Laine, Marko; Lignell, Risto; Thingstad, Frede

    2017-10-01

    Methods for extracting empirically and theoretically sound parameter values are urgently needed in aquatic ecosystem modelling to describe key flows and their variation in the system. Here, we compare three Bayesian formulations for mechanistic model parameterization that differ in their assumptions about the variation in parameter values between various datasets: 1) global analysis - no variation, 2) separate analysis - independent variation and 3) hierarchical analysis - variation arising from a shared distribution defined by hyperparameters. We tested these methods, using computer-generated and empirical data, coupled with simplified and reasonably realistic plankton food web models, respectively. While all methods were adequate, the simulated example demonstrated that a well-designed hierarchical analysis can result in the most accurate and precise parameter estimates and predictions, due to its ability to combine information across datasets. However, our results also highlighted sensitivity to hyperparameter prior distributions as an important caveat of hierarchical analysis. In the more complex empirical example, hierarchical analysis was able to combine precise identification of parameter values with reasonably good predictive performance, although the ranking of the methods was less straightforward. We conclude that hierarchical Bayesian analysis is a promising tool for identifying key ecosystem-functioning parameters and their variation from empirical datasets.

  7. The Pollution Detectives, Part III: Roadside Lead Pollution.

    ERIC Educational Resources Information Center

    Sanderson, Phil

    1989-01-01

    Described is a simple test tube method developed lead analysis of samples of roadside soil. The relationship between the results and the traffic flow indicate car exhausts are the major source of lead pollution. Materials and procedures are detailed. An example of results is provided. (Author/CW)

  8. A POSTERIORI ERROR ANALYSIS OF TWO STAGE COMPUTATION METHODS WITH APPLICATION TO EFFICIENT DISCRETIZATION AND THE PARAREAL ALGORITHM.

    PubMed

    Chaudhry, Jehanzeb Hameed; Estep, Don; Tavener, Simon; Carey, Varis; Sandelin, Jeff

    2016-01-01

    We consider numerical methods for initial value problems that employ a two stage approach consisting of solution on a relatively coarse discretization followed by solution on a relatively fine discretization. Examples include adaptive error control, parallel-in-time solution schemes, and efficient solution of adjoint problems for computing a posteriori error estimates. We describe a general formulation of two stage computations then perform a general a posteriori error analysis based on computable residuals and solution of an adjoint problem. The analysis accommodates various variations in the two stage computation and in formulation of the adjoint problems. We apply the analysis to compute "dual-weighted" a posteriori error estimates, to develop novel algorithms for efficient solution that take into account cancellation of error, and to the Parareal Algorithm. We test the various results using several numerical examples.

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hadgu, Teklu; Matteo, Edward N.

    An example case is presented for testing analytical thermal models. The example case represents thermal analysis of a generic repository in bedded salt at 500 m depth. The analysis is part of the study reported in Matteo et al. (2016). Ambient average ground surface temperature of 15°C, and a natural geothermal gradient of 25°C/km, were assumed to calculate temperature at the near field. For generic salt repository concept crushed salt backfill is assumed. For the semi-analytical analysis crushed salt thermal conductivity of 0.57 W/m-K was used. With time the crushed salt is expected to consolidate into intact salt. In thismore » study a backfill thermal conductivity of 3.2 W/m-K (same as intact) is used for sensitivity analysis. Decay heat data for SRS glass is given in Table 1. The rest of the parameter values are shown below. Results of peak temperatures at the waste package surface are given in Table 2.« less

  10. Bayesian bivariate meta-analysis of correlated effects: Impact of the prior distributions on the between-study correlation, borrowing of strength, and joint inferences

    PubMed Central

    Bujkiewicz, Sylwia; Riley, Richard D

    2016-01-01

    Multivariate random-effects meta-analysis allows the joint synthesis of correlated results from multiple studies, for example, for multiple outcomes or multiple treatment groups. In a Bayesian univariate meta-analysis of one endpoint, the importance of specifying a sensible prior distribution for the between-study variance is well understood. However, in multivariate meta-analysis, there is little guidance about the choice of prior distributions for the variances or, crucially, the between-study correlation, ρB; for the latter, researchers often use a Uniform(−1,1) distribution assuming it is vague. In this paper, an extensive simulation study and a real illustrative example is used to examine the impact of various (realistically) vague prior distributions for ρB and the between-study variances within a Bayesian bivariate random-effects meta-analysis of two correlated treatment effects. A range of diverse scenarios are considered, including complete and missing data, to examine the impact of the prior distributions on posterior results (for treatment effect and between-study correlation), amount of borrowing of strength, and joint predictive distributions of treatment effectiveness in new studies. Two key recommendations are identified to improve the robustness of multivariate meta-analysis results. First, the routine use of a Uniform(−1,1) prior distribution for ρB should be avoided, if possible, as it is not necessarily vague. Instead, researchers should identify a sensible prior distribution, for example, by restricting values to be positive or negative as indicated by prior knowledge. Second, it remains critical to use sensible (e.g. empirically based) prior distributions for the between-study variances, as an inappropriate choice can adversely impact the posterior distribution for ρB, which may then adversely affect inferences such as joint predictive probabilities. These recommendations are especially important with a small number of studies and missing data. PMID:26988929

  11. A Mean variance analysis of arbitrage portfolios

    NASA Astrophysics Data System (ADS)

    Fang, Shuhong

    2007-03-01

    Based on the careful analysis of the definition of arbitrage portfolio and its return, the author presents a mean-variance analysis of the return of arbitrage portfolios, which implies that Korkie and Turtle's results ( B. Korkie, H.J. Turtle, A mean-variance analysis of self-financing portfolios, Manage. Sci. 48 (2002) 427-443) are misleading. A practical example is given to show the difference between the arbitrage portfolio frontier and the usual portfolio frontier.

  12. Energy-Water System Solutions | Energy Analysis | NREL

    Science.gov Websites

    simultaneously. Example Projects Energy, water, and renewable opportunities assessment at Bagram Air Force Base opportunity to plan integrated infrastructure. Example Projects Identification of critical water and campus-level opportunities. Example Projects Net Zero Energy-Water-Waste analysis for Fort Carson Net

  13. Mesh Deformation Based on Fully Stressed Design: The Method and Two-Dimensional Examples

    NASA Technical Reports Server (NTRS)

    Hsu, Su-Yuen; Chang, Chau-Lyan

    2007-01-01

    Mesh deformation in response to redefined boundary geometry is a frequently encountered task in shape optimization and analysis of fluid-structure interaction. We propose a simple and concise method for deforming meshes defined with three-node triangular or four-node tetrahedral elements. The mesh deformation method is suitable for large boundary movement. The approach requires two consecutive linear elastic finite-element analyses of an isotropic continuum using a prescribed displacement at the mesh boundaries. The first analysis is performed with homogeneous elastic property and the second with inhomogeneous elastic property. The fully stressed design is employed with a vanishing Poisson s ratio and a proposed form of equivalent strain (modified Tresca equivalent strain) to calculate, from the strain result of the first analysis, the element-specific Young s modulus for the second analysis. The theoretical aspect of the proposed method, its convenient numerical implementation using a typical linear elastic finite-element code in conjunction with very minor extra coding for data processing, and results for examples of large deformation of two-dimensional meshes are presented in this paper. KEY WORDS: Mesh deformation, shape optimization, fluid-structure interaction, fully stressed design, finite-element analysis, linear elasticity, strain failure, equivalent strain, Tresca failure criterion

  14. Vortex loops and Majoranas

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chesi, Stefano; CEMS, RIKEN, Wako, Saitama 351-0198; Jaffe, Arthur

    2013-11-15

    We investigate the role that vortex loops play in characterizing eigenstates of interacting Majoranas. We give some general results and then focus on ladder Hamiltonian examples as a test of further ideas. Two methods yield exact results: (i) A mapping of certain spin Hamiltonians to quartic interactions of Majoranas shows that the spectra of these two examples coincide. (ii) In cases with reflection-symmetric Hamiltonians, we use reflection positivity for Majoranas to characterize vortices in the ground states. Two additional methods suggest wider applicability of these results: (iii) Numerical evidence suggests similar behavior for certain systems without reflection symmetry. (iv) Amore » perturbative analysis also suggests similar behavior without the assumption of reflection symmetry.« less

  15. Nonexistence of global solutions of abstract wave equations with high energies.

    PubMed

    Esquivel-Avila, Jorge A

    2017-01-01

    We consider an undamped second order in time evolution equation. For any positive value of the initial energy, we give sufficient conditions to conclude nonexistence of global solutions. The analysis is based on a differential inequality. The success of our result is based in a detailed analysis which is different from the ones commonly used to prove blow-up. Several examples are given improving known results in the literature.

  16. Optical holographic structural analysis of Kevlar rocket motor cases

    NASA Astrophysics Data System (ADS)

    Harris, W. J.

    1981-05-01

    The methodology of applying optical holography to evaluation of subscale Kevlar 49 composite pressure vessels is explored. The results and advantages of the holographic technique are discussed. The cases utilized were of similar design, but each had specific design features, the effects of which are reviewed. Burst testing results are presented in conjunction with the holographic fringe patterns obtained during progressive pressurization. Examples of quantitative data extracted by analysis of fringe fields are included.

  17. Model prototype utilization in the analysis of fault tolerant control and data processing systems

    NASA Astrophysics Data System (ADS)

    Kovalev, I. V.; Tsarev, R. Yu; Gruzenkin, D. V.; Prokopenko, A. V.; Knyazkov, A. N.; Laptenok, V. D.

    2016-04-01

    The procedure assessing the profit of control and data processing system implementation is presented in the paper. The reasonability of model prototype creation and analysis results from the implementing of the approach of fault tolerance provision through the inclusion of structural and software assessment redundancy. The developed procedure allows finding the best ratio between the development cost and the analysis of model prototype and earnings from the results of this utilization and information produced. The suggested approach has been illustrated by the model example of profit assessment and analysis of control and data processing system.

  18. Relationships between Student Perception of Teacher-Student Relations and PISA Results in Mathematics and Science

    ERIC Educational Resources Information Center

    Mikk, Jaan; Krips, Heiki; Säälik, Ülle; Kalk, Karmen

    2016-01-01

    Teacher-student relations have a significant correlation with student motivation, academic performance and discipline. For example, the meta-analysis by Hattie (2009) revealed an effect size of d = 0.72 for the effect of relations on achievement, and the meta-analysis by Finn, Schrodt, Witt, Elledge, Jernberg & Larson ("Communication…

  19. 26 CFR 1.6662-6 - Transactions between persons described in section 482 and net section 482 transfer price...

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... transaction price initially reflected in the taxpayer's books and records. The results of controlled..., including an analysis of the economic and legal factors that affect the pricing of its property or services... the economic analysis and projections relied upon in developing the method. For example, if a profit...

  20. Combinatorial Fusion Analysis for Meta Search Information Retrieval

    NASA Astrophysics Data System (ADS)

    Hsu, D. Frank; Taksa, Isak

    Leading commercial search engines are built as single event systems. In response to a particular search query, the search engine returns a single list of ranked search results. To find more relevant results the user must frequently try several other search engines. A meta search engine was developed to enhance the process of multi-engine querying. The meta search engine queries several engines at the same time and fuses individual engine results into a single search results list. The fusion of multiple search results has been shown (mostly experimentally) to be highly effective. However, the question of why and how the fusion should be done still remains largely unanswered. In this chapter, we utilize the combinatorial fusion analysis proposed by Hsu et al. to analyze combination and fusion of multiple sources of information. A rank/score function is used in the design and analysis of our framework. The framework provides a better understanding of the fusion phenomenon in information retrieval. For example, to improve the performance of the combined multiple scoring systems, it is necessary that each of the individual scoring systems has relatively high performance and the individual scoring systems are diverse. Additionally, we illustrate various applications of the framework using two examples from the information retrieval domain.

  1. Statistical analysis and interpolation of compositional data in materials science.

    PubMed

    Pesenson, Misha Z; Suram, Santosh K; Gregoire, John M

    2015-02-09

    Compositional data are ubiquitous in chemistry and materials science: analysis of elements in multicomponent systems, combinatorial problems, etc., lead to data that are non-negative and sum to a constant (for example, atomic concentrations). The constant sum constraint restricts the sampling space to a simplex instead of the usual Euclidean space. Since statistical measures such as mean and standard deviation are defined for the Euclidean space, traditional correlation studies, multivariate analysis, and hypothesis testing may lead to erroneous dependencies and incorrect inferences when applied to compositional data. Furthermore, composition measurements that are used for data analytics may not include all of the elements contained in the material; that is, the measurements may be subcompositions of a higher-dimensional parent composition. Physically meaningful statistical analysis must yield results that are invariant under the number of composition elements, requiring the application of specialized statistical tools. We present specifics and subtleties of compositional data processing through discussion of illustrative examples. We introduce basic concepts, terminology, and methods required for the analysis of compositional data and utilize them for the spatial interpolation of composition in a sputtered thin film. The results demonstrate the importance of this mathematical framework for compositional data analysis (CDA) in the fields of materials science and chemistry.

  2. Assessing Management Support for Worksite Health Promotion: Psychometric Analysis of the Leading by Example (LBE) Instrument

    PubMed Central

    Della, Lindsay J.; DeJoy, David M.; Goetzel, Ron Z.; Ozminkowski, Ronald J.; Wilson, Mark G.

    2009-01-01

    Objective This paper describes the development of the Leading by Example (LBE) instrument. Methods Exploratory factor analysis was used to obtain an initial factor structure. Factor validity was evaluated using confirmatory factor analysis methods. Cronbach’s alpha and item-total correlations provided information on the reliability of the factor subscales. Results Four subscales were identified: business alignment with health promotion objectives; awareness of the health-productivity link; worksite support for health promotion; leadership support for health promotion. Factor by group comparisons revealed that the initial factor structure is effective in detecting differences in organizational support for health promotion across different employee groups Conclusions Management support for health promotion can be assessed using the LBE, a brief, self-report questionnaire. Researchers can use the LBE to diagnose, track, and evaluate worksite health promotion programs. PMID:18517097

  3. How-To-Do-It: Measuring Vegetation Biomass and Production.

    ERIC Educational Resources Information Center

    Collins, Don; Weaver, T.

    1988-01-01

    Describes a lab exercise used to demonstrate the measurement of biomass in a three layered forest. Discusses sampling, estimation methods, and the analysis of results. Presents an example of a summary sheet for this activity. (CW)

  4. Visual analysis as a method of interpretation of the results of satellite ionospheric measurements for exploratory problems

    NASA Astrophysics Data System (ADS)

    Korneva, N. N.; Mogilevskii, M. M.; Nazarov, V. N.

    2016-05-01

    Traditional methods of time series analysis of satellite ionospheric measurements have some limitations and disadvantages that are mainly associated with the complex nonstationary signal structure. In this paper, the possibility of identifying and studying the temporal characteristics of signals via visual analysis is considered. The proposed approach is illustrated by the example of the visual analysis of wave measurements on the DEMETER microsatellite during its passage over the HAARP facility.

  5. Sensitivity Equation Derivation for Transient Heat Transfer Problems

    NASA Technical Reports Server (NTRS)

    Hou, Gene; Chien, Ta-Cheng; Sheen, Jeenson

    2004-01-01

    The focus of the paper is on the derivation of sensitivity equations for transient heat transfer problems modeled by different discretization processes. Two examples will be used in this study to facilitate the discussion. The first example is a coupled, transient heat transfer problem that simulates the press molding process in fabrication of composite laminates. These state equations are discretized into standard h-version finite elements and solved by a multiple step, predictor-corrector scheme. The sensitivity analysis results based upon the direct and adjoint variable approaches will be presented. The second example is a nonlinear transient heat transfer problem solved by a p-version time-discontinuous Galerkin's Method. The resulting matrix equation of the state equation is simply in the form of Ax = b, representing a single step, time marching scheme. A direct differentiation approach will be used to compute the thermal sensitivities of a sample 2D problem.

  6. Teaching hands-on geophysics: examples from the Rū seismic network in New Zealand

    NASA Astrophysics Data System (ADS)

    van Wijk, Kasper; Simpson, Jonathan; Adam, Ludmila

    2017-03-01

    Education in physics and geosciences can be effectively illustrated by the analysis of earthquakes and the subsequent propagation of seismic waves in the Earth. Educational seismology has matured to a level where both the hard- and software are robust and user friendly. This has resulted in successful implementation of educational networks around the world. Seismic data recorded by students are of such quality that these can be used in classic earthquake location exercises, for example. But even ocean waves weakly coupled into the Earth’s crust can now be recorded on educational seismometers. These signals are not just noise, but form the basis of more recent developments in seismology, such as seismic interferometry, where seismic waves generated by ocean waves—instead of earthquakes—can be used to infer information about the Earth’s interior. Here, we introduce an earthquake location exercise and an analysis of ambient seismic noise, and present examples. Data are provided, and all needed software is freely available.

  7. Population Fisher information matrix and optimal design of discrete data responses in population pharmacodynamic experiments.

    PubMed

    Ogungbenro, Kayode; Aarons, Leon

    2011-08-01

    In the recent years, interest in the application of experimental design theory to population pharmacokinetic (PK) and pharmacodynamic (PD) experiments has increased. The aim is to improve the efficiency and the precision with which parameters are estimated during data analysis and sometimes to increase the power and reduce the sample size required for hypothesis testing. The population Fisher information matrix (PFIM) has been described for uniresponse and multiresponse population PK experiments for design evaluation and optimisation. Despite these developments and availability of tools for optimal design of population PK and PD experiments much of the effort has been focused on repeated continuous variable measurements with less work being done on repeated discrete type measurements. Discrete data arise mainly in PDs e.g. ordinal, nominal, dichotomous or count measurements. This paper implements expressions for the PFIM for repeated ordinal, dichotomous and count measurements based on analysis by a mixed-effects modelling technique. Three simulation studies were used to investigate the performance of the expressions. Example 1 is based on repeated dichotomous measurements, Example 2 is based on repeated count measurements and Example 3 is based on repeated ordinal measurements. Data simulated in MATLAB were analysed using NONMEM (Laplace method) and the glmmML package in R (Laplace and adaptive Gauss-Hermite quadrature methods). The results obtained for Examples 1 and 2 showed good agreement between the relative standard errors obtained using the PFIM and simulations. The results obtained for Example 3 showed the importance of sampling at the most informative time points. Implementation of these expressions will provide the opportunity for efficient design of population PD experiments that involve discrete type data through design evaluation and optimisation.

  8. FirebrowseR: an R client to the Broad Institute’s Firehose Pipeline

    PubMed Central

    Deng, Mario; Brägelmann, Johannes; Kryukov, Ivan; Saraiva-Agostinho, Nuno; Perner, Sven

    2017-01-01

    With its Firebrowse service (http://firebrowse.org/) the Broad Institute is making large-scale multi-platform omics data analysis results publicly available through a Representational State Transfer (REST) Application Programmable Interface (API). Querying this database through an API client from an arbitrary programming environment is an essential task, allowing other developers and researchers to focus on their analysis and avoid data wrangling. Hence, as a first result, we developed a workflow to automatically generate, test and deploy such clients for rapid response to API changes. Its underlying infrastructure, a combination of free and publicly available web services, facilitates the development of API clients. It decouples changes in server software from the client software by reacting to changes in the RESTful service and removing direct dependencies on a specific implementation of an API. As a second result, FirebrowseR, an R client to the Broad Institute’s RESTful Firehose Pipeline, is provided as a working example, which is built by the means of the presented workflow. The package’s features are demonstrated by an example analysis of cancer gene expression data. Database URL: https://github.com/mariodeng/ PMID:28062517

  9. FirebrowseR: an R client to the Broad Institute's Firehose Pipeline.

    PubMed

    Deng, Mario; Brägelmann, Johannes; Kryukov, Ivan; Saraiva-Agostinho, Nuno; Perner, Sven

    2017-01-01

    With its Firebrowse service (http://firebrowse.org/) the Broad Institute is making large-scale multi-platform omics data analysis results publicly available through a Representational State Transfer (REST) Application Programmable Interface (API). Querying this database through an API client from an arbitrary programming environment is an essential task, allowing other developers and researchers to focus on their analysis and avoid data wrangling. Hence, as a first result, we developed a workflow to automatically generate, test and deploy such clients for rapid response to API changes. Its underlying infrastructure, a combination of free and publicly available web services, facilitates the development of API clients. It decouples changes in server software from the client software by reacting to changes in the RESTful service and removing direct dependencies on a specific implementation of an API. As a second result, FirebrowseR, an R client to the Broad Institute's RESTful Firehose Pipeline, is provided as a working example, which is built by the means of the presented workflow. The package's features are demonstrated by an example analysis of cancer gene expression data.Database URL: https://github.com/mariodeng/. © The Author(s) 2017. Published by Oxford University Press.

  10. A Gentle Introduction to Bayesian Analysis: Applications to Developmental Research

    PubMed Central

    van de Schoot, Rens; Kaplan, David; Denissen, Jaap; Asendorpf, Jens B; Neyer, Franz J; van Aken, Marcel AG

    2014-01-01

    Bayesian statistical methods are becoming ever more popular in applied and fundamental research. In this study a gentle introduction to Bayesian analysis is provided. It is shown under what circumstances it is attractive to use Bayesian estimation, and how to interpret properly the results. First, the ingredients underlying Bayesian methods are introduced using a simplified example. Thereafter, the advantages and pitfalls of the specification of prior knowledge are discussed. To illustrate Bayesian methods explained in this study, in a second example a series of studies that examine the theoretical framework of dynamic interactionism are considered. In the Discussion the advantages and disadvantages of using Bayesian statistics are reviewed, and guidelines on how to report on Bayesian statistics are provided. PMID:24116396

  11. Meta-analyses and adaptive group sequential designs in the clinical development process.

    PubMed

    Jennison, Christopher; Turnbull, Bruce W

    2005-01-01

    The clinical development process can be viewed as a succession of trials, possibly overlapping in calendar time. The design of each trial may be influenced by results from previous studies and other currently proceeding trials, as well as by external information. Results from all of these trials must be considered together in order to assess the efficacy and safety of the proposed new treatment. Meta-analysis techniques provide a formal way of combining the information. We examine how such methods can be used in combining results from: (1) a collection of separate studies, (2) a sequence of studies in an organized development program, and (3) stages within a single study using a (possibly adaptive) group sequential design. We present two examples. The first example concerns the combining of results from a Phase IIb trial using several dose levels or treatment arms with those of the Phase III trial comparing the treatment selected in Phase IIb against a control This enables a "seamless transition" from Phase IIb to Phase III. The second example examines the use of combination tests to analyze data from an adaptive group sequential trial.

  12. Understanding differences in results from literature-based and individual patient meta-analyses: an example from meta-analyses of observational data.

    PubMed

    Poppe, Katrina K; Doughty, Robert N; Yu, Cheuk-Man; Quintana, Miguel; Møller, Jacob E; Klein, Allan L; Gamble, Greg D; Dini, Frank L; Whalley, Gillian A

    2011-04-14

    Meta-analyses are increasingly used to summarise observational data however a literature meta-analysis (LMA) may give different results to the corresponding individual patient meta-analysis (IPMA). This study compares the published results of equivalent LMAs and IPMAs, highlighting factors that can affect the results and therefore impact on clinical interpretation of meta-analyses. Univariate results from published meta-analyses of prospective observational outcome data were compared, as were the number of studies, patients and length of follow-up. The absolute difference in survival was calculated. The association between severe diastolic dysfunction (RFP) and death post acute myocardial infarction (AMI) and in chronic heart failure (HF) were used as clinical examples. The IPMA hazard ratio was lower that the LMA odds ratio: AMI hazard ratio 2.67 (95% confidence interval 2.23 to 3.20), odds ratio 4.10 (3.38 to 4.99); HF hazard ratio 2.42 (2.06 to 2.83), odds ratio 4.36 (3.60 to 5.04). The IPMAs contained most of the studies from the LMAs as well as additional unpublished data, and a longer length of follow-up was available in the IPMAs (AMI 3.7 vs 2.6 yr, HF 4.0 vs 1.5 yr). Restricting analysis to the same studies in both the LMA and IPMA resulted in a similar difference in effect sizes between methods to those found in the published analyses. The result of a meta-analysis is affected by whether study level or individual patient data have been used, and the variant of analysis that is required. Awareness and consideration of these factors is important for clinical interpretation of meta-analyses. Copyright © 2009 Elsevier B.V. All rights reserved.

  13. Using qualitative comparative analysis (QCA) in systematic reviews of complex interventions: a worked example

    PubMed Central

    2014-01-01

    Background Systematic reviews that address policy and practice questions in relation to complex interventions frequently need not only to assess the efficacy of a given intervention but to identify which intervention - and which intervention components - might be most effective in particular situations. Here, intervention replication is rare, and commonly used synthesis methods are less useful when the focus of analysis is the identification of those components of an intervention that are critical to its success. Methods Having identified initial theories of change in a previous analysis, we explore the potential of qualitative comparative analysis (QCA) to assist with complex syntheses through a worked example. Developed originally in the area of political science and historical sociology, a QCA aims to identify those configurations of participant, intervention and contextual characteristics that may be associated with a given outcome. Analysing studies in these terms facilitates the identification of necessary and sufficient conditions for the outcome to be obtained. Since QCA is predicated on the assumption that multiple pathways might lead to the same outcome and does not assume a linear additive model in terms of changes to a particular condition (that is, it can cope with ‘tipping points’ in complex interventions), it appears not to suffer from some of the limitations of the statistical methods often used in meta-analysis. Results The worked example shows how the QCA reveals that our initial theories of change were unable to distinguish between ‘effective’ and ‘highly effective’ interventions. Through the iterative QCA process, other intervention characteristics are identified that better explain the observed results. Conclusions QCA is a promising alternative (or adjunct), particularly to the standard fall-back of a ‘narrative synthesis’ when a quantitative synthesis is impossible, and should be considered when reviews are broad and heterogeneity is significant. There are very few examples of its use with systematic review data at present, and further methodological work is needed to establish optimal conditions for its use and to document process, practice, and reporting standards. PMID:24950727

  14. An automated real-time microscopy system for analysis of fluorescence resonance energy transfer

    NASA Astrophysics Data System (ADS)

    Bernardini, André; Wotzlaw, Christoph; Lipinski, Hans-Gerd; Fandrey, Joachim

    2010-05-01

    Molecular imaging based on Fluorescence Resonance Energy Transfer (FRET) is widely used in cellular physiology both for protein-protein interaction analysis and detecting conformational changes of single proteins, e.g. during activation of signaling cascades. However, getting reliable results from FRET measurements is still hampered by methodological problems such as spectral bleed through, chromatic aberration, focal plane shifts and false positive FRET. Particularly false positive FRET signals caused by random interaction of the fluorescent dyes can easily lead to misinterpretation of the data. This work introduces a Nipkow Disc based FRET microscopy system, that is easy to operate without expert knowledge of FRET. The system automatically accounts for all relevant sources of errors and provides various result presentations of two, three and four dimensional FRET data. Two examples are given to demonstrate the scope of application. An interaction analysis of the two subunits of the hypoxia-inducible transcription factor 1 demonstrates the use of the system as a tool for protein-protein interaction analysis. As an example for time lapse observations, the conformational change of the fluorophore labeled heat shock protein 33 in the presence of oxidant stress is shown.

  15. Application of copulas to improve covariance estimation for partial least squares.

    PubMed

    D'Angelo, Gina M; Weissfeld, Lisa A

    2013-02-20

    Dimension reduction techniques, such as partial least squares, are useful for computing summary measures and examining relationships in complex settings. Partial least squares requires an estimate of the covariance matrix as a first step in the analysis, making this estimate critical to the results. In addition, the covariance matrix also forms the basis for other techniques in multivariate analysis, such as principal component analysis and independent component analysis. This paper has been motivated by an example from an imaging study in Alzheimer's disease where there is complete separation between Alzheimer's and control subjects for one of the imaging modalities. This separation occurs in one block of variables and does not occur with the second block of variables resulting in inaccurate estimates of the covariance. We propose the use of a copula to obtain estimates of the covariance in this setting, where one set of variables comes from a mixture distribution. Simulation studies show that the proposed estimator is an improvement over the standard estimators of covariance. We illustrate the methods from the motivating example from a study in the area of Alzheimer's disease. Copyright © 2012 John Wiley & Sons, Ltd.

  16. System Safety and the Unintended Consequence

    NASA Technical Reports Server (NTRS)

    Watson, Clifford

    2012-01-01

    The analysis and identification of risks often result in design changes or modification of operational steps. This paper identifies the potential of unintended consequences as an over-looked result of these changes. Examples of societal changes such as prohibition, regulatory changes including mandating lifeboats on passenger ships, and engineering proposals or design changes to automobiles and spaceflight hardware are used to demonstrate that the System Safety Engineer must be cognizant of the potential for unintended consequences as a result of an analysis. Conclusions of the report indicate the need for additional foresight and consideration of the potential effects of analysis-driven design, processing changes, and/or operational modifications.

  17. Pre/Post Data Analysis - Simple or Is It?

    NASA Technical Reports Server (NTRS)

    Feiveson, Al; Fiedler, James; Ploutz-Snyder, Robert

    2011-01-01

    This slide presentation reviews some of the problems of data analysis in analyzing pre and post data. Using as an example, ankle extensor strength (AES) experiments, to measure bone density loss during bed rest, the presentation discusses several questions: (1) How should we describe change? (2) Common analysis methods for comparing post to pre results. (3) What do we mean by "% change"? and (4) What are we testing when we compare % changes?

  18. An Analysis of the Full-Floating Journal Bearing

    NASA Technical Reports Server (NTRS)

    Shaw, M C; Nussdorfer, T J , Jr

    1947-01-01

    An analysis of the operating characteristics of a full-floating journal bearing, a bearing in which a floating sleeve is located between the journal and bearing surfaces, is presented together with charts from which the performance of such bearings may be predicted. Examples are presented to illustrate the use of these charts and a limited number of experiments conducted upon a glass full-floating bearing are reported to verify some results of the analysis.

  19. Second-Order Factor Analysis as a Validity Assessment Tool: A Case Study Example Involving Perceptions of Stereotypic Love.

    ERIC Educational Resources Information Center

    Borrello, Gloria M.; Thompson, Bruce

    The calculation of second-order results in the validity assessment of measures and some useful interpretation aids are presented. First-order and second-order results give different and informative pictures of data dynamics. Several aspects of good practice in interpretation of second-order results are presented using data from 487 subjects…

  20. Results of an integrated structure-control law design sensitivity analysis

    NASA Technical Reports Server (NTRS)

    Gilbert, Michael G.

    1988-01-01

    Next generation air and space vehicle designs are driven by increased performance requirements, demanding a high level of design integration between traditionally separate design disciplines. Interdisciplinary analysis capabilities have been developed, for aeroservoelastic aircraft and large flexible spacecraft control for instance, but the requisite integrated design methods are only beginning to be developed. One integrated design method which has received attention is based on hierarchal problem decompositions, optimization, and design sensitivity analyses. This paper highlights a design sensitivity analysis method for Linear Quadratic Cost, Gaussian (LQG) optimal control laws, which predicts change in the optimal control law due to changes in fixed problem parameters using analytical sensitivity equations. Numerical results of a design sensitivity analysis for a realistic aeroservoelastic aircraft example are presented. In this example, the sensitivity of the optimally controlled aircraft's response to various problem formulation and physical aircraft parameters is determined. These results are used to predict the aircraft's new optimally controlled response if the parameter was to have some other nominal value during the control law design process. The sensitivity results are validated by recomputing the optimal control law for discrete variations in parameters, computing the new actual aircraft response, and comparing with the predicted response. These results show an improvement in sensitivity accuracy for integrated design purposes over methods which do not include changess in the optimal control law. Use of the analytical LQG sensitivity expressions is also shown to be more efficient that finite difference methods for the computation of the equivalent sensitivity information.

  1. Climat du passé (400000 ans) : des temps géologiques à la dérive actuellePast climate (the last 400 ka): from geological times to future climate change

    NASA Astrophysics Data System (ADS)

    Jouzel, Jean

    2003-06-01

    Studies of past climate have, over the last 15 years, provided a wealth of information directly relevant to its evolution in the future. These results include, in particular, the discovery of a link between greenhouse gases and climate in the past and the characterization of rapid climate changes. They are, for example, based on the analysis of deep ice cores such as the one drilled at the Vostok site, which allows us to describe the evolution of the Antarctic climate and of the atmospheric composition over more than 400 thousands years (kyr). This period is also now better and better documented from the analysis of oceanic and continental records. Through examples based on recent studies, in which French teams are deeply involved, we will illustrate the most important results obtained from the analysis of polar ice cores, deep-sea cores and continental archives. To cite this article: J. Jouzel, C. R. Geoscience 335 (2003).

  2. Progressing from initially ambiguous functional analyses: three case examples.

    PubMed

    Tiger, Jeffrey H; Fisher, Wayne W; Toussaint, Karen A; Kodak, Tiffany

    2009-01-01

    Most often functional analyses are initiated using a standard set of test conditions, similar to those described by Iwata, Dorsey, Slifer, Bauman, and Richman [Iwata, B. A., Dorsey, M. F., Slifer, K. J., Bauman, K. E., & Richman, G. S. (1994). Toward a functional analysis of self-injury. Journal of Applied Behavior Analysis, 27, 197-209 (Reprinted from Analysis and Intervention in Developmental Disabilities, 2, 3-20, 1982)]. These test conditions involve the careful manipulation of motivating operations, discriminative stimuli, and reinforcement contingencies to determine the events related to the occurrence and maintenance of problem behavior. Some individuals display problem behavior that is occasioned and reinforced by idiosyncratic or otherwise unique combinations of environmental antecedents and consequences of behavior, which are unlikely to be detected using these standard assessment conditions. For these individuals, modifications to the standard test conditions or the inclusion of novel test conditions may result in clearer assessment outcomes. The current study provides three case examples of individuals whose functional analyses were initially undifferentiated; however, modifications to the standard conditions resulted in the identification of behavioral functions and the implementation of effective function-based treatments.

  3. The problem of pseudoreplication in neuroscientific studies: is it affecting your analysis?

    PubMed Central

    2010-01-01

    Background Pseudoreplication occurs when observations are not statistically independent, but treated as if they are. This can occur when there are multiple observations on the same subjects, when samples are nested or hierarchically organised, or when measurements are correlated in time or space. Analysis of such data without taking these dependencies into account can lead to meaningless results, and examples can easily be found in the neuroscience literature. Results A single issue of Nature Neuroscience provided a number of examples and is used as a case study to highlight how pseudoreplication arises in neuroscientific studies, why the analyses in these papers are incorrect, and appropriate analytical methods are provided. 12% of papers had pseudoreplication and a further 36% were suspected of having pseudoreplication, but it was not possible to determine for certain because insufficient information was provided. Conclusions Pseudoreplication can undermine the conclusions of a statistical analysis, and it would be easier to detect if the sample size, degrees of freedom, the test statistic, and precise p-values are reported. This information should be a requirement for all publications. PMID:20074371

  4. User's Guide to Handlens - A Computer Program that Calculates the Chemistry of Minerals in Mixtures

    USGS Publications Warehouse

    Eberl, D.D.

    2008-01-01

    HandLens is a computer program, written in Excel macro language, that calculates the chemistry of minerals in mineral mixtures (for example, in rocks, soils and sediments) for related samples from inputs of quantitative mineralogy and chemistry. For best results, the related samples should contain minerals having the same chemical compositions; that is, the samples should differ only in the proportions of minerals present. This manual describes how to use the program, discusses the theory behind its operation, and presents test results of the program's accuracy. Required input for HandLens includes quantitative mineralogical data, obtained, for example, by RockJock analysis of X-ray diffraction (XRD) patterns, and quantitative chemical data, obtained, for example, by X-ray florescence (XRF) analysis of the same samples. Other quantitative data, such as sample depth, temperature, surface area, also can be entered. The minerals present in the samples are selected from a list, and the program is started. The results of the calculation include: (1) a table of linear coefficients of determination (r2's) which relate pairs of input data (for example, Si versus quartz weight percents); (2) a utility for plotting all input data, either as pairs of variables, or as sums of up to eight variables; (3) a table that presents the calculated chemical formulae for minerals in the samples; (4) a table that lists the calculated concentrations of major, minor, and trace elements in the various minerals; and (5) a table that presents chemical formulae for the minerals that have been corrected for possible systematic errors in the mineralogical and/or chemical analyses. In addition, the program contains a method for testing the assumption of constant chemistry of the minerals within a sample set.

  5. User-perceived reliability of unrepairable shared protection systems with functionally identical units

    NASA Astrophysics Data System (ADS)

    Ozaki, Hirokazu; Kara, Atsushi; Cheng, Zixue

    2012-05-01

    In this article, we investigate the reliability of M-for-N (M:N) shared protection systems. We focus on the reliability that is perceived by an end user of one of N units. We assume that any failed unit is instantly replaced by one of the M units (if available). We describe the effectiveness of such a protection system in a quantitative manner under the condition that the failed units are not repairable. Mathematical analysis gives the closed-form solution of the reliability and mean time to failure (MTTF). We also analyse several numerical examples of the reliability and MTTF. This result can be applied, for example, to the analysis and design of an integrated circuit consisting of redundant backup components. In such a device, repairing a failed component is unrealistic. The analysis provides useful information for the design for general shared protection systems in which the failed units are not repaired.

  6. Nonlinear Analysis in Counseling Research

    ERIC Educational Resources Information Center

    Balkin, Richard S.; Richey Gosnell, Katelyn M.; Holmgren, Andrew; Osborne, Jason W.

    2017-01-01

    Nonlinear effects are both underreported and underrepresented in counseling research. We provide a rationale for evaluating nonlinear effects and steps to evaluate nonlinear relationships in counseling research. Two heuristic examples are provided along with discussion of the results and advantages to evaluating nonlinear effects.

  7. 26 CFR 1.482-6 - Profit split method.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... consistency between the controlled and uncontrolled taxpayers in accounting practices that materially affect... result. Thus, for example, if differences in inventory and other cost accounting practices would... between the controlled and uncontrolled transactions increases, the relative weight accorded the analysis...

  8. 26 CFR 1.482-6 - Profit split method.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... consistency between the controlled and uncontrolled taxpayers in accounting practices that materially affect... result. Thus, for example, if differences in inventory and other cost accounting practices would... between the controlled and uncontrolled transactions increases, the relative weight accorded the analysis...

  9. 26 CFR 1.482-6 - Profit split method.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... consistency between the controlled and uncontrolled taxpayers in accounting practices that materially affect... result. Thus, for example, if differences in inventory and other cost accounting practices would... between the controlled and uncontrolled transactions increases, the relative weight accorded the analysis...

  10. 26 CFR 1.482-6 - Profit split method.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... consistency between the controlled and uncontrolled taxpayers in accounting practices that materially affect... result. Thus, for example, if differences in inventory and other cost accounting practices would... between the controlled and uncontrolled transactions increases, the relative weight accorded the analysis...

  11. CAUSAL ANALYSIS AND PROBABILITY DATA: EXAMPLES FOR IMPAIRED AQUATIC CONDITION

    EPA Science Inventory

    Causal analysis is plausible reasoning applied to diagnosing observed effect(s), for example, diagnosing

    cause of biological impairment in a stream. Sir Bradford Hill basically defined the application of causal

    analysis when he enumerated the elements of causality f...

  12. Passivity analysis for uncertain BAM neural networks with time delays and reaction-diffusions

    NASA Astrophysics Data System (ADS)

    Zhou, Jianping; Xu, Shengyuan; Shen, Hao; Zhang, Baoyong

    2013-08-01

    This article deals with the problem of passivity analysis for delayed reaction-diffusion bidirectional associative memory (BAM) neural networks with weight uncertainties. By using a new integral inequality, we first present a passivity condition for the nominal networks, and then extend the result to the case with linear fractional weight uncertainties. The proposed conditions are expressed in terms of linear matrix inequalities, and thus can be checked easily. Examples are provided to demonstrate the effectiveness of the proposed results.

  13. Quantitative assessment of the impact of biomedical image acquisition on the results obtained from image analysis and processing.

    PubMed

    Koprowski, Robert

    2014-07-04

    Dedicated, automatic algorithms for image analysis and processing are becoming more and more common in medical diagnosis. When creating dedicated algorithms, many factors must be taken into consideration. They are associated with selecting the appropriate algorithm parameters and taking into account the impact of data acquisition on the results obtained. An important feature of algorithms is the possibility of their use in other medical units by other operators. This problem, namely operator's (acquisition) impact on the results obtained from image analysis and processing, has been shown on a few examples. The analysed images were obtained from a variety of medical devices such as thermal imaging, tomography devices and those working in visible light. The objects of imaging were cellular elements, the anterior segment and fundus of the eye, postural defects and others. In total, almost 200'000 images coming from 8 different medical units were analysed. All image analysis algorithms were implemented in C and Matlab. For various algorithms and methods of medical imaging, the impact of image acquisition on the results obtained is different. There are different levels of algorithm sensitivity to changes in the parameters, for example: (1) for microscope settings and the brightness assessment of cellular elements there is a difference of 8%; (2) for the thyroid ultrasound images there is a difference in marking the thyroid lobe area which results in a brightness assessment difference of 2%. The method of image acquisition in image analysis and processing also affects: (3) the accuracy of determining the temperature in the characteristic areas on the patient's back for the thermal method - error of 31%; (4) the accuracy of finding characteristic points in photogrammetric images when evaluating postural defects - error of 11%; (5) the accuracy of performing ablative and non-ablative treatments in cosmetology - error of 18% for the nose, 10% for the cheeks, and 7% for the forehead. Similarly, when: (7) measuring the anterior eye chamber - there is an error of 20%; (8) measuring the tooth enamel thickness - error of 15%; (9) evaluating the mechanical properties of the cornea during pressure measurement - error of 47%. The paper presents vital, selected issues occurring when assessing the accuracy of designed automatic algorithms for image analysis and processing in bioengineering. The impact of acquisition of images on the problems arising in their analysis has been shown on selected examples. It has also been indicated to which elements of image analysis and processing special attention should be paid in their design.

  14. Qualitative case study data analysis: an example from practice.

    PubMed

    Houghton, Catherine; Murphy, Kathy; Shaw, David; Casey, Dympna

    2015-05-01

    To illustrate an approach to data analysis in qualitative case study methodology. There is often little detail in case study research about how data were analysed. However, it is important that comprehensive analysis procedures are used because there are often large sets of data from multiple sources of evidence. Furthermore, the ability to describe in detail how the analysis was conducted ensures rigour in reporting qualitative research. The research example used is a multiple case study that explored the role of the clinical skills laboratory in preparing students for the real world of practice. Data analysis was conducted using a framework guided by the four stages of analysis outlined by Morse ( 1994 ): comprehending, synthesising, theorising and recontextualising. The specific strategies for analysis in these stages centred on the work of Miles and Huberman ( 1994 ), which has been successfully used in case study research. The data were managed using NVivo software. Literature examining qualitative data analysis was reviewed and strategies illustrated by the case study example provided. Discussion Each stage of the analysis framework is described with illustration from the research example for the purpose of highlighting the benefits of a systematic approach to handling large data sets from multiple sources. By providing an example of how each stage of the analysis was conducted, it is hoped that researchers will be able to consider the benefits of such an approach to their own case study analysis. This paper illustrates specific strategies that can be employed when conducting data analysis in case study research and other qualitative research designs.

  15. Three-Dimensional Analysis and Surgical Planning in Craniomaxillofacial Surgery.

    PubMed

    Steinbacher, Derek M

    2015-12-01

    Three-dimensional (3D) analysis and planning are powerful tools in craniofacial and reconstructive surgery. The elements include 1) analysis, 2) planning, 3) virtual surgery, 4) 3D printouts of guides or implants, and 5) verification of actual to planned results. The purpose of this article is to review different applications of 3D planning in craniomaxillofacial surgery. Case examples involving 3D analysis and planning were reviewed. Common threads pertaining to all types of reconstruction are highlighted and contrasted with unique aspects specific to new applications in craniomaxillofacial surgery. Six examples of 3D planning are described: 1) cranial reconstruction, 2) craniosynostosis, 3) midface advancement, 4) mandibular distraction, 5) mandibular reconstruction, and 6) orthognathic surgery. Planning in craniomaxillofacial surgery is useful and has applicability across different procedures and reconstructions. Three-dimensional planning and virtual surgery enhance efficiency, accuracy, creativity, and reproducibility in craniomaxillofacial surgery. Copyright © 2015 American Association of Oral and Maxillofacial Surgeons. Published by Elsevier Inc. All rights reserved.

  16. An example of requirements for Advanced Subsonic Civil Transport (ASCT) flight control system using structured techniques

    NASA Technical Reports Server (NTRS)

    Mclees, Robert E.; Cohen, Gerald C.

    1991-01-01

    The requirements are presented for an Advanced Subsonic Civil Transport (ASCT) flight control system generated using structured techniques. The requirements definition starts from initially performing a mission analysis to identify the high level control system requirements and functions necessary to satisfy the mission flight. The result of the study is an example set of control system requirements partially represented using a derivative of Yourdon's structured techniques. Also provided is a research focus for studying structured design methodologies and in particular design-for-validation philosophies.

  17. Approach to recognition of flexible form for credit card expiration date recognition as example

    NASA Astrophysics Data System (ADS)

    Sheshkus, Alexander; Nikolaev, Dmitry P.; Ingacheva, Anastasia; Skoryukina, Natalya

    2015-12-01

    In this paper we consider a task of finding information fields within document with flexible form for credit card expiration date field as example. We discuss main difficulties and suggest possible solutions. In our case this task is to be solved on mobile devices therefore computational complexity has to be as low as possible. In this paper we provide results of the analysis of suggested algorithm. Error distribution of the recognition system shows that suggested algorithm solves the task with required accuracy.

  18. Modeling and design optimization of adhesion between surfaces at the microscale.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sylves, Kevin T.

    2008-08-01

    This research applies design optimization techniques to structures in adhesive contact where the dominant adhesive mechanism is the van der Waals force. Interface finite elements are developed for domains discretized by beam elements, quadrilateral elements or triangular shell elements. Example analysis problems comparing finite element results to analytical solutions are presented. These examples are then optimized, where the objective is matching a force-displacement relationship and the optimization variables are the interface element energy of adhesion or the width of beam elements in the structure. Several parameter studies are conducted and discussed.

  19. Practical example of game theory application for production route selection

    NASA Astrophysics Data System (ADS)

    Olender, M.; Krenczyk, D.

    2017-08-01

    The opportunity which opens before manufacturers on the dynamic market, especially before those from the sector of the small and medium-sized enterprises, is associated with the use of the virtual organizations concept. The planning stage of such organizations could be based on supporting decision-making tasks using the tools and formalisms taken from the game theory. In the paper the model of the virtual manufacturing network, along with the practical example of decision-making situation as two person game and the decision strategies with an analysis of calculation results are presented.

  20. Eigenvalue and eigenvector sensitivity and approximate analysis for repeated eigenvalue problems

    NASA Technical Reports Server (NTRS)

    Hou, Gene J. W.; Kenny, Sean P.

    1991-01-01

    A set of computationally efficient equations for eigenvalue and eigenvector sensitivity analysis are derived, and a method for eigenvalue and eigenvector approximate analysis in the presence of repeated eigenvalues is presented. The method developed for approximate analysis involves a reparamaterization of the multivariable structural eigenvalue problem in terms of a single positive-valued parameter. The resulting equations yield first-order approximations of changes in both the eigenvalues and eigenvectors associated with the repeated eigenvalue problem. Examples are given to demonstrate the application of such equations for sensitivity and approximate analysis.

  1. Choosing estimands in clinical trials with missing data.

    PubMed

    Mallinckrodt, Craig; Molenberghs, Geert; Rathmann, Suchitrita

    2017-01-01

    Recent research has fostered new guidance on preventing and treating missing data. Consensus exists that clear objectives should be defined along with the causal estimands; trial design and conduct should maximize adherence to the protocol specified interventions; and a sensible primary analysis should be used along with plausible sensitivity analyses. Two general categories of estimands are effects of the drug as actually taken (de facto, effectiveness) and effects of the drug if taken as directed (de jure, efficacy). Motivated by examples, we argue that no single estimand is likely to meet the needs of all stakeholders and that each estimand has strengths and limitations. Therefore, stakeholder input should be part of an iterative study development process that includes choosing estimands that are consistent with trial objectives. To this end, an example is used to illustrate the benefit from assessing multiple estimands in the same study. A second example illustrates that maximizing adherence reduces sensitivity to missing data assumptions for de jure estimands but may reduce generalizability of results for de facto estimands if efforts to maximize adherence in the trial are not feasible in clinical practice. A third example illustrates that whether or not data after initiation of rescue medication should be included in the primary analysis depends on the estimand to be tested and the clinical setting. We further discuss the sample size and total exposure to placebo implications of including post-rescue data in the primary analysis. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  2. Exponential stability of impulsive stochastic genetic regulatory networks with time-varying delays and reaction-diffusion

    DOE PAGES

    Cao, Boqiang; Zhang, Qimin; Ye, Ming

    2016-11-29

    We present a mean-square exponential stability analysis for impulsive stochastic genetic regulatory networks (GRNs) with time-varying delays and reaction-diffusion driven by fractional Brownian motion (fBm). By constructing a Lyapunov functional and using linear matrix inequality for stochastic analysis we derive sufficient conditions to guarantee the exponential stability of the stochastic model of impulsive GRNs in the mean-square sense. Meanwhile, the corresponding results are obtained for the GRNs with constant time delays and standard Brownian motion. Finally, an example is presented to illustrate our results of the mean-square exponential stability analysis.

  3. CADDIS Volume 3. Examples and Applications: Analytical Examples

    EPA Pesticide Factsheets

    Examples illustrating the use of statistical analysis to support different types of evidence, stream temperature, temperature inferred from macroinverterbate, macroinvertebrate responses, zinc concentrations, observed trait characteristics.

  4. Realization of the FPGA-based reconfigurable computing environment by the example of morphological processing of a grayscale image

    NASA Astrophysics Data System (ADS)

    Shatravin, V.; Shashev, D. V.

    2018-05-01

    Currently, robots are increasingly being used in every industry. One of the most high-tech areas is creation of completely autonomous robotic devices including vehicles. The results of various global research prove the efficiency of vision systems in autonomous robotic devices. However, the use of these systems is limited because of the computational and energy resources available in the robot device. The paper describes the results of applying the original approach for image processing on reconfigurable computing environments by the example of morphological operations over grayscale images. This approach is prospective for realizing complex image processing algorithms and real-time image analysis in autonomous robotic devices.

  5. Experiences on p-Version Time-Discontinuous Galerkin's Method for Nonlinear Heat Transfer Analysis and Sensitivity Analysis

    NASA Technical Reports Server (NTRS)

    Hou, Gene

    2004-01-01

    The focus of this research is on the development of analysis and sensitivity analysis equations for nonlinear, transient heat transfer problems modeled by p-version, time discontinuous finite element approximation. The resulting matrix equation of the state equation is simply in the form ofA(x)x = c, representing a single step, time marching scheme. The Newton-Raphson's method is used to solve the nonlinear equation. Examples are first provided to demonstrate the accuracy characteristics of the resultant finite element approximation. A direct differentiation approach is then used to compute the thermal sensitivities of a nonlinear heat transfer problem. The report shows that only minimal coding effort is required to enhance the analysis code with the sensitivity analysis capability.

  6. Contact Stress Analysis of Spiral Bevel Gears Using Finite Element Analysis

    NASA Technical Reports Server (NTRS)

    Bibel, G. D.; Kumar, A; Reddy, S.; Handschuh, R.

    1995-01-01

    A procedure is presented for performing three-dimensional stress analysis of spiral bevel gears in mesh using the finite element method. The procedure involves generating a finite element model by solving equations that identify tooth surface coordinates. Coordinate transformations are used to orientate the gear and pinion for gear meshing. Contact boundary conditions are simulated with gap elements. A solution technique for correct orientation of the gap elements is given. Example models and results are presented.

  7. 11th Annual CMMI Technology Conference and User Group

    DTIC Science & Technology

    2011-11-17

    Examples of triggers may include: – Cost performance – Schedule performance – Results of management reviews – Occurrence of the risk • as a...Analysis (PHA) – Method 3 – Through bottom- up analysis of design data (e.g., flow diagrams, Failure Mode Effects and Criticality Analysis (FMECA...of formal reviews and the setting up of delta or follow- up reviews can be used to give the organization more places to look at the products as they

  8. The Use of Radioactivation Analysis in Biology; APPLICAZIONI DELL'ANALISI PER RADIOATTIVAZIONE ALLA BIOLOGIA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Merlini, M.

    1962-01-01

    The principles of activation analysis and the methods of detection and measurement of radioactivity from neutron irradiated samples are described. The application of this method in different fields is mentioned. An example of the use of activation analysis in biology is given; and the results of a study on the manganese content in different parts of a lamellibranch, Unio mancus elongatus (Pfeiffer) of Lago Maggiore, are presented and discussed. (auth)

  9. Sensitivity analysis and multidisciplinary optimization for aircraft design: Recent advances and results

    NASA Technical Reports Server (NTRS)

    Sobieszczanski-Sobieski, Jaroslaw

    1988-01-01

    Optimization by decomposition, complex system sensitivity analysis, and a rapid growth of disciplinary sensitivity analysis are some of the recent developments that hold promise of a quantum jump in the support engineers receive from computers in the quantitative aspects of design. Review of the salient points of these techniques is given and illustrated by examples from aircraft design as a process that combines the best of human intellect and computer power to manipulate data.

  10. Analysis of the Department of Defense Pre-Award Contracting Process

    DTIC Science & Technology

    2014-12-01

    Justification and Approval JBSA Joint Base San Antonio KPIs Key Performance Indicators MAJCOMs Major Command MP Mandatory Commands NAVIAR...meets desired results. Results-based performance measurement establishes key performance indicators ( KPIs ) that determine whether procurement...or goals, and underlying business processes (Cullen, 2009, p. 38). Within each quadrant, Cullen provided examples of KPIs that serve to measure

  11. Knowledge as an Aspect of Scientific Competence for Citizenship: Results of a Delphi Study in Spain

    ERIC Educational Resources Information Center

    España-Ramos, Enrique; González-García, Francisco José; Blanco-López, Ángel; Franco-Mariscal, Antonio Joaquín

    2016-01-01

    This article focuses on scientific knowledge as one aspect of the scientific competencies that citizens should ideally possess. The analysis is based on a Delphi study we conducted with Spanish experts from different science-related fields. The results showed that although the experts proposed several examples of scientific knowledge, the degree…

  12. 76 FR 66006 - Revised Medical Criteria for Evaluating Congenital Disorders That Affect Multiple Body Systems

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-10-25

    ... definitive test that documents your disorder (for example, genetic analysis or evidence of biochemical... testing, and We will not accept the fluorescence in situ hybridization (FISH) test--a screening test--and... results even if the person did have a test. Because we do not have definitive test results, we would...

  13. Thermal stress analysis for a wood composite blade. [wind turbines

    NASA Technical Reports Server (NTRS)

    Fu, K. C.; Harb, A.

    1984-01-01

    Heat conduction throughout the blade and the distribution of thermal stresses caused by the temperature distribution were determined for a laminated wood wind turbine blade in both the horizontal and vertical positions. Results show that blade cracking is not due to thermal stresses induced by insulation. A method and practical example of thermal stress analysis for an engineering body of orthotropic materials is presented.

  14. Unsteady-State Heat Transfer Involving a Phase Change: An Example of a 'Project-Oriented' Undergraduate Laboratory.

    ERIC Educational Resources Information Center

    Sundberg, Donald C.; Someshwar, Arun V.

    1989-01-01

    Describes the structure of an in-depth laboratory project chemical engineering. Provides modeling work to guide experimentation and experimental work on heat transfer analysis. Discusses the experimental results and evaluation of the project. (YP)

  15. Interactive visualization of numerical simulation results: A tool for mission planning and data analysis

    NASA Technical Reports Server (NTRS)

    Berchem, J.; Raeder, J.; Walker, R. J.; Ashour-Abdalla, M.

    1995-01-01

    We report on the development of an interactive system for visualizing and analyzing numerical simulation results. This system is based on visualization modules which use the Application Visualization System (AVS) and the NCAR graphics packages. Examples from recent simulations are presented to illustrate how these modules can be used for displaying and manipulating simulation results to facilitate their comparison with phenomenological model results and observations.

  16. MAC/GMC 4.0 User's Manual: Example Problem Manual. Volume 3

    NASA Technical Reports Server (NTRS)

    Bednarcyk, Brett A.; Arnold, Steven M.

    2002-01-01

    This document is the third volume in the three volume set of User's Manuals for the Micromechanics Analysis Code with Generalized Method of Cells Version 4.0 (MAC/GMC 4.0). Volume 1 is the Theory Manual, Volume 2 is the Keywords Manual, and this document is the Example Problems Manual. MAC/GMC 4.0 is a composite material and laminate analysis software program developed at the NASA Glenn Research Center. It is based on the generalized method of cells (GMC) micromechanics theory, which provides access to the local stress and strain fields in the composite material. This access grants GMC the ability to accommodate arbitrary local models for inelastic material behavior and various types of damage and failure analysis. MAC/GMC 4.0 has been built around GMC to provide the theory with a user-friendly framework, along with a library of local inelastic, damage, and failure models. Further, application of simulated thermo-mechanical loading, generation of output results, and selection of architectures to represent the composite material, have been automated in MAC/GMC 4.0. Finally, classical lamination theory has been implemented within MAC/GMC 4.0 wherein GMC is used to model the composite material response of each ply. Consequently, the full range of GMC composite material capabilities is available for analysis of arbitrary laminate configurations as well. This volume provides in-depth descriptions of 43 example problems, which were specially designed to highlight many of the most important capabilities of the code. The actual input files associated with each example problem are distributed with the MAC/GMC 4.0 software; thus providing the user with a convenient starting point for their own specialized problems of interest.

  17. Influences of system uncertainties on the numerical transfer path analysis of engine systems

    NASA Astrophysics Data System (ADS)

    Acri, A.; Nijman, E.; Acri, A.; Offner, G.

    2017-10-01

    Practical mechanical systems operate with some degree of uncertainty. In numerical models uncertainties can result from poorly known or variable parameters, from geometrical approximation, from discretization or numerical errors, from uncertain inputs or from rapidly changing forcing that can be best described in a stochastic framework. Recently, random matrix theory was introduced to take parameter uncertainties into account in numerical modeling problems. In particular in this paper, Wishart random matrix theory is applied on a multi-body dynamic system to generate random variations of the properties of system components. Multi-body dynamics is a powerful numerical tool largely implemented during the design of new engines. In this paper the influence of model parameter variability on the results obtained from the multi-body simulation of engine dynamics is investigated. The aim is to define a methodology to properly assess and rank system sources when dealing with uncertainties. Particular attention is paid to the influence of these uncertainties on the analysis and the assessment of the different engine vibration sources. Examples of the effects of different levels of uncertainties are illustrated by means of examples using a representative numerical powertrain model. A numerical transfer path analysis, based on system dynamic substructuring, is used to derive and assess the internal engine vibration sources. The results obtained from this analysis are used to derive correlations between parameter uncertainties and statistical distribution of results. The derived statistical information can be used to advance the knowledge of the multi-body analysis and the assessment of system sources when uncertainties in model parameters are considered.

  18. Development of a Probabilistic Tsunami Hazard Analysis in Japan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Toshiaki Sakai; Tomoyoshi Takeda; Hiroshi Soraoka

    2006-07-01

    It is meaningful for tsunami assessment to evaluate phenomena beyond the design basis as well as seismic design. Because once we set the design basis tsunami height, we still have possibilities tsunami height may exceeds the determined design tsunami height due to uncertainties regarding the tsunami phenomena. Probabilistic tsunami risk assessment consists of estimating for tsunami hazard and fragility of structures and executing system analysis. In this report, we apply a method for probabilistic tsunami hazard analysis (PTHA). We introduce a logic tree approach to estimate tsunami hazard curves (relationships between tsunami height and probability of excess) and present anmore » example for Japan. Examples of tsunami hazard curves are illustrated, and uncertainty in the tsunami hazard is displayed by 5-, 16-, 50-, 84- and 95-percentile and mean hazard curves. The result of PTHA will be used for quantitative assessment of the tsunami risk for important facilities located on coastal area. Tsunami hazard curves are the reasonable input data for structures and system analysis. However the evaluation method for estimating fragility of structures and the procedure of system analysis is now being developed. (authors)« less

  19. Multiscale recurrence analysis of spatio-temporal data

    NASA Astrophysics Data System (ADS)

    Riedl, M.; Marwan, N.; Kurths, J.

    2015-12-01

    The description and analysis of spatio-temporal dynamics is a crucial task in many scientific disciplines. In this work, we propose a method which uses the mapogram as a similarity measure between spatially distributed data instances at different time points. The resulting similarity values of the pairwise comparison are used to construct a recurrence plot in order to benefit from established tools of recurrence quantification analysis and recurrence network analysis. In contrast to other recurrence tools for this purpose, the mapogram approach allows the specific focus on different spatial scales that can be used in a multi-scale analysis of spatio-temporal dynamics. We illustrate this approach by application on mixed dynamics, such as traveling parallel wave fronts with additive noise, as well as more complicate examples, pseudo-random numbers and coupled map lattices with a semi-logistic mapping rule. Especially the complicate examples show the usefulness of the multi-scale consideration in order to take spatial pattern of different scales and with different rhythms into account. So, this mapogram approach promises new insights in problems of climatology, ecology, or medicine.

  20. Multiscale recurrence analysis of spatio-temporal data.

    PubMed

    Riedl, M; Marwan, N; Kurths, J

    2015-12-01

    The description and analysis of spatio-temporal dynamics is a crucial task in many scientific disciplines. In this work, we propose a method which uses the mapogram as a similarity measure between spatially distributed data instances at different time points. The resulting similarity values of the pairwise comparison are used to construct a recurrence plot in order to benefit from established tools of recurrence quantification analysis and recurrence network analysis. In contrast to other recurrence tools for this purpose, the mapogram approach allows the specific focus on different spatial scales that can be used in a multi-scale analysis of spatio-temporal dynamics. We illustrate this approach by application on mixed dynamics, such as traveling parallel wave fronts with additive noise, as well as more complicate examples, pseudo-random numbers and coupled map lattices with a semi-logistic mapping rule. Especially the complicate examples show the usefulness of the multi-scale consideration in order to take spatial pattern of different scales and with different rhythms into account. So, this mapogram approach promises new insights in problems of climatology, ecology, or medicine.

  1. A concept for holistic whole body MRI data analysis, Imiomics

    PubMed Central

    Malmberg, Filip; Johansson, Lars; Lind, Lars; Sundbom, Magnus; Ahlström, Håkan; Kullberg, Joel

    2017-01-01

    Purpose To present and evaluate a whole-body image analysis concept, Imiomics (imaging–omics) and an image registration method that enables Imiomics analyses by deforming all image data to a common coordinate system, so that the information in each voxel can be compared between persons or within a person over time and integrated with non-imaging data. Methods The presented image registration method utilizes relative elasticity constraints of different tissue obtained from whole-body water-fat MRI. The registration method is evaluated by inverse consistency and Dice coefficients and the Imiomics concept is evaluated by example analyses of importance for metabolic research using non-imaging parameters where we know what to expect. The example analyses include whole body imaging atlas creation, anomaly detection, and cross-sectional and longitudinal analysis. Results The image registration method evaluation on 128 subjects shows low inverse consistency errors and high Dice coefficients. Also, the statistical atlas with fat content intensity values shows low standard deviation values, indicating successful deformations to the common coordinate system. The example analyses show expected associations and correlations which agree with explicit measurements, and thereby illustrate the usefulness of the proposed Imiomics concept. Conclusions The registration method is well-suited for Imiomics analyses, which enable analyses of relationships to non-imaging data, e.g. clinical data, in new types of holistic targeted and untargeted big-data analysis. PMID:28241015

  2. Off-the-shelf Control of Data Analysis Software

    NASA Astrophysics Data System (ADS)

    Wampler, S.

    The Gemini Project must provide convenient access to data analysis facilities to a wide user community. The international nature of this community makes the selection of data analysis software particularly interesting, with staunch advocates of systems such as ADAM and IRAF among the users. Additionally, the continuing trends towards increased use of networked systems and distributed processing impose additional complexity. To meet these needs, the Gemini Project is proposing the novel approach of using low-cost, off-the-shelf software to abstract out both the control and distribution of data analysis from the functionality of the data analysis software. For example, the orthogonal nature of control versus function means that users might select analysis routines from both ADAM and IRAF as appropriate, distributing these routines across a network of machines. It is the belief of the Gemini Project that this approach results in a system that is highly flexible, maintainable, and inexpensive to develop. The Khoros visualization system is presented as an example of control software that is currently available for providing the control and distribution within a data analysis system. The visual programming environment provided with Khoros is also discussed as a means to providing convenient access to this control.

  3. User's Manual and Final Report for Hot-SMAC GUI Development

    NASA Technical Reports Server (NTRS)

    Yarrington, Phil

    2001-01-01

    A new software package called Higher Order Theory-Structural/Micro Analysis Code (HOT-SMAC) has been developed as an effective alternative to the finite element approach for Functionally Graded Material (FGM) modeling. HOT-SMAC is a self-contained package including pre- and post-processing through an intuitive graphical user interface, along with the well-established Higher Order Theory for Functionally Graded Materials (HOTFGM) thermomechanical analysis engine. This document represents a Getting Started/User's Manual for HOT-SMAC and a final report for its development. First, the features of the software are presented in a simple step-by-step example where a HOT-SMAC model representing a functionally graded material is created, mechanical and thermal boundary conditions are applied, the model is analyzed and results are reviewed. In a second step-by-step example, a HOT-SMAC model of an actively cooled metallic channel with ceramic thermal barrier coating is built and analyzed. HOT-SMAC results from this model are compared to recently published results (NASA/TM-2001-210702) for two grid densities. Finally, a prototype integration of HOTSMAC with the commercially available HyperSizer(R) structural analysis and sizing software is presented. In this integration, local strain results from HyperSizer's structural analysis are fed to a detailed HOT-SMAC model of the flange-to-facesheet bond region of a stiffened panel. HOT-SMAC is then used to determine the peak shear and peel (normal) stresses between the facesheet and bonded flange of the panel and determine the "free edge" effects.

  4. Distributed Evaluation of Local Sensitivity Analysis (DELSA), with application to hydrologic models

    USGS Publications Warehouse

    Rakovec, O.; Hill, Mary C.; Clark, M.P.; Weerts, A. H.; Teuling, A. J.; Uijlenhoet, R.

    2014-01-01

    This paper presents a hybrid local-global sensitivity analysis method termed the Distributed Evaluation of Local Sensitivity Analysis (DELSA), which is used here to identify important and unimportant parameters and evaluate how model parameter importance changes as parameter values change. DELSA uses derivative-based “local” methods to obtain the distribution of parameter sensitivity across the parameter space, which promotes consideration of sensitivity analysis results in the context of simulated dynamics. This work presents DELSA, discusses how it relates to existing methods, and uses two hydrologic test cases to compare its performance with the popular global, variance-based Sobol' method. The first test case is a simple nonlinear reservoir model with two parameters. The second test case involves five alternative “bucket-style” hydrologic models with up to 14 parameters applied to a medium-sized catchment (200 km2) in the Belgian Ardennes. Results show that in both examples, Sobol' and DELSA identify similar important and unimportant parameters, with DELSA enabling more detailed insight at much lower computational cost. For example, in the real-world problem the time delay in runoff is the most important parameter in all models, but DELSA shows that for about 20% of parameter sets it is not important at all and alternative mechanisms and parameters dominate. Moreover, the time delay was identified as important in regions producing poor model fits, whereas other parameters were identified as more important in regions of the parameter space producing better model fits. The ability to understand how parameter importance varies through parameter space is critical to inform decisions about, for example, additional data collection and model development. The ability to perform such analyses with modest computational requirements provides exciting opportunities to evaluate complicated models as well as many alternative models.

  5. Quantitative analysis of terahertz spectra for illicit drugs using adaptive-range micro-genetic algorithm

    NASA Astrophysics Data System (ADS)

    Chen, Yi; Ma, Yong; Lu, Zheng; Peng, Bei; Chen, Qin

    2011-08-01

    In the field of anti-illicit drug applications, many suspicious mixture samples might consist of various drug components—for example, a mixture of methamphetamine, heroin, and amoxicillin—which makes spectral identification very difficult. A terahertz spectroscopic quantitative analysis method using an adaptive range micro-genetic algorithm with a variable internal population (ARVIPɛμGA) has been proposed. Five mixture cases are discussed using ARVIPɛμGA driven quantitative terahertz spectroscopic analysis in this paper. The devised simulation results show agreement with the previous experimental results, which suggested that the proposed technique has potential applications for terahertz spectral identifications of drug mixture components. The results show agreement with the results obtained using other experimental and numerical techniques.

  6. How to perform a cost-effectiveness analysis with surrogate endpoint: renal denervation in patients with resistant hypertension (DENERHTN) trial as an example.

    PubMed

    Bulsei, Julie; Darlington, Meryl; Durand-Zaleski, Isabelle; Azizi, Michel

    2018-04-01

    Whilst much uncertainty exists as to the efficacy of renal denervation (RDN), the positive results of the DENERHTN study in France confirmed the interest of an economic evaluation in order to assess efficiency of RDN and inform local decision makers about the costs and benefits of this intervention. The uncertainty surrounding both the outcomes and the costs can be described using health economic methods such as the non-parametric bootstrap. Internationally, numerous health economic studies using a cost-effectiveness model to assess the impact of RDN in terms of cost and effectiveness compared to antihypertensive medical treatment have been conducted. The DENERHTN cost-effectiveness study was the first health economic evaluation specifically designed to assess the cost-effectiveness of RDN using individual data. Using the DENERHTN results as an example, we provide here a summary of the principle methods used to perform a cost-effectiveness analysis.

  7. Study on the Spectral Mixing Model for Mineral Pigments Based on Derivative of Ratio Spectroscopy-Take Vermilion and Stone Yellow for Example

    NASA Astrophysics Data System (ADS)

    Zhao, H.; Hao, Y.; Liu, X.; Hou, M.; Zhao, X.

    2018-04-01

    Hyperspectral remote sensing is a completely non-invasive technology for measurement of cultural relics, and has been successfully applied in identification and analysis of pigments of Chinese historical paintings. Although the phenomenon of mixing pigments is very usual in Chinese historical paintings, the quantitative analysis of the mixing pigments in the ancient paintings is still unsolved. In this research, we took two typical mineral pigments, vermilion and stone yellow as example, made precisely mixed samples using these two kinds of pigments, and measured their spectra in the laboratory. For the mixing spectra, both fully constrained least square (FCLS) method and derivative of ratio spectroscopy (DRS) were performed. Experimental results showed that the mixing spectra of vermilion and stone yellow had strong nonlinear mixing characteristics, but at some bands linear unmixing could also achieve satisfactory results. DRS using strong linear bands can reach much higher accuracy than that of FCLS using full bands.

  8. Delamination Modeling of Composites for Improved Crash Analysis

    NASA Technical Reports Server (NTRS)

    Fleming, David C.

    1999-01-01

    Finite element crash modeling of composite structures is limited by the inability of current commercial crash codes to accurately model delamination growth. Efforts are made to implement and assess delamination modeling techniques using a current finite element crash code, MSC/DYTRAN. Three methods are evaluated, including a straightforward method based on monitoring forces in elements or constraints representing an interface; a cohesive fracture model proposed in the literature; and the virtual crack closure technique commonly used in fracture mechanics. Results are compared with dynamic double cantilever beam test data from the literature. Examples show that it is possible to accurately model delamination propagation in this case. However, the computational demands required for accurate solution are great and reliable property data may not be available to support general crash modeling efforts. Additional examples are modeled including an impact-loaded beam, damage initiation in laminated crushing specimens, and a scaled aircraft subfloor structures in which composite sandwich structures are used as energy-absorbing elements. These examples illustrate some of the difficulties in modeling delamination as part of a finite element crash analysis.

  9. Meta-Analysis in Higher Education: An Illustrative Example Using Hierarchical Linear Modeling

    ERIC Educational Resources Information Center

    Denson, Nida; Seltzer, Michael H.

    2011-01-01

    The purpose of this article is to provide higher education researchers with an illustrative example of meta-analysis utilizing hierarchical linear modeling (HLM). This article demonstrates the step-by-step process of meta-analysis using a recently-published study examining the effects of curricular and co-curricular diversity activities on racial…

  10. Stress Analysis of Columns and Beam Columns by the Photoelastic Method

    NASA Technical Reports Server (NTRS)

    Ruffner, B F

    1946-01-01

    Principles of similarity and other factors in the design of models for photoelastic testing are discussed. Some approximate theoretical equations, useful in the analysis of results obtained from photoelastic tests are derived. Examples of the use of photoelastic techniques and the analysis of results as applied to uniform and tapered beam columns, circular rings, and statically indeterminate frames, are given. It is concluded that this method is an effective tool for the analysis of structures in which column action is present, particularly in tapered beam columns, and in statically indeterminate structures in which the distribution of loads in the structures is influenced by bending moments due to axial loads in one or more members.

  11. African Primary Care Research: Qualitative data analysis and writing results

    PubMed Central

    Govender, Indiran; Ogunbanjo, Gboyega A.; Mash, Bob

    2014-01-01

    Abstract This article is part of a series on African primary care research and gives practical guidance on qualitative data analysis and the presentation of qualitative findings. After an overview of qualitative methods and analytical approaches, the article focuses particularly on content analysis, using the framework method as an example. The steps of familiarisation, creating a thematic index, indexing, charting, interpretation and confirmation are described. Key concepts with regard to establishing the quality and trustworthiness of data analysis are described. Finally, an approach to the presentation of qualitative findings is given. PMID:26245437

  12. African Primary Care Research: qualitative data analysis and writing results.

    PubMed

    Mabuza, Langalibalele H; Govender, Indiran; Ogunbanjo, Gboyega A; Mash, Bob

    2014-06-05

    This article is part of a series on African primary care research and gives practical guidance on qualitative data analysis and the presentation of qualitative findings. After an overview of qualitative methods and analytical approaches, the article focuses particularly on content analysis, using the framework method as an example. The steps of familiarisation, creating a thematic index, indexing, charting, interpretation and confirmation are described. Key concepts with regard to establishing the quality and trustworthiness of data analysis are described. Finally, an approach to the presentation of qualitative findings is given.

  13. A New View of Earthquake Ground Motion Data: The Hilbert Spectral Analysis

    NASA Technical Reports Server (NTRS)

    Huang, Norden; Busalacchi, Antonio J. (Technical Monitor)

    2000-01-01

    A brief description of the newly developed Empirical Mode Decomposition (ENID) and Hilbert Spectral Analysis (HSA) method will be given. The decomposition is adaptive and can be applied to both nonlinear and nonstationary data. Example of the method applied to a sample earthquake record will be given. The results indicate those low frequency components, totally missed by the Fourier analysis, are clearly identified by the new method. Comparisons with Wavelet and window Fourier analysis show the new method offers much better temporal and frequency resolutions.

  14. Extra high speed modified Lundell alternator parameters and open/short-circuit characteristics from global 3D-FE magnetic field solutions

    NASA Astrophysics Data System (ADS)

    Wang, R.; Demerdash, N. A.

    1992-06-01

    The combined magnetic vector potential - magnetic scalar potential method of computation of 3D magnetic fields by finite elements, introduced in a companion paper, is used for global 3D field analysis and machine performance computations under open-circuit and short-circuit conditions for an example 14.3 kVA modified Lundell alternator, whose magnetic field is of intrinsic 3D nature. The computed voltages and currents under these machine test conditions were verified and found to be in very good agreement with corresponding test data. Results of use of this modelling and computation method in the study of a design alteration example, in which the stator stack length of the example alternator is stretched in order to increase voltage and volt-ampere rating, are given here. These results demonstrate the inadequacy of conventional 2D-based design concepts and the imperative of use of this type of 3D magnetic field modelling in the design and investigation of such machines.

  15. Extra high speed modified Lundell alternator parameters and open/short-circuit characteristics from global 3D-FE magnetic field solutions

    NASA Technical Reports Server (NTRS)

    Wang, R.; Demerdash, N. A.

    1992-01-01

    The combined magnetic vector potential - magnetic scalar potential method of computation of 3D magnetic fields by finite elements, introduced in a companion paper, is used for global 3D field analysis and machine performance computations under open-circuit and short-circuit conditions for an example 14.3 kVA modified Lundell alternator, whose magnetic field is of intrinsic 3D nature. The computed voltages and currents under these machine test conditions were verified and found to be in very good agreement with corresponding test data. Results of use of this modelling and computation method in the study of a design alteration example, in which the stator stack length of the example alternator is stretched in order to increase voltage and volt-ampere rating, are given here. These results demonstrate the inadequacy of conventional 2D-based design concepts and the imperative of use of this type of 3D magnetic field modelling in the design and investigation of such machines.

  16. Time-dependent limited penetrable visibility graph analysis of nonstationary time series

    NASA Astrophysics Data System (ADS)

    Gao, Zhong-Ke; Cai, Qing; Yang, Yu-Xuan; Dang, Wei-Dong

    2017-06-01

    Recent years have witnessed the development of visibility graph theory, which allows us to analyze a time series from the perspective of complex network. We in this paper develop a novel time-dependent limited penetrable visibility graph (TDLPVG). Two examples using nonstationary time series from RR intervals and gas-liquid flows are provided to demonstrate the effectiveness of our approach. The results of the first example suggest that our TDLPVG method allows characterizing the time-varying behaviors and classifying heart states of healthy, congestive heart failure and atrial fibrillation from RR interval time series. For the second example, we infer TDLPVGs from gas-liquid flow signals and interestingly find that the deviation of node degree of TDLPVGs enables to effectively uncover the time-varying dynamical flow behaviors of gas-liquid slug and bubble flow patterns. All these results render our TDLPVG method particularly powerful for characterizing the time-varying features underlying realistic complex systems from time series.

  17. Computer code for off-design performance analysis of radial-inflow turbines with rotor blade sweep

    NASA Technical Reports Server (NTRS)

    Meitner, P. L.; Glassman, A. J.

    1983-01-01

    The analysis procedure of an existing computer program was extended to include rotor blade sweep, to model the flow more accurately at the rotor exit, and to provide more detail to the loss model. The modeling changes are described and all analysis equations and procedures are presented. Program input and output are described and are illustrated by an example problem. Results obtained from this program and from a previous program are compared with experimental data.

  18. Contact stress analysis of spiral bevel gears using nonlinear finite element static analysis

    NASA Technical Reports Server (NTRS)

    Bibel, G. D.; Kumar, A.; Reddy, S.; Handschuh, R.

    1993-01-01

    A procedure is presented for performing three-dimensional stress analysis of spiral bevel gears in mesh using the finite element method. The procedure involves generating a finite element model by solving equations that identify tooth surface coordinates. Coordinate transformations are used to orientate the gear and pinion for gear meshing. Contact boundary conditions are simulated with gap elements. A solution technique for correct orientation of the gap elements is given. Example models and results are presented.

  19. Notes on Piezoelectricity

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Redondo, Antonio

    These notes provide a pedagogical discussion of the physics of piezoelectricity. The exposition starts with a brief analysis of the classical (continuum) theory of piezoelectric phenomena in solids. The main subject of the notes is, however, a quantum mechanical analysis. We first derive the Frohlich Hamiltonian as part of the description of the electron-phonon interaction. The results of this analysis are then employed to derive the equations of piezoelectricity. A couple of examples with the zinc blende and and wurtzite structures are presented at the end

  20. A method for the analysis of nonlinearities in aircraft dynamic response to atmospheric turbulence

    NASA Technical Reports Server (NTRS)

    Sidwell, K.

    1976-01-01

    An analytical method is developed which combines the equivalent linearization technique for the analysis of the response of nonlinear dynamic systems with the amplitude modulated random process (Press model) for atmospheric turbulence. The method is initially applied to a bilinear spring system. The analysis of the response shows good agreement with exact results obtained by the Fokker-Planck equation. The method is then applied to an example of control-surface displacement limiting in an aircraft with a pitch-hold autopilot.

  1. Numerical integration of asymptotic solutions of ordinary differential equations

    NASA Technical Reports Server (NTRS)

    Thurston, Gaylen A.

    1989-01-01

    Classical asymptotic analysis of ordinary differential equations derives approximate solutions that are numerically stable. However, the analysis also leads to tedious expansions in powers of the relevant parameter for a particular problem. The expansions are replaced with integrals that can be evaluated by numerical integration. The resulting numerical solutions retain the linear independence that is the main advantage of asymptotic solutions. Examples, including the Falkner-Skan equation from laminar boundary layer theory, illustrate the method of asymptotic analysis with numerical integration.

  2. [Regional differences in the development of hospitalizations : An effect of different demographic trends?

    PubMed

    Nowossadeck, Enno; Prütz, Franziska

    2018-03-01

    Population aging and population decline in many regions of the Federal Republic of Germany are key elements of demographic change. In the regions concerned there is a rising number of older people and, simultaneously, a declining population. So far, the consequences of regional shrinkage and growth for inpatient care don't seem to have been analysed very well. This paper analyses the influence of population aging and declining/increasing population (demographic factors) as well as other, non-demographic factors on the number of hospitalizations in Germany and the Federal States since 2000.One result of the analysis is that there are major differences between the Federal States. The analysis shows, for example, an increase of hospitalizations in Berlin while in Saxony-Anhalt the number of hospitalizations declines. The increase in Berlin was the result of population aging and, to a lower extent, an increase in population. In Saxony-Anhalt the declining population resulted in a decreasing number of hospitalizations. Population aging and non-demographic factors were not able to compensate this trend.Overall, the effect of demographic factors on the number of hospitalizations remains constant over time. Short-term changes of hospitalizations are due to non-demographic factors, such as epidemiological trends, (for example trends of incidence or prevalence), or structural changes of health care service (for example patients shifting between different sectors of health care or the introduction of new reimbursement systems).

  3. Putting the 1991 census sample of anonymised records on your Unix workstation.

    PubMed

    Turton, I; Openshaw, S

    1995-03-01

    "The authors describe the development of a customised computer software package for easing the analysis of the U.K. 1991 Sample of Anonymised Records. The resulting USAR [Unix Sample of Anonymised Records] package is designed to be portable within the Unix environment. It offers a number of features such as interactive table design, intelligent data interpretation, and fuzzy query. An example of SAR analysis is provided." excerpt

  4. Digital model of a vacuum circuit breaker for the analysis of switching waveforms in electrical circuits

    NASA Astrophysics Data System (ADS)

    Budzisz, Joanna; Wróblewski, Zbigniew

    2016-03-01

    The article presents a method of modelling a vaccum circuit breaker in the ATP/EMTP package, the results of the verification of the correctness of the developed digital circuit breaker model operation and its practical usefulness for analysis of overvoltages and overcurrents occurring in commutated capacitive electrical circuits and also examples of digital simulations of overvoltages and overcurrents in selected electrical circuits.

  5. Methods for the Compilation of a Core List of Journals in Toxicology.

    ERIC Educational Resources Information Center

    Kuch, T. D. C.

    Previously reported methods for the compilation of core lists of journals in multidisciplinary areas are first examined, with toxicology used as an example of such an area. Three approaches to the compilation of a core list of journals in toxicology were undertaken and the results analyzed with the aid of models. Analysis of the results of the…

  6. An iterative transformation procedure for numerical solution of flutter and similar characteristics-value problems

    NASA Technical Reports Server (NTRS)

    Gossard, Myron L

    1952-01-01

    An iterative transformation procedure suggested by H. Wielandt for numerical solution of flutter and similar characteristic-value problems is presented. Application of this procedure to ordinary natural-vibration problems and to flutter problems is shown by numerical examples. Comparisons of computed results with experimental values and with results obtained by other methods of analysis are made.

  7. Using Framework Analysis in nursing research: a worked example.

    PubMed

    Ward, Deborah J; Furber, Christine; Tierney, Stephanie; Swallow, Veronica

    2013-11-01

    To demonstrate Framework Analysis using a worked example and to illustrate how criticisms of qualitative data analysis including issues of clarity and transparency can be addressed. Critics of the analysis of qualitative data sometimes cite lack of clarity and transparency about analytical procedures; this can deter nurse researchers from undertaking qualitative studies. Framework Analysis is flexible, systematic, and rigorous, offering clarity, transparency, an audit trail, an option for theme-based and case-based analysis and for readily retrievable data. This paper offers further explanation of the process undertaken which is illustrated with a worked example. Data were collected from 31 nursing students in 2009 using semi-structured interviews. The data collected are not reported directly here but used as a worked example for the five steps of Framework Analysis. Suggestions are provided to guide researchers through essential steps in undertaking Framework Analysis. The benefits and limitations of Framework Analysis are discussed. Nurses increasingly use qualitative research methods and need to use an analysis approach that offers transparency and rigour which Framework Analysis can provide. Nurse researchers may find the detailed critique of Framework Analysis presented in this paper a useful resource when designing and conducting qualitative studies. Qualitative data analysis presents challenges in relation to the volume and complexity of data obtained and the need to present an 'audit trail' for those using the research findings. Framework Analysis is an appropriate, rigorous and systematic method for undertaking qualitative analysis. © 2013 Blackwell Publishing Ltd.

  8. Using qualitative comparative analysis (QCA) in systematic reviews of complex interventions: a worked example.

    PubMed

    Thomas, James; O'Mara-Eves, Alison; Brunton, Ginny

    2014-06-20

    Systematic reviews that address policy and practice questions in relation to complex interventions frequently need not only to assess the efficacy of a given intervention but to identify which intervention - and which intervention components - might be most effective in particular situations. Here, intervention replication is rare, and commonly used synthesis methods are less useful when the focus of analysis is the identification of those components of an intervention that are critical to its success. Having identified initial theories of change in a previous analysis, we explore the potential of qualitative comparative analysis (QCA) to assist with complex syntheses through a worked example. Developed originally in the area of political science and historical sociology, a QCA aims to identify those configurations of participant, intervention and contextual characteristics that may be associated with a given outcome. Analysing studies in these terms facilitates the identification of necessary and sufficient conditions for the outcome to be obtained. Since QCA is predicated on the assumption that multiple pathways might lead to the same outcome and does not assume a linear additive model in terms of changes to a particular condition (that is, it can cope with 'tipping points' in complex interventions), it appears not to suffer from some of the limitations of the statistical methods often used in meta-analysis. The worked example shows how the QCA reveals that our initial theories of change were unable to distinguish between 'effective' and 'highly effective' interventions. Through the iterative QCA process, other intervention characteristics are identified that better explain the observed results. QCA is a promising alternative (or adjunct), particularly to the standard fall-back of a 'narrative synthesis' when a quantitative synthesis is impossible, and should be considered when reviews are broad and heterogeneity is significant. There are very few examples of its use with systematic review data at present, and further methodological work is needed to establish optimal conditions for its use and to document process, practice, and reporting standards.

  9. Control system design and analysis using the INteractive Controls Analysis (INCA) program

    NASA Technical Reports Server (NTRS)

    Bauer, Frank H.; Downing, John P.

    1987-01-01

    The INteractive Controls Analysis (INCA) program was developed at the Goddard Space Flight Center to provide a user friendly efficient environment for the design and analysis of linear control systems. Since its inception, INCA has found extensive use in the design, development, and analysis of control systems for spacecraft, instruments, robotics, and pointing systems. Moreover, the results of the analytic tools imbedded in INCA have been flight proven with at least three currently orbiting spacecraft. This paper describes the INCA program and illustrates, using a flight proven example, how the package can perform complex design analyses with relative ease.

  10. Optical performance assessment under environmental and mechanical perturbations in large, deployable telescopes

    NASA Astrophysics Data System (ADS)

    Folley, Christopher; Bronowicki, Allen

    2005-09-01

    Prediction of optical performance for large, deployable telescopes under environmental conditions and mechanical disturbances is a crucial part of the design verification process of such instruments for all phases of design and operation: ground testing, commissioning, and on-orbit operation. A Structural-Thermal-Optical-Performance (STOP) analysis methodology is often created that integrates the output of one analysis with the input of another. The integration of thermal environment predictions with structural models is relatively well understood, while the integration of structural deformation results into optical analysis/design software is less straightforward. A Matlab toolbox has been created that effectively integrates the predictions of mechanical deformations on optical elements generated by, for example, finite element analysis, and computes optical path differences for the distorted prescription. The engine of the toolbox is the real ray-tracing algorithm that allows the optical surfaces to be defined in a single, global coordinate system thereby allowing automatic alignment of the mechanical coordinate system with the optical coordinate system. Therefore, the physical location of the optical surfaces is identical in the optical prescription and the finite element model. The application of rigid body displacements to optical surfaces, however, is more general than for use solely in STOP analysis, such as the analysis of misalignments during the commissioning process. Furthermore, all the functionality of Matlab is available for optimization and control. Since this is a new tool for use on flight programs, it has been verified against CODE V. The toolbox' functionality, to date, is described, verification results are presented, and, as an example of its utility, results of a thermal distortion analysis are presented using the James Webb Space Telescope (JWST) prescription.

  11. The National Program of Educational Laboratories. Final Report.

    ERIC Educational Resources Information Center

    Chase, Francis S.

    This report presents results of a critical analysis of 20 regional educational laboratories and nine university research and development centers established under ESEA Title IV. Observations, supported by specific examples, are made concerning the laboratories and centers and deal with their roles, programs definitions, impact on educational…

  12. Application of NASTRAN/COSMIC in the analysis of ship structures to underwater explosion shock

    NASA Technical Reports Server (NTRS)

    Fallon, D. J.; Costanzo, F. A.; Handleton, R. T.; Camp, G. C.; Smith, D. C.

    1987-01-01

    The application of NASTRAN/COSMIC in predicting the transient motion of ship structures to underwater, non-contact explosions is discussed. Examples illustrate the finite element models, mathematical formulations of loading functions and, where available, comparisons between analytical and experimental results.

  13. Probabilistic structural analysis using a general purpose finite element program

    NASA Astrophysics Data System (ADS)

    Riha, D. S.; Millwater, H. R.; Thacker, B. H.

    1992-07-01

    This paper presents an accurate and efficient method to predict the probabilistic response for structural response quantities, such as stress, displacement, natural frequencies, and buckling loads, by combining the capabilities of MSC/NASTRAN, including design sensitivity analysis and fast probability integration. Two probabilistic structural analysis examples have been performed and verified by comparison with Monte Carlo simulation of the analytical solution. The first example consists of a cantilevered plate with several point loads. The second example is a probabilistic buckling analysis of a simply supported composite plate under in-plane loading. The coupling of MSC/NASTRAN and fast probability integration is shown to be orders of magnitude more efficient than Monte Carlo simulation with excellent accuracy.

  14. Anticipating Forest and Range Land Development in Central Oregon (USA) for Landscape Analysis, with an Example Application Involving Mule Deer

    NASA Astrophysics Data System (ADS)

    Kline, Jeffrey D.; Moses, Alissa; Burcsu, Theresa

    2010-05-01

    Forest policymakers, public lands managers, and scientists in the Pacific Northwest (USA) seek ways to evaluate the landscape-level effects of policies and management through the multidisciplinary development and application of spatially explicit methods and models. The Interagency Mapping and Analysis Project (IMAP) is an ongoing effort to generate landscape-wide vegetation data and models to evaluate the integrated effects of disturbances and management activities on natural resource conditions in Oregon and Washington (USA). In this initial analysis, we characterized the spatial distribution of forest and range land development in a four-county pilot study region in central Oregon. The empirical model describes the spatial distribution of buildings and new building construction as a function of population growth, existing development, topography, land-use zoning, and other factors. We used the model to create geographic information system maps of likely future development based on human population projections to inform complementary landscape analyses underway involving vegetation, habitat, and wildfire interactions. In an example application, we use the model and resulting maps to show the potential impacts of future forest and range land development on mule deer ( Odocoileus hemionus) winter range. Results indicate significant development encroachment and habitat loss already in 2000 with development located along key migration routes and increasing through the projection period to 2040. The example application illustrates a simple way for policymakers and public lands managers to combine existing data and preliminary model outputs to begin to consider the potential effects of development on future landscape conditions.

  15. Advancing our thinking in presence-only and used-available analysis.

    PubMed

    Warton, David; Aarts, Geert

    2013-11-01

    1. The problems of analysing used-available data and presence-only data are equivalent, and this paper uses this equivalence as a platform for exploring opportunities for advancing analysis methodology. 2. We suggest some potential methodological advances in used-available analysis, made possible via lessons learnt in the presence-only literature, for example, using modern methods to improve predictive performance. We also consider the converse - potential advances in presence-only analysis inspired by used-available methodology. 3. Notwithstanding these potential advances in methodology, perhaps a greater opportunity is in advancing our thinking about how to apply a given method to a particular data set. 4. It is shown by example that strikingly different results can be achieved for a single data set by applying a given method of analysis in different ways - hence having chosen a method of analysis, the next step of working out how to apply it is critical to performance. 5. We review some key issues to consider in deciding how to apply an analysis method: apply the method in a manner that reflects the study design; consider data properties; and use diagnostic tools to assess how reasonable a given analysis is for the data at hand. © 2013 The Authors. Journal of Animal Ecology © 2013 British Ecological Society.

  16. Poisson Regression Analysis of Illness and Injury Surveillance Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Frome E.L., Watkins J.P., Ellis E.D.

    2012-12-12

    The Department of Energy (DOE) uses illness and injury surveillance to monitor morbidity and assess the overall health of the work force. Data collected from each participating site include health events and a roster file with demographic information. The source data files are maintained in a relational data base, and are used to obtain stratified tables of health event counts and person time at risk that serve as the starting point for Poisson regression analysis. The explanatory variables that define these tables are age, gender, occupational group, and time. Typical response variables of interest are the number of absences duemore » to illness or injury, i.e., the response variable is a count. Poisson regression methods are used to describe the effect of the explanatory variables on the health event rates using a log-linear main effects model. Results of fitting the main effects model are summarized in a tabular and graphical form and interpretation of model parameters is provided. An analysis of deviance table is used to evaluate the importance of each of the explanatory variables on the event rate of interest and to determine if interaction terms should be considered in the analysis. Although Poisson regression methods are widely used in the analysis of count data, there are situations in which over-dispersion occurs. This could be due to lack-of-fit of the regression model, extra-Poisson variation, or both. A score test statistic and regression diagnostics are used to identify over-dispersion. A quasi-likelihood method of moments procedure is used to evaluate and adjust for extra-Poisson variation when necessary. Two examples are presented using respiratory disease absence rates at two DOE sites to illustrate the methods and interpretation of the results. In the first example the Poisson main effects model is adequate. In the second example the score test indicates considerable over-dispersion and a more detailed analysis attributes the over-dispersion to extra-Poisson variation. The R open source software environment for statistical computing and graphics is used for analysis. Additional details about R and the data that were used in this report are provided in an Appendix. Information on how to obtain R and utility functions that can be used to duplicate results in this report are provided.« less

  17. Testing for Questionable Research Practices in a Meta-Analysis: An Example from Experimental Parapsychology.

    PubMed

    Bierman, Dick J; Spottiswoode, James P; Bijl, Aron

    2016-01-01

    We describe a method of quantifying the effect of Questionable Research Practices (QRPs) on the results of meta-analyses. As an example we simulated a meta-analysis of a controversial telepathy protocol to assess the extent to which these experimental results could be explained by QRPs. Our simulations used the same numbers of studies and trials as the original meta-analysis and the frequencies with which various QRPs were applied in the simulated experiments were based on surveys of experimental psychologists. Results of both the meta-analysis and simulations were characterized by 4 metrics, two describing the trial and mean experiment hit rates (HR) of around 31%, where 25% is expected by chance, one the correlation between sample-size and hit-rate, and one the complete P-value distribution of the database. A genetic algorithm optimized the parameters describing the QRPs, and the fitness of the simulated meta-analysis was defined as the sum of the squares of Z-scores for the 4 metrics. Assuming no anomalous effect a good fit to the empirical meta-analysis was found only by using QRPs with unrealistic parameter-values. Restricting the parameter space to ranges observed in studies of QRP occurrence, under the untested assumption that parapsychologists use comparable QRPs, the fit to the published Ganzfeld meta-analysis with no anomalous effect was poor. We allowed for a real anomalous effect, be it unidentified QRPs or a paranormal effect, where the HR ranged from 25% (chance) to 31%. With an anomalous HR of 27% the fitness became F = 1.8 (p = 0.47 where F = 0 is a perfect fit). We conclude that the very significant probability cited by the Ganzfeld meta-analysis is likely inflated by QRPs, though results are still significant (p = 0.003) with QRPs. Our study demonstrates that quantitative simulations of QRPs can assess their impact. Since meta-analyses in general might be polluted by QRPs, this method has wide applicability outside the domain of experimental parapsychology.

  18. 26 CFR 1.482-8 - Examples of the best method rule.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... illustrate the comparative analysis required to apply this rule. As with all of the examples in these... case. Example 10. Cost of services plus method preferred to other methods. (i) FP designs and...

  19. 26 CFR 1.482-8 - Examples of the best method rule.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... illustrate the comparative analysis required to apply this rule. As with all of the examples in these... case. Example 10. Cost of services plus method preferred to other methods. (i) FP designs and...

  20. 26 CFR 1.482-8 - Examples of the best method rule.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... illustrate the comparative analysis required to apply this rule. As with all of the examples in these... case. Example 10. Cost of services plus method preferred to other methods. (i) FP designs and...

  1. 26 CFR 1.482-8 - Examples of the best method rule.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... illustrate the comparative analysis required to apply this rule. As with all of the examples in these... case. Example 10. Cost of services plus method preferred to other methods. (i) FP designs and...

  2. Modeling thermoelastic distortion of optics using elastodynamic reciprocity

    NASA Astrophysics Data System (ADS)

    King, Eleanor; Levin, Yuri; Ottaway, David; Veitch, Peter

    2015-07-01

    Thermoelastic distortion resulting from optical absorption by transmissive and reflective optics can cause unacceptable changes in optical systems that employ high-power beams. In advanced-generation laser-interferometric gravitational wave detectors, for example, optical absorption is expected to result in wavefront distortions that would compromise the sensitivity of the detector, thus necessitating the use of adaptive thermal compensation. Unfortunately, these systems have long thermal time constants, and so predictive feed-forward control systems could be required, but the finite-element analysis is computationally expensive. We describe here the use of the Betti-Maxwell elastodynamic reciprocity theorem to calculate the response of linear elastic bodies (optics) to heating that has arbitrary spatial distribution. We demonstrate, using a simple example, that it can yield accurate results in computational times that are significantly less than those required for finite-element analyses.

  3. Attitudinal Changes of the Student Teacher--A Further Analysis. An Example of an Orthogonal Comparisons Analysis Model Applied to Educational Research.

    ERIC Educational Resources Information Center

    Courtney, E. Wayne

    This report was designed to present an example of a research study involving the use of coefficients of orthogonal comparisons in analysis of variance tests of significance. A sample research report and analysis was included so as to lead the reader through the design steps. The sample study was designed to determine the extent of attitudinal…

  4. Assessing the Kansas water-level monitoring program: An example of the application of classical statistics to a geological problem

    USGS Publications Warehouse

    Davis, J.C.

    2000-01-01

    Geologists may feel that geological data are not amenable to statistical analysis, or at best require specialized approaches such as nonparametric statistics and geostatistics. However, there are many circumstances, particularly in systematic studies conducted for environmental or regulatory purposes, where traditional parametric statistical procedures can be beneficial. An example is the application of analysis of variance to data collected in an annual program of measuring groundwater levels in Kansas. Influences such as well conditions, operator effects, and use of the water can be assessed and wells that yield less reliable measurements can be identified. Such statistical studies have resulted in yearly improvements in the quality and reliability of the collected hydrologic data. Similar benefits may be achieved in other geological studies by the appropriate use of classical statistical tools.

  5. Blade design and analysis using a modified Euler solver

    NASA Technical Reports Server (NTRS)

    Leonard, O.; Vandenbraembussche, R. A.

    1991-01-01

    An iterative method for blade design based on Euler solver and described in an earlier paper is used to design compressor and turbine blades providing shock free transonic flows. The method shows a rapid convergence, and indicates how much the flow is sensitive to small modifications of the blade geometry, that the classical iterative use of analysis methods might not be able to define. The relationship between the required Mach number distribution and the resulting geometry is discussed. Examples show how geometrical constraints imposed upon the blade shape can be respected by using free geometrical parameters or by relaxing the required Mach number distribution. The same code is used both for the design of the required geometry and for the off-design calculations. Examples illustrate the difficulty of designing blade shapes with optimal performance also outside of the design point.

  6. Characterization and analysis of Porous, Brittle solid structures by X-ray micro computed tomography

    NASA Astrophysics Data System (ADS)

    Lin, C. L.; Videla, A. R.; Yu, Q.; Miller, J. D.

    2010-12-01

    The internal structure of porous, brittle solid structures, such as porous rock, foam metal and wallboard, is extremely complex. For example, in the case of wallboard, the air bubble size and the thickness/composition of the wall structure are spatial parameters that vary significantly and influence mechanical, thermal, and acoustical properties. In this regard, the complex geometry and the internal texture of material, such as wallboard, is characterized and analyzed in 3-D using cone beam x-ray micro computed tomography. Geometrical features of the porous brittle structure are quantitatively analyzed based on calibration of the x-ray linear attenuation coefficient, use of a 3-D watershed algorithm, and use of a 3-D skeletonization procedure. Several examples of the 3-D analysis for porous, wallboard structures are presented and the results discussed.

  7. NASTRAN cyclic symmetry capability. [application to solid rocket propellant grains and space antennas

    NASA Technical Reports Server (NTRS)

    Macneal, R. H.; Harder, R. L.; Mason, J. B.

    1973-01-01

    A development for NASTRAN which facilitates the analysis of structures made up of identical segments symmetrically arranged with respect to an axis is described. The key operation in the method is the transformation of the degrees of freedom for the structure into uncoupled symmetrical components, thereby greatly reducing the number of equations which are solved simultaneously. A further reduction occurs if each segment has a plane of reflective symmetry. The only required assumption is that the problem be linear. The capability, as developed, will be available in level 16 of NASTRAN for static stress analysis, steady state heat transfer analysis, and vibration analysis. The paper includes a discussion of the theory, a brief description of the data supplied by the user, and the results obtained for two example problems. The first problem concerns the acoustic modes of a long prismatic cavity imbedded in the propellant grain of a solid rocket motor. The second problem involves the deformations of a large space antenna. The latter example is the first application of the NASTRAN Cyclic Symmetry capability to a really large problem.

  8. A Descriptive Evaluation of Long-Term Treatment Integrity

    ERIC Educational Resources Information Center

    Arkoosh, Maire Kathryn; Derby, K. Mark; Wacker, David P.; Berg, Wendy; McLaughlin, T. F.; Barretto, Anjali

    2007-01-01

    The validity of selecting treatment contingencies on the basis of the results obtained through functional analysis is well documented. However, a number of second-generation questions have emerged: For example, what are the parameters required to achieve desired treatment outcomes? More specifically, what is the degree of treatment integrity…

  9. Contextualized Science for Teaching Science and Technology.

    ERIC Educational Resources Information Center

    Koul, Ravinder; Dana, Thomas M.

    1997-01-01

    Discusses science education in India, arguing that a contextualized curriculum is a powerful means of improvement. The paper presents results from an analysis of the treatment of the nature of science and technology in current Indian textbooks and uses India's controversial Sadar Sarovar Hydro-Electric Project as a case example. (SM)

  10. A Quantitative Analysis of Children's Splitting Operations and Fraction Schemes

    ERIC Educational Resources Information Center

    Norton, Anderson; Wilkins, Jesse L. M.

    2009-01-01

    Teaching experiments with pairs of children have generated several hypotheses about students' construction of fractions. For example, Steffe (2004) hypothesized that robust conceptions of improper fractions depends on the development of a splitting operation. Results from teaching experiments that rely on scheme theory and Steffe's hierarchy of…

  11. Uncovering Portuguese Teachers' Difficulties in Implementing Sciences Curriculum

    ERIC Educational Resources Information Center

    Vasconcelos, Clara; Torres, Joana; Moutinho, Sara; Martins, Idalina; Costa, Nilza

    2015-01-01

    Many countries recognize the positive and effective results of improving science education through the introduction of reforms in the sciences curriculum. However, some important issues are generally neglected like, for example, the involvement of the teachers in the reform process. Taking the sciences curriculum reform under analysis and…

  12. Quantifying uncertainty in forest nutrient budgets

    Treesearch

    Ruth D. Yanai; Carrie R. Levine; Mark B. Green; John L. Campbell

    2012-01-01

    Nutrient budgets for forested ecosystems have rarely included error analysis, in spite of the importance of uncertainty to interpretation and extrapolation of the results. Uncertainty derives from natural spatial and temporal variation and also from knowledge uncertainty in measurement and models. For example, when estimating forest biomass, researchers commonly report...

  13. False-Positive Tangible Outcomes of Functional Analyses

    ERIC Educational Resources Information Center

    Rooker, Griffin W.; Iwata, Brian A.; Harper, Jill M.; Fahmie, Tara A.; Camp, Erin M.

    2011-01-01

    Functional analysis (FA) methodology is the most precise method for identifying variables that maintain problem behavior. Occasionally, however, results of an FA may be influenced by idiosyncratic sensitivity to aspects of the assessment conditions. For example, data from several studies suggest that inclusion of a tangible condition during an FA…

  14. Stagnation-Point Shielding by Melting and Vaporization

    NASA Technical Reports Server (NTRS)

    Roberts, Leonard

    1959-01-01

    An approximate theoretical analysis was made of the shielding mechanism whereby the rate of heat transfer to the forward stagnation point of blunt bodies is reduced by melting and evaporation. General qualitative results are given and a numerical example, the melting and evaporation of ice, is presented and discussed in detail.

  15. ANALYSIS OF CONCORDANCE OF PROBABILISTIC AGGREGATE EXPOSURE PREDICTIONS WITH OBSERVED BIOMONITORING RESULTS: AN EXAMPLE USING CTEPP DATA

    EPA Science Inventory

    Three key areas of scientific inquiry in the study of human exposure to environmental contaminants are 1) assessment of aggregate (i.e., multi-pathway, multi-route) exposures, 2) application of probabilistic methods to exposure prediction, and 3) the interpretation of biomarker m...

  16. CAUSAL ANALYSIS OF BIOLOGICAL IMPAIRMENT IN LONG CREEK, A SANDY-BOTTOMED STREAM IN COASTAL SOUTHERN MAINE (Final Report)

    EPA Science Inventory

    This assessment presents results from a complex causal assessment of a biologically impaired, urbanized coastal watershed located primarily in South Portland, Maine, USA—the Long Creek watershed. This case study serves as an example implementation of U.S. Environmental Protectio...

  17. Bayesian analysis of spatially-dependent functional responses with spatially-dependent multi-dimensional functional predictors

    USDA-ARS?s Scientific Manuscript database

    Recent advances in technology have led to the collection of high-dimensional data not previously encountered in many scientific environments. As a result, scientists are often faced with the challenging task of including these high-dimensional data into statistical models. For example, data from sen...

  18. [Clinical research XXIII. From clinical judgment to meta-analyses].

    PubMed

    Rivas-Ruiz, Rodolfo; Castelán-Martínez, Osvaldo D; Pérez-Rodríguez, Marcela; Palacios-Cruz, Lino; Noyola-Castillo, Maura E; Talavera, Juan O

    2014-01-01

    Systematic reviews (SR) are studies made in order to ask clinical questions based on original articles. Meta-analysis (MTA) is the mathematical analysis of SR. These analyses are divided in two groups, those which evaluate the measured results of quantitative variables (for example, the body mass index -BMI-) and those which evaluate qualitative variables (for example, if a patient is alive or dead, or if he is healing or not). Quantitative variables generally use the mean difference analysis and qualitative variables can be performed using several calculations: odds ratio (OR), relative risk (RR), absolute risk reduction (ARR) and hazard ratio (HR). These analyses are represented through forest plots which allow the evaluation of each individual study, as well as the heterogeneity between studies and the overall effect of the intervention. These analyses are mainly based on Student's t test and chi-squared. To take appropriate decisions based on the MTA, it is important to understand the characteristics of statistical methods in order to avoid misinterpretations.

  19. Affordability Engineering: Bridging the Gap Between Design and Cost

    NASA Technical Reports Server (NTRS)

    Reeves, J. D.; DePasquale, Dominic; Lim, Evan

    2010-01-01

    Affordability is a commonly used term that takes on numerous meanings depending on the context used. Within conceptual design of complex systems, the term generally implies comparisons between expected costs and expected resources. This characterization is largely correct, but does not convey the many nuances and considerations that are frequently misunderstood and underappreciated. In the most fundamental sense, affordability and cost directly relate to engineering and programmatic decisions made throughout development programs. Systems engineering texts point out that there is a temporal aspect to this relationship, for decisions made earlier in a program dictate design implications much more so than those made during latter phases. This paper explores affordability engineering and its many sub-disciplines by discussing how it can be considered an additional engineering discipline to be balanced throughout the systems engineering and systems analysis processes. Example methods of multidisciplinary design analysis with affordability as a key driver will be discussed, as will example methods of data visualization, probabilistic analysis, and other ways of relating design decisions to affordability results.

  20. Analysis of multinomial models with unknown index using data augmentation

    USGS Publications Warehouse

    Royle, J. Andrew; Dorazio, R.M.; Link, W.A.

    2007-01-01

    Multinomial models with unknown index ('sample size') arise in many practical settings. In practice, Bayesian analysis of such models has proved difficult because the dimension of the parameter space is not fixed, being in some cases a function of the unknown index. We describe a data augmentation approach to the analysis of this class of models that provides for a generic and efficient Bayesian implementation. Under this approach, the data are augmented with all-zero detection histories. The resulting augmented dataset is modeled as a zero-inflated version of the complete-data model where an estimable zero-inflation parameter takes the place of the unknown multinomial index. Interestingly, data augmentation can be justified as being equivalent to imposing a discrete uniform prior on the multinomial index. We provide three examples involving estimating the size of an animal population, estimating the number of diabetes cases in a population using the Rasch model, and the motivating example of estimating the number of species in an animal community with latent probabilities of species occurrence and detection.

  1. An analysis of optical effects caused by thermally induced mirror deformations.

    PubMed

    Ogrodnik, R F

    1970-09-01

    This paper analyzes thermally induced mirror deformations and their resulting wavefront distortions which occur under the conditions of radially nonuniform mirror heating. The analysis is adaptable to heating produced by any radially nonuniform incident radiation. Specific examples of radiation distributions which are considered are the cosine squared and the gaussian and TEM(0, 1) laser distributions. Deformation effects are examined from two aspects, the first of which is the reflected wavefront radial phase distortion profile caused by the thermally induced surface irregularities at the mirror face. These phase distortion effects appear as aberrations in noncoherent optical applications and as the loss of spatial coherence in coherent applications. The second aspect is the gross wavefront bending due to mirror curvature effects. The analysis considers substrate material, geometry, and cooling in order to determine potential deformation controlling factors. Substrate materials are compared, and performance indicators are suggested to aid in selecting an optimum material for a given heating condition. Deformation examples are given for materials of interest and specific absorbed power levels.

  2. Meta-analysis of magnitudes, differences and variation in evolutionary parameters.

    PubMed

    Morrissey, M B

    2016-10-01

    Meta-analysis is increasingly used to synthesize major patterns in the large literatures within ecology and evolution. Meta-analytic methods that do not account for the process of observing data, which we may refer to as 'informal meta-analyses', may have undesirable properties. In some cases, informal meta-analyses may produce results that are unbiased, but do not necessarily make the best possible use of available data. In other cases, unbiased statistical noise in individual reports in the literature can potentially be converted into severe systematic biases in informal meta-analyses. I first present a general description of how failure to account for noise in individual inferences should be expected to lead to biases in some kinds of meta-analysis. In particular, informal meta-analyses of quantities that reflect the dispersion of parameters in nature, for example, the mean absolute value of a quantity, are likely to be generally highly misleading. I then re-analyse three previously published informal meta-analyses, where key inferences were of aspects of the dispersion of values in nature, for example, the mean absolute value of selection gradients. Major biological conclusions in each original informal meta-analysis closely match those that could arise as artefacts due to statistical noise. I present alternative mixed-model-based analyses that are specifically tailored to each situation, but where all analyses may be implemented with widely available open-source software. In each example meta-re-analysis, major conclusions change substantially. © 2016 European Society For Evolutionary Biology. Journal of Evolutionary Biology © 2016 European Society For Evolutionary Biology.

  3. International perspectives on social media guidance for nurses: a content analysis.

    PubMed

    Ryan, Gemma

    2016-12-01

    Aim This article reports the results of an analysis of the content of national and international professional guidance on social media for the nursing profession. The aim was to consolidate good practice examples of social media guidelines, and inform the development of comprehensive guidance. Method A scoping search of professional nursing bodies' and organisations' social media guidance documents was undertaken using google search. Results 34 guidance documents were located, and a content analysis of these was conducted. Conclusion The results, combined with a review of competency hearings and literature, indicate that guidance should cover the context of social media, and support nurses to navigate and negotiate the differences between the real and online domains to help them translate awareness into actions.

  4. Strong smoker interest in 'setting an example to children' by quitting: national survey data.

    PubMed

    Thomson, George; Wilson, Nick; Weerasekera, Deepa; Edwards, Richard

    2011-02-01

    To further explore smoker views on reasons to quit. As part of the multi-country ITC Project, a national sample of 1,376 New Zealand adult (18+ years) smokers was surveyed in 2007/08. This sample included boosted sampling of Māori, Pacific and Asian New Zealanders. 'Setting an example to children' was given as 'very much' a reason to quit by 51%, compared to 45% giving personal health concerns. However, the 'very much' and 'somewhat' responses (combined) were greater for personal health (81%) than 'setting an example to children' (74%). Price was the third ranked reason (67%). In a multivariate analysis, women were significantly more likely to state that 'setting an example to children' was 'very much' or 'somewhat' a reason to quit; as were Māori, or Pacific compared to European; and those suffering financial stress. The relatively high importance of 'example to children' as a reason to quit is an unusual finding, and may have arisen as a result of social marketing campaigns encouraging cessation to protect families in New Zealand. The policy implications could include a need for a greater emphasis on social reasons (e.g. 'example to children'), in pack warnings, and in social marketing for smoking cessation. © 2011 The Authors. ANZJPH © 2010 Public Health Association of Australia.

  5. Ultrascalable Techniques Applied to the Global Intelligence Community Information Awareness Common Operating Picture (IA COP)

    DTIC Science & Technology

    2005-11-01

    more random. Autonomous systems can exchange entropy statistics for packet streams with no confidentiality concerns, potentially enabling timely and... analysis began with simulation results, which were validated by analysis of actual data from an Autonomous System (AS). A scale-free network is one...traffic—for example, time series of flux at given nodes and mean path length Outputs the time series from any node queried Calculates

  6. Uncertainty Prediction in Passive Target Motion Analysis

    DTIC Science & Technology

    2016-05-12

    fundamental property of bearings- only target motion analysis (TMA) is that bearing B to the Attorney Docket No. 300118 3 of 25 target 10 results...the measurements used to estimate them are often non-linear. This is true for the bearing observation: = tan −1 ( () () ) ( 3 ...Parameter Evaluation Plot ( PEP ) is one example of such a grid-based approach. U.S. Patent No. 7,020,046 discloses one version of this method and is

  7. Mapping agroecological zones and time lag in vegetation growth by means of Fourier analysis of time series of NDVI images

    NASA Technical Reports Server (NTRS)

    Menenti, M.; Azzali, S.; Verhoef, W.; Van Swol, R.

    1993-01-01

    Examples are presented of applications of a fast Fourier transform algorithm to analyze time series of images of Normalized Difference Vegetation Index values. The results obtained for a case study on Zambia indicated that differences in vegetation development among map units of an existing agroclimatic map were not significant, while reliable differences were observed among the map units obtained using the Fourier analysis.

  8. Code Analysis and Refactoring with Clang Tools, Version 0.1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kelley, Timothy M.

    2016-12-23

    Code Analysis and Refactoring with Clang Tools is a small set of example code that demonstrates techniques for applying tools distributed with the open source Clang compiler. Examples include analyzing where variables are used and replacing old data structures with standard structures.

  9. Investigation of the Fracture Behavior of Scaled HY-130 Weldments

    DTIC Science & Technology

    1990-06-01

    RESULTS OF EPFM ANALYSIS OF TFREE-POINT BEND SPECIMENS OF DIFFERENT SIZES ........... D-48 xi NAVSWC TR 90-360 TABLES 1 RESULTS OF DYNAMIC FRACTURE...characterize the critical loading needed to cause fracture of the welded joint. An example of such a measure of severity applicable to structures with...weldments. Application of the approach to two titanium alloys and the results of that hwvestigation will be presented in another report. The next four

  10. Automatic differentiation for design sensitivity analysis of structural systems using multiple processors

    NASA Technical Reports Server (NTRS)

    Nguyen, Duc T.; Storaasli, Olaf O.; Qin, Jiangning; Qamar, Ramzi

    1994-01-01

    An automatic differentiation tool (ADIFOR) is incorporated into a finite element based structural analysis program for shape and non-shape design sensitivity analysis of structural systems. The entire analysis and sensitivity procedures are parallelized and vectorized for high performance computation. Small scale examples to verify the accuracy of the proposed program and a medium scale example to demonstrate the parallel vector performance on multiple CRAY C90 processors are included.

  11. Interrupted Time Series Versus Statistical Process Control in Quality Improvement Projects.

    PubMed

    Andersson Hagiwara, Magnus; Andersson Gäre, Boel; Elg, Mattias

    2016-01-01

    To measure the effect of quality improvement interventions, it is appropriate to use analysis methods that measure data over time. Examples of such methods include statistical process control analysis and interrupted time series with segmented regression analysis. This article compares the use of statistical process control analysis and interrupted time series with segmented regression analysis for evaluating the longitudinal effects of quality improvement interventions, using an example study on an evaluation of a computerized decision support system.

  12. CXTFIT/Excel A modular adaptable code for parameter estimation, sensitivity analysis and uncertainty analysis for laboratory or field tracer experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tang, Guoping; Mayes, Melanie; Parker, Jack C

    2010-01-01

    We implemented the widely used CXTFIT code in Excel to provide flexibility and added sensitivity and uncertainty analysis functions to improve transport parameter estimation and to facilitate model discrimination for multi-tracer experiments on structured soils. Analytical solutions for one-dimensional equilibrium and nonequilibrium convection dispersion equations were coded as VBA functions so that they could be used as ordinary math functions in Excel for forward predictions. Macros with user-friendly interfaces were developed for optimization, sensitivity analysis, uncertainty analysis, error propagation, response surface calculation, and Monte Carlo analysis. As a result, any parameter with transformations (e.g., dimensionless, log-transformed, species-dependent reactions, etc.) couldmore » be estimated with uncertainty and sensitivity quantification for multiple tracer data at multiple locations and times. Prior information and observation errors could be incorporated into the weighted nonlinear least squares method with a penalty function. Users are able to change selected parameter values and view the results via embedded graphics, resulting in a flexible tool applicable to modeling transport processes and to teaching students about parameter estimation. The code was verified by comparing to a number of benchmarks with CXTFIT 2.0. It was applied to improve parameter estimation for four typical tracer experiment data sets in the literature using multi-model evaluation and comparison. Additional examples were included to illustrate the flexibilities and advantages of CXTFIT/Excel. The VBA macros were designed for general purpose and could be used for any parameter estimation/model calibration when the forward solution is implemented in Excel. A step-by-step tutorial, example Excel files and the code are provided as supplemental material.« less

  13. Uncertainty Estimation Cheat Sheet for Probabilistic Risk Assessment

    NASA Technical Reports Server (NTRS)

    Britton, Paul T.; Al Hassan, Mohammad; Ring, Robert W.

    2017-01-01

    "Uncertainty analysis itself is uncertain, therefore, you cannot evaluate it exactly," Source Uncertain Quantitative results for aerospace engineering problems are influenced by many sources of uncertainty. Uncertainty analysis aims to make a technical contribution to decision-making through the quantification of uncertainties in the relevant variables as well as through the propagation of these uncertainties up to the result. Uncertainty can be thought of as a measure of the 'goodness' of a result and is typically represented as statistical dispersion. This paper will explain common measures of centrality and dispersion; and-with examples-will provide guidelines for how they may be estimated to ensure effective technical contributions to decision-making.

  14. Lognormal Uncertainty Estimation for Failure Rates

    NASA Technical Reports Server (NTRS)

    Britton, Paul T.; Al Hassan, Mohammad; Ring, Robert W.

    2017-01-01

    "Uncertainty analysis itself is uncertain, therefore, you cannot evaluate it exactly," Source Uncertain. Quantitative results for aerospace engineering problems are influenced by many sources of uncertainty. Uncertainty analysis aims to make a technical contribution to decision-making through the quantification of uncertainties in the relevant variables as well as through the propagation of these uncertainties up to the result. Uncertainty can be thought of as a measure of the 'goodness' of a result and is typically represented as statistical dispersion. This presentation will explain common measures of centrality and dispersion; and-with examples-will provide guidelines for how they may be estimated to ensure effective technical contributions to decision-making.

  15. Using Log Linear Analysis for Categorical Family Variables.

    ERIC Educational Resources Information Center

    Moen, Phyllis

    The Goodman technique of log linear analysis is ideal for family research, because it is designed for categorical (non-quantitative) variables. Variables are dichotomized (for example, married/divorced, childless/with children) or otherwise categorized (for example, level of permissiveness, life cycle stage). Contingency tables are then…

  16. NASTRAN as an analytical research tool for composite mechanics and composite structures

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.; Sinclair, J. H.; Sullivan, T. L.

    1976-01-01

    Selected examples are described in which NASTRAN is used as an analysis research tool for composite mechanics and for composite structural components. The examples were selected to illustrate the importance of using NASTRAN as an analysis tool in this rapidly advancing field.

  17. Concordance correlation for model performance assessment: An example with reference evapotranspiration

    USDA-ARS?s Scientific Manuscript database

    Procedures for assessing model performance in agronomy are often arbitrary and not always helpful. An omnibus analysis statistic, concordance correlation, is widely known and used in many other sciences. An illustrative example is presented here. The analysis assumes the exact relationship “observat...

  18. Application of computational aerodynamics methods to the design and analysis of transport aircraft

    NASA Technical Reports Server (NTRS)

    Da Costa, A. L.

    1978-01-01

    The application and validation of several computational aerodynamic methods in the design and analysis of transport aircraft is established. An assessment is made concerning more recently developed methods that solve three-dimensional transonic flow and boundary layers on wings. Capabilities of subsonic aerodynamic methods are demonstrated by several design and analysis efforts. Among the examples cited are the B747 Space Shuttle Carrier Aircraft analysis, nacelle integration for transport aircraft, and winglet optimization. The accuracy and applicability of a new three-dimensional viscous transonic method is demonstrated by comparison of computed results to experimental data

  19. Approximate Analysis for Interlaminar Stresses in Composite Structures with Thickness Discontinuities

    NASA Technical Reports Server (NTRS)

    Rose, Cheryl A.; Starnes, James H., Jr.

    1996-01-01

    An efficient, approximate analysis for calculating complete three-dimensional stress fields near regions of geometric discontinuities in laminated composite structures is presented. An approximate three-dimensional local analysis is used to determine the detailed local response due to far-field stresses obtained from a global two-dimensional analysis. The stress results from the global analysis are used as traction boundary conditions for the local analysis. A generalized plane deformation assumption is made in the local analysis to reduce the solution domain to two dimensions. This assumption allows out-of-plane deformation to occur. The local analysis is based on the principle of minimum complementary energy and uses statically admissible stress functions that have an assumed through-the-thickness distribution. Examples are presented to illustrate the accuracy and computational efficiency of the local analysis. Comparisons of the results of the present local analysis with the corresponding results obtained from a finite element analysis and from an elasticity solution are presented. These results indicate that the present local analysis predicts the stress field accurately. Computer execution-times are also presented. The demonstrated accuracy and computational efficiency of the analysis make it well suited for parametric and design studies.

  20. Observational evidence and strength of evidence domains: case examples

    PubMed Central

    2014-01-01

    Background Systematic reviews of healthcare interventions most often focus on randomized controlled trials (RCTs). However, certain circumstances warrant consideration of observational evidence, and such studies are increasingly being included as evidence in systematic reviews. Methods To illustrate the use of observational evidence, we present case examples of systematic reviews in which observational evidence was considered as well as case examples of individual observational studies, and how they demonstrate various strength of evidence domains in accordance with current Agency for Healthcare Research and Quality (AHRQ) Evidence-based Practice Center (EPC) methods guidance. Results In the presented examples, observational evidence is used when RCTs are infeasible or raise ethical concerns, lack generalizability, or provide insufficient data. Individual study case examples highlight how observational evidence may fulfill required strength of evidence domains, such as study limitations (reduced risk of selection, detection, performance, and attrition); directness; consistency; precision; and reporting bias (publication, selective outcome reporting, and selective analysis reporting), as well as additional domains of dose-response association, plausible confounding that would decrease the observed effect, and strength of association (magnitude of effect). Conclusions The cases highlighted in this paper demonstrate how observational studies may provide moderate to (rarely) high strength evidence in systematic reviews. PMID:24758494

  1. Analysis of longitudinal "time series" data in toxicology.

    PubMed

    Cox, C; Cory-Slechta, D A

    1987-02-01

    Studies focusing on chronic toxicity or on the time course of toxicant effect often involve repeated measurements or longitudinal observations of endpoints of interest. Experimental design considerations frequently necessitate between-group comparisons of the resulting trends. Typically, procedures such as the repeated-measures analysis of variance have been used for statistical analysis, even though the required assumptions may not be satisfied in some circumstances. This paper describes an alternative analytical approach which summarizes curvilinear trends by fitting cubic orthogonal polynomials to individual profiles of effect. The resulting regression coefficients serve as quantitative descriptors which can be subjected to group significance testing. Randomization tests based on medians are proposed to provide a comparison of treatment and control groups. Examples from the behavioral toxicology literature are considered, and the results are compared to more traditional approaches, such as repeated-measures analysis of variance.

  2. Statistical Methods for the Analysis of Discrete Choice Experiments: A Report of the ISPOR Conjoint Analysis Good Research Practices Task Force.

    PubMed

    Hauber, A Brett; González, Juan Marcos; Groothuis-Oudshoorn, Catharina G M; Prior, Thomas; Marshall, Deborah A; Cunningham, Charles; IJzerman, Maarten J; Bridges, John F P

    2016-06-01

    Conjoint analysis is a stated-preference survey method that can be used to elicit responses that reveal preferences, priorities, and the relative importance of individual features associated with health care interventions or services. Conjoint analysis methods, particularly discrete choice experiments (DCEs), have been increasingly used to quantify preferences of patients, caregivers, physicians, and other stakeholders. Recent consensus-based guidance on good research practices, including two recent task force reports from the International Society for Pharmacoeconomics and Outcomes Research, has aided in improving the quality of conjoint analyses and DCEs in outcomes research. Nevertheless, uncertainty regarding good research practices for the statistical analysis of data from DCEs persists. There are multiple methods for analyzing DCE data. Understanding the characteristics and appropriate use of different analysis methods is critical to conducting a well-designed DCE study. This report will assist researchers in evaluating and selecting among alternative approaches to conducting statistical analysis of DCE data. We first present a simplistic DCE example and a simple method for using the resulting data. We then present a pedagogical example of a DCE and one of the most common approaches to analyzing data from such a question format-conditional logit. We then describe some common alternative methods for analyzing these data and the strengths and weaknesses of each alternative. We present the ESTIMATE checklist, which includes a list of questions to consider when justifying the choice of analysis method, describing the analysis, and interpreting the results. Copyright © 2016 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  3. Modelling uncertainty in incompressible flow simulation using Galerkin based generalized ANOVA

    NASA Astrophysics Data System (ADS)

    Chakraborty, Souvik; Chowdhury, Rajib

    2016-11-01

    This paper presents a new algorithm, referred to here as Galerkin based generalized analysis of variance decomposition (GG-ANOVA) for modelling input uncertainties and its propagation in incompressible fluid flow. The proposed approach utilizes ANOVA to represent the unknown stochastic response. Further, the unknown component functions of ANOVA are represented using the generalized polynomial chaos expansion (PCE). The resulting functional form obtained by coupling the ANOVA and PCE is substituted into the stochastic Navier-Stokes equation (NSE) and Galerkin projection is employed to decompose it into a set of coupled deterministic 'Navier-Stokes alike' equations. Temporal discretization of the set of coupled deterministic equations is performed by employing Adams-Bashforth scheme for convective term and Crank-Nicolson scheme for diffusion term. Spatial discretization is performed by employing finite difference scheme. Implementation of the proposed approach has been illustrated by two examples. In the first example, a stochastic ordinary differential equation has been considered. This example illustrates the performance of proposed approach with change in nature of random variable. Furthermore, convergence characteristics of GG-ANOVA has also been demonstrated. The second example investigates flow through a micro channel. Two case studies, namely the stochastic Kelvin-Helmholtz instability and stochastic vortex dipole, have been investigated. For all the problems results obtained using GG-ANOVA are in excellent agreement with benchmark solutions.

  4. Cognitive task analysis: Techniques applied to airborne weapons training

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Terranova, M.; Seamster, T.L.; Snyder, C.E.

    1989-01-01

    This is an introduction to cognitive task analysis as it may be used in Naval Air Systems Command (NAVAIR) training development. The focus of a cognitive task analysis is human knowledge, and its methods of analysis are those developed by cognitive psychologists. This paper explains the role that cognitive task analysis and presents the findings from a preliminary cognitive task analysis of airborne weapons operators. Cognitive task analysis is a collection of powerful techniques that are quantitative, computational, and rigorous. The techniques are currently not in wide use in the training community, so examples of this methodology are presented alongmore » with the results. 6 refs., 2 figs., 4 tabs.« less

  5. EMC analysis of MOS-1

    NASA Astrophysics Data System (ADS)

    Ishizawa, Y.; Abe, K.; Shirako, G.; Takai, T.; Kato, H.

    The electromagnetic compatibility (EMC) control method, system EMC analysis method, and system test method which have been applied to test the components of the MOS-1 satellite are described. The merits and demerits of the problem solving, specification, and system approaches to EMC control are summarized, and the data requirements of the SEMCAP (specification and electromagnetic compatibility analysis program) computer program for verifying the EMI safety margin of the components are sumamrized. Examples of EMC design are mentioned, and the EMC design process and selection method for EMC critical points are shown along with sample EMC test results.

  6. Synthesis, characterization and biological activity of Schiff bases based on chitosan and arylpyrazole moiety.

    PubMed

    Salama, Hend E; Saad, Gamal R; Sabaa, Magdy W

    2015-08-01

    The Schiff bases of chitosan were synthesized by the reaction of chitosan with 3-(4-substituted-phenyl)-1-phenyl-1H-pyrazole-4-carbaldehyde. The structure of the prepared chitosan derivatives was characterized by FT-IR spectroscopy, elemental analysis, and X-ray diffraction studies and thermogravimetric analysis (TG). The results show that the specific properties of Schiff bases of chitosan can be altered by modifying the molecular structures with proper substituent groups.TG results reveal that the thermal stability of the prepared chitosan Schiff bases was lower than chitosan. The activation energy of decomposition was calculated using Coats-Redfern model. The antimicrobial activity of chitosan and Schiff bases of chitosan were investigated against Streptococcus pneumonia, Bacillis subtilis, Escherichia coli (as examples of bacteria) and Aspergillus fumigatus, Geotricum candidum and Syncephalastrum recemosum (as examples of fungi). The results indicated that the antimicrobial activity of the Schiff bases was stronger than that of chitosan and was dependent on the substituent group. The activity of un-substituted arylpyrazole chitosan derivative toward the investigated bacteria and fungi species was better than the other derivatives. Copyright © 2015 Elsevier B.V. All rights reserved.

  7. Actuation for simultaneous motions and constraining efforts: an open chain example

    NASA Astrophysics Data System (ADS)

    Perreira, N. Duke

    1997-06-01

    A brief discussion on systems where simultaneous control of forces and velocities are desirable is given and an example linkage with revolute and prismatic joint is selected for further analysis. The Newton-Euler approach for dynamic system analysis is applied to the example to provide a basis of comparison. Gauge invariant transformations are used to convert the dynamic equations into invariant form suitable for use in a new dynamic system analysis method known as the motion-effort approach. This approach uses constraint elimination techniques based on singular value decompositions to recast the invariant form of dynamic system equations into orthogonal sets of motion and effort equations. Desired motions and constraining efforts are partitioned into ideally obtainable and unobtainable portions which are then used to determine the required actuation. The method is applied to the example system and an analytic estimate to its success is made.

  8. An analysis of satellite state vector observability using SST tracking data

    NASA Technical Reports Server (NTRS)

    Englar, T. S., Jr.; Hammond, C. L.

    1976-01-01

    Observability of satellite state vectors, using only SST tracking data was investigated by covariance analysis under a variety of satellite and station configurations. These results indicate very precarious observability in most short arc cases. The consequences of this are large variances on many state components, such as the downrange component of the relay satellite position. To illustrate the impact of observability problems, an example is given of two distinct satellite orbit pairs generating essentially the same data arc. The physical bases for unobservability are outlined and related to proposed TDRSS configurations. Results are relevant to any mission depending upon TDRSS to determine satellite state. The required mathematical analysis and the software used is described.

  9. Analysis of titanium content in titanium tetrachloride solution

    NASA Astrophysics Data System (ADS)

    Bi, Xiaoguo; Dong, Yingnan; Li, Shanshan; Guan, Duojiao; Wang, Jianyu; Tang, Meiling

    2018-03-01

    Strontium titanate, barium titan and lead titanate are new type of functional ceramic materials with good prospect, and titanium tetrachloride is a commonly in the production such products. Which excellent electrochemical performance of ferroelectric tempreature coefficient effect.In this article, three methods are used to calibrate the samples of titanium tetrachloride solution by back titration method, replacement titration method and gravimetric analysis method. The results show that the back titration method has many good points, for example, relatively simple operation, easy to judgment the titration end point, better accuracy and precision of analytical results, the relative standard deviation not less than 0.2%. So, it is the ideal of conventional analysis methods in the mass production.

  10. Decimal Fraction Arithmetic: Logical Error Analysis and Its Validation.

    ERIC Educational Resources Information Center

    Standiford, Sally N.; And Others

    This report illustrates procedures of item construction for addition and subtraction examples involving decimal fractions. Using a procedural network of skills required to solve such examples, an item characteristic matrix of skills analysis was developed to describe the characteristics of the content domain by projected student difficulties. Then…

  11. Tuning the Voices of a Choir: Detecting Ecological Gradients in Time-Series Populations.

    PubMed

    Buras, Allan; van der Maaten-Theunissen, Marieke; van der Maaten, Ernst; Ahlgrimm, Svenja; Hermann, Philipp; Simard, Sonia; Heinrich, Ingo; Helle, Gerd; Unterseher, Martin; Schnittler, Martin; Eusemann, Pascal; Wilmking, Martin

    2016-01-01

    This paper introduces a new approach-the Principal Component Gradient Analysis (PCGA)-to detect ecological gradients in time-series populations, i.e. several time-series originating from different individuals of a population. Detection of ecological gradients is of particular importance when dealing with time-series from heterogeneous populations which express differing trends. PCGA makes use of polar coordinates of loadings from the first two axes obtained by principal component analysis (PCA) to define groups of similar trends. Based on the mean inter-series correlation (rbar) the gain of increasing a common underlying signal by PCGA groups is quantified using Monte Carlo Simulations. In terms of validation PCGA is compared to three other existing approaches. Focusing on dendrochronological examples, PCGA is shown to correctly determine population gradients and in particular cases to be advantageous over other considered methods. Furthermore, PCGA groups in each example allowed for enhancing the strength of a common underlying signal and comparably well as hierarchical cluster analysis. Our results indicate that PCGA potentially allows for a better understanding of mechanisms causing time-series population gradients as well as objectively enhancing the performance of climate transfer functions in dendroclimatology. While our examples highlight the relevance of PCGA to the field of dendrochronology, we believe that also other disciplines working with data of comparable structure may benefit from PCGA.

  12. Tuning the Voices of a Choir: Detecting Ecological Gradients in Time-Series Populations

    PubMed Central

    Buras, Allan; van der Maaten-Theunissen, Marieke; van der Maaten, Ernst; Ahlgrimm, Svenja; Hermann, Philipp; Simard, Sonia; Heinrich, Ingo; Helle, Gerd; Unterseher, Martin; Schnittler, Martin; Eusemann, Pascal; Wilmking, Martin

    2016-01-01

    This paper introduces a new approach–the Principal Component Gradient Analysis (PCGA)–to detect ecological gradients in time-series populations, i.e. several time-series originating from different individuals of a population. Detection of ecological gradients is of particular importance when dealing with time-series from heterogeneous populations which express differing trends. PCGA makes use of polar coordinates of loadings from the first two axes obtained by principal component analysis (PCA) to define groups of similar trends. Based on the mean inter-series correlation (rbar) the gain of increasing a common underlying signal by PCGA groups is quantified using Monte Carlo Simulations. In terms of validation PCGA is compared to three other existing approaches. Focusing on dendrochronological examples, PCGA is shown to correctly determine population gradients and in particular cases to be advantageous over other considered methods. Furthermore, PCGA groups in each example allowed for enhancing the strength of a common underlying signal and comparably well as hierarchical cluster analysis. Our results indicate that PCGA potentially allows for a better understanding of mechanisms causing time-series population gradients as well as objectively enhancing the performance of climate transfer functions in dendroclimatology. While our examples highlight the relevance of PCGA to the field of dendrochronology, we believe that also other disciplines working with data of comparable structure may benefit from PCGA. PMID:27467508

  13. Maternal mortality as a Millennium Development Goal of the United Nations: a systematic assessment and analysis of available data in threshold countries using Indonesia as example

    PubMed Central

    Reinke, Evelyn; Supriyatiningsih; Haier, Jörg

    2017-01-01

    Background In 2015 the proposed period ended for achieving the Millennium Development Goals (MDG) of the United Nations targeting to lower maternal mortality worldwide by ~ 75%. 99% of these cases appear in developing and threshold countries; but reports mostly rely on incomplete or unrepresentative data. Using Indonesia as example, currently available data sets for maternal mortality were systematically reviewed. Methods Besides analysis of international and national data resources, a systematic review was carried out according to Cochrane methodology to identify all data and assessments regarding maternal mortality. Results Overall, primary data on maternal mortality differed significantly and were hardly comparable. For 1990 results varied between 253/100 000 and 446/100 000. In 2013 data appeared more conclusive (140–199/100 000). An annual reduction rate (ARR) of –2.8% can be calculated. Conclusion Reported data quality of maternal mortality in Indonesia is very limited regarding comprehensive availability and methodology. This limitation appears to be of general importance for the targeted countries of the MDG. Primary data are rare, not uniformly obtained and not evaluated by comparable methods resulting in very limited comparability. Continuous small data set registration should have high priority for analysis of maternal health activities. PMID:28400953

  14. The analysis of 3-phase squirrel-cage induction motors including space harmonics and mutual slotting in transient and steady state

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Paap, G.C.

    1991-03-01

    From general equations which describe the transient electromechanical behavior of the asynchronous squirrel-cage motor, and which include the influence of space harmonics and mutual slotting, simplified models are derived and compared. The models derived are demonstrated in examples where special attention is paid to the influence of the place of the harmonics in the mutual inductance matrix and the influence of mutual slotting. Further, the steady-state equations are derived and the back-transformation for the stator and rotor currents is given. One example is compared with the result of measurements.

  15. Examples of Effective Data Sharing in Scientific Publishing

    DOE PAGES

    Kitchin, John R.

    2015-05-11

    Here, we present a perspective on an approach to data sharing in scientific publications we have been developing in our group. The essence of the approach is that data can be embedded in a human-readable and machine-addressable way within the traditional publishing environment. We show this by example for both computational and experimental data. We articulate a need for new authoring tools to facilitate data sharing, and we discuss the tools we have been developing for this purpose. With these tools, data generation, analysis, and manuscript preparation can be deeply integrated, resulting in easier and better data sharing in scientificmore » publications.« less

  16. A study of delamination buckling of laminates

    NASA Technical Reports Server (NTRS)

    Mukherjee, Yu-Xie; Xie, Zhi-Cheng; Ingraffea, Anthony

    1990-01-01

    The subject of this paper is the buckling of laminated plates, with a preexisting delamination, subjected to in-plane loading. Each laminate is modelled as an orthotropic Mindlin plate. The analysis is carried out by a combination of the finite element and asymptotic expansion methods. By applying the finite element method, plates with general delamination regions can be studied. The asymptotic expansion method reduces the number of unknown variables of the eigenvalue equation to that of the equation for a single Kirchhoff plate. Numerical results are presented for several examples. The effects of the shape, size, and position of the delamination on the buckling load are studied through these examples.

  17. Examples of Effective Data Sharing in Scientific Publishing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kitchin, John R.

    Here, we present a perspective on an approach to data sharing in scientific publications we have been developing in our group. The essence of the approach is that data can be embedded in a human-readable and machine-addressable way within the traditional publishing environment. We show this by example for both computational and experimental data. We articulate a need for new authoring tools to facilitate data sharing, and we discuss the tools we have been developing for this purpose. With these tools, data generation, analysis, and manuscript preparation can be deeply integrated, resulting in easier and better data sharing in scientificmore » publications.« less

  18. Parametrically excited non-linear multidegree-of-freedom systems with repeated natural frequencies

    NASA Astrophysics Data System (ADS)

    Tezak, E. G.; Nayfeh, A. H.; Mook, D. T.

    1982-12-01

    A method for analyzing multidegree-of-freedom systems having a repeated natural frequency subjected to a parametric excitation is presented. Attention is given to the ordering of the various terms (linear and non-linear) in the governing equations. The analysis is based on the method of multiple scales. As a numerical example involving a parametric resonance, panel flutter is discussed in detail in order to illustrate the type of results one can expect to obtain with this analysis. Some of the analytical results are verified by a numerical integration of the governing equations.

  19. Fractal-Based Oscillation of Macular Arteriogenesis and Dropout During Progressive Diabetic Retinopathy

    NASA Technical Reports Server (NTRS)

    Radharkrishnan, Krishnan; Kaiser, Peter K.

    2011-01-01

    By both fractal (D1) and branching (Lv) analysis, macular arterial density oscillated with progression from mild NPDR to PDR. Results are consistent with out study reported recently for the entire arterial and venous branching trees within 50 degree FAs by VESGEN generational branching analysis. Current and previous results are important for advances in early-stage regenerative DR therapies, for which reversal of DR progression to a normal vessel density may be possible. For example, potential use of regenerative angiogenesis stimulators to reverse vascular dropout during mild and severe NPDR is not indicated for treatment of moderate NPDR.

  20. Conformational interconversions in peptide beta-turns: analysis of turns in proteins and computational estimates of barriers.

    PubMed

    Gunasekaran, K; Gomathi, L; Ramakrishnan, C; Chandrasekhar, J; Balaram, P

    1998-12-18

    The two most important beta-turn features in peptides and proteins are the type I and type II turns, which differ mainly in the orientation of the central peptide unit. Facile conformational interconversion is possible, in principle, by a flip of the central peptide unit. Homologous crystal structures afford an opportunity to structurally characterize both possible conformational states, thus allowing identification of sites that are potentially stereochemically mobile. A representative data set of 250 high-resolution (

  1. A probabilistic-entropy approach of finding thematically similar documents with creating context-semantic graph for investigating evolution of society opinion

    NASA Astrophysics Data System (ADS)

    Moloshnikov, I. A.; Sboev, A. G.; Rybka, R. B.; Gydovskikh, D. V.

    2016-02-01

    The composite algorithm integrating, on one hand, the algorithm of finding documents on a given topic, and, on the other hand, the method of emotiveness evaluation of topical texts is presented. This method is convenient for analysis of people opinions expressed in social media and, as a result, for automated analysis of event evolutions in social media. Some examples of such analysing are demonstrated and discussed.

  2. Analysis of Particulate and Chemical Residue Resulting from Exposure to Burning and Abrading Composite Materials

    DTIC Science & Technology

    2013-05-31

    21 Figure 15. Example of a Possible Foreign Object Observed in a Small Number of Slides. This Object May Be a Hair, Thread, or Plant Material that...h)anthracene Fluoranthene Fluorene Indeno(1,2,3-cd)pyrene Naphthalene Phenanthrene Pyrene 16 Distribution A. Approved for public release...material during sampling. These were subject to particle analysis as described above in order to estimate the coverage ratio and particle density of

  3. Analysis and optimization of indicators of energy and resource consumption of gas turbine and electric drives for transportation of hydrocarbons

    NASA Astrophysics Data System (ADS)

    Golik, V. V.; Zemenkova, M. Yu; Seroshtanov, I. V.; Begalko, Z. V.

    2018-05-01

    The paper presents the results of the analysis of statistical indicators of energy and resource consumption in oil and gas transportation by the example of one of the regions of Russia. The article analyzes engineering characteristics of compressor station drives. Official statistical bulletins on the fuel and energy resources of the region in the pipeline oil and gas transportation system were used as the initial data.

  4. A collocation-shooting method for solving fractional boundary value problems

    NASA Astrophysics Data System (ADS)

    Al-Mdallal, Qasem M.; Syam, Muhammed I.; Anwar, M. N.

    2010-12-01

    In this paper, we discuss the numerical solution of special class of fractional boundary value problems of order 2. The method of solution is based on a conjugating collocation and spline analysis combined with shooting method. A theoretical analysis about the existence and uniqueness of exact solution for the present class is proven. Two examples involving Bagley-Torvik equation subject to boundary conditions are also presented; numerical results illustrate the accuracy of the present scheme.

  5. Complex numbers in chemometrics: examples from multivariate impedance measurements on lipid monolayers.

    PubMed

    Geladi, Paul; Nelson, Andrew; Lindholm-Sethson, Britta

    2007-07-09

    Electrical impedance gives multivariate complex number data as results. Two examples of multivariate electrical impedance data measured on lipid monolayers in different solutions give rise to matrices (16x50 and 38x50) of complex numbers. Multivariate data analysis by principal component analysis (PCA) or singular value decomposition (SVD) can be used for complex data and the necessary equations are given. The scores and loadings obtained are vectors of complex numbers. It is shown that the complex number PCA and SVD are better at concentrating information in a few components than the naïve juxtaposition method and that Argand diagrams can replace score and loading plots. Different concentrations of Magainin and Gramicidin A give different responses and also the role of the electrolyte medium can be studied. An interaction of Gramicidin A in the solution with the monolayer over time can be observed.

  6. The Quantitative Analysis of User Behavior Online - Data, Models and Algorithms

    NASA Astrophysics Data System (ADS)

    Raghavan, Prabhakar

    By blending principles from mechanism design, algorithms, machine learning and massive distributed computing, the search industry has become good at optimizing monetization on sound scientific principles. This represents a successful and growing partnership between computer science and microeconomics. When it comes to understanding how online users respond to the content and experiences presented to them, we have more of a lacuna in the collaboration between computer science and certain social sciences. We will use a concrete technical example from image search results presentation, developing in the process some algorithmic and machine learning problems of interest in their own right. We then use this example to motivate the kinds of studies that need to grow between computer science and the social sciences; a critical element of this is the need to blend large-scale data analysis with smaller-scale eye-tracking and "individualized" lab studies.

  7. Bivariate normal, conditional and rectangular probabilities: A computer program with applications

    NASA Technical Reports Server (NTRS)

    Swaroop, R.; Brownlow, J. D.; Ashwworth, G. R.; Winter, W. R.

    1980-01-01

    Some results for the bivariate normal distribution analysis are presented. Computer programs for conditional normal probabilities, marginal probabilities, as well as joint probabilities for rectangular regions are given: routines for computing fractile points and distribution functions are also presented. Some examples from a closed circuit television experiment are included.

  8. The Application of Electrochemical and Surface Analysis Approaches to Studying Copper Corrosion in Water: Fundamentals, Limitations, and Examples

    EPA Science Inventory

    Corrosion control is a concern for many drinking water utilities. The Lead and Copper Rule established a regulatory need to maintain a corrosion control program. Other corrosion-related issues such as “red” water resulting from excessive iron corrosion and copper pinhole leaks ...

  9. Inside the Black Box: Revealing the Process in Applying a Grounded Theory Analysis

    ERIC Educational Resources Information Center

    Rich, Peter

    2012-01-01

    Qualitative research methods have long set an example of rich description, in which data and researchers' hermeneutics work together to inform readers of findings in specific contexts. Among published works, insight into the analytical process is most often represented in the form of methodological propositions or research results. This paper…

  10. A summary and evaluation of semi-empirical methods for the prediction of helicopter rotor noise

    NASA Technical Reports Server (NTRS)

    Pegg, R. J.

    1979-01-01

    Existing prediction techniques are compiled and described. The descriptions include input and output parameter lists, required equations and graphs, and the range of validity for each part of the prediction procedures. Examples are provided illustrating the analysis procedure and the degree of agreement with experimental results.

  11. Discourse Factors in the Evaluation of Language Ability.

    ERIC Educational Resources Information Center

    Litteral, Robert

    Features of connected discourse that have been identified by discourse analysis may be applied to the evaluation of oral proficiency in a second language. For example, in the area of semantics, a speaker's control of the cause-result relationship involves, among other things, the ability to produce the different grammatical and lexical…

  12. Assigning Cases to Groups Using Taxometric Results: An Empirical Comparison of Classification Techniques

    ERIC Educational Resources Information Center

    Ruscio, John

    2009-01-01

    Determining whether individuals belong to different latent classes (taxa) or vary along one or more latent factors (dimensions) has implications for assessment. For example, no instrument can simultaneously maximize the efficiency of categorical and continuous measurement. Methods such as taxometric analysis can test the relative fit of taxonic…

  13. Environmental Regulations as Drivers of Materials Obsolescence

    NASA Technical Reports Server (NTRS)

    Scroggins, Sharon

    2010-01-01

    This slide presentation reviews the operations of the Principal Center for Regulatory Risk Analysis and Communication (RRAC-PC) and the impact of environmental regulations in making some materials obsolete. The center is NASA's resource for identifying and managing risks associated with changing environmental regulations. To this end the center acts as an regulatory early warning system, to review track and analyze emerging regulations, collaborate with the technical community on regulatory risk analysis and interpretation and to represent NASA's interests to the regulatory agencies. Regulations frequently result in making some materials unavailable forcing a change to another material. Processes may also be changed due to environmental regulations. For example some items that were sprayed with a chemical may now have to be painted or dipped with the chemical. Sometimes a regulation changes the use of a certain product, which does not affect the usage on Earth, but has significant implications in space. An example of this is the use of lead-free solders of basically tin, which don't appear to have any problem on Earth, but in space applications tin whiskers have resulted in several confirmed satellite failures.

  14. Map of low-frequency electromagnetic noise in the sky

    NASA Astrophysics Data System (ADS)

    Füllekrug, Martin; Mezentsev, Andrew; Watson, Robert; Gaffet, Stéphane; Astin, Ivan; Smith, Nathan; Evans, Adrian

    2015-06-01

    The Earth's natural electromagnetic environment is disturbed by anthropogenic electromagnetic noise. Here we report the first results from an electromagnetic noise survey of the sky. The locations of electromagnetic noise sources are mapped on the hemisphere above a distributed array of wideband receivers that operate in a small aperture configuration. It is found that the noise sources can be localized at elevation angles up to ˜60° in the sky, well above the horizon. The sky also exhibits zones with little or no noise that are found toward the local zenith and the southwest of the array. These results are obtained by a rigorous analysis of the residuals from the classic dispersion relation for electromagnetic waves using an array analysis of electric field measurements in the frequency range from ˜20 to 250 kHz. The observed locations of the noise sources enable detailed observations of ionospheric modification, for example, caused by particle precipitation and lightning discharges, while the observed exclusion zones enable the detection of weak natural electromagnetic emissions, for example, from streamers in transient luminous events above thunderclouds.

  15. New method to incorporate Type B uncertainty into least-squares procedures in radionuclide metrology.

    PubMed

    Han, Jubong; Lee, K B; Lee, Jong-Man; Park, Tae Soon; Oh, J S; Oh, Pil-Jei

    2016-03-01

    We discuss a new method to incorporate Type B uncertainty into least-squares procedures. The new method is based on an extension of the likelihood function from which a conventional least-squares function is derived. The extended likelihood function is the product of the original likelihood function with additional PDFs (Probability Density Functions) that characterize the Type B uncertainties. The PDFs are considered to describe one's incomplete knowledge on correction factors being called nuisance parameters. We use the extended likelihood function to make point and interval estimations of parameters in the basically same way as the least-squares function used in the conventional least-squares method is derived. Since the nuisance parameters are not of interest and should be prevented from appearing in the final result, we eliminate such nuisance parameters by using the profile likelihood. As an example, we present a case study for a linear regression analysis with a common component of Type B uncertainty. In this example we compare the analysis results obtained from using our procedure with those from conventional methods. Copyright © 2015. Published by Elsevier Ltd.

  16. Conducting Simulation Studies in the R Programming Environment.

    PubMed

    Hallgren, Kevin A

    2013-10-12

    Simulation studies allow researchers to answer specific questions about data analysis, statistical power, and best-practices for obtaining accurate results in empirical research. Despite the benefits that simulation research can provide, many researchers are unfamiliar with available tools for conducting their own simulation studies. The use of simulation studies need not be restricted to researchers with advanced skills in statistics and computer programming, and such methods can be implemented by researchers with a variety of abilities and interests. The present paper provides an introduction to methods used for running simulation studies using the R statistical programming environment and is written for individuals with minimal experience running simulation studies or using R. The paper describes the rationale and benefits of using simulations and introduces R functions relevant for many simulation studies. Three examples illustrate different applications for simulation studies, including (a) the use of simulations to answer a novel question about statistical analysis, (b) the use of simulations to estimate statistical power, and (c) the use of simulations to obtain confidence intervals of parameter estimates through bootstrapping. Results and fully annotated syntax from these examples are provided.

  17. Methods for calculating confidence and credible intervals for the residual between-study variance in random effects meta-regression models

    PubMed Central

    2014-01-01

    Background Meta-regression is becoming increasingly used to model study level covariate effects. However this type of statistical analysis presents many difficulties and challenges. Here two methods for calculating confidence intervals for the magnitude of the residual between-study variance in random effects meta-regression models are developed. A further suggestion for calculating credible intervals using informative prior distributions for the residual between-study variance is presented. Methods Two recently proposed and, under the assumptions of the random effects model, exact methods for constructing confidence intervals for the between-study variance in random effects meta-analyses are extended to the meta-regression setting. The use of Generalised Cochran heterogeneity statistics is extended to the meta-regression setting and a Newton-Raphson procedure is developed to implement the Q profile method for meta-analysis and meta-regression. WinBUGS is used to implement informative priors for the residual between-study variance in the context of Bayesian meta-regressions. Results Results are obtained for two contrasting examples, where the first example involves a binary covariate and the second involves a continuous covariate. Intervals for the residual between-study variance are wide for both examples. Conclusions Statistical methods, and R computer software, are available to compute exact confidence intervals for the residual between-study variance under the random effects model for meta-regression. These frequentist methods are almost as easily implemented as their established counterparts for meta-analysis. Bayesian meta-regressions are also easily performed by analysts who are comfortable using WinBUGS. Estimates of the residual between-study variance in random effects meta-regressions should be routinely reported and accompanied by some measure of their uncertainty. Confidence and/or credible intervals are well-suited to this purpose. PMID:25196829

  18. Neural network representation and learning of mappings and their derivatives

    NASA Technical Reports Server (NTRS)

    White, Halbert; Hornik, Kurt; Stinchcombe, Maxwell; Gallant, A. Ronald

    1991-01-01

    Discussed here are recent theorems proving that artificial neural networks are capable of approximating an arbitrary mapping and its derivatives as accurately as desired. This fact forms the basis for further results establishing the learnability of the desired approximations, using results from non-parametric statistics. These results have potential applications in robotics, chaotic dynamics, control, and sensitivity analysis. An example involving learning the transfer function and its derivatives for a chaotic map is discussed.

  19. A simple technique investigating baseline heterogeneity helped to eliminate potential bias in meta-analyses.

    PubMed

    Hicks, Amy; Fairhurst, Caroline; Torgerson, David J

    2018-03-01

    To perform a worked example of an approach that can be used to identify and remove potentially biased trials from meta-analyses via the analysis of baseline variables. True randomisation produces treatment groups that differ only by chance; therefore, a meta-analysis of a baseline measurement should produce no overall difference and zero heterogeneity. A meta-analysis from the British Medical Journal, known to contain significant heterogeneity and imbalance in baseline age, was chosen. Meta-analyses of baseline variables were performed and trials systematically removed, starting with those with the largest t-statistic, until the I 2 measure of heterogeneity became 0%, then the outcome meta-analysis repeated with only the remaining trials as a sensitivity check. We argue that heterogeneity in a meta-analysis of baseline variables should not exist, and therefore removing trials which contribute to heterogeneity from a meta-analysis will produce a more valid result. In our example none of the overall outcomes changed when studies contributing to heterogeneity were removed. We recommend routine use of this technique, using age and a second baseline variable predictive of outcome for the particular study chosen, to help eliminate potential bias in meta-analyses. Copyright © 2017 Elsevier Inc. All rights reserved.

  20. [EXPERIENCE OF STUDY AND POSSIBLE WAYS OF ELIMINATION OF FALSE POSITIVE AND FALSE NEGATIVE RESULTS DURING EXECUTION OF POLYMERASE CHAIN REACTION ON AN EXAMPLE OF JUNIN VIRUS RNA DETECTION].

    PubMed

    Sizikova, T E; Lebedev, V N; Pantyukhov, V B; Borisevich, S V; Merkulov, V A

    2015-01-01

    Experience of study and possible ways of elimination of false positive and false negative results during execution of polymerase chain reaction on an example of Junin virus RNA detection. MATERIALSS AND METHODS: Junin virus--causative agent of Argentine hemorrhagic fever (AHF) strain XJpR37/5787 was obtained from the State collection of pathogenicity group I causative agents of the 48th Central Research Institute. Reagent kit for detection of Junin virus RNA by RT-PCR was developed in the Institute and consists of 4 sets: for isolation of RNA, execution of reverse-transcription reaction, execution of PCR and electrophoretic detection of PCR products. RT-PCR was carried out by a standard technique. Continuous cell cultures of African green monkey Vero B, GMK-AH-1(D) were obtained from the museum of cell culture department of the Centre. An experimental study of the effect of various factors of impact on the sample under investigation ("thawing-freezing", presence of formaldehyde, heparin) on the obtaining of false negative results during Junin virus RNA detection by using RT-PCR was studied. Addition of 0.01% heparin to the samples was shown to completely inhibit PCR. Addition of 0.05% formaldehyde significantly reduces sensitivity of the method. A possibility of reduction of analysis timeframe from 15 to 5 days was shown during detection of the causative agent in samples with low concentration of the latter by growing the samples and subsequent analysis of the material obtained by using RT-PCR. During detection of causative agent by using RT-PCR false negative results could appear in the presence of formaldehyde and heparin in the sample. A possibility of elimination of false negative PCR results due to concentration of the causative agent in the sample under investigation at a level below sensitivity threshold was shown on the example of Junin virus RNA detection by using growing of the pathogen in appropriate accumulation system with subsequent analysis of the material obtained using PCR.

  1. An intelligent computer tutor to guide self-explanation while learning from examples

    NASA Astrophysics Data System (ADS)

    Conati, Cristina

    1999-11-01

    Many studies in cognitive science show that self-explanation---the process of clarifying and making more complete to oneself the solution of an example---improves learning, and that guiding self-explanation extends these benefits. This thesis presents an intelligent computer tutor that aims to improve learning from examples by supporting self-explanation. The tutor, known as the SE (self-explanation) Coach, is innovative in two ways. First, it represents the first attempt to develop a computer tutor that supports example studying instead of problem solving. Second, it explicitly guides a domain-general, meta-cognitive skill: self-explanation. The SE-Coach is part of the Andes tutoring system for college physics and is meant to be used in conjunction with the problem solving tasks that Andes supports. In order to maximize the system capability to trigger the same beneficial cognitive processes, every element of the SE-Coach embeds existing hypotheses about the features that make self-explanation effective for learning. Designing the SE-Coach involved finding solutions for three main challenges: (1) To design an interface that effectively monitors and supports self-explanation. (2) To devise a student model that allows the assessment of example understanding from reading and self-examination actions. (3) To effectively elicit further self-explanation that improves student's example understanding. In this work we present our solutions to these challenges: (1) An interface including principled, interactive tools to explore examples and build self-explanations under the SECoach's supervision. (2) A probabilistic student model based on a Bayesian network, which integrates a model of correct self-explanation and information on the student's knowledge and studying actions to generate a probabilistic assessment of the student's example understanding. (3) Tutorial interventions that rely on the student model to detect deficits in the student's example understanding and elicit self-explanations that overcome them. In this thesis we also present the results of a formal study with 56 college students to evaluate the effectiveness of the SE-Coach. We discuss some hypotheses to explain the obtained results, based on the analysis of the data collected during the experiment.

  2. Enhanced definition and required examples of common datum imposed by ISO standard

    NASA Astrophysics Data System (ADS)

    Yan, Yiqing; Bohn, Martin

    2017-12-01

    According to the ISO Geometrical Product Specifications (GPS), the establishment and definition of common datum for geometrical components are not fully defined. There are two main limitations of this standard. Firstly: the explications of ISO examples of common datums are not matched with their corresponding definitions, and secondly: a full definition of common datum is missing. This paper suggests a new approach for an enhanced definition and concrete examples of common datum and proposes a holistic methodology for establishment of common datum for each geometrical component. This research is based on the analysis of physical behaviour of geometrical components, orientation constraints and invariance classes of datums. This approach fills the definition gaps of common datum based on ISO GPS, thereby eliminating those deficits. As a result, an improved methodology for a fully functional defined definition of common datum was formulated.

  3. `So, What Do Men and Women Want? Is It any Different from What Animals Want?' Sex Education in an Upper Secondary School

    NASA Astrophysics Data System (ADS)

    Orlander, Auli Arvola

    2016-12-01

    The aim of the study is to discuss and problematise notions of femininity and masculinity constructed in teaching situations among 16-year-old upper-secondary students studying science. The empirical examples originate from a teaching session with the theme of `sex and relationships'. The analysis is focused on metaphors inherent in a lesson that has its origins in the animal world. The findings show that the lesson `sex in the animal world' is full of anthropomorphism, metaphors that humanise animal behaviour. Teachers and students compare the animals' sexual behaviour with human behaviour, with the result that the animal world can be perceived as representative of natural sexual behaviour. The survey illustrates problems with how the examples are permeated by cultural values in the presentation of the animal world and how these examples form constructions of femininity and masculinity in the classroom.

  4. School-wide PBIS: An Example of Applied Behavior Analysis Implemented at a Scale of Social Importance.

    PubMed

    Horner, Robert H; Sugai, George

    2015-05-01

    School-wide Positive Behavioral Interventions and Supports (PBIS) is an example of applied behavior analysis implemented at a scale of social importance. In this paper, PBIS is defined and the contributions of behavior analysis in shaping both the content and implementation of PBIS are reviewed. Specific lessons learned from implementation of PBIS over the past 20 years are summarized.

  5. High temperature flow-through device for rapid solubilization and analysis

    DOEpatents

    West, Jason A. A. [Castro Valley, CA; Hukari, Kyle W [San Ramon, CA; Patel, Kamlesh D [Dublin, CA; Peterson, Kenneth A [Albuquerque, NM; Renzi, Ronald F [Tracy, CA

    2009-09-22

    Devices and methods for thermally lysing of biological material, for example vegetative bacterial cells and bacterial spores, are provided. Hot solution methods for solubilizing bacterial spores are described. Systems for direct analysis are disclosed including thermal lysers coupled to sample preparation stations. Integrated systems capable of performing sample lysis, labeling and protein fingerprint analysis of biological material, for example, vegetative bacterial cells, bacterial spores and viruses are provided.

  6. High temperature flow-through device for rapid solubilization and analysis

    DOEpatents

    West, Jason A. A.; Hukari, Kyle W.; Patel, Kamlesh D.; Peterson, Kenneth A.; Renzi, Ronald F.

    2013-04-23

    Devices and methods for thermally lysing of biological material, for example vegetative bacterial cells and bacterial spores, are provided. Hot solution methods for solubilizing bacterial spores are described. Systems for direct analysis are disclosed including thermal lysers coupled to sample preparation stations. Integrated systems capable of performing sample lysis, labeling and protein fingerprint analysis of biological material, for example, vegetative bacterial cells, bacterial spores and viruses are provided.

  7. Process Improvement Through Tool Integration in Aero-Mechanical Design

    NASA Technical Reports Server (NTRS)

    Briggs, Clark

    2010-01-01

    Emerging capabilities in commercial design tools promise to significantly improve the multi-disciplinary and inter-disciplinary design and analysis coverage for aerospace mechanical engineers. This paper explores the analysis process for two example problems of a wing and flap mechanical drive system and an aircraft landing gear door panel. The examples begin with the design solid models and include various analysis disciplines such as structural stress and aerodynamic loads. Analytical methods include CFD, multi-body dynamics with flexible bodies and structural analysis. Elements of analysis data management, data visualization and collaboration are also included.

  8. 40 CFR 1065.275 - N2O measurement devices.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... procedures for interpretation of infrared spectra. For example, EPA Test Method 320 is considered a valid... infrared analyzer. Examples of laser infrared analyzers are pulsed-mode high-resolution narrow band mid... for analysis. Examples of acceptable columns are a PLOT column consisting of bonded polystyrene...

  9. NASA Systems Analysis and Concepts Directorate Mission and Trade Study Analysis

    NASA Technical Reports Server (NTRS)

    Ricks, Wendell; Guynn, Mark; Hahn, Andrew; Lepsch, Roger; Mazanek, Dan; Dollyhigh, Sam

    2006-01-01

    Mission analysis, as practiced by the NASA Langley Research Center's Systems Analysis and Concepts Directorate (SACD), consists of activities used to define, assess, and evaluate a wide spectrum of aerospace systems for given requirements. The missions for these systems encompass a broad range from aviation to space exploration. The customer, who is usually another NASA organization or another government agency, often predefines the mission. Once a mission is defined, the goals and objectives that the system will need to meet are delineated and quantified. A number of alternative systems are then typically developed and assessed relative to these goals and objectives. This is done in order to determine the most favorable design approaches for further refinement. Trade studies are performed in order to understand the impact of a requirement on each system and to select among competing design options. Items varied in trade studies typically include: design variables or design constraints; technology and subsystem options; and operational approaches. The results of trade studies are often used to refine the mission and system requirements. SACD studies have been integral to the decision processes of many organizations for decades. Many recent examples of SACD mission and trade study analyses illustrate their excellence and influence. The SACD-led, Agency-wide effort to analyze a broad range of future human lunar exploration scenarios for NASA s Exploration Systems Mission Directorate (ESMD) and the Mars airplane design study in support of the Aerial Regional-scale Environment Survey of Mars (ARES) mission are two such examples. This paper describes SACD's mission and trade study analysis activities in general and presents the lunar exploration and Mars airplane studies as examples of type of work performed by the SACD.

  10. Analysis of Recent Corporal Punishment Cases Reported in National Newspapers.

    ERIC Educational Resources Information Center

    Clarke, Jacqueline; And Others

    This paper presents examples of types of corporal punishment and a content analysis of newspaper articles since 1977 dealing with corporal punishment in public and nonpublic schools. Examples are used to illustrate types of punishment, paddling injuries, injuries to other parts of the body, special punishments devised by teachers, deaths due to…

  11. Discovering Reliable Sources of Biochemical Thermodynamic Data to Aid Students' Understanding

    ERIC Educational Resources Information Center

    Me´ndez, Eduardo; Cerda´, María F.

    2016-01-01

    Students of physical chemistry in biochemical disciplines need biochemical examples to capture the need, not always understood, of a difficult area in their studies. The use of thermodynamic data in the chemical reference state may lead to incorrect interpretations in the analysis of biochemical examples when the analysis does not include relevant…

  12. Communicating Comparative Findings from Meta-Analysis in Educational Research: Some Examples and Suggestions

    ERIC Educational Resources Information Center

    Higgins, Steve; Katsipataki, Maria

    2016-01-01

    This article reviews some of the strengths and limitations of the comparative use of meta-analysis findings, using examples from the Sutton Trust-Education Endowment Foundation Teaching and Learning "Toolkit" which summarizes a range of educational approaches to improve pupil attainment in schools. This comparative use of quantitative…

  13. Melding Leadership Lessons with Data Collection and Analysis Lessons: Two Classroom Examples

    ERIC Educational Resources Information Center

    Lindahl, Ronald

    2014-01-01

    The purpose of this module is to illustrate examples of how courses in educational leadership programs can effectively and efficiently meld lessons on leadership with lessons on data collection and analysis. The rationale behind emphasizing this combination is very straightforward: America's schools need leaders who are adept with data-based…

  14. High Resolution Melting (HRM) for High-Throughput Genotyping-Limitations and Caveats in Practical Case Studies.

    PubMed

    Słomka, Marcin; Sobalska-Kwapis, Marta; Wachulec, Monika; Bartosz, Grzegorz; Strapagiel, Dominik

    2017-11-03

    High resolution melting (HRM) is a convenient method for gene scanning as well as genotyping of individual and multiple single nucleotide polymorphisms (SNPs). This rapid, simple, closed-tube, homogenous, and cost-efficient approach has the capacity for high specificity and sensitivity, while allowing easy transition to high-throughput scale. In this paper, we provide examples from our laboratory practice of some problematic issues which can affect the performance and data analysis of HRM results, especially with regard to reference curve-based targeted genotyping. We present those examples in order of the typical experimental workflow, and discuss the crucial significance of the respective experimental errors and limitations for the quality and analysis of results. The experimental details which have a decisive impact on correct execution of a HRM genotyping experiment include type and quality of DNA source material, reproducibility of isolation method and template DNA preparation, primer and amplicon design, automation-derived preparation and pipetting inconsistencies, as well as physical limitations in melting curve distinction for alternative variants and careful selection of samples for validation by sequencing. We provide a case-by-case analysis and discussion of actual problems we encountered and solutions that should be taken into account by researchers newly attempting HRM genotyping, especially in a high-throughput setup.

  15. High Resolution Melting (HRM) for High-Throughput Genotyping—Limitations and Caveats in Practical Case Studies

    PubMed Central

    Słomka, Marcin; Sobalska-Kwapis, Marta; Wachulec, Monika; Bartosz, Grzegorz

    2017-01-01

    High resolution melting (HRM) is a convenient method for gene scanning as well as genotyping of individual and multiple single nucleotide polymorphisms (SNPs). This rapid, simple, closed-tube, homogenous, and cost-efficient approach has the capacity for high specificity and sensitivity, while allowing easy transition to high-throughput scale. In this paper, we provide examples from our laboratory practice of some problematic issues which can affect the performance and data analysis of HRM results, especially with regard to reference curve-based targeted genotyping. We present those examples in order of the typical experimental workflow, and discuss the crucial significance of the respective experimental errors and limitations for the quality and analysis of results. The experimental details which have a decisive impact on correct execution of a HRM genotyping experiment include type and quality of DNA source material, reproducibility of isolation method and template DNA preparation, primer and amplicon design, automation-derived preparation and pipetting inconsistencies, as well as physical limitations in melting curve distinction for alternative variants and careful selection of samples for validation by sequencing. We provide a case-by-case analysis and discussion of actual problems we encountered and solutions that should be taken into account by researchers newly attempting HRM genotyping, especially in a high-throughput setup. PMID:29099791

  16. Scope of inextensible frame hypothesis in local action analysis of spherical reservoirs

    NASA Astrophysics Data System (ADS)

    Vinogradov, Yu. I.

    2017-05-01

    Spherical reservoirs, as objects perfect with respect to their weight, are used in spacecrafts, where thin-walled elements are joined by frames into multifunction structures. The junctions are local, which results in origination of stress concentration regions and the corresponding rigidity problems. The thin-walled elements are reinforced by frame to decrease the stresses in them. To simplify the analysis of the mathematical model of common deformation of the shell (which is a mathematical idealization of the reservoir) and the frame, the assumption that the frame axial line is inextensible is used widely (in particular, in the manual literature). The unjustified use of this assumption significantly distorts the concept of the stress-strain state. In this paper, an example of a lens-shaped structure formed as two spherical shell segments connected by a frame of square profile is used to carry out a numerical comparative analysis of the solutions with and without the inextensible frame hypothesis taken into account. The scope of the hypothesis is shown depending on the structure geometric parameters and the load location degree. The obtained results can be used to determine the stress-strain state of the thin-walled structure with an a priori prescribed error, for example, in research and experimental design of aerospace systems.

  17. Formalization, Annotation and Analysis of Diverse Drug and Probe Screening Assay Datasets Using the BioAssay Ontology (BAO)

    PubMed Central

    Vempati, Uma D.; Przydzial, Magdalena J.; Chung, Caty; Abeyruwan, Saminda; Mir, Ahsan; Sakurai, Kunie; Visser, Ubbo; Lemmon, Vance P.; Schürer, Stephan C.

    2012-01-01

    Huge amounts of high-throughput screening (HTS) data for probe and drug development projects are being generated in the pharmaceutical industry and more recently in the public sector. The resulting experimental datasets are increasingly being disseminated via publically accessible repositories. However, existing repositories lack sufficient metadata to describe the experiments and are often difficult to navigate by non-experts. The lack of standardized descriptions and semantics of biological assays and screening results hinder targeted data retrieval, integration, aggregation, and analyses across different HTS datasets, for example to infer mechanisms of action of small molecule perturbagens. To address these limitations, we created the BioAssay Ontology (BAO). BAO has been developed with a focus on data integration and analysis enabling the classification of assays and screening results by concepts that relate to format, assay design, technology, target, and endpoint. Previously, we reported on the higher-level design of BAO and on the semantic querying capabilities offered by the ontology-indexed triple store of HTS data. Here, we report on our detailed design, annotation pipeline, substantially enlarged annotation knowledgebase, and analysis results. We used BAO to annotate assays from the largest public HTS data repository, PubChem, and demonstrate its utility to categorize and analyze diverse HTS results from numerous experiments. BAO is publically available from the NCBO BioPortal at http://bioportal.bioontology.org/ontologies/1533. BAO provides controlled terminology and uniform scope to report probe and drug discovery screening assays and results. BAO leverages description logic to formalize the domain knowledge and facilitate the semantic integration with diverse other resources. As a consequence, BAO offers the potential to infer new knowledge from a corpus of assay results, for example molecular mechanisms of action of perturbagens. PMID:23155465

  18. Detecting the contagion effect in mass killings; a constructive example of the statistical advantages of unbinned likelihood methods.

    PubMed

    Towers, Sherry; Mubayi, Anuj; Castillo-Chavez, Carlos

    2018-01-01

    When attempting to statistically distinguish between a null and an alternative hypothesis, many researchers in the life and social sciences turn to binned statistical analysis methods, or methods that are simply based on the moments of a distribution (such as the mean, and variance). These methods have the advantage of simplicity of implementation, and simplicity of explanation. However, when null and alternative hypotheses manifest themselves in subtle differences in patterns in the data, binned analysis methods may be insensitive to these differences, and researchers may erroneously fail to reject the null hypothesis when in fact more sensitive statistical analysis methods might produce a different result when the null hypothesis is actually false. Here, with a focus on two recent conflicting studies of contagion in mass killings as instructive examples, we discuss how the use of unbinned likelihood methods makes optimal use of the information in the data; a fact that has been long known in statistical theory, but perhaps is not as widely appreciated amongst general researchers in the life and social sciences. In 2015, Towers et al published a paper that quantified the long-suspected contagion effect in mass killings. However, in 2017, Lankford & Tomek subsequently published a paper, based upon the same data, that claimed to contradict the results of the earlier study. The former used unbinned likelihood methods, and the latter used binned methods, and comparison of distribution moments. Using these analyses, we also discuss how visualization of the data can aid in determination of the most appropriate statistical analysis methods to distinguish between a null and alternate hypothesis. We also discuss the importance of assessment of the robustness of analysis results to methodological assumptions made (for example, arbitrary choices of number of bins and bin widths when using binned methods); an issue that is widely overlooked in the literature, but is critical to analysis reproducibility and robustness. When an analysis cannot distinguish between a null and alternate hypothesis, care must be taken to ensure that the analysis methodology itself maximizes the use of information in the data that can distinguish between the two hypotheses. The use of binned methods by Lankford & Tomek (2017), that examined how many mass killings fell within a 14 day window from a previous mass killing, substantially reduced the sensitivity of their analysis to contagion effects. The unbinned likelihood methods used by Towers et al (2015) did not suffer from this problem. While a binned analysis might be favorable for simplicity and clarity of presentation, unbinned likelihood methods are preferable when effects might be somewhat subtle.

  19. Detecting the contagion effect in mass killings; a constructive example of the statistical advantages of unbinned likelihood methods

    PubMed Central

    Mubayi, Anuj; Castillo-Chavez, Carlos

    2018-01-01

    Background When attempting to statistically distinguish between a null and an alternative hypothesis, many researchers in the life and social sciences turn to binned statistical analysis methods, or methods that are simply based on the moments of a distribution (such as the mean, and variance). These methods have the advantage of simplicity of implementation, and simplicity of explanation. However, when null and alternative hypotheses manifest themselves in subtle differences in patterns in the data, binned analysis methods may be insensitive to these differences, and researchers may erroneously fail to reject the null hypothesis when in fact more sensitive statistical analysis methods might produce a different result when the null hypothesis is actually false. Here, with a focus on two recent conflicting studies of contagion in mass killings as instructive examples, we discuss how the use of unbinned likelihood methods makes optimal use of the information in the data; a fact that has been long known in statistical theory, but perhaps is not as widely appreciated amongst general researchers in the life and social sciences. Methods In 2015, Towers et al published a paper that quantified the long-suspected contagion effect in mass killings. However, in 2017, Lankford & Tomek subsequently published a paper, based upon the same data, that claimed to contradict the results of the earlier study. The former used unbinned likelihood methods, and the latter used binned methods, and comparison of distribution moments. Using these analyses, we also discuss how visualization of the data can aid in determination of the most appropriate statistical analysis methods to distinguish between a null and alternate hypothesis. We also discuss the importance of assessment of the robustness of analysis results to methodological assumptions made (for example, arbitrary choices of number of bins and bin widths when using binned methods); an issue that is widely overlooked in the literature, but is critical to analysis reproducibility and robustness. Conclusions When an analysis cannot distinguish between a null and alternate hypothesis, care must be taken to ensure that the analysis methodology itself maximizes the use of information in the data that can distinguish between the two hypotheses. The use of binned methods by Lankford & Tomek (2017), that examined how many mass killings fell within a 14 day window from a previous mass killing, substantially reduced the sensitivity of their analysis to contagion effects. The unbinned likelihood methods used by Towers et al (2015) did not suffer from this problem. While a binned analysis might be favorable for simplicity and clarity of presentation, unbinned likelihood methods are preferable when effects might be somewhat subtle. PMID:29742115

  20. Safety of serotonin (5-HT3) receptor antagonists in patients undergoing surgery and chemotherapy: protocol for a systematic review and network meta-analysis.

    PubMed

    Tricco, Andrea C; Soobiah, Charlene; Antony, Jesmin; Hemmelgarn, Brenda; Moher, David; Hutton, Brian; Straus, Sharon E

    2013-06-28

    Serotonin (5-HT3) receptor antagonists are a class of antiemetic medications often used to prevent nausea and vomiting among patients undergoing chemotherapy, radiotherapy or surgery. However, recent studies suggest that these agents might be associated with increased cardiac harm. To examine this further, we are proposing to conduct a systematic review and network meta-analysis on the comparative safety of 5-HT3 receptor antagonists among patients undergoing chemotherapy or surgery. Studies reporting one or more safety outcomes of interest for 5-HT3 receptor antagonists compared with each other, placebo, and/or other anti-emetic agents (for example, benzamides, phenothiazines, butyrophenones, antihistamines, and anticholinergics) among children and adult patients undergoing surgery or chemotherapy will be included. Our primary outcome of interest is arrhythmia. Our secondary outcomes include cardiac death, QT prolongation, PR prolongation, all-cause mortality, nausea, and vomiting. We will include experimental studies, quasi-experimental studies (namely controlled before-after and interrupted time series), and observational studies (namely cohort studies). We will not limit inclusion by publication status, time period, duration of follow-up or language of dissemination.Electronic databases (for example, MEDLINE, EMBASE) will be searched from inception onwards. These main searches will be supplemented by searching for difficult to locate and unpublished studies, such as dissertations, and governmental reports. The eligibility criteria will be pilot-tested and subsequently used to screen the literature search results by two reviewers in duplicate. A similar process will be followed for full-text screening, data abstraction, and risk of bias/methodological quality appraisal. The Cochrane Risk of Bias tool will be used to appraise experimental and quasi-experimental studies, and cohort studies will be assessed using the Newcastle Ottawa Scale. If the data allows, random effects meta-analysis and a network (that is, mixed treatment comparisons) meta-analysis will be conducted. All analyses will be conducted separately for different study designs, patient populations (for example, children and adults), and reason for administering 5-HT3 receptor antagonists (for example, post-surgery and chemotherapy). Our results will help inform patients, clinicians, and health policy-makers about the potential safety concerns, as well as the comparative safety, of using these antiemetic agents. PROSPERO registry number:CRD42013003564.

  1. Moving Beyond Univariate Post-Hoc Testing in Exercise Science: A Primer on Descriptive Discriminate Analysis.

    PubMed

    Barton, Mitch; Yeatts, Paul E; Henson, Robin K; Martin, Scott B

    2016-12-01

    There has been a recent call to improve data reporting in kinesiology journals, including the appropriate use of univariate and multivariate analysis techniques. For example, a multivariate analysis of variance (MANOVA) with univariate post hocs and a Bonferroni correction is frequently used to investigate group differences on multiple dependent variables. However, this univariate approach decreases power, increases the risk for Type 1 error, and contradicts the rationale for conducting multivariate tests in the first place. The purpose of this study was to provide a user-friendly primer on conducting descriptive discriminant analysis (DDA), which is a post-hoc strategy to MANOVA that takes into account the complex relationships among multiple dependent variables. A real-world example using the Statistical Package for the Social Sciences syntax and data from 1,095 middle school students on their body composition and body image are provided to explain and interpret the results from DDA. While univariate post hocs increased the risk for Type 1 error to 76%, the DDA identified which dependent variables contributed to group differences and which groups were different from each other. For example, students in the very lean and Healthy Fitness Zone categories for body mass index experienced less pressure to lose weight, more satisfaction with their body, and higher physical self-concept than the Needs Improvement Zone groups. However, perceived pressure to gain weight did not contribute to group differences because it was a suppressor variable. Researchers are encouraged to use DDA when investigating group differences on multiple correlated dependent variables to determine which variables contributed to group differences.

  2. Using mind mapping techniques for rapid qualitative data analysis in public participation processes.

    PubMed

    Burgess-Allen, Jilla; Owen-Smith, Vicci

    2010-12-01

    In a health service environment where timescales for patient participation in service design are short and resources scarce, a balance needs to be achieved between research rigour and the timeliness and utility of the findings of patient participation processes. To develop a pragmatic mind mapping approach to managing the qualitative data from patient participation processes. While this article draws on experience of using mind maps in a variety of participation processes, a single example is used to illustrate the approach. In this example mind maps were created during the course of patient participation focus groups. Two group discussions were also transcribed verbatim to allow comparison of the rapid mind mapping approach with traditional thematic analysis of qualitative data. The illustrative example formed part of a local alcohol service review which included consultation with local alcohol service users, their families and staff groups. The mind mapping approach provided a pleasing graphical format for representing the key themes raised during the focus groups. It helped stimulate and galvanize discussion and keep it on track, enhanced transparency and group ownership of the data analysis process, allowed a rapid dynamic between data collection and feedback, and was considerably faster than traditional methods for the analysis of focus groups, while resulting in similar broad themes. This study suggests that the use of a mind mapping approach to managing qualitative data can provide a pragmatic resolution of the tension between limited resources and quality in patient participation processes. © 2010 The Authors. Health Expectations © 2010 Blackwell Publishing Ltd.

  3. Medical Image Analysis by Cognitive Information Systems - a Review.

    PubMed

    Ogiela, Lidia; Takizawa, Makoto

    2016-10-01

    This publication presents a review of medical image analysis systems. The paradigms of cognitive information systems will be presented by examples of medical image analysis systems. The semantic processes present as it is applied to different types of medical images. Cognitive information systems were defined on the basis of methods for the semantic analysis and interpretation of information - medical images - applied to cognitive meaning of medical images contained in analyzed data sets. Semantic analysis was proposed to analyzed the meaning of data. Meaning is included in information, for example in medical images. Medical image analysis will be presented and discussed as they are applied to various types of medical images, presented selected human organs, with different pathologies. Those images were analyzed using different classes of cognitive information systems. Cognitive information systems dedicated to medical image analysis was also defined for the decision supporting tasks. This process is very important for example in diagnostic and therapy processes, in the selection of semantic aspects/features, from analyzed data sets. Those features allow to create a new way of analysis.

  4. Fundamental principles of conducting a surgery economic analysis study.

    PubMed

    Kotsis, Sandra V; Chung, Kevin C

    2010-02-01

    The use of economic evaluation in surgery is scarce. Economic evaluation is used even less so in plastic surgery, in which health-related quality of life is of particular importance. This article, part of a tutorial series on evidence-based medicine, focuses on the fundamental principles of conducting a surgery economic analysis. The authors include the essential aspects of conducting a surgical cost-utility analysis by considering perspectives, costs, outcomes, and utilities. The authors also describe and give examples of how to conduct the analyses (including calculating quality-adjusted life-years and discounting), how to interpret the results, and how to report the results. Although economic analyses are not simple to conduct, a well-conducted one provides many rewards, such as recommending the adoption of a more effective treatment. For comparing and interpreting economic analysis publications, it is important that all studies use consistent methodology and report the results in a similar manner.

  5. A Study of the NASS-CDS System for Injury/Fatality Rates of Occupants in Various Restraints and A Discussion of Alternative Presentation Methods

    PubMed Central

    Stucki, Sheldon Lee; Biss, David J.

    2000-01-01

    An analysis was performed using the National Automotive Sampling System Crashworthiness Data System (NASS-CDS) database to compare the injury/fatality rates of variously restrained driver occupants as compared to unrestrained driver occupants in the total database of drivers/frontals, and also by Delta-V. A structured search of the NASS-CDS was done using the SAS® statistical analysis software to extract the data for this analysis and the SUDAAN software package was used to arrive at statistical significance indicators. In addition, this paper goes on to investigate different methods for presenting results of accident database searches including significance results; a risk versus Delta-V format for specific exposures; and, a percent cumulative injury versus Delta-V format to characterize injury trends. These alternative analysis presentation methods are then discussed by example using the present study results. PMID:11558105

  6. Meta-Analysis of Effect Sizes Reported at Multiple Time Points Using General Linear Mixed Model.

    PubMed

    Musekiwa, Alfred; Manda, Samuel O M; Mwambi, Henry G; Chen, Ding-Geng

    2016-01-01

    Meta-analysis of longitudinal studies combines effect sizes measured at pre-determined time points. The most common approach involves performing separate univariate meta-analyses at individual time points. This simplistic approach ignores dependence between longitudinal effect sizes, which might result in less precise parameter estimates. In this paper, we show how to conduct a meta-analysis of longitudinal effect sizes where we contrast different covariance structures for dependence between effect sizes, both within and between studies. We propose new combinations of covariance structures for the dependence between effect size and utilize a practical example involving meta-analysis of 17 trials comparing postoperative treatments for a type of cancer, where survival is measured at 6, 12, 18 and 24 months post randomization. Although the results from this particular data set show the benefit of accounting for within-study serial correlation between effect sizes, simulations are required to confirm these results.

  7. Sentiment analysis of political communication: combining a dictionary approach with crowdcoding.

    PubMed

    Haselmayer, Martin; Jenny, Marcelo

    2017-01-01

    Sentiment is important in studies of news values, public opinion, negative campaigning or political polarization and an explosive expansion of digital textual data and fast progress in automated text analysis provide vast opportunities for innovative social science research. Unfortunately, tools currently available for automated sentiment analysis are mostly restricted to English texts and require considerable contextual adaption to produce valid results. We present a procedure for collecting fine-grained sentiment scores through crowdcoding to build a negative sentiment dictionary in a language and for a domain of choice. The dictionary enables the analysis of large text corpora that resource-intensive hand-coding struggles to cope with. We calculate the tonality of sentences from dictionary words and we validate these estimates with results from manual coding. The results show that the crowdbased dictionary provides efficient and valid measurement of sentiment. Empirical examples illustrate its use by analyzing the tonality of party statements and media reports.

  8. Fundamental Travel Demand Model Example

    NASA Technical Reports Server (NTRS)

    Hanssen, Joel

    2010-01-01

    Instances of transportation models are abundant and detailed "how to" instruction is available in the form of transportation software help documentation. The purpose of this paper is to look at the fundamental inputs required to build a transportation model by developing an example passenger travel demand model. The example model reduces the scale to a manageable size for the purpose of illustrating the data collection and analysis required before the first step of the model begins. This aspect of the model development would not reasonably be discussed in software help documentation (it is assumed the model developer comes prepared). Recommendations are derived from the example passenger travel demand model to suggest future work regarding the data collection and analysis required for a freight travel demand model.

  9. Framing of Uncertainty in Scientific Publications: Towards Recommendations for Decision Support

    NASA Astrophysics Data System (ADS)

    Guillaume, J. H. A.; Helgeson, C.; Elsawah, S.; Jakeman, A. J.; Kummu, M.

    2016-12-01

    Uncertainty is recognised as an essential issue in environmental decision making and decision support. As modellers, we notably use a variety of tools and techniques within an analysis, for example related to uncertainty quantification and model validation. We also address uncertainty by how we present results. For example, experienced modellers are careful to distinguish robust conclusions from those that need further work, and the precision of quantitative results is tailored to their accuracy. In doing so, the modeller frames how uncertainty should be interpreted by their audience. This is an area which extends beyond modelling to fields such as philosophy of science, semantics, discourse analysis, intercultural communication and rhetoric. We propose that framing of uncertainty deserves greater attention in the context of decision support, and that there are opportunities in this area for fundamental research, synthesis and knowledge transfer, development of teaching curricula, and significant advances in managing uncertainty in decision making. This presentation reports preliminary results of a study of framing practices. Specifically, we analyse the framing of uncertainty that is visible in the abstracts from a corpus of scientific articles. We do this through textual analysis of the content and structure of those abstracts. Each finding that appears in an abstract is classified according to the uncertainty framing approach used, using a classification scheme that was iteratively revised based on reflection and comparison amongst three coders. This analysis indicates how frequently the different framing approaches are used, and provides initial insights into relationships between frames, how the frames relate to interpretation of uncertainty, and how rhetorical devices are used by modellers to communicate uncertainty in their work. We propose initial hypotheses for how the resulting insights might influence decision support, and help advance decision making to better address uncertainty.

  10. Optimization Based Efficiencies in First Order Reliability Analysis

    NASA Technical Reports Server (NTRS)

    Peck, Jeffrey A.; Mahadevan, Sankaran

    2003-01-01

    This paper develops a method for updating the gradient vector of the limit state function in reliability analysis using Broyden's rank one updating technique. In problems that use commercial code as a black box, the gradient calculations are usually done using a finite difference approach, which becomes very expensive for large system models. The proposed method replaces the finite difference gradient calculations in a standard first order reliability method (FORM) with Broyden's Quasi-Newton technique. The resulting algorithm of Broyden updates within a FORM framework (BFORM) is used to run several example problems, and the results compared to standard FORM results. It is found that BFORM typically requires fewer functional evaluations that FORM to converge to the same answer.

  11. Multistability and instability analysis of recurrent neural networks with time-varying delays.

    PubMed

    Zhang, Fanghai; Zeng, Zhigang

    2018-01-01

    This paper provides new theoretical results on the multistability and instability analysis of recurrent neural networks with time-varying delays. It is shown that such n-neuronal recurrent neural networks have exactly [Formula: see text] equilibria, [Formula: see text] of which are locally exponentially stable and the others are unstable, where k 0 is a nonnegative integer such that k 0 ≤n. By using the combination method of two different divisions, recurrent neural networks can possess more dynamic properties. This method improves and extends the existing results in the literature. Finally, one numerical example is provided to show the superiority and effectiveness of the presented results. Copyright © 2017 Elsevier Ltd. All rights reserved.

  12. The convergence study of the homotopy analysis method for solving nonlinear Volterra-Fredholm integrodifferential equations.

    PubMed

    Ghanbari, Behzad

    2014-01-01

    We aim to study the convergence of the homotopy analysis method (HAM in short) for solving special nonlinear Volterra-Fredholm integrodifferential equations. The sufficient condition for the convergence of the method is briefly addressed. Some illustrative examples are also presented to demonstrate the validity and applicability of the technique. Comparison of the obtained results HAM with exact solution shows that the method is reliable and capable of providing analytic treatment for solving such equations.

  13. Research of processes of reception and analysis of dynamic digital medical images in hardware/software complexes used for diagnostics and treatment of cardiovascular diseases

    NASA Astrophysics Data System (ADS)

    Karmazikov, Y. V.; Fainberg, E. M.

    2005-06-01

    Work with DICOM compatible equipment integrated into hardware and software systems for medical purposes has been considered. Structures of process of reception and translormation of the data are resulted by the example of digital rentgenography and angiography systems, included in hardware-software complex DIMOL-IK. Algorithms of reception and the analysis of the data are offered. Questions of the further processing and storage of the received data are considered.

  14. Free vibrations and buckling analysis of laminated plates by oscillatory radial basis functions

    NASA Astrophysics Data System (ADS)

    Neves, A. M. A.; Ferreira, A. J. M.

    2015-12-01

    In this paper the free vibrations and buckling analysis of laminated plates is performed using a global meshless method. A refined version of Kant's theorie which accounts for transverse normal stress and through-the-thickness deformation is used. The innovation is the use of oscillatory radial basis functions. Numerical examples are performed and results are presented and compared to available references. Such functions proved to be an alternative to the tradicional nonoscillatory radial basis functions.

  15. Effects of Barometric Fluctuations on Well Water-Level Measurements and Aquifer Test Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Spane, Frank A.

    1999-12-16

    This report examines the effects of barometric fluctuations on well water-level measurements and evaluates adjustment and removal methods for determining areal aquifer head conditions and aquifer test analysis. Two examples of Hanford Site unconfined aquifer tests are examined that demonstrate baro-metric response analysis and illustrate the predictive/removal capabilities of various methods for well water-level and aquifer total head values. Good predictive/removal characteristics were demonstrated with best corrective results provided by multiple-regression deconvolution methods.

  16. An efficient solution procedure for the thermoelastic analysis of truss space structures

    NASA Technical Reports Server (NTRS)

    Givoli, D.; Rand, O.

    1992-01-01

    A solution procedure is proposed for the thermal and thermoelastic analysis of truss space structures in periodic motion. In this method, the spatial domain is first descretized using a consistent finite element formulation. Then the resulting semi-discrete equations in time are solved analytically by using Fourier decomposition. Geometrical symmetry is taken advantage of completely. An algorithm is presented for the calculation of heat flux distribution. The method is demonstrated via a numerical example of a cylindrically shaped space structure.

  17. Risks to Navigation at the Matagorda Ship Channel Entrance, Texas, Phase 2: Evaluation of Significant Risk Factors

    DTIC Science & Technology

    2011-08-01

    jetties are deteriorating. As a result of this deterioration and lowered beach and dunes adjacent to the jetties, there are overwash occurrences during...the toe . An example slope stability analysis is presented in Figure 51. This figure shows a typical cross section or model properties (soil layers...depth caused by the ship passage. Any area of influence will be localized and, in light of a critical gradient analysis, near- toe scouring effects

  18. Structural and Functional Analysis of HIV-1 Coreceptors: Roles of Charged Residues and Posttranslational Modifications on Coreceptor Activity

    DTIC Science & Technology

    2000-01-01

    to sites of inflammation. They may have additional functions. For example analysis of CXCR4 knockout mice show that CXCR4, which is chemotactic for... mice had similar phenotypes (195). Homozygous knockout of CXCR4 or SDF-1 results in embyonic lethality. Though CCR5 appears to be dispensable, other...chemokine receptors have vital functions. CXCR5 knockout mice have B-cell homing defects (118), and CXCR2 knockout mice overproduce B-cells and

  19. Structural and Functional Analysis of HIV-1 Coreceptors: Roles of Charged Residues and Posttranslational Modifications on Coreceptor Activity

    DTIC Science & Technology

    2000-01-01

    various organs and to sites of inflammation. They may have additional functions. For example analysis of CXCR4 knockout mice show that CXCR4, which...SDF-1 knockout mice had similar phenotypes (195). Homozygous knockout of CXCR4 or SDF-1 results in embyonic lethality. Though CCR5 appears to be...dispensable, other chemokine receptors have vital functions. CXCR5 knockout mice have B-cell homing defects (118), and CXCR2 knockout mice

  20. Stability analysis of piecewise non-linear systems and its application to chaotic synchronisation with intermittent control

    NASA Astrophysics Data System (ADS)

    Wang, Qingzhi; Tan, Guanzheng; He, Yong; Wu, Min

    2017-10-01

    This paper considers a stability analysis issue of piecewise non-linear systems and applies it to intermittent synchronisation of chaotic systems. First, based on piecewise Lyapunov function methods, more general and less conservative stability criteria of piecewise non-linear systems in periodic and aperiodic cases are presented, respectively. Next, intermittent synchronisation conditions of chaotic systems are derived which extend existing results. Finally, Chua's circuit is taken as an example to verify the validity of our methods.

  1. Flexibility evaluation of multiechelon supply chains.

    PubMed

    Almeida, João Flávio de Freitas; Conceição, Samuel Vieira; Pinto, Luiz Ricardo; de Camargo, Ricardo Saraiva; Júnior, Gilberto de Miranda

    2018-01-01

    Multiechelon supply chains are complex logistics systems that require flexibility and coordination at a tactical level to cope with environmental uncertainties in an efficient and effective manner. To cope with these challenges, mathematical programming models are developed to evaluate supply chain flexibility. However, under uncertainty, supply chain models become complex and the scope of flexibility analysis is generally reduced. This paper presents a unified approach that can evaluate the flexibility of a four-echelon supply chain via a robust stochastic programming model. The model simultaneously considers the plans of multiple business divisions such as marketing, logistics, manufacturing, and procurement, whose goals are often conflicting. A numerical example with deterministic parameters is presented to introduce the analysis, and then, the model stochastic parameters are considered to evaluate flexibility. The results of the analysis on supply, manufacturing, and distribution flexibility are presented. Tradeoff analysis of demand variability and service levels is also carried out. The proposed approach facilitates the adoption of different management styles, thus improving supply chain resilience. The model can be extended to contexts pertaining to supply chain disruptions; for example, the model can be used to explore operation strategies when subtle events disrupt supply, manufacturing, or distribution.

  2. Constructing Scientific Explanations: a System of Analysis for Students' Explanations

    NASA Astrophysics Data System (ADS)

    de Andrade, Vanessa; Freire, Sofia; Baptista, Mónica

    2017-08-01

    This article describes a system of analysis aimed at characterizing students' scientific explanations. Science education literature and reform documents have been highlighting the importance of scientific explanations for students' conceptual understanding and for their understanding of the nature of scientific knowledge. Nevertheless, and despite general agreement regarding the potential of having students construct their own explanations, a consensual notion of scientific explanation has still not been reached. As a result, within science education literature, there are several frameworks defining scientific explanations, with different foci as well as different notions of what accounts as a good explanation. Considering this, and based on a more ample project, we developed a system of analysis to characterize students' explanations. It was conceptualized and developed based on theories and models of scientific explanations, science education literature, and from examples of students' explanations collected by an open-ended questionnaire. With this paper, it is our goal to present the system of analysis, illustrating it with specific examples of students' collected explanations. In addition, we expect to point out its adequacy and utility for analyzing and characterizing students' scientific explanations as well as for tracing their progression.

  3. Flexibility evaluation of multiechelon supply chains

    PubMed Central

    Conceição, Samuel Vieira; Pinto, Luiz Ricardo; de Camargo, Ricardo Saraiva; Júnior, Gilberto de Miranda

    2018-01-01

    Multiechelon supply chains are complex logistics systems that require flexibility and coordination at a tactical level to cope with environmental uncertainties in an efficient and effective manner. To cope with these challenges, mathematical programming models are developed to evaluate supply chain flexibility. However, under uncertainty, supply chain models become complex and the scope of flexibility analysis is generally reduced. This paper presents a unified approach that can evaluate the flexibility of a four-echelon supply chain via a robust stochastic programming model. The model simultaneously considers the plans of multiple business divisions such as marketing, logistics, manufacturing, and procurement, whose goals are often conflicting. A numerical example with deterministic parameters is presented to introduce the analysis, and then, the model stochastic parameters are considered to evaluate flexibility. The results of the analysis on supply, manufacturing, and distribution flexibility are presented. Tradeoff analysis of demand variability and service levels is also carried out. The proposed approach facilitates the adoption of different management styles, thus improving supply chain resilience. The model can be extended to contexts pertaining to supply chain disruptions; for example, the model can be used to explore operation strategies when subtle events disrupt supply, manufacturing, or distribution. PMID:29584755

  4. An overview of STRUCTURE: applications, parameter settings, and supporting software

    PubMed Central

    Porras-Hurtado, Liliana; Ruiz, Yarimar; Santos, Carla; Phillips, Christopher; Carracedo, Ángel; Lareu, Maria V.

    2013-01-01

    Objectives: We present an up-to-date review of STRUCTURE software: one of the most widely used population analysis tools that allows researchers to assess patterns of genetic structure in a set of samples. STRUCTURE can identify subsets of the whole sample by detecting allele frequency differences within the data and can assign individuals to those sub-populations based on analysis of likelihoods. The review covers STRUCTURE's most commonly used ancestry and frequency models, plus an overview of the main applications of the software in human genetics including case-control association studies (CCAS), population genetics, and forensic analysis. The review is accompanied by supplementary material providing a step-by-step guide to running STRUCTURE. Methods: With reference to a worked example, we explore the effects of changing the principal analysis parameters on STRUCTURE results when analyzing a uniform set of human genetic data. Use of the supporting software: CLUMPP and distruct is detailed and we provide an overview and worked example of STRAT software, applicable to CCAS. Conclusion: The guide offers a simplified view of how STRUCTURE, CLUMPP, distruct, and STRAT can be applied to provide researchers with an informed choice of parameter settings and supporting software when analyzing their own genetic data. PMID:23755071

  5. FEAMAC/CARES Stochastic-Strength-Based Damage Simulation Tool for Ceramic Matrix Composites

    NASA Technical Reports Server (NTRS)

    Nemeth, Noel; Bednarcyk, Brett; Pineda, Evan; Arnold, Steven; Mital, Subodh; Murthy, Pappu; Bhatt, Ramakrishna

    2016-01-01

    Reported here is a coupling of two NASA developed codes: CARES (Ceramics Analysis and Reliability Evaluation of Structures) with the MAC/GMC (Micromechanics Analysis Code/ Generalized Method of Cells) composite material analysis code. The resulting code is called FEAMAC/CARES and is constructed as an Abaqus finite element analysis UMAT (user defined material). Here we describe the FEAMAC/CARES code and an example problem (taken from the open literature) of a laminated CMC in off-axis loading is shown. FEAMAC/CARES performs stochastic-strength-based damage simulation response of a CMC under multiaxial loading using elastic stiffness reduction of the failed elements.

  6. Stochastic-Strength-Based Damage Simulation Tool for Ceramic Matrix Composite

    NASA Technical Reports Server (NTRS)

    Nemeth, Noel; Bednarcyk, Brett; Pineda, Evan; Arnold, Steven; Mital, Subodh; Murthy, Pappu

    2015-01-01

    Reported here is a coupling of two NASA developed codes: CARES (Ceramics Analysis and Reliability Evaluation of Structures) with the MAC/GMC (Micromechanics Analysis Code/ Generalized Method of Cells) composite material analysis code. The resulting code is called FEAMAC/CARES and is constructed as an Abaqus finite element analysis UMAT (user defined material). Here we describe the FEAMAC/CARES code and an example problem (taken from the open literature) of a laminated CMC in off-axis loading is shown. FEAMAC/CARES performs stochastic-strength-based damage simulation response of a CMC under multiaxial loading using elastic stiffness reduction of the failed elements.

  7. [Discourse analysis: research potentialities to gender violence].

    PubMed

    de Azambuja, Mariana Porto Ruwer; Nogueira, Conceição

    2009-01-01

    In the last few years we see the growing use of the terms 'discourse' and 'discourses analysis' in academic and research contexts, frequently without a precise definition. This fact opens space for critics and mistakes. The aim of this paper is to show a brief contextualization of discursive studies, as well as tasks/steps to Discourse Analysis process by the Social Construcionism perspective. As examples we used fragments of an interview with a Family Doctor about gender violence. In the results we detach the potential of Discourse Analysis to deconstruct the existing discourses to subsequently (re)construction in the way to a more holistic view about gender violence problem.

  8. Bayesian Methods for the Physical Sciences. Learning from Examples in Astronomy and Physics.

    NASA Astrophysics Data System (ADS)

    Andreon, Stefano; Weaver, Brian

    2015-05-01

    Chapter 1: This chapter presents some basic steps for performing a good statistical analysis, all summarized in about one page. Chapter 2: This short chapter introduces the basics of probability theory inan intuitive fashion using simple examples. It also illustrates, again with examples, how to propagate errors and the difference between marginal and profile likelihoods. Chapter 3: This chapter introduces the computational tools and methods that we use for sampling from the posterior distribution. Since all numerical computations, and Bayesian ones are no exception, may end in errors, we also provide a few tips to check that the numerical computation is sampling from the posterior distribution. Chapter 4: Many of the concepts of building, running, and summarizing the resultsof a Bayesian analysis are described with this step-by-step guide using a basic (Gaussian) model. The chapter also introduces examples using Poisson and Binomial likelihoods, and how to combine repeated independent measurements. Chapter 5: All statistical analyses make assumptions, and Bayesian analyses are no exception. This chapter emphasizes that results depend on data and priors (assumptions). We illustrate this concept with examples where the prior plays greatly different roles, from major to negligible. We also provide some advice on how to look for information useful for sculpting the prior. Chapter 6: In this chapter we consider examples for which we want to estimate more than a single parameter. These common problems include estimating location and spread. We also consider examples that require the modeling of two populations (one we are interested in and a nuisance population) or averaging incompatible measurements. We also introduce quite complex examples dealing with upper limits and with a larger-than-expected scatter. Chapter 7: Rarely is a sample randomly selected from the population we wish to study. Often, samples are affected by selection effects, e.g., easier-to-collect events or objects are over-represented in samples and difficult-to-collect are under-represented if not missing altogether. In this chapter we show how to account for non-random data collection to infer the properties of the population from the studied sample. Chapter 8: In this chapter we introduce regression models, i.e., how to fit (regress) one, or more quantities, against each other through a functional relationship and estimate any unknown parameters that dictate this relationship. Questions of interest include: how to deal with samples affected by selection effects? How does a rich data structure influence the fitted parameters? And what about non-linear multiple-predictor fits, upper/lower limits, measurements errors of different amplitudes and an intrinsic variety in the studied populations or an extra source of variability? A number of examples illustrate how to answer these questions and how to predict the value of an unavailable quantity by exploiting the existence of a trend with another, available, quantity. Chapter 9: This chapter provides some advice on how the careful scientist should perform model checking and sensitivity analysis, i.e., how to answer the following questions: is the considered model at odds with the current available data (the fitted data), for example because it is over-simplified compared to some specific complexity pointed out by the data? Furthermore, are the data informative about the quantity being measured or are results sensibly dependent on details of the fitted model? And, finally, what about if assumptions are uncertain? A number of examples illustrate how to answer these questions. Chapter 10: This chapter compares the performance of Bayesian methods against simple, non-Bayesian alternatives, such as maximum likelihood, minimal chi square, ordinary and weighted least square, bivariate correlated errors and intrinsic scatter, and robust estimates of location and scale. Performances are evaluated in terms of quality of the prediction, accuracy of the estimates, and fairness and noisiness of the quoted errors. We also focus on three failures of maximum likelihood methods occurring with small samples, with mixtures, and with regressions with errors in the predictor quantity.

  9. Deployment Process, Mechanization, and Testing for the Mars Exploration Rovers

    NASA Technical Reports Server (NTRS)

    Iskenderian, Ted

    2004-01-01

    NASA's Mar Exploration Rover (MER) robotic prospectors were produced in an environment of unusually challenging schedule, volume, and mass restrictions. The technical challenges pushed the system s design towards extensive integration of function, which resulted in complex system engineering issues. One example of the system's integrated complexity can be found in the deployment process for the rover. Part of this process, rover "standup", is outlined in this paper. Particular attention is given to the Rover Lift Mechanism's (RLM) role and its design. Analysis methods are presented and compared to test results. It is shown that because prudent design principles were followed, a robust mechanism was created that minimized the duration of integration and test, and enabled recovery without perturbing related systems when reasonably foreseeable problems did occur. Examples of avoidable, unnecessary difficulty are also presented.

  10. Selecting predictors for discriminant analysis of species performance: an example from an amphibious softwater plant.

    PubMed

    Vanderhaeghe, F; Smolders, A J P; Roelofs, J G M; Hoffmann, M

    2012-03-01

    Selecting an appropriate variable subset in linear multivariate methods is an important methodological issue for ecologists. Interest often exists in obtaining general predictive capacity or in finding causal inferences from predictor variables. Because of a lack of solid knowledge on a studied phenomenon, scientists explore predictor variables in order to find the most meaningful (i.e. discriminating) ones. As an example, we modelled the response of the amphibious softwater plant Eleocharis multicaulis using canonical discriminant function analysis. We asked how variables can be selected through comparison of several methods: univariate Pearson chi-square screening, principal components analysis (PCA) and step-wise analysis, as well as combinations of some methods. We expected PCA to perform best. The selected methods were evaluated through fit and stability of the resulting discriminant functions and through correlations between these functions and the predictor variables. The chi-square subset, at P < 0.05, followed by a step-wise sub-selection, gave the best results. In contrast to expectations, PCA performed poorly, as so did step-wise analysis. The different chi-square subset methods all yielded ecologically meaningful variables, while probable noise variables were also selected by PCA and step-wise analysis. We advise against the simple use of PCA or step-wise discriminant analysis to obtain an ecologically meaningful variable subset; the former because it does not take into account the response variable, the latter because noise variables are likely to be selected. We suggest that univariate screening techniques are a worthwhile alternative for variable selection in ecology. © 2011 German Botanical Society and The Royal Botanical Society of the Netherlands.

  11. A single-vendor and a single-buyer integrated inventory model with ordering cost reduction dependent on lead time

    NASA Astrophysics Data System (ADS)

    Vijayashree, M.; Uthayakumar, R.

    2017-09-01

    Lead time is one of the major limits that affect planning at every stage of the supply chain system. In this paper, we study a continuous review inventory model. This paper investigates the ordering cost reductions are dependent on lead time. This study addressed two-echelon supply chain problem consisting of a single vendor and a single buyer. The main contribution of this study is that the integrated total cost of the single vendor and the single buyer integrated system is analyzed by adopting two different (linear and logarithmic) types ordering cost reductions act dependent on lead time. In both cases, we develop effective solution procedures for finding the optimal solution and then illustrative numerical examples are given to illustrate the results. The solution procedure is to determine the optimal solutions of order quantity, ordering cost, lead time and the number of deliveries from the single vendor and the single buyer in one production run, so that the integrated total cost incurred has the minimum value. Ordering cost reduction is the main aspect of the proposed model. A numerical example is given to validate the model. Numerical example solved by using Matlab software. The mathematical model is solved analytically by minimizing the integrated total cost. Furthermore, the sensitivity analysis is included and the numerical examples are given to illustrate the results. The results obtained in this paper are illustrated with the help of numerical examples. The sensitivity of the proposed model has been checked with respect to the various major parameters of the system. Results reveal that the proposed integrated inventory model is more applicable for the supply chain manufacturing system. For each case, an algorithm procedure of finding the optimal solution is developed. Finally, the graphical representation is presented to illustrate the proposed model and also include the computer flowchart in each model.

  12. Ship Maintenance Processes with Collaborative Product Lifecycle Management and 3D Terrestrial Laser Scanning Tools: Reducing Costs and Increasing Productivity

    DTIC Science & Technology

    2011-09-20

    optimal portfolio point on the efficient frontier, for example, Portfolio B on the chart in Figure A1. Then, by subsequently changing some of the ... optimized portfolio controlling for risk using the IRM methodology and tool suite. Results indicate that both rapid and incremental implementation...Results of the KVA and SD scenario analysis provided the financial information required to forecast an optimized

  13. Use of aerial thermography in Canadian energy conservation programs

    NASA Technical Reports Server (NTRS)

    Cihlar, J.; Brown, R. J.; Lawrence, G.; Barry, J. N.; James, R. B.

    1977-01-01

    Recent developments in the use of aerial thermography in energy conservation programs within Canada were summarized. Following a brief review of studies conducted during the last three years, methodologies of data acquisition, processing, analysis and interpretation was discussed. Examples of results from an industrial oriented project were presented and recommendations for future basic work were outlined.

  14. The Study of Two-Dimensional Oscillations Using a Smartphone Acceleration Sensor: Example of Lissajous Curves

    ERIC Educational Resources Information Center

    Tuset-Sanchis, Luis; Castro-Palacio, Juan C.; Gómez-Tejedor, José A.; Manjón, Francisco J.; Monsoriu, Juan A.

    2015-01-01

    A smartphone acceleration sensor is used to study two-dimensional harmonic oscillations. The data recorded by the free android application, Accelerometer Toy, is used to determine the periods of oscillation by graphical analysis. Different patterns of the Lissajous curves resulting from the superposition of harmonic motions are illustrated for…

  15. Development and Analysis of Models for Handling the Refrigerated Containerized Cargoes

    NASA Astrophysics Data System (ADS)

    Nyrkov, A.; Pavlova, L.; Nikiforov, V.; Sokolov, S.; Budnik, V.

    2017-07-01

    This paper considers the open multi-channel queuing system, which receives irregular homogeneous or heterogeneous applications with an unlimited flow of standby time. The system is regarded as an example of a container terminal, having conditionally functional sections with a certain duty cycle, which receives an irregular, non-uniform flow of vessels with the resultant intensity.

  16. Aerospace technology as a source of new ideas.

    NASA Technical Reports Server (NTRS)

    Hamilton, J. T.

    1972-01-01

    It is shown that technological products and processes resulting from aeronautical and space research and development can be a significant source of new product or product improvement ideas. The problems associated with technology transfer are discussed. As an example, the commercialization of NASTRAN, NASA's structural analysis computer program, is discussed. Some other current application projects are also outlined.

  17. Phenomenography and Grounded Theory as Research Methods in Computing Education Research Field

    ERIC Educational Resources Information Center

    Kinnunen, Paivi; Simon, Beth

    2012-01-01

    This paper discusses two qualitative research methods, phenomenography and grounded theory. We introduce both methods' data collection and analysis processes and the type or results you may get at the end by using examples from computing education research. We highlight some of the similarities and differences between the aim, data collection and…

  18. An Applied Example of Pooled Time Series Analysis: Cardiovascular Reactivity to Stressors in Children with Autism

    ERIC Educational Resources Information Center

    Hoeppner, Bettina B.; Goodwin, Matthew S.; Velicer, Wayne F.; Heltshe, James

    2007-01-01

    The advent of telemetric devices that sample data extensively over time has facilitated single subject or idiographic research to intensively study a single person over time. One of the challenges of idiographic research is combining single subject results to determine generalizability across subjects. This article demonstrates the first…

  19. How to test validity in orthodontic research: a mixed dentition analysis example.

    PubMed

    Donatelli, Richard E; Lee, Shin-Jae

    2015-02-01

    The data used to test the validity of a prediction method should be different from the data used to generate the prediction model. In this study, we explored whether an independent data set is mandatory for testing the validity of a new prediction method and how validity can be tested without independent new data. Several validation methods were compared in an example using the data from a mixed dentition analysis with a regression model. The validation errors of real mixed dentition analysis data and simulation data were analyzed for increasingly large data sets. The validation results of both the real and the simulation studies demonstrated that the leave-1-out cross-validation method had the smallest errors. The largest errors occurred in the traditional simple validation method. The differences between the validation methods diminished as the sample size increased. The leave-1-out cross-validation method seems to be an optimal validation method for improving the prediction accuracy in a data set with limited sample sizes. Copyright © 2015 American Association of Orthodontists. Published by Elsevier Inc. All rights reserved.

  20. Small rural hospitals: an example of market segmentation analysis.

    PubMed

    Mainous, A G; Shelby, R L

    1991-01-01

    In recent years, market segmentation analysis has shown increased popularity among health care marketers, although marketers tend to focus upon hospitals as sellers. The present analysis suggests that there is merit to viewing hospitals as a market of consumers. Employing a random sample of 741 small rural hospitals, the present investigation sought to determine, through the use of segmentation analysis, the variables associated with hospital success (occupancy). The results of a discriminant analysis yielded a model which classifies hospitals with a high degree of predictive accuracy. Successful hospitals have more beds and employees, and are generally larger and have more resources. However, there was no significant relationship between organizational success and number of services offered by the institution.

  1. Principal component and spatial correlation analysis of spectroscopic-imaging data in scanning probe microscopy.

    PubMed

    Jesse, Stephen; Kalinin, Sergei V

    2009-02-25

    An approach for the analysis of multi-dimensional, spectroscopic-imaging data based on principal component analysis (PCA) is explored. PCA selects and ranks relevant response components based on variance within the data. It is shown that for examples with small relative variations between spectra, the first few PCA components closely coincide with results obtained using model fitting, and this is achieved at rates approximately four orders of magnitude faster. For cases with strong response variations, PCA allows an effective approach to rapidly process, de-noise, and compress data. The prospects for PCA combined with correlation function analysis of component maps as a universal tool for data analysis and representation in microscopy are discussed.

  2. Using integrated models to minimize environmentally induced wavefront error in optomechanical design and analysis

    NASA Astrophysics Data System (ADS)

    Genberg, Victor L.; Michels, Gregory J.

    2017-08-01

    The ultimate design goal of an optical system subjected to dynamic loads is to minimize system level wavefront error (WFE). In random response analysis, system WFE is difficult to predict from finite element results due to the loss of phase information. In the past, the use of ystem WFE was limited by the difficulty of obtaining a linear optics model. In this paper, an automated method for determining system level WFE using a linear optics model is presented. An error estimate is included in the analysis output based on fitting errors of mode shapes. The technique is demonstrated by example with SigFit, a commercially available tool integrating mechanical analysis with optical analysis.

  3. An example problem illustrating the application of the national lime association mixture design and testing protocol (MDTP) to ascertain engineering properties of lime-treated subgrades for mechanistic pavement design/analysis.

    DOT National Transportation Integrated Search

    2001-09-01

    This document presents an example of mechanistic design and analysis using a mix design and : testing protocol. More specifically, it addresses the structural properties of lime-treated subgrade, : subbase, and base layers through mechanistic design ...

  4. Safety, effectiveness, and cost of dipeptidyl peptidase-4 inhibitors versus intermediate acting insulin for type 2 diabetes: protocol for a systematic review and network meta-analysis.

    PubMed

    Tricco, Andrea C; Antony, Jesmin; Soobiah, Charlene; Hemmelgarn, Brenda; Moher, David; Hutton, Brian; Yu, Catherine H; Majumdar, Sumit R; Straus, Sharon E

    2013-06-28

    Type 2 diabetes mellitus (T2DM) results from insulin resistance and relative insulin deficiency. T2DM treatment is a step-wise approach beginning with lifestyle modifications (for example, diet, exercise), followed by the addition of oral hypoglycemic agents (for example, metformin). Patients who do not respond to first-line therapy are offered second-line therapy (for example, sulfonylureas). Third-line therapy may include insulin and/or dipeptidyl peptidase-4 (DPP-4) inhibitors.It is unclear whether DPP-4 inhibitors are safer and more effective than intermediate acting insulin for third-line management of T2DM. As such, our objective is to evaluate the comparative effectiveness, safety and cost-effectiveness of DPP-4 inhibitors versus intermediate acting insulin for T2DM patients who have failed both first- and second-line diabetes treatments. Electronic searches of MEDLINE, Cochrane Central Register of Controlled Trials, EMBASE, and grey literature (for example, trial registries, public health websites) will be conducted to identify studies examining DPP-4 inhibitors compared with each other, intermediate acting insulin, no treatment, or placebo for adults with T2DM. The outcomes of interest include glycosylated hemoglobin (A1C) (primary outcome), as well as emergency department visits, physician visits, hospital admissions, weight gain, quality of life, microvascular complications, macrovascular complications, all-cause mortality, and cost (secondary outcomes). Randomized clinical trials (RCTs), quasi-RCTs, non-RCTs, controlled before-after, interrupted time series, cohort studies, and cost studies reporting data on these outcomes will be included. Eligibility will not be restricted by publication status, language of dissemination, duration of study follow-up, or time period of study conduct.Two reviewers will screen the titles and abstracts resulting from the literature search, as well as potentially relevant full-text articles, in duplicate. Data will be abstracted and quality will be appraised by two team members independently. Conflicts at all levels of screening and abstraction will be resolved through team discussion.Our results will be described narratively. Random effects meta-analysis and network meta-analysis will be conducted, if feasible and appropriate. Our systematic review results can be used to determine the most effective, safe and cost-effective third-line strategies for managing T2DM. This information will be of great use to health policy-makers and clinicians, as well as patients living with T2DM and their families. PROSPERO registry number: CRD42013003624.

  5. Testing for Questionable Research Practices in a Meta-Analysis: An Example from Experimental Parapsychology

    PubMed Central

    Bierman, Dick J.; Spottiswoode, James P.; Bijl, Aron

    2016-01-01

    We describe a method of quantifying the effect of Questionable Research Practices (QRPs) on the results of meta-analyses. As an example we simulated a meta-analysis of a controversial telepathy protocol to assess the extent to which these experimental results could be explained by QRPs. Our simulations used the same numbers of studies and trials as the original meta-analysis and the frequencies with which various QRPs were applied in the simulated experiments were based on surveys of experimental psychologists. Results of both the meta-analysis and simulations were characterized by 4 metrics, two describing the trial and mean experiment hit rates (HR) of around 31%, where 25% is expected by chance, one the correlation between sample-size and hit-rate, and one the complete P-value distribution of the database. A genetic algorithm optimized the parameters describing the QRPs, and the fitness of the simulated meta-analysis was defined as the sum of the squares of Z-scores for the 4 metrics. Assuming no anomalous effect a good fit to the empirical meta-analysis was found only by using QRPs with unrealistic parameter-values. Restricting the parameter space to ranges observed in studies of QRP occurrence, under the untested assumption that parapsychologists use comparable QRPs, the fit to the published Ganzfeld meta-analysis with no anomalous effect was poor. We allowed for a real anomalous effect, be it unidentified QRPs or a paranormal effect, where the HR ranged from 25% (chance) to 31%. With an anomalous HR of 27% the fitness became F = 1.8 (p = 0.47 where F = 0 is a perfect fit). We conclude that the very significant probability cited by the Ganzfeld meta-analysis is likely inflated by QRPs, though results are still significant (p = 0.003) with QRPs. Our study demonstrates that quantitative simulations of QRPs can assess their impact. Since meta-analyses in general might be polluted by QRPs, this method has wide applicability outside the domain of experimental parapsychology. PMID:27144889

  6. Sharing methodology: a worked example of theoretical integration with qualitative data to clarify practical understanding of learning and generate new theoretical development.

    PubMed

    Yardley, Sarah; Brosnan, Caragh; Richardson, Jane

    2013-01-01

    Theoretical integration is a necessary element of study design if clarification of experiential learning is to be achieved. There are few published examples demonstrating how this can be achieved. This methodological article provides a worked example of research methodology that achieved clarification of authentic early experiences (AEEs) through a bi-directional approach to theory and data. Bi-directional refers to our simultaneous use of theory to guide and interrogate empirical data and the use of empirical data to refine theory. We explain the five steps of our methodological approach: (1) understanding the context; (2) critique on existing applications of socio-cultural models to inform study design; (3) data generation; (4) analysis and interpretation and (5) theoretical development through a novel application of Metis. These steps resulted in understanding of how and why different outcomes arose from students participating in AEE. Our approach offers a mechanism for clarification without which evidence-based effective ways to maximise constructive learning cannot be developed. In our example it also contributed to greater theoretical understanding of the influence of social interactions. By sharing this example of research undertaken to develop both theory and educational practice we hope to assist others seeking to conduct similar research.

  7. Statistical analysis and application of quasi experiments to antimicrobial resistance intervention studies.

    PubMed

    Shardell, Michelle; Harris, Anthony D; El-Kamary, Samer S; Furuno, Jon P; Miller, Ram R; Perencevich, Eli N

    2007-10-01

    Quasi-experimental study designs are frequently used to assess interventions that aim to limit the emergence of antimicrobial-resistant pathogens. However, previous studies using these designs have often used suboptimal statistical methods, which may result in researchers making spurious conclusions. Methods used to analyze quasi-experimental data include 2-group tests, regression analysis, and time-series analysis, and they all have specific assumptions, data requirements, strengths, and limitations. An example of a hospital-based intervention to reduce methicillin-resistant Staphylococcus aureus infection rates and reduce overall length of stay is used to explore these methods.

  8. Quantitative subpixel spectral detection of targets in multispectral images. [terrestrial and planetary surfaces

    NASA Technical Reports Server (NTRS)

    Sabol, Donald E., Jr.; Adams, John B.; Smith, Milton O.

    1992-01-01

    The conditions that affect the spectral detection of target materials at the subpixel scale are examined. Two levels of spectral mixture analysis for determining threshold detection limits of target materials in a spectral mixture are presented, the cases where the target is detected as: (1) a component of a spectral mixture (continuum threshold analysis) and (2) residuals (residual threshold analysis). The results of these two analyses are compared under various measurement conditions. The examples illustrate the general approach that can be used for evaluating the spectral detectability of terrestrial and planetary targets at the subpixel scale.

  9. FEAMAC-CARES Software Coupling Development Effort for CMC Stochastic-Strength-Based Damage Simulation

    NASA Technical Reports Server (NTRS)

    Nemeth, Noel N.; Bednarcyk, Brett A.; Pineda, Evan; Arnold, Steven; Mital, Subodh; Murthy, Pappu; Walton, Owen

    2015-01-01

    Reported here is a coupling of two NASA developed codes: CARES (Ceramics Analysis and Reliability Evaluation of Structures) with the MACGMC composite material analysis code. The resulting code is called FEAMACCARES and is constructed as an Abaqus finite element analysis UMAT (user defined material). Here we describe the FEAMACCARES code and an example problem (taken from the open literature) of a laminated CMC in off-axis loading is shown. FEAMACCARES performs stochastic-strength-based damage simulation response of a CMC under multiaxial loading using elastic stiffness reduction of the failed elements.

  10. Classical linear-control analysis applied to business-cycle dynamics and stability

    NASA Technical Reports Server (NTRS)

    Wingrove, R. C.

    1983-01-01

    Linear control analysis is applied as an aid in understanding the fluctuations of business cycles in the past, and to examine monetary policies that might improve stabilization. The analysis shows how different policies change the frequency and damping of the economic system dynamics, and how they modify the amplitude of the fluctuations that are caused by random disturbances. Examples are used to show how policy feedbacks and policy lags can be incorporated, and how different monetary strategies for stabilization can be analytically compared. Representative numerical results are used to illustrate the main points.

  11. Payload design requirements analysis (study 2.2). Volume 3. Guideline analysis. [economic analysis of payloads for space shuttles and space tugs

    NASA Technical Reports Server (NTRS)

    Shiokari, T.

    1973-01-01

    Payloads to be launched on the space shuttle/space tug/sortie lab combinations are discussed. The payloads are of four types: (1) expendable, (2) ground refurbishable, (3) on-orbit maintainable, and (4) sortie. Economic comparisons are limited to the four types of payloads described. Additional system guidelines were developed by analyzing two payloads parameterically and demonstrating the results on an example satellite. In addition to analyzing the selected guidelines, emphasis was placed on providing economic tradeoff data and identifying payload parameters influencing the low cost approaches.

  12. Computational model for the analysis of cartilage and cartilage tissue constructs

    PubMed Central

    Smith, David W.; Gardiner, Bruce S.; Davidson, John B.; Grodzinsky, Alan J.

    2013-01-01

    We propose a new non-linear poroelastic model that is suited to the analysis of soft tissues. In this paper the model is tailored to the analysis of cartilage and the engineering design of cartilage constructs. The proposed continuum formulation of the governing equations enables the strain of the individual material components within the extracellular matrix (ECM) to be followed over time, as the individual material components are synthesized, assembled and incorporated within the ECM or lost through passive transport or degradation. The material component analysis developed here naturally captures the effect of time-dependent changes of ECM composition on the deformation and internal stress states of the ECM. For example, it is shown that increased synthesis of aggrecan by chondrocytes embedded within a decellularized cartilage matrix initially devoid of aggrecan results in osmotic expansion of the newly synthesized proteoglycan matrix and tension within the structural collagen network. Specifically, we predict that the collagen network experiences a tensile strain, with a maximum of ~2% at the fixed base of the cartilage. The analysis of an example problem demonstrates the temporal and spatial evolution of the stresses and strains in each component of a self-equilibrating composite tissue construct, and the role played by the flux of water through the tissue. PMID:23784936

  13. Measurement uncertainty analysis techniques applied to PV performance measurements

    NASA Astrophysics Data System (ADS)

    Wells, C.

    1992-10-01

    The purpose of this presentation is to provide a brief introduction to measurement uncertainty analysis, outline how it is done, and illustrate uncertainty analysis with examples drawn from the PV field, with particular emphasis toward its use in PV performance measurements. The uncertainty information we know and state concerning a PV performance measurement or a module test result determines, to a significant extent, the value and quality of that result. What is measurement uncertainty analysis? It is an outgrowth of what has commonly been called error analysis. But uncertainty analysis, a more recent development, gives greater insight into measurement processes and tests, experiments, or calibration results. Uncertainty analysis gives us an estimate of the interval about a measured value or an experiment's final result within which we believe the true value of that quantity will lie. Why should we take the time to perform an uncertainty analysis? A rigorous measurement uncertainty analysis: Increases the credibility and value of research results; allows comparisons of results from different labs; helps improve experiment design and identifies where changes are needed to achieve stated objectives (through use of the pre-test analysis); plays a significant role in validating measurements and experimental results, and in demonstrating (through the post-test analysis) that valid data have been acquired; reduces the risk of making erroneous decisions; demonstrates quality assurance and quality control measures have been accomplished; define Valid Data as data having known and documented paths of: Origin, including theory; measurements; traceability to measurement standards; computations; uncertainty analysis of results.

  14. Some applications of categorical data analysis to epidemiological studies.

    PubMed Central

    Grizzle, J E; Koch, G G

    1979-01-01

    Several examples of categorized data from epidemiological studies are analyzed to illustrate that more informative analysis than tests of independence can be performed by fitting models. All of the analyses fit into a unified conceptual framework that can be performed by weighted least squares. The methods presented show how to calculate point estimate of parameters, asymptotic variances, and asymptotically valid chi 2 tests. The examples presented are analysis of relative risks estimated from several 2 x 2 tables, analysis of selected features of life tables, construction of synthetic life tables from cross-sectional studies, and analysis of dose-response curves. PMID:540590

  15. A tutorial of diverse genome analysis tools found in the CoGe web-platform using Plasmodium spp. as a model

    PubMed Central

    Castillo, Andreina I; Nelson, Andrew D L; Haug-Baltzell, Asher K; Lyons, Eric

    2018-01-01

    Abstract Integrated platforms for storage, management, analysis and sharing of large quantities of omics data have become fundamental to comparative genomics. CoGe (https://genomevolution.org/coge/) is an online platform designed to manage and study genomic data, enabling both data- and hypothesis-driven comparative genomics. CoGe’s tools and resources can be used to organize and analyse both publicly available and private genomic data from any species. Here, we demonstrate the capabilities of CoGe through three example workflows using 17 Plasmodium genomes as a model. Plasmodium genomes present unique challenges for comparative genomics due to their rapidly evolving and highly variable genomic AT/GC content. These example workflows are intended to serve as templates to help guide researchers who would like to use CoGe to examine diverse aspects of genome evolution. In the first workflow, trends in genome composition and amino acid usage are explored. In the second, changes in genome structure and the distribution of synonymous (Ks) and non-synonymous (Kn) substitution values are evaluated across species with different levels of evolutionary relatedness. In the third workflow, microsyntenic analyses of multigene families’ genomic organization are conducted using two Plasmodium-specific gene families—serine repeat antigen, and cytoadherence-linked asexual gene—as models. In general, these example workflows show how to achieve quick, reproducible and shareable results using the CoGe platform. We were able to replicate previously published results, as well as leverage CoGe’s tools and resources to gain additional insight into various aspects of Plasmodium genome evolution. Our results highlight the usefulness of the CoGe platform, particularly in understanding complex features of genome evolution. Database URL: https://genomevolution.org/coge/

  16. 40 CFR 1065.275 - N2O measurement devices.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... for interpretation of infrared spectra. For example, EPA Test Method 320 is considered a valid method... and length to achieve adequate resolution of the N2O peak for analysis. Examples of acceptable columns....550(b) that would otherwise apply. For example, you may perform a span gas measurement before and...

  17. 40 CFR 1065.275 - N2O measurement devices.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... for interpretation of infrared spectra. For example, EPA Test Method 320 is considered a valid method... and length to achieve adequate resolution of the N2O peak for analysis. Examples of acceptable columns....550(b) that would otherwise apply. For example, you may perform a span gas measurement before and...

  18. Examples in College Algebra Textbooks: Opportunities for Students' Learning

    ERIC Educational Resources Information Center

    Mesa, Vilma; Suh, Heejoo; Blake, Tyler; Whittemore, Timothy

    2012-01-01

    We present an analysis of several characteristics of examples in 10 college algebra textbooks used in community colleges or 4-year institutions. We analyzed the examples along four dimensions: cognitive demand, the responses expected, the use of representations, and the strategies available for verifying the correctness of the solutions. We found…

  19. EAC: A program for the error analysis of STAGS results for plates

    NASA Technical Reports Server (NTRS)

    Sistla, Rajaram; Thurston, Gaylen A.; Bains, Nancy Jane C.

    1989-01-01

    A computer code is now available for estimating the error in results from the STAGS finite element code for a shell unit consisting of a rectangular orthotropic plate. This memorandum contains basic information about the computer code EAC (Error Analysis and Correction) and describes the connection between the input data for the STAGS shell units and the input data necessary to run the error analysis code. The STAGS code returns a set of nodal displacements and a discrete set of stress resultants; the EAC code returns a continuous solution for displacements and stress resultants. The continuous solution is defined by a set of generalized coordinates computed in EAC. The theory and the assumptions that determine the continuous solution are also outlined in this memorandum. An example of application of the code is presented and instructions on its usage on the Cyber and the VAX machines have been provided.

  20. Analysis of "D" regions of RC structures based on example of frame corners

    NASA Astrophysics Data System (ADS)

    Michał, Szczecina; Andrzej, Winnicki

    2018-01-01

    Calculations of reinforcement of "D" regions of reinforced concrete structures is much difficult than for "B" regions and demands some specific approaches. Authors of the paper suggest to use both Strut-and-Tie (S&T) and Finite Element Method (FEM). The first of those methods allows to calculate required reinforcement and efficiency factor. In turn FEM can not only confirm S&T results but also gives information about crack width and pattern, strains and nodal displacement. Sample calculations were performed on example of frame corners under opening bending moment. Parameters of Concrete Damaged Plasticity model of concrete implemented in Abaqus were calibrated in tension and compressions test.

  1. Estimating the Economic Value of Information for Screening in Disseminating and Targeting Effective School-based Preventive Interventions: An Illustrative Example.

    PubMed

    Johnston, Stephen S; Salkever, David S; Ialongo, Nicholas S; Slade, Eric P; Stuart, Elizabeth A

    2017-11-01

    When candidates for school-based preventive interventions are heterogeneous in their risk of poor outcomes, an intervention's expected economic net benefits may be maximized by targeting candidates for whom the intervention is most likely to yield benefits, such as those at high risk of poor outcomes. Although increasing amounts of information about candidates may facilitate more accurate targeting, collecting information can be costly. We present an illustrative example to show how cost-benefit analysis results from effective intervention demonstrations can help us to assess whether improved targeting accuracy justifies the cost of collecting additional information needed to make this improvement.

  2. Fast and Cost-Effective Biochemical Spectrophotometric Analysis of Solution of Insect "Blood" and Body Surface Elution.

    PubMed

    Łoś, Aleksandra; Strachecka, Aneta

    2018-05-09

    Using insect hemolymph ("blood") and insect body surface elutions, researchers can perform rapid and cheap biochemical analyses to determine the insect's immunology status. The authors of this publication describe a detailed methodology for a quick marking of the concentration of total proteins and evaluation of the proteolytic system activity (acid, neutral, and alkaline proteases and protease inhibitors), as well as a methodology for quick "liver" tests in insects: alanine aminotransferase (ALT), aspartate aminotransferase (AST), alkaline phosphatase (ALP), and urea and glucose concentration analyses. The meaning and examples of an interpretation of the results of the presented methodology for biochemical parameter determination are described for the example of honey bees.

  3. Analysis of truss, beam, frame, and membrane components. [composite structures

    NASA Technical Reports Server (NTRS)

    Knoell, A. C.; Robinson, E. Y.

    1975-01-01

    Truss components are considered, taking into account composite truss structures, truss analysis, column members, and truss joints. Beam components are discussed, giving attention to composite beams, laminated beams, and sandwich beams. Composite frame components and composite membrane components are examined. A description is given of examples of flat membrane components and examples of curved membrane elements. It is pointed out that composite structural design and analysis is a highly interactive, iterative procedure which does not lend itself readily to characterization by design or analysis function only.-

  4. Examples of sex/gender sensitivity in epidemiological research: results of an evaluation of original articles published in JECH 2006-2014.

    PubMed

    Jahn, Ingeborg; Börnhorst, Claudia; Günther, Frauke; Brand, Tilman

    2017-02-15

    During the last decades, sex and gender biases have been identified in various areas of biomedical and public health research, leading to compromised validity of research findings. As a response, methodological requirements were developed but these are rarely translated into research practice. The aim of this study is to provide good practice examples of sex/gender sensitive health research. We conducted a systematic search of research articles published in JECH between 2006 and 2014. An instrument was constructed to evaluate sex/gender sensitivity in four stages of the research process (background, study design, statistical analysis, discussion). In total, 37 articles covering diverse topics were included. Thereof, 22 were evaluated as good practice example in at least one stage; two articles achieved highest ratings across all stages. Good examples of the background referred to available knowledge on sex/gender differences and sex/gender informed theoretical frameworks. Related to the study design, good examples calculated sample sizes to be able to detect sex/gender differences, selected sex/gender sensitive outcome/exposure indicators, or chose different cut-off values for male and female participants. Good examples of statistical analyses used interaction terms with sex/gender or different shapes of the estimated relationship for men and women. Examples of good discussions interpreted their findings related to social and biological explanatory models or questioned the statistical methods used to detect sex/gender differences. The identified good practice examples may inspire researchers to critically reflect on the relevance of sex/gender issues of their studies and help them to translate methodological recommendations of sex/gender sensitivity into research practice.

  5. STAGS Example Problems Manual

    NASA Technical Reports Server (NTRS)

    Knight, Norman F., Jr.; Rankin, Charles C.

    2006-01-01

    This document summarizes the STructural Analysis of General Shells (STAGS) development effort, STAGS performance for selected demonstration problems, and STAGS application problems illustrating selected advanced features available in the STAGS Version 5.0. Each problem is discussed including selected background information and reference solutions when available. The modeling and solution approach for each problem is described and illustrated. Numerical results are presented and compared with reference solutions, test data, and/or results obtained from mesh refinement studies. These solutions provide an indication of the overall capabilities of the STAGS nonlinear finite element analysis tool and provide users with representative cases, including input files, to explore these capabilities that may then be tailored to other applications.

  6. Testing the Consolidated Framework for Implementation Research on health care innovations from South Yorkshire.

    PubMed

    Ilott, Irene; Gerrish, Kate; Booth, Andrew; Field, Becky

    2013-10-01

    There is an international imperative to implement research into clinical practice to improve health care. Understanding the dynamics of change requires knowledge from theoretical and empirical studies. This paper presents a novel approach to testing a new meta theoretical framework: the Consolidated Framework for Implementation Research. The utility of the Framework was evaluated using a post hoc, deductive analysis of 11 narrative accounts of innovation in health care services and practice from England, collected in 2010. A matrix, comprising the five domains and 39 constructs of the Framework was developed to examine the coherence of the terminology, to compare results across contexts and to identify new theoretical developments. The Framework captured the complexity of implementation across 11 diverse examples, offering theoretically informed, comprehensive coverage. The Framework drew attention to relevant points in individual cases together with patterns across cases; for example, all were internally developed innovations that brought direct or indirect patient advantage. In 10 cases, the change was led by clinicians. Most initiatives had been maintained for several years and there was evidence of spread in six examples. Areas for further development within the Framework include sustainability and patient/public engagement in implementation. Our analysis suggests that this conceptual framework has the potential to offer useful insights, whether as part of a situational analysis or by developing context-specific propositions for hypothesis testing. Such studies are vital now that innovation is being promoted as core business for health care. © 2012 John Wiley & Sons Ltd.

  7. Conditional robustness analysis for fragility discovery and target identification in biochemical networks and in cancer systems biology.

    PubMed

    Bianconi, Fortunato; Baldelli, Elisa; Ludovini, Vienna; Luovini, Vienna; Petricoin, Emanuel F; Crinò, Lucio; Valigi, Paolo

    2015-10-19

    The study of cancer therapy is a key issue in the field of oncology research and the development of target therapies is one of the main problems currently under investigation. This is particularly relevant in different types of tumor where traditional chemotherapy approaches often fail, such as lung cancer. We started from the general definition of robustness introduced by Kitano and applied it to the analysis of dynamical biochemical networks, proposing a new algorithm based on moment independent analysis of input/output uncertainty. The framework utilizes novel computational methods which enable evaluating the model fragility with respect to quantitative performance measures and parameters such as reaction rate constants and initial conditions. The algorithm generates a small subset of parameters that can be used to act on complex networks and to obtain the desired behaviors. We have applied the proposed framework to the EGFR-IGF1R signal transduction network, a crucial pathway in lung cancer, as an example of Cancer Systems Biology application in drug discovery. Furthermore, we have tested our framework on a pulse generator network as an example of Synthetic Biology application, thus proving the suitability of our methodology to the characterization of the input/output synthetic circuits. The achieved results are of immediate practical application in computational biology, and while we demonstrate their use in two specific examples, they can in fact be used to study a wider class of biological systems.

  8. Visual modeling in an analysis of multidimensional data

    NASA Astrophysics Data System (ADS)

    Zakharova, A. A.; Vekhter, E. V.; Shklyar, A. V.; Pak, A. J.

    2018-01-01

    The article proposes an approach to solve visualization problems and the subsequent analysis of multidimensional data. Requirements to the properties of visual models, which were created to solve analysis problems, are described. As a perspective direction for the development of visual analysis tools for multidimensional and voluminous data, there was suggested an active use of factors of subjective perception and dynamic visualization. Practical results of solving the problem of multidimensional data analysis are shown using the example of a visual model of empirical data on the current state of studying processes of obtaining silicon carbide by an electric arc method. There are several results of solving this problem. At first, an idea of possibilities of determining the strategy for the development of the domain, secondly, the reliability of the published data on this subject, and changes in the areas of attention of researchers over time.

  9. Graphical tools for network meta-analysis in STATA.

    PubMed

    Chaimani, Anna; Higgins, Julian P T; Mavridis, Dimitris; Spyridonos, Panagiota; Salanti, Georgia

    2013-01-01

    Network meta-analysis synthesizes direct and indirect evidence in a network of trials that compare multiple interventions and has the potential to rank the competing treatments according to the studied outcome. Despite its usefulness network meta-analysis is often criticized for its complexity and for being accessible only to researchers with strong statistical and computational skills. The evaluation of the underlying model assumptions, the statistical technicalities and presentation of the results in a concise and understandable way are all challenging aspects in the network meta-analysis methodology. In this paper we aim to make the methodology accessible to non-statisticians by presenting and explaining a series of graphical tools via worked examples. To this end, we provide a set of STATA routines that can be easily employed to present the evidence base, evaluate the assumptions, fit the network meta-analysis model and interpret its results.

  10. Graphical Tools for Network Meta-Analysis in STATA

    PubMed Central

    Chaimani, Anna; Higgins, Julian P. T.; Mavridis, Dimitris; Spyridonos, Panagiota; Salanti, Georgia

    2013-01-01

    Network meta-analysis synthesizes direct and indirect evidence in a network of trials that compare multiple interventions and has the potential to rank the competing treatments according to the studied outcome. Despite its usefulness network meta-analysis is often criticized for its complexity and for being accessible only to researchers with strong statistical and computational skills. The evaluation of the underlying model assumptions, the statistical technicalities and presentation of the results in a concise and understandable way are all challenging aspects in the network meta-analysis methodology. In this paper we aim to make the methodology accessible to non-statisticians by presenting and explaining a series of graphical tools via worked examples. To this end, we provide a set of STATA routines that can be easily employed to present the evidence base, evaluate the assumptions, fit the network meta-analysis model and interpret its results. PMID:24098547

  11. Structural bioinformatics of the human spliceosomal proteome

    PubMed Central

    Korneta, Iga; Magnus, Marcin; Bujnicki, Janusz M.

    2012-01-01

    In this work, we describe the results of a comprehensive structural bioinformatics analysis of the spliceosomal proteome. We used fold recognition analysis to complement prior data on the ordered domains of 252 human splicing proteins. Examples of newly identified domains include a PWI domain in the U5 snRNP protein 200K (hBrr2, residues 258–338), while examples of previously known domains with a newly determined fold include the DUF1115 domain of the U4/U6 di-snRNP protein 90K (hPrp3, residues 540–683). We also established a non-redundant set of experimental models of spliceosomal proteins, as well as constructed in silico models for regions without an experimental structure. The combined set of structural models is available for download. Altogether, over 90% of the ordered regions of the spliceosomal proteome can be represented structurally with a high degree of confidence. We analyzed the reduced spliceosomal proteome of the intron-poor organism Giardia lamblia, and as a result, we proposed a candidate set of ordered structural regions necessary for a functional spliceosome. The results of this work will aid experimental and structural analyses of the spliceosomal proteins and complexes, and can serve as a starting point for multiscale modeling of the structure of the entire spliceosome. PMID:22573172

  12. Acoustic-articulatory mapping in vowels by locally weighted regression

    PubMed Central

    McGowan, Richard S.; Berger, Michael A.

    2009-01-01

    A method for mapping between simultaneously measured articulatory and acoustic data is proposed. The method uses principal components analysis on the articulatory and acoustic variables, and mapping between the domains by locally weighted linear regression, or loess [Cleveland, W. S. (1979). J. Am. Stat. Assoc. 74, 829–836]. The latter method permits local variation in the slopes of the linear regression, assuming that the function being approximated is smooth. The methodology is applied to vowels of four speakers in the Wisconsin X-ray Microbeam Speech Production Database, with formant analysis. Results are examined in terms of (1) examples of forward (articulation-to-acoustics) mappings and inverse mappings, (2) distributions of local slopes and constants, (3) examples of correlations among slopes and constants, (4) root-mean-square error, and (5) sensitivity of formant frequencies to articulatory change. It is shown that the results are qualitatively correct and that loess performs better than global regression. The forward mappings show different root-mean-square error properties than the inverse mappings indicating that this method is better suited for the forward mappings than the inverse mappings, at least for the data chosen for the current study. Some preliminary results on sensitivity of the first two formant frequencies to the two most important articulatory principal components are presented. PMID:19813812

  13. Experimental Evaluation of Verification and Validation Tools on Martian Rover Software

    NASA Technical Reports Server (NTRS)

    Brat, Guillaume; Giannakopoulou, Dimitra; Goldberg, Allen; Havelund, Klaus; Lowry, Mike; Pasareani, Corina; Venet, Arnaud; Visser, Willem; Washington, Rich

    2003-01-01

    We report on a study to determine the maturity of different verification and validation technologies (V&V) on a representative example of NASA flight software. The study consisted of a controlled experiment where three technologies (static analysis, runtime analysis and model checking) were compared to traditional testing with respect to their ability to find seeded errors in a prototype Mars Rover. What makes this study unique is that it is the first (to the best of our knowledge) to do a controlled experiment to compare formal methods based tools to testing on a realistic industrial-size example where the emphasis was on collecting as much data on the performance of the tools and the participants as possible. The paper includes a description of the Rover code that was analyzed, the tools used as well as a detailed description of the experimental setup and the results. Due to the complexity of setting up the experiment, our results can not be generalized, but we believe it can still serve as a valuable point of reference for future studies of this kind. It did confirm the belief we had that advanced tools can outperform testing when trying to locate concurrency errors. Furthermore the results of the experiment inspired a novel framework for testing the next generation of the Rover.

  14. Diagnostics for generalized linear hierarchical models in network meta-analysis.

    PubMed

    Zhao, Hong; Hodges, James S; Carlin, Bradley P

    2017-09-01

    Network meta-analysis (NMA) combines direct and indirect evidence comparing more than 2 treatments. Inconsistency arises when these 2 information sources differ. Previous work focuses on inconsistency detection, but little has been done on how to proceed after identifying inconsistency. The key issue is whether inconsistency changes an NMA's substantive conclusions. In this paper, we examine such discrepancies from a diagnostic point of view. Our methods seek to detect influential and outlying observations in NMA at a trial-by-arm level. These observations may have a large effect on the parameter estimates in NMA, or they may deviate markedly from other observations. We develop formal diagnostics for a Bayesian hierarchical model to check the effect of deleting any observation. Diagnostics are specified for generalized linear hierarchical NMA models and investigated for both published and simulated datasets. Results from our example dataset using either contrast- or arm-based models and from the simulated datasets indicate that the sources of inconsistency in NMA tend not to be influential, though results from the example dataset suggest that they are likely to be outliers. This mimics a familiar result from linear model theory, in which outliers with low leverage are not influential. Future extensions include incorporating baseline covariates and individual-level patient data. Copyright © 2017 John Wiley & Sons, Ltd.

  15. Meta‐analysis using individual participant data: one‐stage and two‐stage approaches, and why they may differ

    PubMed Central

    Ensor, Joie; Riley, Richard D.

    2016-01-01

    Meta‐analysis using individual participant data (IPD) obtains and synthesises the raw, participant‐level data from a set of relevant studies. The IPD approach is becoming an increasingly popular tool as an alternative to traditional aggregate data meta‐analysis, especially as it avoids reliance on published results and provides an opportunity to investigate individual‐level interactions, such as treatment‐effect modifiers. There are two statistical approaches for conducting an IPD meta‐analysis: one‐stage and two‐stage. The one‐stage approach analyses the IPD from all studies simultaneously, for example, in a hierarchical regression model with random effects. The two‐stage approach derives aggregate data (such as effect estimates) in each study separately and then combines these in a traditional meta‐analysis model. There have been numerous comparisons of the one‐stage and two‐stage approaches via theoretical consideration, simulation and empirical examples, yet there remains confusion regarding when each approach should be adopted, and indeed why they may differ. In this tutorial paper, we outline the key statistical methods for one‐stage and two‐stage IPD meta‐analyses, and provide 10 key reasons why they may produce different summary results. We explain that most differences arise because of different modelling assumptions, rather than the choice of one‐stage or two‐stage itself. We illustrate the concepts with recently published IPD meta‐analyses, summarise key statistical software and provide recommendations for future IPD meta‐analyses. © 2016 The Authors. Statistics in Medicine published by John Wiley & Sons Ltd. PMID:27747915

  16. Applying spatial analysis tools in public health: an example using SaTScan to detect geographic targets for colorectal cancer screening interventions.

    PubMed

    Sherman, Recinda L; Henry, Kevin A; Tannenbaum, Stacey L; Feaster, Daniel J; Kobetz, Erin; Lee, David J

    2014-03-20

    Epidemiologists are gradually incorporating spatial analysis into health-related research as geocoded cases of disease become widely available and health-focused geospatial computer applications are developed. One health-focused application of spatial analysis is cluster detection. Using cluster detection to identify geographic areas with high-risk populations and then screening those populations for disease can improve cancer control. SaTScan is a free cluster-detection software application used by epidemiologists around the world to describe spatial clusters of infectious and chronic disease, as well as disease vectors and risk factors. The objectives of this article are to describe how spatial analysis can be used in cancer control to detect geographic areas in need of colorectal cancer screening intervention, identify issues commonly encountered by SaTScan users, detail how to select the appropriate methods for using SaTScan, and explain how method selection can affect results. As an example, we used various methods to detect areas in Florida where the population is at high risk for late-stage diagnosis of colorectal cancer. We found that much of our analysis was underpowered and that no single method detected all clusters of statistical or public health significance. However, all methods detected 1 area as high risk; this area is potentially a priority area for a screening intervention. Cluster detection can be incorporated into routine public health operations, but the challenge is to identify areas in which the burden of disease can be alleviated through public health intervention. Reliance on SaTScan's default settings does not always produce pertinent results.

  17. Characterization methods for liquid interfacial layers

    NASA Astrophysics Data System (ADS)

    Javadi, A.; Mucic, N.; Karbaschi, M.; Won, J. Y.; Lotfi, M.; Dan, A.; Ulaganathan, V.; Gochev, G.; Makievski, A. V.; Kovalchuk, V. I.; Kovalchuk, N. M.; Krägel, J.; Miller, R.

    2013-05-01

    Liquid interfaces are met everywhere in our daily life. The corresponding interfacial properties and their modification play an important role in many modern technologies. Most prominent examples are all processes involved in the formation of foams and emulsions, as they are based on a fast creation of new surfaces, often of an immense extension. During the formation of an emulsion, for example, all freshly created and already existing interfaces are permanently subject to all types of deformation. This clearly entails the need of a quantitative knowledge on relevant dynamic interfacial properties and their changes under conditions pertinent to the technological processes. We report on the state of the art of interfacial layer characterization, including the determination of thermodynamic quantities as base line for a further quantitative analysis of the more important dynamic interfacial characteristics. Main focus of the presented work is on the experimental possibilities available at present to gain dynamic interfacial parameters, such as interfacial tensions, adsorbed amounts, interfacial composition, visco-elastic parameters, at shortest available surface ages and fastest possible interfacial perturbations. The experimental opportunities are presented along with examples for selected systems and theoretical models for a best data analysis. We also report on simulation results and concepts of necessary refinements and developments in this important field of interfacial dynamics.

  18. Rural water supply and related services in developing countries — Comparative analysis of several approaches

    NASA Astrophysics Data System (ADS)

    Bajard, Y.; Draper, M.; Viens, P.

    1981-05-01

    The proposed paper deals with a comparative analysis of several approaches possible and actually used for a joint action of local institutions and foreign aid in the field of water supply and related services such as sanitation to villages and small rural agglomerations (market towns, etc.) in developing countries. This comparative analysis is based on examples of actual programmes in this field. The authors have participated in most of the programmes selected as examples, at various levels and in various capacities, from conception to design, implementation and/or evaluation (i.e. rural development programmes in Ivory Coast, Ghana (upper region), Benin and Ethiopia. The authors were not involved in other examples such as water supply and/or sanitation to small urban centres in Benin, Ivory Coast, etc. They have, however, witnessed them directly and have obtained, therefore, first-hand information on their organization, execution and results. Several typical examples of actual projects are briefly defined and characterized. The paper undertakes, then, to compare, in a clinical fashion, the advantages and drawbacks of the approaches taken in the various examples presented. The paper finally proposes a recommendation for a realistic approach to joint action between local/domestic and foreign financing/assistance agencies and executing bodies (consultants, contractors) in the field of rural water supply, sanitation, and more generally, health improvement. The definition of this line of approach is made in terms of logical framework, i.e. goals, purposes, outputs and inputs at the various stages of the project, up to actual evaluation of execution and impact if possible; description of practical indicators of the two types of evaluation. A particular attention is given to the problems of technological choices, in view of the constraints imposed by the natural environment, by the human and social patterns; in view also of the institutions and the economy. Another point of importance taken into consideration by the paper is the problem of information, education, and support to users for the introduction, implementation, operation and maintenance of technical developments at village level. Conclusions are drawn as to the relative advantages of this approach over the "classical" approach and its replicability.

  19. Representation of scientific methodology in secondary science textbooks

    NASA Astrophysics Data System (ADS)

    Binns, Ian C.

    The purpose of this investigation was to assess the representation of scientific methodology in secondary science textbooks. More specifically, this study looked at how textbooks introduced scientific methodology and to what degree the examples from the rest of the textbook, the investigations, and the images were consistent with the text's description of scientific methodology, if at all. The sample included eight secondary science textbooks from two publishers, McGraw-Hill/Glencoe and Harcourt/Holt, Rinehart & Winston. Data consisted of all student text and teacher text that referred to scientific methodology. Second, all investigations in the textbooks were analyzed. Finally, any images that depicted scientists working were also collected and analyzed. The text analysis and activity analysis used the ethnographic content analysis approach developed by Altheide (1996). The rubrics used for the text analysis and activity analysis were initially guided by the Benchmarks (AAAS, 1993), the NSES (NRC, 1996), and the nature of science literature. Preliminary analyses helped to refine each of the rubrics and grounded them in the data. Image analysis used stereotypes identified in the DAST literature. Findings indicated that all eight textbooks presented mixed views of scientific methodology in their initial descriptions. Five textbooks placed more emphasis on the traditional view and three placed more emphasis on the broad view. Results also revealed that the initial descriptions, examples, investigations, and images all emphasized the broad view for Glencoe Biology and the traditional view for Chemistry: Matter and Change. The initial descriptions, examples, investigations, and images in the other six textbooks were not consistent. Overall, the textbook with the most appropriate depiction of scientific methodology was Glencoe Biology and the textbook with the least appropriate depiction of scientific methodology was Physics: Principles and Problems. These findings suggest that compared to earlier investigations, textbooks have begun to improve in how they represent scientific methodology. However, there is still much room for improvement. Future research needs to consider how textbooks impact teachers' and students' understandings of scientific methodology.

  20. Promoting College Students' Construction of Problem Schemata in Statistics Using Schema-Emphasizing Worked Examples

    ERIC Educational Resources Information Center

    Yan, Jie

    2010-01-01

    In this study, the effectiveness of worked examples that emphasizes problem features (data type, number of groups, purpose of analysis) associated with specific problem types (t-test, chi-square, correlation) were examined on students' construction of problem schemata compared to traditional solution-only worked examples. A sample of 96 students…

  1. Analysis and synthesis of abstract data types through generalization from examples

    NASA Technical Reports Server (NTRS)

    Wild, Christian

    1987-01-01

    The discovery of general patterns of behavior from a set of input/output examples can be a useful technique in the automated analysis and synthesis of software systems. These generalized descriptions of the behavior form a set of assertions which can be used for validation, program synthesis, program testing and run-time monitoring. Describing the behavior is characterized as a learning process in which general patterns can be easily characterized. The learning algorithm must choose a transform function and define a subset of the transform space which is related to equivalence classes of behavior in the original domain. An algorithm for analyzing the behavior of abstract data types is presented and several examples are given. The use of the analysis for purposes of program synthesis is also discussed.

  2. Probabilistic wind/tornado/missile analyses for hazard and fragility evaluations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Park, Y.J.; Reich, M.

    Detailed analysis procedures and examples are presented for the probabilistic evaluation of hazard and fragility against high wind, tornado, and tornado-generated missiles. In the tornado hazard analysis, existing risk models are modified to incorporate various uncertainties including modeling errors. A significant feature of this paper is the detailed description of the Monte-Carlo simulation analyses of tornado-generated missiles. A simulation procedure, which includes the wind field modeling, missile injection, solution of flight equations, and missile impact analysis, is described with application examples.

  3. A Bayesian approach to meta-analysis of plant pathology studies.

    PubMed

    Mila, A L; Ngugi, H K

    2011-01-01

    Bayesian statistical methods are used for meta-analysis in many disciplines, including medicine, molecular biology, and engineering, but have not yet been applied for quantitative synthesis of plant pathology studies. In this paper, we illustrate the key concepts of Bayesian statistics and outline the differences between Bayesian and classical (frequentist) methods in the way parameters describing population attributes are considered. We then describe a Bayesian approach to meta-analysis and present a plant pathological example based on studies evaluating the efficacy of plant protection products that induce systemic acquired resistance for the management of fire blight of apple. In a simple random-effects model assuming a normal distribution of effect sizes and no prior information (i.e., a noninformative prior), the results of the Bayesian meta-analysis are similar to those obtained with classical methods. Implementing the same model with a Student's t distribution and a noninformative prior for the effect sizes, instead of a normal distribution, yields similar results for all but acibenzolar-S-methyl (Actigard) which was evaluated only in seven studies in this example. Whereas both the classical (P = 0.28) and the Bayesian analysis with a noninformative prior (95% credibility interval [CRI] for the log response ratio: -0.63 to 0.08) indicate a nonsignificant effect for Actigard, specifying a t distribution resulted in a significant, albeit variable, effect for this product (CRI: -0.73 to -0.10). These results confirm the sensitivity of the analytical outcome (i.e., the posterior distribution) to the choice of prior in Bayesian meta-analyses involving a limited number of studies. We review some pertinent literature on more advanced topics, including modeling of among-study heterogeneity, publication bias, analyses involving a limited number of studies, and methods for dealing with missing data, and show how these issues can be approached in a Bayesian framework. Bayesian meta-analysis can readily include information not easily incorporated in classical methods, and allow for a full evaluation of competing models. Given the power and flexibility of Bayesian methods, we expect them to become widely adopted for meta-analysis of plant pathology studies.

  4. An Example-Based Brain MRI Simulation Framework.

    PubMed

    He, Qing; Roy, Snehashis; Jog, Amod; Pham, Dzung L

    2015-02-21

    The simulation of magnetic resonance (MR) images plays an important role in the validation of image analysis algorithms such as image segmentation, due to lack of sufficient ground truth in real MR images. Previous work on MRI simulation has focused on explicitly modeling the MR image formation process. However, because of the overwhelming complexity of MR acquisition these simulations must involve simplifications and approximations that can result in visually unrealistic simulated images. In this work, we describe an example-based simulation framework, which uses an "atlas" consisting of an MR image and its anatomical models derived from the hard segmentation. The relationships between the MR image intensities and its anatomical models are learned using a patch-based regression that implicitly models the physics of the MR image formation. Given the anatomical models of a new brain, a new MR image can be simulated using the learned regression. This approach has been extended to also simulate intensity inhomogeneity artifacts based on the statistical model of training data. Results show that the example based MRI simulation method is capable of simulating different image contrasts and is robust to different choices of atlas. The simulated images resemble real MR images more than simulations produced by a physics-based model.

  5. The mathematics textbook at tertiary level as curriculum material - exploring the teacher's decision-making process

    NASA Astrophysics Data System (ADS)

    Randahl, Mira

    2016-08-01

    This paper reports on a study about how the mathematics textbook was perceived and used by the teacher in the context of a calculus part of a basic mathematics course for first-year engineering students. The focus was on the teacher's choices and the use of definitions, examples and exercises in a sequence of lectures introducing the derivative concept. Data were collected during observations of lectures and an interview, and informal talks with the teacher. The introduction and the treatment of the derivative as proposed by the teacher during the lectures were analysed in relation to the results of the content text analysis of the textbook. The teacher's decisions were explored through the lens of intended learning goals for engineering students taking the mathematics course. The results showed that the sequence of concepts and the formal introduction of the derivative as proposed by the textbook were closely followed during the lectures. The examples and tasks offered to the students focused strongly on procedural knowledge. Although the textbook proposes both examples and exercises that promote conceptual knowledge, these opportunities were not fully utilized during the observed lectures. Possible reasons for the teacher's choices and decisions are discussed.

  6. Differential Amplifier with Current-Mirror Load: Influence of Current Gain, Early Voltage, and Supply Voltage on the DC Output Voltage

    ERIC Educational Resources Information Center

    Paulik, G. F.; Mayer, R. P.

    2012-01-01

    A differential amplifier composed of an emitter-coupled pair is useful as an example in lecture presentations and laboratory experiments in electronic circuit analysis courses. However, in an active circuit with zero input load V[subscript id], both laboratory measurements and PSPICE and LTspice simulation results for the output voltage…

  7. Modeling post-fire woody carbon dynamics with data from remeasured inventory plots

    Treesearch

    Bianca N.I. Eskelson; Jeremy Fried; Vicente Monleon

    2015-01-01

    In California, the Forest Inventory and Analysis (FIA) plots within large fires were visited one year after the fire occurred resulting in a time series of measurements before and after fire. During this additional plot visit, the standard inventory measurements were augmented for these burned plots to assess fire effects. One example of the additional measurements is...

  8. Flight dynamics analysis and simulation of heavy lift airships. Volume 2: Technical manual

    NASA Technical Reports Server (NTRS)

    Ringland, R. F.; Tischler, M. B.; Jex, H. R.; Emmen, R. D.; Ashkenas, I. L.

    1982-01-01

    The mathematical models embodied in the simulation are described in considerable detail and with supporting evidence for the model forms chosen. In addition the trimming and linearization algorithms used in the simulation are described. Appendices to the manual identify reference material for estimating the needed coefficients for the input data and provide example simulation results.

  9. Measuring the contour of a wavefront using the Irradiance Transport Equation (ITE)

    NASA Astrophysics Data System (ADS)

    Castillo-Rodríguez, Luis; Granados-Agustín, Fermín; Fernández-Guasti, Manuel; Cornejo-Rodríguez, Alejandro

    2006-01-01

    The Irradiance Transport Equation (ITE), found by Teague, had been used in optics with different applications. One of the field where had been used is in optical testing, for example, with the method developed by Takeda. In this paper following the idea of using different optical and mathematical analysis method, theorical and experimental results are presented.

  10. Using Case-Mix Adjustment Methods To Measure the Effectiveness of Substance Abuse Treatment: Three Examples Using Client Employment Outcomes.

    ERIC Educational Resources Information Center

    Koenig, Lane; Fields, Errol L.; Dall, Timothy M.; Ameen, Ansari Z.; Harwood, Henrick J.

    This report demonstrates three applications of case-mix methods using regression analysis. The results are used to assess the relative effectiveness of substance abuse treatment providers. The report also examines the ability of providers to improve client employment outcomes, an outcome domain relatively unexamined in the assessment of provider…

  11. Performance-Based Seismic Design of Steel Frames Utilizing Colliding Bodies Algorithm

    PubMed Central

    Veladi, H.

    2014-01-01

    A pushover analysis method based on semirigid connection concept is developed and the colliding bodies optimization algorithm is employed to find optimum seismic design of frame structures. Two numerical examples from the literature are studied. The results of the new algorithm are compared to the conventional design methods to show the power or weakness of the algorithm. PMID:25202717

  12. Performance-based seismic design of steel frames utilizing colliding bodies algorithm.

    PubMed

    Veladi, H

    2014-01-01

    A pushover analysis method based on semirigid connection concept is developed and the colliding bodies optimization algorithm is employed to find optimum seismic design of frame structures. Two numerical examples from the literature are studied. The results of the new algorithm are compared to the conventional design methods to show the power or weakness of the algorithm.

  13. Agricultural Research Service research highlights in remote sensing for calendar year 1981

    NASA Technical Reports Server (NTRS)

    Ritchie, J. C. (Compiler)

    1982-01-01

    Selected examples of research accomplishments related to remote sensing are compiled. A brief statement is given to highlight the significant results of each research project. A list of 1981 publication and location contacts is given also. The projects cover emission and reflectance analysis, identification of crop and soil parameters, and the utilization of remote sensing data.

  14. Nonclassicality thresholds for multiqubit states: Numerical analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gruca, Jacek; Zukowski, Marek; Laskowski, Wieslaw

    2010-07-15

    States that strongly violate Bell's inequalities are required in many quantum-informational protocols as, for example, in cryptography, secret sharing, and the reduction of communication complexity. We investigate families of such states with a numerical method which allows us to reveal nonclassicality even without direct knowledge of Bell's inequalities for the given problem. An extensive set of numerical results is presented and discussed.

  15. Methodology for Examining Effects of Arms Control Reduction on Tactical Air Forces. An Example from Conventional Forces in Europe (CFE) Treaty Analysis

    DTIC Science & Technology

    1993-01-01

    H. Wegner for developing the tactical air and ground force databases and producing the campaign results. Thanks are also due to Group Captain Michael ... Jackson , RAF, for developing the evaluation criteria for NATO’s tactical air force reductions during his stay at RAND. -xi. CONTENTS PREFACE

  16. A critical review of published methods for analysis of red cell antigen-antibody reactions by flow cytometry, and approaches for resolving problems with red cell agglutination.

    PubMed

    Arndt, Patricia A; Garratty, George

    2010-07-01

    Flow cytometry operators often apply familiar white blood cell (WBC) methods when studying red blood cell (RBC) antigens and antibodies. Some WBC methods are not appropriate for RBCs, as the analysis of RBCs requires special considerations, for example, avoidance of agglutination. One hundred seventy-six published articles from 88 groups studying RBC interactions were reviewed. Three fourths of groups used at least one unnecessary WBC procedure for RBCs, and about one fourth did not use any method to prevent/disperse RBC agglutination. Flow cytometric studies were performed to determine the effect of RBC agglutination on results and compare different methods of preventing and/or dispersing agglutination. The presence of RBC agglutinates have been shown to be affected by the type of pipette tip used for mixing RBC suspensions, the number of antigen sites/RBC, the type and concentration of primary antibody, and the type of secondary antibody. For quantitation methods, for example, fetal maternal hemorrhage, the presence of agglutinates have been shown to adversely affect results (fewer fetal D+ RBCs detected). Copyright 2010 Elsevier Inc. All rights reserved.

  17. Characteristics Desired in Clinical Data Warehouse for Biomedical Research

    PubMed Central

    Shin, Soo-Yong; Kim, Woo Sung

    2014-01-01

    Objectives Due to the unique characteristics of clinical data, clinical data warehouses (CDWs) have not been successful so far. Specifically, the use of CDWs for biomedical research has been relatively unsuccessful thus far. The characteristics necessary for the successful implementation and operation of a CDW for biomedical research have not clearly defined yet. Methods Three examples of CDWs were reviewed: a multipurpose CDW in a hospital, a CDW for independent multi-institutional research, and a CDW for research use in an institution. After reviewing the three CDW examples, we propose some key characteristics needed in a CDW for biomedical research. Results A CDW for research should include an honest broker system and an Institutional Review Board approval interface to comply with governmental regulations. It should also include a simple query interface, an anonymized data review tool, and a data extraction tool. Also, it should be a biomedical research platform for data repository use as well as data analysis. Conclusions The proposed characteristics desired in a CDW may have limited transfer value to organizations in other countries. However, these analysis results are still valid in Korea, and we have developed clinical research data warehouse based on these desiderata. PMID:24872909

  18. Application of Different Statistical Techniques in Integrated Logistics Support of the International Space Station Alpha

    NASA Technical Reports Server (NTRS)

    Sepehry-Fard, F.; Coulthard, Maurice H.

    1995-01-01

    The process to predict the values of the maintenance time dependent variable parameters such as mean time between failures (MTBF) over time must be one that will not in turn introduce uncontrolled deviation in the results of the ILS analysis such as life cycle cost spares calculation, etc. A minor deviation in the values of the maintenance time dependent variable parameters such as MTBF over time will have a significant impact on the logistics resources demands, International Space Station availability, and maintenance support costs. It is the objective of this report to identify the magnitude of the expected enhancement in the accuracy of the results for the International Space Station reliability and maintainability data packages by providing examples. These examples partially portray the necessary information hy evaluating the impact of the said enhancements on the life cycle cost and the availability of the International Space Station.

  19. Statistical analysis of QC data and estimation of fuel rod behaviour

    NASA Astrophysics Data System (ADS)

    Heins, L.; Groβ, H.; Nissen, K.; Wunderlich, F.

    1991-02-01

    The behaviour of fuel rods while in reactor is influenced by many parameters. As far as fabrication is concerned, fuel pellet diameter and density, and inner cladding diameter are important examples. Statistical analyses of quality control data show a scatter of these parameters within the specified tolerances. At present it is common practice to use a combination of superimposed unfavorable tolerance limits (worst case dataset) in fuel rod design calculations. Distributions are not considered. The results obtained in this way are very conservative but the degree of conservatism is difficult to quantify. Probabilistic calculations based on distributions allow the replacement of the worst case dataset by a dataset leading to results with known, defined conservatism. This is achieved by response surface methods and Monte Carlo calculations on the basis of statistical distributions of the important input parameters. The procedure is illustrated by means of two examples.

  20. An adaptive front tracking technique for three-dimensional transient flows

    NASA Astrophysics Data System (ADS)

    Galaktionov, O. S.; Anderson, P. D.; Peters, G. W. M.; van de Vosse, F. N.

    2000-01-01

    An adaptive technique, based on both surface stretching and surface curvature analysis for tracking strongly deforming fluid volumes in three-dimensional flows is presented. The efficiency and accuracy of the technique are demonstrated for two- and three-dimensional flow simulations. For the two-dimensional test example, the results are compared with results obtained using a different tracking approach based on the advection of a passive scalar. Although for both techniques roughly the same structures are found, the resolution for the front tracking technique is much higher. In the three-dimensional test example, a spherical blob is tracked in a chaotic mixing flow. For this problem, the accuracy of the adaptive tracking is demonstrated by the volume conservation for the advected blob. Adaptive front tracking is suitable for simulation of the initial stages of fluid mixing, where the interfacial area can grow exponentially with time. The efficiency of the algorithm significantly benefits from parallelization of the code. Copyright

  1. Semantic Features for Classifying Referring Search Terms

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    May, Chandler J.; Henry, Michael J.; McGrath, Liam R.

    2012-05-11

    When an internet user clicks on a result in a search engine, a request is submitted to the destination web server that includes a referrer field containing the search terms given by the user. Using this information, website owners can analyze the search terms leading to their websites to better understand their visitors needs. This work explores some of the features that can be used for classification-based analysis of such referring search terms. We present initial results for the example task of classifying HTTP requests countries of origin. A system that can accurately predict the country of origin from querymore » text may be a valuable complement to IP lookup methods which are susceptible to the obfuscation of dereferrers or proxies. We suggest that the addition of semantic features improves classifier performance in this example application. We begin by looking at related work and presenting our approach. After describing initial experiments and results, we discuss paths forward for this work.« less

  2. On the statistical assessment of classifiers using DNA microarray data

    PubMed Central

    Ancona, N; Maglietta, R; Piepoli, A; D'Addabbo, A; Cotugno, R; Savino, M; Liuni, S; Carella, M; Pesole, G; Perri, F

    2006-01-01

    Background In this paper we present a method for the statistical assessment of cancer predictors which make use of gene expression profiles. The methodology is applied to a new data set of microarray gene expression data collected in Casa Sollievo della Sofferenza Hospital, Foggia – Italy. The data set is made up of normal (22) and tumor (25) specimens extracted from 25 patients affected by colon cancer. We propose to give answers to some questions which are relevant for the automatic diagnosis of cancer such as: Is the size of the available data set sufficient to build accurate classifiers? What is the statistical significance of the associated error rates? In what ways can accuracy be considered dependant on the adopted classification scheme? How many genes are correlated with the pathology and how many are sufficient for an accurate colon cancer classification? The method we propose answers these questions whilst avoiding the potential pitfalls hidden in the analysis and interpretation of microarray data. Results We estimate the generalization error, evaluated through the Leave-K-Out Cross Validation error, for three different classification schemes by varying the number of training examples and the number of the genes used. The statistical significance of the error rate is measured by using a permutation test. We provide a statistical analysis in terms of the frequencies of the genes involved in the classification. Using the whole set of genes, we found that the Weighted Voting Algorithm (WVA) classifier learns the distinction between normal and tumor specimens with 25 training examples, providing e = 21% (p = 0.045) as an error rate. This remains constant even when the number of examples increases. Moreover, Regularized Least Squares (RLS) and Support Vector Machines (SVM) classifiers can learn with only 15 training examples, with an error rate of e = 19% (p = 0.035) and e = 18% (p = 0.037) respectively. Moreover, the error rate decreases as the training set size increases, reaching its best performances with 35 training examples. In this case, RLS and SVM have error rates of e = 14% (p = 0.027) and e = 11% (p = 0.019). Concerning the number of genes, we found about 6000 genes (p < 0.05) correlated with the pathology, resulting from the signal-to-noise statistic. Moreover the performances of RLS and SVM classifiers do not change when 74% of genes is used. They progressively reduce up to e = 16% (p < 0.05) when only 2 genes are employed. The biological relevance of a set of genes determined by our statistical analysis and the major roles they play in colorectal tumorigenesis is discussed. Conclusions The method proposed provides statistically significant answers to precise questions relevant for the diagnosis and prognosis of cancer. We found that, with as few as 15 examples, it is possible to train statistically significant classifiers for colon cancer diagnosis. As for the definition of the number of genes sufficient for a reliable classification of colon cancer, our results suggest that it depends on the accuracy required. PMID:16919171

  3. A Multiscale, Nonlinear, Modeling Framework Enabling the Design and Analysis of Composite Materials and Structures

    NASA Technical Reports Server (NTRS)

    Bednarcyk, Brett A.; Arnold, Steven M.

    2012-01-01

    A framework for the multiscale design and analysis of composite materials and structures is presented. The ImMAC software suite, developed at NASA Glenn Research Center, embeds efficient, nonlinear micromechanics capabilities within higher scale structural analysis methods such as finite element analysis. The result is an integrated, multiscale tool that relates global loading to the constituent scale, captures nonlinearities at this scale, and homogenizes local nonlinearities to predict their effects at the structural scale. Example applications of the multiscale framework are presented for the stochastic progressive failure of a SiC/Ti composite tensile specimen and the effects of microstructural variations on the nonlinear response of woven polymer matrix composites.

  4. STAGS Developments for Residual Strength Analysis Methods for Metallic Fuselage Structures

    NASA Technical Reports Server (NTRS)

    Young, Richard D.; Rose, Cheryl A.

    2014-01-01

    A summary of advances in the Structural Analysis of General Shells (STAGS) finite element code for the residual strength analysis of metallic fuselage structures, that were realized through collaboration between the structures group at NASA Langley, and Dr. Charles Rankin is presented. The majority of the advancements described were made in the 1990's under the NASA Airframe Structural Integrity Program (NASIP). Example results from studies that were conducted using the STAGS code to develop improved understanding of the nonlinear response of cracked fuselage structures subjected to combined loads are presented. An integrated residual strength analysis methodology for metallic structure that models crack growth to predict the effect of cracks on structural integrity is demonstrated

  5. Net present value analysis to select public R&D programs and valuate expected private sector participation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hinman, N.D.; Yancey, M.A.

    1997-12-31

    One of the main functions of government is to invest taxpayers dollars in projects, programs, and properties that will result in social benefit. Public programs focused on the development of technology are examples of such opportunities. Selecting these programs requires the same investment analysis approaches that private companies and individuals use. Good use of investment analysis approaches to these programs will minimize our tax costs and maximize public benefit from tax dollars invested. This article describes the use of the net present value (NPV) analysis approach to select public R&D programs and valuate expected private sector participation in the programs.more » 5 refs.« less

  6. A Multiscale, Nonlinear, Modeling Framework Enabling the Design and Analysis of Composite Materials and Structures

    NASA Technical Reports Server (NTRS)

    Bednarcyk, Brett A.; Arnold, Steven M.

    2011-01-01

    A framework for the multiscale design and analysis of composite materials and structures is presented. The ImMAC software suite, developed at NASA Glenn Research Center, embeds efficient, nonlinear micromechanics capabilities within higher scale structural analysis methods such as finite element analysis. The result is an integrated, multiscale tool that relates global loading to the constituent scale, captures nonlinearities at this scale, and homogenizes local nonlinearities to predict their effects at the structural scale. Example applications of the multiscale framework are presented for the stochastic progressive failure of a SiC/Ti composite tensile specimen and the effects of microstructural variations on the nonlinear response of woven polymer matrix composites.

  7. A data analysis framework for biomedical big data: Application on mesoderm differentiation of human pluripotent stem cells

    PubMed Central

    Karlsson, Alexander; Riveiro, Maria; Améen, Caroline; Åkesson, Karolina; Andersson, Christian X.; Sartipy, Peter; Synnergren, Jane

    2017-01-01

    The development of high-throughput biomolecular technologies has resulted in generation of vast omics data at an unprecedented rate. This is transforming biomedical research into a big data discipline, where the main challenges relate to the analysis and interpretation of data into new biological knowledge. The aim of this study was to develop a framework for biomedical big data analytics, and apply it for analyzing transcriptomics time series data from early differentiation of human pluripotent stem cells towards the mesoderm and cardiac lineages. To this end, transcriptome profiling by microarray was performed on differentiating human pluripotent stem cells sampled at eleven consecutive days. The gene expression data was analyzed using the five-stage analysis framework proposed in this study, including data preparation, exploratory data analysis, confirmatory analysis, biological knowledge discovery, and visualization of the results. Clustering analysis revealed several distinct expression profiles during differentiation. Genes with an early transient response were strongly related to embryonic- and mesendoderm development, for example CER1 and NODAL. Pluripotency genes, such as NANOG and SOX2, exhibited substantial downregulation shortly after onset of differentiation. Rapid induction of genes related to metal ion response, cardiac tissue development, and muscle contraction were observed around day five and six. Several transcription factors were identified as potential regulators of these processes, e.g. POU1F1, TCF4 and TBP for muscle contraction genes. Pathway analysis revealed temporal activity of several signaling pathways, for example the inhibition of WNT signaling on day 2 and its reactivation on day 4. This study provides a comprehensive characterization of biological events and key regulators of the early differentiation of human pluripotent stem cells towards the mesoderm and cardiac lineages. The proposed analysis framework can be used to structure data analysis in future research, both in stem cell differentiation, and more generally, in biomedical big data analytics. PMID:28654683

  8. A data analysis framework for biomedical big data: Application on mesoderm differentiation of human pluripotent stem cells.

    PubMed

    Ulfenborg, Benjamin; Karlsson, Alexander; Riveiro, Maria; Améen, Caroline; Åkesson, Karolina; Andersson, Christian X; Sartipy, Peter; Synnergren, Jane

    2017-01-01

    The development of high-throughput biomolecular technologies has resulted in generation of vast omics data at an unprecedented rate. This is transforming biomedical research into a big data discipline, where the main challenges relate to the analysis and interpretation of data into new biological knowledge. The aim of this study was to develop a framework for biomedical big data analytics, and apply it for analyzing transcriptomics time series data from early differentiation of human pluripotent stem cells towards the mesoderm and cardiac lineages. To this end, transcriptome profiling by microarray was performed on differentiating human pluripotent stem cells sampled at eleven consecutive days. The gene expression data was analyzed using the five-stage analysis framework proposed in this study, including data preparation, exploratory data analysis, confirmatory analysis, biological knowledge discovery, and visualization of the results. Clustering analysis revealed several distinct expression profiles during differentiation. Genes with an early transient response were strongly related to embryonic- and mesendoderm development, for example CER1 and NODAL. Pluripotency genes, such as NANOG and SOX2, exhibited substantial downregulation shortly after onset of differentiation. Rapid induction of genes related to metal ion response, cardiac tissue development, and muscle contraction were observed around day five and six. Several transcription factors were identified as potential regulators of these processes, e.g. POU1F1, TCF4 and TBP for muscle contraction genes. Pathway analysis revealed temporal activity of several signaling pathways, for example the inhibition of WNT signaling on day 2 and its reactivation on day 4. This study provides a comprehensive characterization of biological events and key regulators of the early differentiation of human pluripotent stem cells towards the mesoderm and cardiac lineages. The proposed analysis framework can be used to structure data analysis in future research, both in stem cell differentiation, and more generally, in biomedical big data analytics.

  9. Potential of knowledge discovery using workflows implemented in the C3Grid

    NASA Astrophysics Data System (ADS)

    Engel, Thomas; Fink, Andreas; Ulbrich, Uwe; Schartner, Thomas; Dobler, Andreas; Fritzsch, Bernadette; Hiller, Wolfgang; Bräuer, Benny

    2013-04-01

    With the increasing number of climate simulations, reanalyses and observations, new infrastructures to search and analyse distributed data are necessary. In recent years, the Grid architecture became an important technology to fulfill these demands. For the German project "Collaborative Climate Community Data and Processing Grid" (C3Grid) computer scientists and meteorologists developed a system that offers its users a webinterface to search and download climate data and use implemented analysis tools (called workflows) to further investigate them. In this contribution, two workflows that are implemented in the C3Grid architecture are presented: the Cyclone Tracking (CT) and Stormtrack workflow. They shall serve as an example on how to perform numerous investigations on midlatitude winterstorms on a large amount of analysis and climate model data without having an insight into the data source, program code and a low-to-moderate understanding of the theortical background. CT is based on the work of Murray and Simmonds (1991) to identify and track local minima in the mean sea level pressure (MSLP) field of the selected dataset. Adjustable thresholds for the curvature of the isobars as well as the minimum lifetime of a cyclone allow the distinction of weak subtropical heat low systems and stronger midlatitude cyclones e.g. in the Northern Atlantic. The user gets the resulting track data including statistics about the track density, average central pressure, average central curvature, cyclogenesis and cyclolysis as well as pre-built visualizations of these results. Stormtrack calculates the 2.5-6 day bandpassfiltered standard deviation of the geopotential height on a selected pressure level. Although this workflow needs much less computational effort compared to CT it shows structures that are in good agreement with the track density of the CT workflow. To what extent changes in the mid-level tropospheric storm track are reflected in trough density and intensity alteration of surface cyclones. A specific feature of C3Grid is the flexible Workflow Scheduling Service (WSS) which also allows for automated nightly analysis runs of CT, Stormtrack, etc. with different input parameter sets. The statistical results of these workflows can be accumulated afterwards by a scheduled final analysis step, thereby providing a tool for data intensive analytics for the massive amounts of climate model data accessible through C3Grid. First tests with these automated analysis workflows show promising results to speed up the investigation of high volume modeling data. This example is relevant to the thorough analysis of future changes in storminess in Europe and is just one example of the potential of knowledge discovery using automated workflows implemented in the C3Grid architecture.

  10. Analysis of Surface Charging for a Candidate Solar Sail Mission Using NASCAP-2K

    NASA Technical Reports Server (NTRS)

    Parker, Linda Neergaard; Minow, Joseph L.; Davis, V. A.; Mandell, Myron; Gardner, Barbara

    2005-01-01

    The characterization of the electromagnetic interaction for a solar sail in the solar wind environment and identification of viable charging mitigation strategies are critical solar sail mission design tasks. Spacecraft charging has important implications both for science applications and for lifetime and reliability issues of sail propulsion systems. To that end, surface charging calculations of a candidate 150-meter-class solar sail spacecraft for the 0.5 AU solar polar and 1.9 AU LI solar wind environments are performed. A model of the spacecraft with candidate materials having appropriate electrical properties is constructed using Object Toolkit. The spacecraft charging analysis is performed using Nascap-2k. the NASA/AFRL sponsored spacecraft charging analysis tool. Nominal and atypical solar wind environments appropriate for the 0.5 AU and 1.0 AU missions are used to establish current collection of solar wind ions and electrons. Finally, a geostationary orbit environment case is included to demonstrate a bounding example of extreme (negative) charging of a solar sail spacecraft. Results from the charging analyses demonstrate that minimal differential potentials (and resulting threat of electrostatic discharge) occur when the spacecraft is constructed entirely of conducting materials, as anticipated from standard guidelines for mitigation of spacecraft charging issues. Examples with dielectric materials exposed to the space environment exhibit differential potentials ranging from a few volts to extreme potentials in the kilovolt range.

  11. Analysis of Surface Charging for a Candidate Solar Sail Mission Using Nascap-2k

    NASA Technical Reports Server (NTRS)

    Parker, Linda Neergaard; Minow, Joseph I.; Davis, Victoria; Mandell, Myron; Gardner, Barbara

    2005-01-01

    The characterization of the electromagnetic interaction for a solar sail in the solar wind environment and identification of viable charging mitigation strategies are critical solar sail mission design task. Spacecraft charging has important implications both for science applications and for lifetime and reliability issues of sail propulsion systems. To that end, surface charging calculations of a candidate 150-meter-class solar sail spacecraft for the 0.5 AU solar polar and 1.0 AU L1 solar wind environments are performed. A model of the spacecraft with candidate materials having appropriate electrical properties is constructed using Object Toolkit. The spacecraft charging analysis is performed using Nascap-2k, the NASA/AFRL sponsored spacecraft charging analysis tool. Nominal and atypical solar wind environments appropriate for the 0.5 AU and 1.0 AU missions are used to establish current collection of solar wind ions and electrons. Finally, a geostationary orbit environment case is included to demonstrate a bounding example of extreme (negative) charging of a solar sail spacecraft. Results from the charging analyses demonstrate that minimal differential potentials (and resulting threat of electrostatic discharge) occur when the spacecraft is constructed entirely of conducting materials, as anticipated from standard guidelines for mitigation of spacecraft charging issues. Examples with dielectric materials exposed to the space environment exhibit differential potentials ranging from a few volts to extreme potentials in the kilovolt range.

  12. A methodological investigation of hominoid craniodental morphology and phylogenetics.

    PubMed

    Bjarnason, Alexander; Chamberlain, Andrew T; Lockwood, Charles A

    2011-01-01

    The evolutionary relationships of extant great apes and humans have been largely resolved by molecular studies, yet morphology-based phylogenetic analyses continue to provide conflicting results. In order to further investigate this discrepancy we present bootstrap clade support of morphological data based on two quantitative datasets, one dataset consisting of linear measurements of the whole skull from 5 hominoid genera and the second dataset consisting of 3D landmark data from the temporal bone of 5 hominoid genera, including 11 sub-species. Using similar protocols for both datasets, we were able to 1) compare distance-based phylogenetic methods to cladistic parsimony of quantitative data converted into discrete character states, 2) vary outgroup choice to observe its effect on phylogenetic inference, and 3) analyse male and female data separately to observe the effect of sexual dimorphism on phylogenies. Phylogenetic analysis was sensitive to methodological decisions, particularly outgroup selection, where designation of Pongo as an outgroup and removal of Hylobates resulted in greater congruence with the proposed molecular phylogeny. The performance of distance-based methods also justifies their use in phylogenetic analysis of morphological data. It is clear from our analyses that hominoid phylogenetics ought not to be used as an example of conflict between the morphological and molecular, but as an example of how outgroup and methodological choices can affect the outcome of phylogenetic analysis. Copyright © 2010 Elsevier Ltd. All rights reserved.

  13. Human Error: A Concept Analysis

    NASA Technical Reports Server (NTRS)

    Hansen, Frederick D.

    2007-01-01

    Human error is the subject of research in almost every industry and profession of our times. This term is part of our daily language and intuitively understood by most people however, it would be premature to assume that everyone's understanding of human error s the same. For example, human error is used to describe the outcome or consequence of human action, the causal factor of an accident, deliberate violations,a nd the actual action taken by a human being. As a result, researchers rarely agree on the either a specific definition or how to prevent human error. The purpose of this article is to explore the specific concept of human error using Concept Analysis as described by Walker and Avant (1995). The concept of human error is examined as currently used in the literature of a variety of industries and professions. Defining attributes and examples of model, borderline, and contrary cases are described. The antecedents and consequences of human error are also discussed and a definition of human error is offered.

  14. Why bundled payments could drive innovation: an example from interventional oncology.

    PubMed

    Steele, Joseph R; Jones, A Kyle; Ninan, Elizabeth P; Clarke, Ryan K; Odisio, Bruno C; Avritscher, Rony; Murthy, Ravi; Mahvash, Armeen

    2015-03-01

    Some have suggested that the current fee-for-service health care payment system in the United States stifles innovation. However, there are few published examples supporting this concept. We implemented an innovative temporary balloon occlusion technique for yttrium 90 radioembolization of nonresectable liver cancer. Although our balloon occlusion technique was associated with similar patient outcomes, lower cost, and faster procedure times compared with the standard-of-care coil embolization technique, our technique failed to gain widespread acceptance. Financial analysis revealed that because the balloon occlusion technique avoided a procedural step associated with a lucrative Current Procedural Terminology billing code, this new technique resulted in a significant decrease in hospital and physician revenue in the current fee-for-service payment system, even though the new technique would provide a revenue enhancement through cost savings in a bundled payment system. Our analysis illustrates how in a fee-for-service payment system, financial disincentives can stifle innovation and advancement of health care delivery. Copyright © 2015 by American Society of Clinical Oncology.

  15. An Energy-Based Limit State Function for Estimation of Structural Reliability in Shock Environments

    DOE PAGES

    Guthrie, Michael A.

    2013-01-01

    limit state function is developed for the estimation of structural reliability in shock environments. This limit state function uses peak modal strain energies to characterize environmental severity and modal strain energies at failure to characterize the structural capacity. The Hasofer-Lind reliability index is briefly reviewed and its computation for the energy-based limit state function is discussed. Applications to two degree of freedom mass-spring systems and to a simple finite element model are considered. For these examples, computation of the reliability index requires little effort beyond a modal analysis, but still accounts for relevant uncertainties in both the structure and environment.more » For both examples, the reliability index is observed to agree well with the results of Monte Carlo analysis. In situations where fast, qualitative comparison of several candidate designs is required, the reliability index based on the proposed limit state function provides an attractive metric which can be used to compare and control reliability.« less

  16. On holographic Rényi entropy in some modified theories of gravity

    NASA Astrophysics Data System (ADS)

    Dey, Anshuman; Roy, Pratim; Sarkar, Tapobrata

    2018-04-01

    We perform a detailed analysis of holographic entanglement Rényi entropy in some modified theories of gravity with four dimensional conformal field theory duals. First, we construct perturbative black hole solutions in a recently proposed model of Einsteinian cubic gravity in five dimensions, and compute the Rényi entropy as well as the scaling dimension of the twist operators in the dual field theory. Consistency of these results are verified from the AdS/CFT correspondence, via a corresponding computation of the Weyl anomaly on the gravity side. Similar analyses are then carried out for three other examples of modified gravity in five dimensions that include a chemical potential, namely Born-Infeld gravity, charged quasi-topological gravity and a class of Weyl corrected gravity theories with a gauge field, with the last example being treated perturbatively. Some interesting bounds in the dual conformal field theory parameters in quasi-topological gravity are pointed out. We also provide arguments on the validity of our perturbative analysis, whenever applicable.

  17. Analysis on LID for highly urbanized areas' waterlogging control: demonstrated on the example of Caohejing in Shanghai.

    PubMed

    Liao, Z L; He, Y; Huang, F; Wang, S; Li, H Z

    2013-01-01

    Although a commonly applied measure across the United States and Europe for alleviating the negative impacts of urbanization on the hydrological cycle, low impact development (LID) has not been widely used in highly urbanized areas, especially in rapidly urbanizing cities in developing countries like China. In this paper, given five LID practices including Bio-Retention, Infiltration Trench, Porous Pavement, Rain Barrels, and Green Swale, an analysis on LID for highly urbanized areas' waterlogging control is demonstrated using the example of Caohejing in Shanghai, China. Design storm events and storm water management models are employed to simulate the total waterlogging volume reduction, peak flow rate reduction and runoff coefficient reduction of different scenarios. Cost-effectiveness is calculated for the five practices. The aftermath shows that LID practices can have significant effects on storm water management in a highly urbanized area, and the comparative results reveal that Rain Barrels and Infiltration Trench are the two most suitable cost-effective measures for the study area.

  18. Corrosion of Highly Specular Vapor Deposited Aluminum (VDA) on Earthshade Door Sandwich Structure

    NASA Technical Reports Server (NTRS)

    Plaskon, Daniel; Hsieh, Cheng

    2003-01-01

    High-resolution infrared (IR) imaging requires spacecraft instrument design that is tightly coupled with overall thermal control design. The JPL Tropospheric Emission Spectrometer (TES) instrument measures the 3-dimensional distribution of ozone and its precursors in the lower atmosphere on a global scale. The TES earthshade must protect the 180-K radiator and the 230-K radiator from the Earth IR and albedo. Requirements for specularity, emissivity, and solar absorptance of inner surfaces could only be met with vapor deposited aluminum (VDA). Circumstances leading to corrosion of the VDA are described. Innovative materials and processing to meet the optical and thermal cycle requirements were developed. Examples of scanning electronmicroscope (SEM), atomic force microscope (AFM), and other surface analysis techniques used in failure analysis, problem solving, and process development are given. Materials and process selection criteria and development test results are presented in a decision matrix. Examples of conditions promoting and preventing galvanic corrosion between VDA and graphite fiber-reinforced laminates are provided.

  19. Application of parameter estimation to aircraft stability and control: The output-error approach

    NASA Technical Reports Server (NTRS)

    Maine, Richard E.; Iliff, Kenneth W.

    1986-01-01

    The practical application of parameter estimation methodology to the problem of estimating aircraft stability and control derivatives from flight test data is examined. The primary purpose of the document is to present a comprehensive and unified picture of the entire parameter estimation process and its integration into a flight test program. The document concentrates on the output-error method to provide a focus for detailed examination and to allow us to give specific examples of situations that have arisen. The document first derives the aircraft equations of motion in a form suitable for application to estimation of stability and control derivatives. It then discusses the issues that arise in adapting the equations to the limitations of analysis programs, using a specific program for an example. The roles and issues relating to mass distribution data, preflight predictions, maneuver design, flight scheduling, instrumentation sensors, data acquisition systems, and data processing are then addressed. Finally, the document discusses evaluation and the use of the analysis results.

  20. Analysis of group-velocity dispersion of high-frequency Rayleigh waves for near-surface applications

    USGS Publications Warehouse

    Luo, Y.; Xia, J.; Xu, Y.; Zeng, C.

    2011-01-01

    The Multichannel Analysis of Surface Waves (MASW) method is an efficient tool to obtain the vertical shear (S)-wave velocity profile using the dispersive characteristic of Rayleigh waves. Most MASW researchers mainly apply Rayleigh-wave phase-velocity dispersion for S-wave velocity estimation with a few exceptions applying Rayleigh-wave group-velocity dispersion. Herein, we first compare sensitivities of fundamental surface-wave phase velocities with group velocities with three four-layer models including a low-velocity layer or a high-velocity layer. Then synthetic data are simulated by a finite difference method. Images of group-velocity dispersive energy of the synthetic data are generated using the Multiple Filter Analysis (MFA) method. Finally we invert a high-frequency surface-wave group-velocity dispersion curve of a real-world example. Results demonstrate that (1) the sensitivities of group velocities are higher than those of phase velocities and usable frequency ranges are wider than that of phase velocities, which is very helpful in improving inversion stability because for a stable inversion system, small changes in phase velocities do not result in a large fluctuation in inverted S-wave velocities; (2) group-velocity dispersive energy can be measured using single-trace data if Rayleigh-wave fundamental-mode energy is dominant, which suggests that the number of shots required in data acquisition can be dramatically reduced and the horizontal resolution can be greatly improved using analysis of group-velocity dispersion; and (3) the suspension logging results of the real-world example demonstrate that inversion of group velocities generated by the MFA method can successfully estimate near-surface S-wave velocities. ?? 2011 Elsevier B.V.

  1. Assessing the Use of Employment Screening for Sexual Assault Prevention

    DTIC Science & Technology

    2017-01-01

    designed to address. For example, one meta-analysis found the average validity coefficient, or association, between integrity tests and job performance...or a more-general trait), and the group that is tested . For example, one meta-analysis found stronger associations between integrity tests and self... Tests Overt and personality-based integrity tests use questions designed to address somewhat different but overlapping content areas ( Ones

  2. 26 CFR 1.894-1 - Income affected by treaty.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... same as in Example 3, except that in year 2, A makes the interest payment of $25 to FB, a Country Y unrelated foreign bank, on a loan from FB to A. (ii) Analysis. The analysis is the same as in Example 1 with respect to the $100 dividend payment from S to A. With respect to the payment from A to FB, paragraph (d...

  3. Non-linear analytic and coanalytic problems ( L_p-theory, Clifford analysis, examples)

    NASA Astrophysics Data System (ADS)

    Dubinskii, Yu A.; Osipenko, A. S.

    2000-02-01

    Two kinds of new mathematical model of variational type are put forward: non-linear analytic and coanalytic problems. The formulation of these non-linear boundary-value problems is based on a decomposition of the complete scale of Sobolev spaces into the "orthogonal" sum of analytic and coanalytic subspaces. A similar decomposition is considered in the framework of Clifford analysis. Explicit examples are presented.

  4. [Gender-sensitive epidemiological data analysis: methodological aspects and empirical outcomes. Illustrated by a health reporting example].

    PubMed

    Jahn, I; Foraita, R

    2008-01-01

    In Germany gender-sensitive approaches are part of guidelines for good epidemiological practice as well as health reporting. They are increasingly claimed to realize the gender mainstreaming strategy in research funding by the federation and federal states. This paper focuses on methodological aspects of data analysis, as an empirical data example of which serves the health report of Bremen, a population-based cross-sectional study. Health reporting requires analysis and reporting methods that are able to discover sex/gender issues of questions, on the one hand, and consider how results can adequately be communicated, on the other hand. The core question is: Which consequences do a different inclusion of the category sex in different statistical analyses for identification of potential target groups have on the results? As evaluation methods logistic regressions as well as a two-stage procedure were exploratively conducted. This procedure combines graphical models with CHAID decision trees and allows for visualising complex results. Both methods are analysed by stratification as well as adjusted by sex/gender and compared with each other. As a result, only stratified analyses are able to detect differences between the sexes and within the sex/gender groups as long as one cannot resort to previous knowledge. Adjusted analyses can detect sex/gender differences only if interaction terms have been included in the model. Results are discussed from a statistical-epidemiological perspective as well as in the context of health reporting. As a conclusion, the question, if a statistical method is gender-sensitive, can only be answered by having concrete research questions and known conditions. Often, an appropriate statistic procedure can be chosen after conducting a separate analysis for women and men. Future gender studies deserve innovative study designs as well as conceptual distinctiveness with regard to the biological and the sociocultural elements of the category sex/gender.

  5. Further studies on stability analysis of nonlinear Roesser-type two-dimensional systems

    NASA Astrophysics Data System (ADS)

    Dai, Xiao-Lin

    2014-04-01

    This paper is concerned with further relaxations of the stability analysis of nonlinear Roesser-type two-dimensional (2D) systems in the Takagi-Sugeno fuzzy form. To achieve the goal, a novel slack matrix variable technique, which is homogenous polynomially parameter-dependent on the normalized fuzzy weighting functions with arbitrary degree, is developed and the algebraic properties of the normalized fuzzy weighting functions are collected into a set of augmented matrices. Consequently, more information about the normalized fuzzy weighting functions is involved and the relaxation quality of the stability analysis is significantly improved. Moreover, the obtained result is formulated in the form of linear matrix inequalities, which can be easily solved via standard numerical software. Finally, a numerical example is provided to demonstrate the effectiveness of the proposed result.

  6. Applying phylogenetic analysis to viral livestock diseases: moving beyond molecular typing.

    PubMed

    Olvera, Alex; Busquets, Núria; Cortey, Marti; de Deus, Nilsa; Ganges, Llilianne; Núñez, José Ignacio; Peralta, Bibiana; Toskano, Jennifer; Dolz, Roser

    2010-05-01

    Changes in livestock production systems in recent years have altered the presentation of many diseases resulting in the need for more sophisticated control measures. At the same time, new molecular assays have been developed to support the diagnosis of animal viral disease. Nucleotide sequences generated by these diagnostic techniques can be used in phylogenetic analysis to infer phenotypes by sequence homology and to perform molecular epidemiology studies. In this review, some key elements of phylogenetic analysis are highlighted, such as the selection of the appropriate neutral phylogenetic marker, the proper phylogenetic method and different techniques to test the reliability of the resulting tree. Examples are given of current and future applications of phylogenetic reconstructions in viral livestock diseases. Copyright 2009 Elsevier Ltd. All rights reserved.

  7. Con: Meta-analysis: some key limitations and potential solutions.

    PubMed

    Esterhuizen, Tonya M; Thabane, Lehana

    2016-06-01

    Meta-analysis, a statistical combination of results of several trials to produce a summary effect, has been subject to criticism in the past, mainly for the reasons of poor quality of included studies, heterogeneity between studies meta-analyzed and failing to address publication bias. These limitations can cause the results to be misleading, which is important if policy and practice decisions are based on systematic reviews and meta-analyses. We elaborate on these limitations and illustrate them with examples from the nephrology literature. Finally, we present some potential solutions, notably, education in meta-analysis for evidence producers and consumers as well as the use of individual patient data for meta-analyses. © The Author 2016. Published by Oxford University Press on behalf of ERA-EDTA. All rights reserved.

  8. Application of the boundary element method to the micromechanical analysis of composite materials

    NASA Technical Reports Server (NTRS)

    Goldberg, R. K.; Hopkins, D. A.

    1995-01-01

    A new boundary element formulation for the micromechanical analysis of composite materials is presented in this study. A unique feature of the formulation is the use of circular shape functions to convert the two-dimensional integrations of the composite fibers to one-dimensional integrations. To demonstrate the applicability of the formulations, several example problems including elastic and thermal analysis of laminated composites and elastic analyses of woven composites are presented and the boundary element results compared to experimental observations and/or results obtained through alternate analytical procedures. While several issues remain to be addressed in order to make the methodology more robust, the formulations presented here show the potential in providing an alternative to traditional finite element methods, particularly for complex composite architectures.

  9. Propagation Constant of a Rectangular Waveguides Completely Full of Ferrite Magnetized Longitudinally

    NASA Astrophysics Data System (ADS)

    Sakli, Hedi; Benzina, Hafedh; Aguili, Taoufik; Tao, Jun Wu

    2009-08-01

    This paper is an analysis of rectangular waveguide completely full of ferrite magnetized longitudinally. The analysis is based on the formulation of the transverse operator method (TOM), followed by the application of the Galerkin method. We obtain an eigenvalue equation system. The propagation constant of some homogenous and anisotropic waveguide structures with ferrite has been obtained. The results presented here show that the transverse operator formulation is not only an elegant theoretical form, but also a powerful and efficient analysis method, it is useful to solve a number of the propagation problems in electromagnetic. One advantage of this method is that it presents a fast convergence. Numerical examples are given for different cases and compared with the published results. A good agreement is obtained.

  10. Multi-Disciplinary, Multi-Fidelity Discrete Data Transfer Using Degenerate Geometry Forms

    NASA Technical Reports Server (NTRS)

    Olson, Erik D.

    2016-01-01

    In a typical multi-fidelity design process, different levels of geometric abstraction are used for different analysis methods, and transitioning from one phase of design to the next often requires a complete re-creation of the geometry. To maintain consistency between lower-order and higher-order analysis results, Vehicle Sketch Pad (OpenVSP) recently introduced the ability to generate and export several degenerate forms of the geometry, representing the type of abstraction required to perform low- to medium-order analysis for a range of aeronautical disciplines. In this research, the functionality of these degenerate models was extended, so that in addition to serving as repositories for the geometric information that is required as input to an analysis, the degenerate models can also store the results of that analysis mapped back onto the geometric nodes. At the same time, the results are also mapped indirectly onto the nodes of lower-order degenerate models using a process called aggregation, and onto higher-order models using a process called disaggregation. The mapped analysis results are available for use by any subsequent analysis in an integrated design and analysis process. A simple multi-fidelity analysis process for a single-aisle subsonic transport aircraft is used as an example case to demonstrate the value of the approach.

  11. Fragmentary and incidental behaviour of columns, slabs and crystals

    PubMed Central

    Whiteley, Walter

    2014-01-01

    Between the study of small finite frameworks and infinite incidentally periodic frameworks, we find the real materials which are large, but finite, fragments that fit into the infinite periodic frameworks. To understand these materials, we seek insights from both (i) their analysis as large frameworks with associated geometric and combinatorial properties (including the geometric repetitions) and (ii) embedding them into appropriate infinite periodic structures with motions that may break the periodic structure. A review of real materials identifies a number of examples with a local appearance of ‘unit cells’ which repeat under isometries but perhaps in unusual forms. These examples also refocus attention on several new classes of infinite ‘periodic’ frameworks: (i) columns—three-dimensional structures generated with one repeating isometry and (ii) slabs—three-dimensional structures with two independent repeating translations. With this larger vision of structures to be studied, we find some patterns and partial results that suggest new conjectures as well as many additional open questions. These invite a search for new examples and additional theorems. PMID:24379423

  12. [The genetic control of mouse coat color and its applications in genetics teaching].

    PubMed

    Xing, Wanjin; Morigen, Morigen

    2014-10-01

    Mice are the most commonly used mammalian model. The coat colors of mice are typical Mendelian traits, which have various colors such as white, black, yellow and agouti. The inheritance of mouse coat color is usually stated as an example only in teaching the knowledge of recessive lethal alleles. After searched the related literatures and summarized the molecular mechanisms of mouse coat color inheritance, we further expanded the application of this example into the introduction of the basic concepts of alleles and Mendelian laws, demonstration of the gene structure and function, regulation of gene expression, gene interaction, epigenetic modification, quantitative genetics, as well as evolutionary genetics. By running this example through the whole genetics-teaching lectures, we help the student to form a systemic and developmental view of genetic analysis. At the same time, this teaching approach not only highlights the advancement and integrity of genetics, but also results in a good teaching effect on inspiring the students' interest and attracting students' attention.

  13. Multivariate Analysis, Retrieval, and Storage System (MARS). Volume 6: MARS System - A Sample Problem (Gross Weight of Subsonic Transports)

    NASA Technical Reports Server (NTRS)

    Hague, D. S.; Woodbury, N. W.

    1975-01-01

    The Mars system is a tool for rapid prediction of aircraft or engine characteristics based on correlation-regression analysis of past designs stored in the data bases. An example of output obtained from the MARS system, which involves derivation of an expression for gross weight of subsonic transport aircraft in terms of nine independent variables is given. The need is illustrated for careful selection of correlation variables and for continual review of the resulting estimation equations. For Vol. 1, see N76-10089.

  14. Data Unfolding with Wiener-SVD Method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tang, W.; Li, X.; Qian, X.

    Here, data unfolding is a common analysis technique used in HEP data analysis. Inspired by the deconvolution technique in the digital signal processing, a new unfolding technique based on the SVD technique and the well-known Wiener filter is introduced. The Wiener-SVD unfolding approach achieves the unfolding by maximizing the signal to noise ratios in the effective frequency domain given expectations of signal and noise and is free from regularization parameter. Through a couple examples, the pros and cons of the Wiener-SVD approach as well as the nature of the unfolded results are discussed.

  15. Pinning synchronization of delayed complex dynamical networks with nonlinear coupling

    NASA Astrophysics Data System (ADS)

    Cheng, Ranran; Peng, Mingshu; Yu, Weibin

    2014-11-01

    In this paper, we find that complex networks with the Watts-Strogatz or scale-free BA random topological architecture can be synchronized more easily by pin-controlling fewer nodes than regular systems. Theoretical analysis is included by means of Lyapunov functions and linear matrix inequalities (LMI) to make all nodes reach complete synchronization. Numerical examples are also provided to illustrate the importance of our theoretical analysis, which implies that there exists a gap between the theoretical prediction and numerical results about the minimum number of pinning controlled nodes.

  16. Williams Element with Generalized Degrees of Freedom for Fracture Analysis of Multiple-Cracked Beam

    NASA Astrophysics Data System (ADS)

    Xu, Hua; Wei, Quyang; Yang, Lufeng

    2017-10-01

    In this paper, the method of finite element with generalized degrees of freedom (FEDOFs) is used to calculate the stress intensity factor (SIF) of multiple cracked beam and analysed the effect of minor cracks on the main crack SIF in different cases. Williams element is insensitive to the size of singular region. So that calculation efficiency is highly improved. Examples analysis validates that the SIF near the crack tip can be obtained directly though FEDOFs. And the result is well consistent with ANSYS solution and has a satisfied accuracy.

  17. Spectral analysis for nonstationary and nonlinear systems: a discrete-time-model-based approach.

    PubMed

    He, Fei; Billings, Stephen A; Wei, Hua-Liang; Sarrigiannis, Ptolemaios G; Zhao, Yifan

    2013-08-01

    A new frequency-domain analysis framework for nonlinear time-varying systems is introduced based on parametric time-varying nonlinear autoregressive with exogenous input models. It is shown how the time-varying effects can be mapped to the generalized frequency response functions (FRFs) to track nonlinear features in frequency, such as intermodulation and energy transfer effects. A new mapping to the nonlinear output FRF is also introduced. A simulated example and the application to intracranial electroencephalogram data are used to illustrate the theoretical results.

  18. Spatial analysis on future housing markets: economic development and housing implications.

    PubMed

    Liu, Xin; Wang, Lizhe

    2014-01-01

    A coupled projection method combining formal modelling and other statistical techniques was developed to delineate the relationship between economic and social drivers for net new housing allocations. Using the example of employment growth in Tyne and Wear, UK, until 2016, the empirical analysis yields housing projections at the macro- and microspatial levels (e.g., region to subregion to elected ward levels). The results have important implications for the strategic planning of locations for housing and employment, demonstrating both intuitively and quantitatively how local economic developments affect housing demand.

  19. Data Unfolding with Wiener-SVD Method

    DOE PAGES

    Tang, W.; Li, X.; Qian, X.; ...

    2017-10-04

    Here, data unfolding is a common analysis technique used in HEP data analysis. Inspired by the deconvolution technique in the digital signal processing, a new unfolding technique based on the SVD technique and the well-known Wiener filter is introduced. The Wiener-SVD unfolding approach achieves the unfolding by maximizing the signal to noise ratios in the effective frequency domain given expectations of signal and noise and is free from regularization parameter. Through a couple examples, the pros and cons of the Wiener-SVD approach as well as the nature of the unfolded results are discussed.

  20. Selecting supplier combination based on fuzzy multicriteria analysis

    NASA Astrophysics Data System (ADS)

    Han, Zhi-Qiu; Luo, Xin-Xing; Chen, Xiao-Hong; Yang, Wu-E.

    2015-07-01

    Existing multicriteria analysis (MCA) methods are probably ineffective in selecting a supplier combination. Thus, an MCA-based fuzzy 0-1 programming method is introduced. The programming relates to a simple MCA matrix that is used to select a single supplier. By solving the programming, the most feasible combination of suppliers is selected. Importantly, this result differs from selecting suppliers one by one according to a single-selection order, which is used to rank sole suppliers in existing MCA methods. An example highlights such difference and illustrates the proposed method.

  1. Conserved directed percolation: exact quasistationary distribution of small systems and Monte Carlo simulations

    NASA Astrophysics Data System (ADS)

    César Mansur Filho, Júlio; Dickman, Ronald

    2011-05-01

    We study symmetric sleepy random walkers, a model exhibiting an absorbing-state phase transition in the conserved directed percolation (CDP) universality class. Unlike most examples of this class studied previously, this model possesses a continuously variable control parameter, facilitating analysis of critical properties. We study the model using two complementary approaches: analysis of the numerically exact quasistationary (QS) probability distribution on rings of up to 22 sites, and Monte Carlo simulation of systems of up to 32 000 sites. The resulting estimates for critical exponents β, \\beta /\

  2. Spatial Analysis on Future Housing Markets: Economic Development and Housing Implications

    PubMed Central

    Liu, Xin; Wang, Lizhe

    2014-01-01

    A coupled projection method combining formal modelling and other statistical techniques was developed to delineate the relationship between economic and social drivers for net new housing allocations. Using the example of employment growth in Tyne and Wear, UK, until 2016, the empirical analysis yields housing projections at the macro- and microspatial levels (e.g., region to subregion to elected ward levels). The results have important implications for the strategic planning of locations for housing and employment, demonstrating both intuitively and quantitatively how local economic developments affect housing demand. PMID:24892097

  3. On the Possibility of Studying the Reactions of the Thermal Decomposition of Energy Substances by the Methods of High-Resolution Terahertz Spectroscopy

    NASA Astrophysics Data System (ADS)

    Vaks, V. L.; Domracheva, E. G.; Chernyaeva, M. B.; Pripolzin, S. I.; Revin, L. S.; Tretyakov, I. V.; Anfertyev, V. A.; Yablokov, A. A.; Lukyanenko, I. A.; Sheikov, Yu. V.

    2018-02-01

    We show prospects for using the method of high-resolution terahertz spectroscopy for a continuous analysis of the decomposition products of energy substances in the gas phase (including short-lived ones) in a wide temperature range. The experimental setup, which includes a terahertz spectrometer for studying the thermal decomposition reactions, is described. The results of analysis of the gaseous decomposition products of energy substances by the example of ammonium nitrate heated from room temperature to 167°C are presented.

  4. Statistical Validation of Image Segmentation Quality Based on a Spatial Overlap Index1

    PubMed Central

    Zou, Kelly H.; Warfield, Simon K.; Bharatha, Aditya; Tempany, Clare M.C.; Kaus, Michael R.; Haker, Steven J.; Wells, William M.; Jolesz, Ferenc A.; Kikinis, Ron

    2005-01-01

    Rationale and Objectives To examine a statistical validation method based on the spatial overlap between two sets of segmentations of the same anatomy. Materials and Methods The Dice similarity coefficient (DSC) was used as a statistical validation metric to evaluate the performance of both the reproducibility of manual segmentations and the spatial overlap accuracy of automated probabilistic fractional segmentation of MR images, illustrated on two clinical examples. Example 1: 10 consecutive cases of prostate brachytherapy patients underwent both preoperative 1.5T and intraoperative 0.5T MR imaging. For each case, 5 repeated manual segmentations of the prostate peripheral zone were performed separately on preoperative and on intraoperative images. Example 2: A semi-automated probabilistic fractional segmentation algorithm was applied to MR imaging of 9 cases with 3 types of brain tumors. DSC values were computed and logit-transformed values were compared in the mean with the analysis of variance (ANOVA). Results Example 1: The mean DSCs of 0.883 (range, 0.876–0.893) with 1.5T preoperative MRI and 0.838 (range, 0.819–0.852) with 0.5T intraoperative MRI (P < .001) were within and at the margin of the range of good reproducibility, respectively. Example 2: Wide ranges of DSC were observed in brain tumor segmentations: Meningiomas (0.519–0.893), astrocytomas (0.487–0.972), and other mixed gliomas (0.490–0.899). Conclusion The DSC value is a simple and useful summary measure of spatial overlap, which can be applied to studies of reproducibility and accuracy in image segmentation. We observed generally satisfactory but variable validation results in two clinical applications. This metric may be adapted for similar validation tasks. PMID:14974593

  5. MicroRNAs show a wide diversity of expression profiles in the developing and mature central nervous system

    PubMed Central

    Kapsimali, Marika; Kloosterman, Wigard P; de Bruijn, Ewart; Rosa, Frederic; Plasterk, Ronald HA; Wilson, Stephen W

    2007-01-01

    Background MicroRNA (miRNA) encoding genes are abundant in vertebrate genomes but very few have been studied in any detail. Bioinformatic tools allow prediction of miRNA targets and this information coupled with knowledge of miRNA expression profiles facilitates formulation of hypotheses of miRNA function. Although the central nervous system (CNS) is a prominent site of miRNA expression, virtually nothing is known about the spatial and temporal expression profiles of miRNAs in the brain. To provide an overview of the breadth of miRNA expression in the CNS, we performed a comprehensive analysis of the neuroanatomical expression profiles of 38 abundant conserved miRNAs in developing and adult zebrafish brain. Results Our results show miRNAs have a wide variety of different expression profiles in neural cells, including: expression in neuronal precursors and stem cells (for example, miR-92b); expression associated with transition from proliferation to differentiation (for example, miR-124); constitutive expression in mature neurons (miR-124 again); expression in both proliferative cells and their differentiated progeny (for example, miR-9); regionally restricted expression (for example, miR-222 in telencephalon); and cell-type specific expression (for example, miR-218a in motor neurons). Conclusion The data we present facilitate prediction of likely modes of miRNA function in the CNS and many miRNA expression profiles are consistent with the mutual exclusion mode of function in which there is spatial or temporal exclusion of miRNAs and their targets. However, some miRNAs, such as those with cell-type specific expression, are more likely to be co-expressed with their targets. Our data provide an important resource for future functional studies of miRNAs in the CNS. PMID:17711588

  6. Conical wave propagation and diffraction in two-dimensional hexagonally packed granular lattices

    DOE PAGES

    Chong, C.; Kevrekidis, P. G.; Ablowitz, M. J.; ...

    2016-01-25

    We explore linear and nonlinear mechanisms for conical wave propagation in two-dimensional lattices in the realm of phononic crystals. As a prototypical example, a statically compressed granular lattice of spherical particles arranged in a hexagonal packing configuration is analyzed. Upon identifying the dispersion relation of the underlying linear problem, the resulting diffraction properties are considered. Analysis both via a heuristic argument for the linear propagation of a wave packet and via asymptotic analysis leading to the derivation of a Dirac system suggests the occurrence of conical diffraction. This analysis is valid for strong precompression, i.e., near the linear regime. Formore » weak precompression, conical wave propagation is still possible, but the resulting expanding circular wave front is of a nonoscillatory nature, resulting from the complex interplay among the discreteness, nonlinearity, and geometry of the packing. Lastly, the transition between these two types of propagation is explored.« less

  7. Studies in astronomical time series analysis. I - Modeling random processes in the time domain

    NASA Technical Reports Server (NTRS)

    Scargle, J. D.

    1981-01-01

    Several random process models in the time domain are defined and discussed. Attention is given to the moving average model, the autoregressive model, and relationships between and combinations of these models. Consideration is then given to methods for investigating pulse structure, procedures of model construction, computational methods, and numerical experiments. A FORTRAN algorithm of time series analysis has been developed which is relatively stable numerically. Results of test cases are given to study the effect of adding noise and of different distributions for the pulse amplitudes. A preliminary analysis of the light curve of the quasar 3C 272 is considered as an example.

  8. Solar electric geocentric transfer with attitude constraints: Analysis

    NASA Technical Reports Server (NTRS)

    Sackett, L. L.; Malchow, H. L.; Delbaum, T. N.

    1975-01-01

    A time optimal or nearly time optimal trajectory program was developed for solar electric geocentric transfer with or without attitude constraints and with an optional initial high thrust stage. The method of averaging reduces computation time. A nonsingular set of orbital elements is used. The constraints, which are those of one of the SERT-C designs, introduce complexities into the analysis and the solution yields possible discontinuous changes in thrust direction. The power degradation due to VanAllen radiation is modeled analytically. A wide range of solar cell characteristics is assumed. Effects such as oblateness and shadowing are included. The analysis and the results of many example runs are included.

  9. A forestry application simulation of man-machine techniques for analyzing remotely sensed data

    NASA Technical Reports Server (NTRS)

    Berkebile, J.; Russell, J.; Lube, B.

    1976-01-01

    The typical steps in the analysis of remotely sensed data for a forestry applications example are simulated. The example uses numerically-oriented pattern recognition techniques and emphasizes man-machine interaction.

  10. Lipids of aquatic sediments, recent and ancient

    NASA Technical Reports Server (NTRS)

    Eglinton, G.; Hajibrahim, S. K.; Maxwell, J. R.; Quirke, J. M. E.; Shaw, G. J.; Volkman, J. K.; Wardroper, A. M. K.

    1979-01-01

    Computerized gas chromatography-mass spectrometry (GC-MS) is now an essential tool in the analysis of the complex mixtures of lipids (geolipids) encountered in aquatic sediments, both 'recent' (less than 1 million years old) and ancient. The application of MS, and particularly GC-MS, has been instrumental in the rapid development of organic geochemistry and environmental organic chemistry in recent years. The techniques used have resulted in the identification of numerous compounds of a variety of types in sediments. Most attention has been concentrated on molecules of limited size, mainly below 500 molecular mass, and of limited functionality, for examples, hydrocarbons, fatty acids and alcohols. Examples from recent studies (at Bristol) of contemporary, 'recent' and ancient sediments are presented and discussed.

  11. Space physics education via examples in the undergraduate physics curriculum

    NASA Astrophysics Data System (ADS)

    Martin, R.; Holland, D. L.

    2011-12-01

    The field of space physics is rich with examples of basic physics and analysis techniques, yet it is rarely seen in physics courses or textbooks. As space physicists in an undergraduate physics department we like to use research to inform teaching, and we find that students respond well to examples from magnetospheric science. While we integrate examples into general education courses as well, this talk will focus on physics major courses. Space physics examples are typically selected to illustrate a particular concept or method taught in the course. Four examples will be discussed, from an introductory electricity and magnetism course, a mechanics/nonlinear dynamics course, a computational physics course, and a plasma physics course. Space physics provides examples of many concepts from introductory E&M, including the application of Faraday's law to terrestrial magnetic storm effects and the use of the basic motion of charged particles as a springboard to discussion of the inner magnetosphere and the aurora. In the mechanics and nonlinear dynamics courses, the motion of charged particles in a magnetotail current sheet magnetic field is treated as a Newtonian dynamical system, illustrating the Poincaré surface-of-section technique, the partitioning of phase space, and the KAM theorem. Neural network time series analysis of AE data is used as an example in the computational physics course. Finally, among several examples, current sheet particle dynamics is utilized in the plasma physics course to illustrate the notion of adiabatic/guiding center motion and the breakdown of the adiabatic approximation. We will present short descriptions of our pedagogy and student assignments in this "backdoor" method of space physics education.

  12. Interactive Visualization of Healthcare Data Using Tableau.

    PubMed

    Ko, Inseok; Chang, Hyejung

    2017-10-01

    Big data analysis is receiving increasing attention in many industries, including healthcare. Visualization plays an important role not only in intuitively showing the results of data analysis but also in the whole process of collecting, cleaning, analyzing, and sharing data. This paper presents a procedure for the interactive visualization and analysis of healthcare data using Tableau as a business intelligence tool. Starting with installation of the Tableau Desktop Personal version 10.3, this paper describes the process of understanding and visualizing healthcare data using an example. The example data of colon cancer patients were obtained from health insurance claims in years 2012 and 2013, provided by the Health Insurance Review and Assessment Service. To explore the visualization of healthcare data using Tableau for beginners, this paper describes the creation of a simple view for the average length of stay of colon cancer patients. Since Tableau provides various visualizations and customizations, the level of analysis can be increased with small multiples, view filtering, mark cards, and Tableau charts. Tableau is a software that can help users explore and understand their data by creating interactive visualizations. The software has the advantages that it can be used in conjunction with almost any database, and it is easy to use by dragging and dropping to create an interactive visualization expressing the desired format.

  13. Modal energy analysis for mechanical systems excited by spatially correlated loads

    NASA Astrophysics Data System (ADS)

    Zhang, Peng; Fei, Qingguo; Li, Yanbin; Wu, Shaoqing; Chen, Qiang

    2018-10-01

    MODal ENergy Analysis (MODENA) is an energy-based method, which is proposed to deal with vibroacoustic problems. The performance of MODENA on the energy analysis of a mechanical system under spatially correlated excitation is investigated. A plate/cavity coupling system excited by a pressure field is studied in a numerical example, in which four kinds of pressure fields are involved, which include the purely random pressure field, the perfectly correlated pressure field, the incident diffuse field, and the turbulent boundary layer pressure fluctuation. The total energies of subsystems differ to reference solution only in the case of purely random pressure field and only for the non-excited subsystem (the cavity). A deeper analysis on the scale of modal energy is further conducted via another numerical example, in which two structural modes excited by correlated forces are coupled with one acoustic mode. A dimensionless correlation strength factor is proposed to determine the correlation strength between modal forces. Results show that the error on modal energy increases with the increment of the correlation strength factor. A criterion is proposed to establish a link between the error and the correlation strength factor. According to the criterion, the error is negligible when the correlation strength is weak, in this situation the correlation strength factor is less than a critical value.

  14. Statistical correlation of structural mode shapes from test measurements and NASTRAN analytical values

    NASA Technical Reports Server (NTRS)

    Purves, L.; Strang, R. F.; Dube, M. P.; Alea, P.; Ferragut, N.; Hershfeld, D.

    1983-01-01

    The software and procedures of a system of programs used to generate a report of the statistical correlation between NASTRAN modal analysis results and physical tests results from modal surveys are described. Topics discussed include: a mathematical description of statistical correlation, a user's guide for generating a statistical correlation report, a programmer's guide describing the organization and functions of individual programs leading to a statistical correlation report, and a set of examples including complete listings of programs, and input and output data.

  15. Heat loading of hoist brakes by example of drum brakes

    NASA Astrophysics Data System (ADS)

    Vöth, S.; Vasilyeva, M. A.

    2017-10-01

    Due to the technological development in drive technology, drives controlled by frequency inverters in hoists of cranes are almost standard. Since these drives offer the possibility of electric braking, the operation of the mechanical brakes changes as a result. As a result, the mechanical brakes are used more rarely and, if so, more likely in critical operating conditions. In this paper, an analysis of the changes that occur in the structure under the influence of thermal load is presented.

  16. Automatic Adviser on stationary devices status identification and anticipated change

    NASA Astrophysics Data System (ADS)

    Shabelnikov, A. N.; Liabakh, N. N.; Gibner, Ya M.; Pushkarev, E. A.

    2018-05-01

    A task is defined to synthesize an Automatic Adviser to identify the automation systems stationary devices status using an autoregressive model of changing their key parameters. An applied model type was rationalized and the research objects monitoring process algorithm was developed. A complex of mobile objects status operation simulation and prediction results analysis was proposed. Research results are commented using a specific example of a hump yard compressor station. The work was supported by the Russian Fundamental Research Fund, project No. 17-20-01040.

  17. Similarity analysis between quantum images

    NASA Astrophysics Data System (ADS)

    Zhou, Ri-Gui; Liu, XingAo; Zhu, Changming; Wei, Lai; Zhang, Xiafen; Ian, Hou

    2018-06-01

    Similarity analyses between quantum images are so essential in quantum image processing that it provides fundamental research for the other fields, such as quantum image matching, quantum pattern recognition. In this paper, a quantum scheme based on a novel quantum image representation and quantum amplitude amplification algorithm is proposed. At the end of the paper, three examples and simulation experiments show that the measurement result must be 0 when two images are same, and the measurement result has high probability of being 1 when two images are different.

  18. Quantum adiabatic computation and adiabatic conditions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wei Zhaohui; Ying Mingsheng

    2007-08-15

    Recently, quantum adiabatic computation has attracted more and more attention in the literature. It is a novel quantum computation model based on adiabatic approximation, and the analysis of a quantum adiabatic algorithm depends highly on the adiabatic conditions. However, it has been pointed out that the traditional adiabatic conditions are problematic. Thus, results obtained previously should be checked and sufficient adiabatic conditions applicable to adiabatic computation should be proposed. Based on a result of Tong et al. [Phys. Rev. Lett. 98, 150402 (2007)], we propose a modified adiabatic criterion which is more applicable to the analysis of adiabatic algorithms. Asmore » an example, we prove the validity of the local adiabatic search algorithm by employing our criterion.« less

  19. Modeling and Analysis of Large Amplitude Flight Maneuvers

    NASA Technical Reports Server (NTRS)

    Anderson, Mark R.

    2004-01-01

    Analytical methods for stability analysis of large amplitude aircraft motion have been slow to develop because many nonlinear system stability assessment methods are restricted to a state-space dimension of less than three. The proffered approach is to create regional cell-to-cell maps for strategically located two-dimensional subspaces within the higher-dimensional model statespace. These regional solutions capture nonlinear behavior better than linearized point solutions. They also avoid the computational difficulties that emerge when attempting to create a cell map for the entire state-space. Example stability results are presented for a general aviation aircraft and a micro-aerial vehicle configuration. The analytical results are consistent with characteristics that were discovered during previous flight-testing.

  20. Knowledge Support and Automation for Performance Analysis with PerfExplorer 2.0

    DOE PAGES

    Huck, Kevin A.; Malony, Allen D.; Shende, Sameer; ...

    2008-01-01

    The integration of scalable performance analysis in parallel development tools is difficult. The potential size of data sets and the need to compare results from multiple experiments presents a challenge to manage and process the information. Simply to characterize the performance of parallel applications running on potentially hundreds of thousands of processor cores requires new scalable analysis techniques. Furthermore, many exploratory analysis processes are repeatable and could be automated, but are now implemented as manual procedures. In this paper, we will discuss the current version of PerfExplorer, a performance analysis framework which provides dimension reduction, clustering and correlation analysis ofmore » individual trails of large dimensions, and can perform relative performance analysis between multiple application executions. PerfExplorer analysis processes can be captured in the form of Python scripts, automating what would otherwise be time-consuming tasks. We will give examples of large-scale analysis results, and discuss the future development of the framework, including the encoding and processing of expert performance rules, and the increasing use of performance metadata.« less

  1. Meshless methods in shape optimization of linear elastic and thermoelastic solids

    NASA Astrophysics Data System (ADS)

    Bobaru, Florin

    This dissertation proposes a meshless approach to problems in shape optimization of elastic and thermoelastic solids. The Element-free Galerkin (EFG) method is used for this purpose. The ability of the EFG to avoid remeshing, that is normally done in a Finite Element approach to correct highly distorted meshes, is clearly demonstrated by several examples. The shape optimization example of a thermal cooling fin shows a dramatic improvement in the objective compared to a previous FEM analysis. More importantly, the new solution, displaying large shape changes contrasted to the initial design, was completely missed by the FEM analysis. The EFG formulation given here for shape optimization "uncovers" new solutions that are, apparently, unobtainable via a FEM approach. This is one of the main achievements of our work. The variational formulations for the analysis problem and for the sensitivity problems are obtained with a penalty method for imposing the displacement boundary conditions. The continuum formulation is general and this facilitates 2D and 3D with minor differences from one another. Also, transient thermoelastic problems can use the present development at each time step to solve shape optimization problems for time-dependent thermal problems. For the elasticity framework, displacement sensitivity is obtained in the EFG context. Excellent agreements with analytical solutions for some test problems are obtained. The shape optimization of a fillet is carried out in great detail, and results show significant improvement of the EFG solution over the FEM or the Boundary Element Method solutions. In our approach we avoid differentiating the complicated EFG shape functions, with respect to the shape design parameters, by using a particular discretization for sensitivity calculations. Displacement and temperature sensitivities are formulated for the shape optimization of a linear thermoelastic solid. Two important examples considered in this work, the optimization of a thermal fin and of a uniformly loaded thermoelastic beam, reveal new characteristics of the EFG method in shape optimization applications. Among other advantages of the EFG method over traditional FEM treatments of shape optimization problems, some of the most important ones are shown to be: elimination of post-processing for stress and strain recovery that directly gives more accurate results in critical positions (near the boundaries, for example) for shape optimization problems; nodes movement flexibility that permits new, better shapes (previously missed by an FEM analysis) to be discovered. Several new research directions that need further consideration are exposed.

  2. Global asymptotic stability analysis of bidirectional associative memory neural networks with time delays.

    PubMed

    Arik, Sabri

    2005-05-01

    This paper presents a sufficient condition for the existence, uniqueness and global asymptotic stability of the equilibrium point for bidirectional associative memory (BAM) neural networks with distributed time delays. The results impose constraint conditions on the network parameters of neural system independently of the delay parameter, and they are applicable to all continuous nonmonotonic neuron activation functions. It is shown that in some special cases of the results, the stability criteria can be easily checked. Some examples are also given to compare the results with the previous results derived in the literature.

  3. MSFC crack growth analysis computer program, version 2 (users manual)

    NASA Technical Reports Server (NTRS)

    Creager, M.

    1976-01-01

    An updated version of the George C. Marshall Space Flight Center Crack Growth Analysis Program is described. The updated computer program has significantly expanded capabilities over the original one. This increased capability includes an extensive expansion of the library of stress intensity factors, plotting capability, increased design iteration capability, and the capability of performing proof test logic analysis. The technical approaches used within the computer program are presented, and the input and output formats and options are described. Details of the stress intensity equations, example data, and example problems are presented.

  4. Framework for Comparative Risk Analysis of Dredged Material Disposal Options.

    DTIC Science & Technology

    1986-10-01

    TC3898-62 DACU67-85-D-8829 UNCLASSIFIED F/G 24/3 NL 125 ൖ ൘ ilil;1III -I uPSDDAR UTReports m ~ Puget Sound Dredged DipslAnalysis e~ od Washington State...I rB T T for Puget Sound Dredged Disposal Analysis c/o U.S. Army Corps of Engineers Seattle District 1 A" October, 1986 l-jq .__ .. _ Tetra Tech, Inc...priority pollutants C-2 E-1 Hypothetical example of total or bulk contaminant concentrations in four Puget Sound sediments E-1 E-2 ’Hypothetical example

  5. Conformations of Substituted Ethanes.

    ERIC Educational Resources Information Center

    Kingsbury, Charles A.

    1979-01-01

    Reviews state-of-the-art of conformational analysis and factors which affect it. Emphasizes sp-3 hybridized acrylic molecules. Provides examples on the importance of certain factors in determining conformation. Purpose, is to provide examples for examination questions. (Author/SA)

  6. Interactive K-Means Clustering Method Based on User Behavior for Different Analysis Target in Medicine.

    PubMed

    Lei, Yang; Yu, Dai; Bin, Zhang; Yang, Yang

    2017-01-01

    Clustering algorithm as a basis of data analysis is widely used in analysis systems. However, as for the high dimensions of the data, the clustering algorithm may overlook the business relation between these dimensions especially in the medical fields. As a result, usually the clustering result may not meet the business goals of the users. Then, in the clustering process, if it can combine the knowledge of the users, that is, the doctor's knowledge or the analysis intent, the clustering result can be more satisfied. In this paper, we propose an interactive K -means clustering method to improve the user's satisfactions towards the result. The core of this method is to get the user's feedback of the clustering result, to optimize the clustering result. Then, a particle swarm optimization algorithm is used in the method to optimize the parameters, especially the weight settings in the clustering algorithm to make it reflect the user's business preference as possible. After that, based on the parameter optimization and adjustment, the clustering result can be closer to the user's requirement. Finally, we take an example in the breast cancer, to testify our method. The experiments show the better performance of our algorithm.

  7. Multiscale integral analysis of a HT leakage in a fusion nuclear power plant

    NASA Astrophysics Data System (ADS)

    Velarde, M.; Fradera, J.; Perlado, J. M.; Zamora, I.; Martínez-Saban, E.; Colomer, C.; Briani, P.

    2016-05-01

    The present work presents an example of the application of an integral methodology based on a multiscale analysis that covers the whole tritium cycle within a nuclear fusion power plant, from a micro scale, analyzing key components where tritium is leaked through permeation, to a macro scale, considering its atmospheric transport. A leakage from the Nuclear Power Plants, (NPP) primary to the secondary side of a heat exchanger (HEX) is considered for the present example. Both primary and secondary loop coolants are assumed to be He. Leakage is placed inside the HEX, leaking tritium in elementary tritium (HT) form to the secondary loop where it permeates through the piping structural material to the exterior. The Heating Ventilation and Air Conditioning (HVAC) system removes the leaked tritium towards the NPP exhaust. The HEX is modelled with system codes and coupled to Computational Fluid Dynamic (CFD) to account for tritium dispersion inside the nuclear power plants buildings and in site environment. Finally, tritium dispersion is calculated with an atmospheric transport code and a dosimetry analysis is carried out. Results show how the implemented methodology is capable of assessing the impact of tritium from the microscale to the atmospheric scale including the dosimetric aspect.

  8. Dome diagnostics system of optical parameters and characteristics of LEDs

    NASA Astrophysics Data System (ADS)

    Peretyagin, Vladimir S.; Pavlenko, Nikita A.

    2017-09-01

    Scientific and technological progress of recent years in the production of the light emitting diodes (LEDs) has led to the expansion of areas of their application from the simplest systems to high precision lighting devices used in various fields of human activity. However, development and production (especially mass production) of LED lighting devices are impossible without a thorough analysis of its parameters and characteristics. There are many ways and devices for analysis the spatial, energy and colorimetric parameters of LEDs. The most methods are intended for definition only one parameter (for example, luminous flux) or one characteristic (for example, the angular distribution of energy or the spectral characteristics). Besides, devices used these methods are intended for measuring parameters in only one point or plane. This problem can be solved by using a dome diagnostics system of optical parameters and characteristics of LEDs, developed by specialists of the department OEDS chair of ITMO University in Russia. The paper presents the theoretical aspects of the analysis of LED's spatial (angular), energy and color parameters by using mentioned of diagnostics system. The article also presents the results of spatial), energy and color parameters measurements of some LEDs brands.

  9. Introduction to methodology of dose-response meta-analysis for binary outcome: With application on software.

    PubMed

    Zhang, Chao; Jia, Pengli; Yu, Liu; Xu, Chang

    2018-05-01

    Dose-response meta-analysis (DRMA) is widely applied to investigate the dose-specific relationship between independent and dependent variables. Such methods have been in use for over 30 years and are increasingly employed in healthcare and clinical decision-making. In this article, we give an overview of the methodology used in DRMA. We summarize the commonly used regression model and the pooled method in DRMA. We also use an example to illustrate how to employ a DRMA by these methods. Five regression models, linear regression, piecewise regression, natural polynomial regression, fractional polynomial regression, and restricted cubic spline regression, were illustrated in this article to fit the dose-response relationship. And two types of pooling approaches, that is, one-stage approach and two-stage approach are illustrated to pool the dose-response relationship across studies. The example showed similar results among these models. Several dose-response meta-analysis methods can be used for investigating the relationship between exposure level and the risk of an outcome. However the methodology of DRMA still needs to be improved. © 2018 Chinese Cochrane Center, West China Hospital of Sichuan University and John Wiley & Sons Australia, Ltd.

  10. [Effects decomposition in mediation analysis: a numerical example].

    PubMed

    Zugna, Daniela; Richiardi, Lorenzo

    2018-01-01

    Mediation analysis aims to decompose the total effect of the exposure on the outcome into a direct effect (unmediated) and an indirect effect (mediated by a mediator). When the interest also lies on understanding whether the exposure effect differs in different sub-groups of study population or under different scenarios, the mediation analysis needs to be integrated with interaction analysis. In this setting it is necessary to decompose the total effect not only into two components, the direct and indirect effects, but other two components linked to interaction. The interaction between the exposure and the mediator in their effect on the outcome could indeed act through the effect of the exposure on the mediator or through the mediator when the mediator is not totally explained by the exposure. We describe options for decomposition, proposed in literature, of the total effect and we illustrate them through a hypothetical example of the effect of age at diagnosis of cancer on survival, mediated and unmediated by the therapeutical approach, and a numerical example.

  11. Reverse engineering the physical chemistry of making Egyptian faience through compositional analysis of the cementation process

    NASA Astrophysics Data System (ADS)

    Pourattar, Parisa

    The cementation process of making Egyptian faience, reported by Hans Wulff from a workshop in Qom, Iran, has not been easy to replicate and various views have been set forth to understand the transport of materials from the glazing powder to the surfaces of the crushed quartz beads. Replications of the process fired to 950° C and under-fired to 850° C were characterized by electron beam microprobe analysis (EPMA), petrographic thin section analysis, and scanning electron microscopy with energy dispersive x-ray analysis (SEM-EDS). Chemical variations were modeled using thermal data, phase diagrams, and copper vaporization experiments. These replications were compared to 52 examples from various collections, including 20th century ethnographic collections of beads, glazing powder and plant ash, 12th century CE beads and glazing powder from Fustat (Old Cairo), Egypt, and to an earlier example from Abydos, Egypt in the New Kingdom and to an ash example from the Smithsonian Institution National Museum of Natural History.

  12. Studies on the properties of an epithermal-neutron hydrogen analyzer.

    PubMed

    Papp, A; Csikai, J

    2010-09-01

    Systematic investigations have proved the advantages of the Epithermal Neutron Analyzer (ETNA) for bulk hydrogen analysis as compared to the thermal neutron techniques. Results can contribute, for example, to the design and construction of instruments needed for the detection and identification of plastic anti-personnel landmines, explosives hidden in airline baggage and cargo containers via hydrogen contents as an indicator of their presence.

  13. Definite Integral Automatic Analysis Mechanism Research and Development Using the "Find the Area by Integration" Unit as an Example

    ERIC Educational Resources Information Center

    Ting, Mu Yu

    2017-01-01

    Using the capabilities of expert knowledge structures, the researcher prepared test questions on the university calculus topic of "finding the area by integration." The quiz is divided into two types of multiple choice items (one out of four and one out of many). After the calculus course was taught and tested, the results revealed that…

  14. A CFBPN Artificial Neural Network Model for Educational Qualitative Data Analyses: Example of Students' Attitudes Based on Kellerts' Typologies

    ERIC Educational Resources Information Center

    Yorek, Nurettin; Ugulu, Ilker

    2015-01-01

    In this study, artificial neural networks are suggested as a model that can be "trained" to yield qualitative results out of a huge amount of categorical data. It can be said that this is a new approach applied in educational qualitative data analysis. In this direction, a cascade-forward back-propagation neural network (CFBPN) model was…

  15. Characterization and Dynamic Analysis of Long-Cavity Multi-Section Gain- Levered Quantum-Dot Lasers

    DTIC Science & Technology

    2013-03-01

    test setup .................................................................... 8 Figure 5: Comparison of a Fabry – Perot and distributed feedback...for example Fabry – Perot and distributed-feedback designs), with each possessing advantages and disadvantages that will be discussed in detail in...contrast to Fabry – Perot cavities (two discrete mirrors) that result in lasing over multiple longitudinal modes supported by the cavity. Figure 5 shows

  16. Additive Manufacturing in the Marine Corps

    DTIC Science & Technology

    2015-06-01

    commonly referred to as 3D printing. This thesis answers the question of how additive manufacturing can improve the effectiveness of Marine Corps...analysis of current and future 3D -printing processes, examination of several civilian and military examples, and examination of the impact across...fully integrating 3D printers, such as the lack of certification and qualification standards, unreliable end product results, and determining ownership

  17. Geochronological and lead-isotope evidences for rapid crust formation in middle-proterozoic time: The Labrador example

    NASA Technical Reports Server (NTRS)

    Schaerer, Urs

    1988-01-01

    Extensive U-Pb geochronological studies in the Grenville and Makkovik provinces have shown that eastern Labrador is underlain by two distinct crustal blocks. In order to substantiate the juvenile character of the middle-Proterozoic crustal block, the isotopic compositon of lead in leached k-feldspars from the same rocks were analyzed. The results of the analysis are briefly discussed.

  18. Simplification of multiple Fourier series - An example of algorithmic approach

    NASA Technical Reports Server (NTRS)

    Ng, E. W.

    1981-01-01

    This paper describes one example of multiple Fourier series which originate from a problem of spectral analysis of time series data. The example is exercised here with an algorithmic approach which can be generalized for other series manipulation on a computer. The generalized approach is presently pursued towards applications to a variety of multiple series and towards a general purpose algorithm for computer algebra implementation.

  19. Reference Model for Project Support Environments Version 1.0

    DTIC Science & Technology

    1993-02-28

    relationship with the framework’s Process Support services and with the Lifecycle Process Engineering services. Examples: "* ORCA (Object-based...Design services. Examples: "* ORCA (Object-based Requirements Capture and Analysis). "* RETRAC (REquirements TRACeability). 4.3 Life-Cycle Process...34traditional" computer tools. Operations: Examples of audio and video processing operations include: "* Create, modify, and delete sound and video data

  20. Wave chaos in a randomly inhomogeneous waveguide: spectral analysis of the finite-range evolution operator.

    PubMed

    Makarov, D V; Kon'kov, L E; Uleysky, M Yu; Petrov, P S

    2013-01-01

    The problem of sound propagation in a randomly inhomogeneous oceanic waveguide is considered. An underwater sound channel in the Sea of Japan is taken as an example. Our attention is concentrated on the domains of finite-range ray stability in phase space and their influence on wave dynamics. These domains can be found by means of the one-step Poincare map. To study manifestations of finite-range ray stability, we introduce the finite-range evolution operator (FREO) describing transformation of a wave field in the course of propagation along a finite segment of a waveguide. Carrying out statistical analysis of the FREO spectrum, we estimate the contribution of regular domains and explore their evanescence with increasing length of the segment. We utilize several methods of spectral analysis: analysis of eigenfunctions by expanding them over modes of the unperturbed waveguide, approximation of level-spacing statistics by means of the Berry-Robnik distribution, and the procedure used by A. Relano and coworkers [Relano et al., Phys. Rev. Lett. 89, 244102 (2002); Relano, Phys. Rev. Lett. 100, 224101 (2008)]. Comparing the results obtained with different methods, we find that the method based on the statistical analysis of FREO eigenfunctions is the most favorable for estimating the contribution of regular domains. It allows one to find directly the waveguide modes whose refraction is regular despite the random inhomogeneity. For example, it is found that near-axial sound propagation in the Sea of Japan preserves stability even over distances of hundreds of kilometers due to the presence of a shearless torus in the classical phase space. Increasing the acoustic wavelength degrades scattering, resulting in recovery of eigenfunction localization near periodic orbits of the one-step Poincaré map.

  1. [Database supported electronic retrospective analyses in radiation oncology: establishing a workflow using the example of pancreatic cancer].

    PubMed

    Kessel, K A; Habermehl, D; Bohn, C; Jäger, A; Floca, R O; Zhang, L; Bougatf, N; Bendl, R; Debus, J; Combs, S E

    2012-12-01

    Especially in the field of radiation oncology, handling a large variety of voluminous datasets from various information systems in different documentation styles efficiently is crucial for patient care and research. To date, conducting retrospective clinical analyses is rather difficult and time consuming. With the example of patients with pancreatic cancer treated with radio-chemotherapy, we performed a therapy evaluation by using an analysis system connected with a documentation system. A total number of 783 patients have been documented into a professional, database-based documentation system. Information about radiation therapy, diagnostic images and dose distributions have been imported into the web-based system. For 36 patients with disease progression after neoadjuvant chemoradiation, we designed and established an analysis workflow. After an automatic registration of the radiation plans with the follow-up images, the recurrence volumes are segmented manually. Based on these volumes the DVH (dose volume histogram) statistic is calculated, followed by the determination of the dose applied to the region of recurrence. All results are saved in the database and included in statistical calculations. The main goal of using an automatic analysis tool is to reduce time and effort conducting clinical analyses, especially with large patient groups. We showed a first approach and use of some existing tools, however manual interaction is still necessary. Further steps need to be taken to enhance automation. Already, it has become apparent that the benefits of digital data management and analysis lie in the central storage of data and reusability of the results. Therefore, we intend to adapt the analysis system to other types of tumors in radiation oncology.

  2. On summary measure analysis of linear trend repeated measures data: performance comparison with two competing methods.

    PubMed

    Vossoughi, Mehrdad; Ayatollahi, S M T; Towhidi, Mina; Ketabchi, Farzaneh

    2012-03-22

    The summary measure approach (SMA) is sometimes the only applicable tool for the analysis of repeated measurements in medical research, especially when the number of measurements is relatively large. This study aimed to describe techniques based on summary measures for the analysis of linear trend repeated measures data and then to compare performances of SMA, linear mixed model (LMM), and unstructured multivariate approach (UMA). Practical guidelines based on the least squares regression slope and mean of response over time for each subject were provided to test time, group, and interaction effects. Through Monte Carlo simulation studies, the efficacy of SMA vs. LMM and traditional UMA, under different types of covariance structures, was illustrated. All the methods were also employed to analyze two real data examples. Based on the simulation and example results, it was found that the SMA completely dominated the traditional UMA and performed convincingly close to the best-fitting LMM in testing all the effects. However, the LMM was not often robust and led to non-sensible results when the covariance structure for errors was misspecified. The results emphasized discarding the UMA which often yielded extremely conservative inferences as to such data. It was shown that summary measure is a simple, safe and powerful approach in which the loss of efficiency compared to the best-fitting LMM was generally negligible. The SMA is recommended as the first choice to reliably analyze the linear trend data with a moderate to large number of measurements and/or small to moderate sample sizes.

  3. [Computer-assisted analysis of the results of training in internal medicine].

    PubMed

    Vrbová, H; Spunda, M

    1991-06-01

    Analysis of the results of teaching of clinical disciplines has in the long run an impact on the standard and value of medical care. It requires processing of quantitative and qualitative data. The selection of indicators which will be followed up and procedures used for their processing are of fundamental importance. The submitted investigation is an example how to use possibilities to process results of effectiveness analysis in teaching internal medicine by means of computer technique. As an indicator of effectiveness the authors selected the percentage of students who had an opportunity during the given period of their studies to observe a certain pathological condition, and as method of data collection a survey by means of questionnaires was used. The task permits to differentiate the students' experience (whether the student examined the patient himself or whether the patient was only demonstrated) and it makes it possible to differentiate the place of observation (at the university teaching hospital or regional non-teaching hospital attachment). The task permits also to form sub-groups of respondents to combine them as desired and to compare their results. The described computer programme support comprises primary processing of the output of the questionnaire survey. The questionnaires are transformed and stored by groups of respondents in data files of suitable format (programme SDFORM); the processing of results is described as well as their presentation as output listing or on the display in the interactive way (SDRESULT programme). Using the above programmes, the authors processed the results of a survey made among students during and after completion of the studies in a series of 70 recommended pathological conditions. As an example the authors compare results of observations in 20 selected pathological conditions important for the diagnosis and therapy in primary care in the final stage of the medical course in 1981 and 1985.

  4. LSENS, a general chemical kinetics and sensitivity analysis code for homogeneous gas-phase reactions. 2: Code description and usage

    NASA Technical Reports Server (NTRS)

    Radhakrishnan, Krishnan; Bittker, David A.

    1994-01-01

    LSENS, the Lewis General Chemical Kinetics Analysis Code, has been developed for solving complex, homogeneous, gas-phase chemical kinetics problems and contains sensitivity analysis for a variety of problems, including nonisothermal situations. This report is part 2 of a series of three reference publications that describe LSENS, provide a detailed guide to its usage, and present many example problems. Part 2 describes the code, how to modify it, and its usage, including preparation of the problem data file required to execute LSENS. Code usage is illustrated by several example problems, which further explain preparation of the problem data file and show how to obtain desired accuracy in the computed results. LSENS is a flexible, convenient, accurate, and efficient solver for chemical reaction problems such as static system; steady, one-dimensional, inviscid flow; reaction behind incident shock wave, including boundary layer correction; and perfectly stirred (highly backmixed) reactor. In addition, the chemical equilibrium state can be computed for the following assigned states: temperature and pressure, enthalpy and pressure, temperature and volume, and internal energy and volume. For static problems the code computes the sensitivity coefficients of the dependent variables and their temporal derivatives with respect to the initial values of the dependent variables and/or the three rate coefficient parameters of the chemical reactions. Part 1 (NASA RP-1328) derives the governing equations describes the numerical solution procedures for the types of problems that can be solved by lSENS. Part 3 (NASA RP-1330) explains the kinetics and kinetics-plus-sensitivity-analysis problems supplied with LSENS and presents sample results.

  5. Determining characteristics of artificial near-Earth objects using observability analysis

    NASA Astrophysics Data System (ADS)

    Friedman, Alex M.; Frueh, Carolin

    2018-03-01

    Observability analysis is a method for determining whether a chosen state of a system can be determined from the output or measurements. Knowledge of state information availability resulting from observability analysis leads to improved sensor tasking for observation of orbital debris and better control of active spacecraft. This research performs numerical observability analysis of artificial near-Earth objects. Analysis of linearization methods and state transition matrices is performed to determine the viability of applying linear observability methods to the nonlinear orbit problem. Furthermore, pre-whitening is implemented to reformulate classical observability analysis. In addition, the state in observability analysis is typically composed of position and velocity; however, including object characteristics beyond position and velocity can be crucial for precise orbit propagation. For example, solar radiation pressure has a significant impact on the orbit of high area-to-mass ratio objects in geosynchronous orbit. Therefore, determining the time required for solar radiation pressure parameters to become observable is important for understanding debris objects. In order to compare observability analysis results with and without measurement noise and an extended state, quantitative measures of observability are investigated and implemented.

  6. Amniotic Fluid Analysis

    MedlinePlus

    ... serious consequences for the developing fetus. A few examples include tests for: TORCH : toxoplasmosis, rubella, cytomegalovirus (CMV), herpes simplex virus (HSV) Parvovirus B19 Cultures for bacterial ... may be performed in select situations, for example, if a woman is very unsure of her ...

  7. The application of the electrodynamic separator in minerals beneficiation

    NASA Astrophysics Data System (ADS)

    Skowron, M.; Syrek, P.; Surowiak, A.

    2017-05-01

    The aim of presented paper is elaboration of methodology of upgrading natural minerals in example of chalcocite and bornite sample. The results were obtained by means of laboratory drum separator. This device operates in accordance to properties of materials, which in this case was electrical conductivity. The study contains the analysis of the forces occurring inside of electrodynamic separator chamber, that act on the particles of various electrical properties. Both, the potential and electric field strength distributions were calculated, with set of separators setpoints. Theoretical analysis influenced on separator parameters, and hence impacted the empirical results too. Next, the authors conducted empirical research on chalcocite and bornite beneficiation by means of electrodynamic separation. The results of this process were shown graphically in form of upgrading curves of chalcocite considering elementary copper and lead.

  8. Standardized residual as response function for order identification of multi input intervention analysis

    NASA Astrophysics Data System (ADS)

    Suhartono, Lee, Muhammad Hisyam; Rezeki, Sri

    2017-05-01

    Intervention analysis is a statistical model in the group of time series analysis which is widely used to describe the effect of an intervention caused by external or internal factors. An example of external factors that often occurs in Indonesia is a disaster, both natural or man-made disaster. The main purpose of this paper is to provide the results of theoretical studies on identification step for determining the order of multi inputs intervention analysis for evaluating the magnitude and duration of the impact of interventions on time series data. The theoretical result showed that the standardized residuals could be used properly as response function for determining the order of multi inputs intervention model. Then, these results are applied for evaluating the impact of a disaster on a real case in Indonesia, i.e. the magnitude and duration of the impact of the Lapindo mud on the volume of vehicles on the highway. Moreover, the empirical results showed that the multi inputs intervention model can describe and explain accurately the magnitude and duration of the impact of disasters on a time series data.

  9. Caring presence in practice: facilitating an appreciative discourse in nursing.

    PubMed

    du Plessis, E

    2016-09-01

    To report on an appreciation of caring presence practised by nurses in South Africa in order to facilitate an appreciative discourse in nursing and a return to caring values and attitudes. Appreciative reports on caring presence are often overlooked. Media may provide a platform for facilitating appreciation for caring presence practised by nurses. Such an appreciation may foster further practice of caring presence and re-ignite a caring ethos in nursing. This article provides an appreciative discourse on caring presence in nursing in the form of examples of caring presence practised by nurses. An anecdotal approach was followed. Social media, namely narratives on caring presence shared by nurses on a Facebook page, and formal media, namely news reports in which nurses are appreciated for their efforts, were used. Deductive content analysis was applied to analyse the narratives and news reports in relation to a definition of caring presence and types of caring presence. The analysis of the narratives and news reports resulted in an appreciative discourse in which examples of nurses practising caring presence could be provided. Examples of nurses practising caring presence could be found, and an appreciative discourse could be initiated. Appreciation ignites positive action and ownership of high-quality health care. Leadership should thus cultivate a culture of appreciating nurses, through using media, and encourage nurses to share how caring presence impact on quality in health care. © 2016 International Council of Nurses.

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Silva, F. S.

    Functionally graded components exhibit spatial variations of mechanical properties in contrast with, and as an alternative to, purely homogeneous components. A large class of graded materials, however, are in fact mostly homogeneous materials with property variations (chemical or mechanical) restricted to a specific area or layer produced by applying for example a coating or by introducing sub-surface residual stresses. However, it is also possible to obtain graded materials with a smooth transition of mechanical properties along the entire component, for example in a 40 mm component. This is possible, for example, by using centrifugal casting technique or incremental melting andmore » solidification technique. In this paper we will study fully metallic functionally graded components with a smooth gradient, focusing on fatigue crack propagation. Fatigue propagation will be assessed in the direction parallel to the gradation (in different homogeneous layers of the functionally graded component) to assess what would be fatigue crack propagation on the direction perpendicular to the gradation. Fatigue crack growth rate (standard mode I fatigue crack growth) will be correlated to the mode I stress intensity factor range. Other mechanical properties of different layers of the component (Young's modulus) will also be considered in this analysis. The effect of residual stresses along the component gradation on crack propagation will also be taken into account. A qualitative analysis of the effects of some important features, present in functionally graded materials, will be made based on the obtained results.« less

  11. Comparative analysis of time-scaling properties about water pH in Poyang Lake Inlet and Outlet on the basis of fractal methods.

    PubMed

    Shi, K; Liu, C Q; Huang, Z W; Zhang, B; Su, Y

    2010-01-01

    Detrended fluctuation analysis (DFA) and multifractal methods are applied to the time-scaling properties analysis of water pH series in Poyang Lake Inlet and Outlet in China. The results show that these pH series are characterised by long-term memory and multifractal scaling, and these characteristics have obvious differences between the Lake Inlet and Outlet. The comparison results suggest that monofractal and multifractal parameters can be quantitative dynamical indexes reflecting the capability of anti-acidification of Poyang Lake. Furthermore, we investigated the frequency-size distribution of pH series in Poyang Lake Inlet and Outlet. Our findings suggest that water pH is an example of a self-organised criticality (SOC) process. The results show that it is different SOC behaviours that result in the differences of power-law relations between pH series in Poyang Lake Inlet and Outlet. This work can be helpful to improvement of modelling of lake water quality.

  12. Analysis of defects of overhead facade systems and other light thin-walled structures

    NASA Astrophysics Data System (ADS)

    Endzhievskiy, L.; Frolovskaia, A.; Petrova, Y.

    2017-04-01

    This paper analyzes the defects and the causes of contemporary design solutions with an example of overhead facade systems with ventilated air gaps and light steel thin-walled structures on the basis of field experiments. The analysis is performed at all stages of work: design, manufacture, including quality, construction, and operation. Practical examples are given. The main causes of accidents and the accident rate prediction are looked upon and discussed.

  13. Analysis and synthesis of abstract data types through generalization from examples

    NASA Technical Reports Server (NTRS)

    Wild, Christian

    1987-01-01

    The discovery of general patterns of behavior from a set of input/output examples can be a useful technique in the automated analysis and synthesis of software systems. These generalized descriptions of the behavior form a set of assertions which can be used for validation, program synthesis, program testing, and run-time monitoring. Describing the behavior is characterized as a learning process in which the set of inputs is mapped into an appropriate transform space such that general patterns can be easily characterized. The learning algorithm must chose a transform function and define a subset of the transform space which is related to equivalence classes of behavior in the original domain. An algorithm for analyzing the behavior of abstract data types is presented and several examples are given. The use of the analysis for purposes of program synthesis is also discussed.

  14. Ablative Rayleigh Taylor instability in the limit of an infinitely large density ratio

    NASA Astrophysics Data System (ADS)

    Clavin, Paul; Almarcha, Christophe

    2005-05-01

    The instability of ablation fronts strongly accelerated toward the dense medium under the conditions of inertial confinement fusion (ICF) is addressed in the limit of an infinitely large density ratio. The analysis serves to demonstrate that the flow is irrotational to first order, reducing the nonlinear analysis to solve a two-potential flows problem. Vorticity appears at the following orders in the perturbation analysis. This result simplifies greatly the analysis. The possibility for using boundary integral methods opens new perspectives in the nonlinear theory of the ablative RT instability in ICF. A few examples are given at the end of the Note. To cite this article: P. Clavin, C. Almarcha, C. R. Mecanique 333 (2005).

  15. 3D analysis of semiconductor devices: A combination of 3D imaging and 3D elemental analysis

    NASA Astrophysics Data System (ADS)

    Fu, Bianzhu; Gribelyuk, Michael A.

    2018-04-01

    3D analysis of semiconductor devices using a combination of scanning transmission electron microscopy (STEM) Z-contrast tomography and energy dispersive spectroscopy (EDS) elemental tomography is presented. 3D STEM Z-contrast tomography is useful in revealing the depth information of the sample. However, it suffers from contrast problems between materials with similar atomic numbers. Examples of EDS elemental tomography are presented using an automated EDS tomography system with batch data processing, which greatly reduces the data collection and processing time. 3D EDS elemental tomography reveals more in-depth information about the defect origin in semiconductor failure analysis. The influence of detector shadowing and X-rays absorption on the EDS tomography's result is also discussed.

  16. An advanced probabilistic structural analysis method for implicit performance functions

    NASA Technical Reports Server (NTRS)

    Wu, Y.-T.; Millwater, H. R.; Cruse, T. A.

    1989-01-01

    In probabilistic structural analysis, the performance or response functions usually are implicitly defined and must be solved by numerical analysis methods such as finite element methods. In such cases, the most commonly used probabilistic analysis tool is the mean-based, second-moment method which provides only the first two statistical moments. This paper presents a generalized advanced mean value (AMV) method which is capable of establishing the distributions to provide additional information for reliability design. The method requires slightly more computations than the second-moment method but is highly efficient relative to the other alternative methods. In particular, the examples show that the AMV method can be used to solve problems involving non-monotonic functions that result in truncated distributions.

  17. Approximate analysis for repeated eigenvalue problems with applications to controls-structure integrated design

    NASA Technical Reports Server (NTRS)

    Kenny, Sean P.; Hou, Gene J. W.

    1994-01-01

    A method for eigenvalue and eigenvector approximate analysis for the case of repeated eigenvalues with distinct first derivatives is presented. The approximate analysis method developed involves a reparameterization of the multivariable structural eigenvalue problem in terms of a single positive-valued parameter. The resulting equations yield first-order approximations to changes in the eigenvalues and the eigenvectors associated with the repeated eigenvalue problem. This work also presents a numerical technique that facilitates the definition of an eigenvector derivative for the case of repeated eigenvalues with repeated eigenvalue derivatives (of all orders). Examples are given which demonstrate the application of such equations for sensitivity and approximate analysis. Emphasis is placed on the application of sensitivity analysis to large-scale structural and controls-structures optimization problems.

  18. Galileo environmental test and analysis program summary

    NASA Technical Reports Server (NTRS)

    Hoffman, A. R.

    1991-01-01

    This paper presents an overview of the Galileo Project's environmental test and analysis program during the spacecraft development phase - October 1978 through launch in October 1989. After describing the top-level objectives of the program, summaries of-the approach, requirements, and margins are provided. Examples of assembly- and system-level test results are given for both the pre-1986 (direct mission) testing and the post-1986 (Venus-Earth-Earth gravity assist mission) testing, including dynamic, thermal, electromagnetic compatibility (EMC), and magnetic. The approaches and results for verifying by analysis that the requirements of certain environments (e.g., radiation, micrometeoroids, and single event upsets) are satisfied are presented. The environmental program implemented on Galileo satisfied the spirit and intent of the requirements imposed by the Project during the spacecraft's development. The lessons learned from the Galileo environmental program are discussed in this paper.

  19. Determining association constants from titration experiments in supramolecular chemistry.

    PubMed

    Thordarson, Pall

    2011-03-01

    The most common approach for quantifying interactions in supramolecular chemistry is a titration of the guest to solution of the host, noting the changes in some physical property through NMR, UV-Vis, fluorescence or other techniques. Despite the apparent simplicity of this approach, there are several issues that need to be carefully addressed to ensure that the final results are reliable. This includes the use of non-linear rather than linear regression methods, careful choice of stoichiometric binding model, the choice of method (e.g., NMR vs. UV-Vis) and concentration of host, the application of advanced data analysis methods such as global analysis and finally the estimation of uncertainties and confidence intervals for the results obtained. This tutorial review will give a systematic overview of all these issues-highlighting some of the key messages herein with simulated data analysis examples.

  20. Rainy Day: A Remote Sensing-Driven Extreme Rainfall Simulation Approach for Hazard Assessment

    NASA Astrophysics Data System (ADS)

    Wright, Daniel; Yatheendradas, Soni; Peters-Lidard, Christa; Kirschbaum, Dalia; Ayalew, Tibebu; Mantilla, Ricardo; Krajewski, Witold

    2015-04-01

    Progress on the assessment of rainfall-driven hazards such as floods and landslides has been hampered by the challenge of characterizing the frequency, intensity, and structure of extreme rainfall at the watershed or hillslope scale. Conventional approaches rely on simplifying assumptions and are strongly dependent on the location, the availability of long-term rain gage measurements, and the subjectivity of the analyst. Regional and global-scale rainfall remote sensing products provide an alternative, but are limited by relatively short (~15-year) observational records. To overcome this, we have coupled these remote sensing products with a space-time resampling framework known as stochastic storm transposition (SST). SST "lengthens" the rainfall record by resampling from a catalog of observed storms from a user-defined region, effectively recreating the regional extreme rainfall hydroclimate. This coupling has been codified in Rainy Day, a Python-based platform for quickly generating large numbers of probabilistic extreme rainfall "scenarios" at any point on the globe. Rainy Day is readily compatible with any gridded rainfall dataset. The user can optionally incorporate regional rain gage or weather radar measurements for bias correction using the Precipitation Uncertainties for Satellite Hydrology (PUSH) framework. Results from Rainy Day using the CMORPH satellite precipitation product are compared with local observations in two examples. The first example is peak discharge estimation in a medium-sized (~4000 square km) watershed in the central United States performed using CUENCAS, a parsimonious physically-based distributed hydrologic model. The second example is rainfall frequency analysis for Saint Lucia, a small volcanic island in the eastern Caribbean that is prone to landslides and flash floods. The distinct rainfall hydroclimates of the two example sites illustrate the flexibility of the approach and its usefulness for hazard analysis in data-poor regions.

Top