Process Improvement Through Tool Integration in Aero-Mechanical Design
NASA Technical Reports Server (NTRS)
Briggs, Clark
2010-01-01
Emerging capabilities in commercial design tools promise to significantly improve the multi-disciplinary and inter-disciplinary design and analysis coverage for aerospace mechanical engineers. This paper explores the analysis process for two example problems of a wing and flap mechanical drive system and an aircraft landing gear door panel. The examples begin with the design solid models and include various analysis disciplines such as structural stress and aerodynamic loads. Analytical methods include CFD, multi-body dynamics with flexible bodies and structural analysis. Elements of analysis data management, data visualization and collaboration are also included.
Probabilistic wind/tornado/missile analyses for hazard and fragility evaluations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Park, Y.J.; Reich, M.
Detailed analysis procedures and examples are presented for the probabilistic evaluation of hazard and fragility against high wind, tornado, and tornado-generated missiles. In the tornado hazard analysis, existing risk models are modified to incorporate various uncertainties including modeling errors. A significant feature of this paper is the detailed description of the Monte-Carlo simulation analyses of tornado-generated missiles. A simulation procedure, which includes the wind field modeling, missile injection, solution of flight equations, and missile impact analysis, is described with application examples.
Code Analysis and Refactoring with Clang Tools, Version 0.1
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kelley, Timothy M.
2016-12-23
Code Analysis and Refactoring with Clang Tools is a small set of example code that demonstrates techniques for applying tools distributed with the open source Clang compiler. Examples include analyzing where variables are used and replacing old data structures with standard structures.
ERIC Educational Resources Information Center
Courtney, E. Wayne
This report was designed to present an example of a research study involving the use of coefficients of orthogonal comparisons in analysis of variance tests of significance. A sample research report and analysis was included so as to lead the reader through the design steps. The sample study was designed to determine the extent of attitudinal…
A Selection of Composites Simulation Practices at NASA Langley Research Center
NASA Technical Reports Server (NTRS)
Ratcliffe, James G.
2007-01-01
One of the major areas of study at NASA Langley Research Center is the development of technologies that support the use of advanced composite materials in aerospace applications. Amongst the supporting technologies are analysis tools used to simulate the behavior of these materials. This presentation will discuss a number of examples of analysis tools and simulation practices conducted at NASA Langley. The presentation will include examples of damage tolerance analyses for both interlaminar and intralaminar failure modes. Tools for modeling interlaminar failure modes include fracture mechanics and cohesive methods, whilst tools for modeling intralaminar failure involve the development of various progressive failure analyses. Other examples of analyses developed at NASA Langley include a thermo-mechanical model of an orthotropic material and the simulation of delamination growth in z-pin reinforced laminates.
Discovering Reliable Sources of Biochemical Thermodynamic Data to Aid Students' Understanding
ERIC Educational Resources Information Center
Me´ndez, Eduardo; Cerda´, María F.
2016-01-01
Students of physical chemistry in biochemical disciplines need biochemical examples to capture the need, not always understood, of a difficult area in their studies. The use of thermodynamic data in the chemical reference state may lead to incorrect interpretations in the analysis of biochemical examples when the analysis does not include relevant…
NASA Technical Reports Server (NTRS)
Nguyen, Duc T.; Storaasli, Olaf O.; Qin, Jiangning; Qamar, Ramzi
1994-01-01
An automatic differentiation tool (ADIFOR) is incorporated into a finite element based structural analysis program for shape and non-shape design sensitivity analysis of structural systems. The entire analysis and sensitivity procedures are parallelized and vectorized for high performance computation. Small scale examples to verify the accuracy of the proposed program and a medium scale example to demonstrate the parallel vector performance on multiple CRAY C90 processors are included.
Methods for the evaluation of alternative disaster warning systems
NASA Technical Reports Server (NTRS)
Agnew, C. E.; Anderson, R. J., Jr.; Lanen, W. N.
1977-01-01
For each of the methods identified, a theoretical basis is provided and an illustrative example is described. The example includes sufficient realism and detail to enable an analyst to conduct an evaluation of other systems. The methods discussed in the study include equal capability cost analysis, consumers' surplus, and statistical decision theory.
Interrupted Time Series Versus Statistical Process Control in Quality Improvement Projects.
Andersson Hagiwara, Magnus; Andersson Gäre, Boel; Elg, Mattias
2016-01-01
To measure the effect of quality improvement interventions, it is appropriate to use analysis methods that measure data over time. Examples of such methods include statistical process control analysis and interrupted time series with segmented regression analysis. This article compares the use of statistical process control analysis and interrupted time series with segmented regression analysis for evaluating the longitudinal effects of quality improvement interventions, using an example study on an evaluation of a computerized decision support system.
High temperature flow-through device for rapid solubilization and analysis
West, Jason A. A. [Castro Valley, CA; Hukari, Kyle W [San Ramon, CA; Patel, Kamlesh D [Dublin, CA; Peterson, Kenneth A [Albuquerque, NM; Renzi, Ronald F [Tracy, CA
2009-09-22
Devices and methods for thermally lysing of biological material, for example vegetative bacterial cells and bacterial spores, are provided. Hot solution methods for solubilizing bacterial spores are described. Systems for direct analysis are disclosed including thermal lysers coupled to sample preparation stations. Integrated systems capable of performing sample lysis, labeling and protein fingerprint analysis of biological material, for example, vegetative bacterial cells, bacterial spores and viruses are provided.
High temperature flow-through device for rapid solubilization and analysis
West, Jason A. A.; Hukari, Kyle W.; Patel, Kamlesh D.; Peterson, Kenneth A.; Renzi, Ronald F.
2013-04-23
Devices and methods for thermally lysing of biological material, for example vegetative bacterial cells and bacterial spores, are provided. Hot solution methods for solubilizing bacterial spores are described. Systems for direct analysis are disclosed including thermal lysers coupled to sample preparation stations. Integrated systems capable of performing sample lysis, labeling and protein fingerprint analysis of biological material, for example, vegetative bacterial cells, bacterial spores and viruses are provided.
Probabilistic structural analysis using a general purpose finite element program
NASA Astrophysics Data System (ADS)
Riha, D. S.; Millwater, H. R.; Thacker, B. H.
1992-07-01
This paper presents an accurate and efficient method to predict the probabilistic response for structural response quantities, such as stress, displacement, natural frequencies, and buckling loads, by combining the capabilities of MSC/NASTRAN, including design sensitivity analysis and fast probability integration. Two probabilistic structural analysis examples have been performed and verified by comparison with Monte Carlo simulation of the analytical solution. The first example consists of a cantilevered plate with several point loads. The second example is a probabilistic buckling analysis of a simply supported composite plate under in-plane loading. The coupling of MSC/NASTRAN and fast probability integration is shown to be orders of magnitude more efficient than Monte Carlo simulation with excellent accuracy.
... serious consequences for the developing fetus. A few examples include tests for: TORCH : toxoplasmosis, rubella, cytomegalovirus (CMV), herpes simplex virus (HSV) Parvovirus B19 Cultures for bacterial ... may be performed in select situations, for example, if a woman is very unsure of her ...
Benefit-Cost Analysis of Integrated Paratransit Systems : Volume 6. Technical Appendices.
DOT National Transportation Integrated Search
1979-09-01
This last volume, includes five technical appendices which document the methodologies used in the benefit-cost analysis. They are the following: Scenario analysis methodology; Impact estimation; Example of impact estimation; Sensitivity analysis; Agg...
Reference Model for Project Support Environments Version 1.0
1993-02-28
relationship with the framework’s Process Support services and with the Lifecycle Process Engineering services. Examples: "* ORCA (Object-based...Design services. Examples: "* ORCA (Object-based Requirements Capture and Analysis). "* RETRAC (REquirements TRACeability). 4.3 Life-Cycle Process...34traditional" computer tools. Operations: Examples of audio and video processing operations include: "* Create, modify, and delete sound and video data
Designing for fiber composite structural durability in hygrothermomechanical environment
NASA Technical Reports Server (NTRS)
Chamis, C. C.
1985-01-01
A methodology is described which can be used to design/analyze fiber composite structures subjected to complex hygrothermomechanical environments. This methodology includes composite mechanics and advanced structural analysis methods (finite element). Select examples are described to illustrate the application of the available methodology. The examples include: (1) composite progressive fracture; (2) composite design for high cycle fatigue combined with hot-wet conditions; and (3) general laminate design.
Design/Analysis of the JWST ISIM Bonded Joints for Survivability at Cryogenic Temperatures
NASA Technical Reports Server (NTRS)
Bartoszyk, Andrew; Johnston, John; Kaprielian, Charles; Kuhn, Jonathan; Kunt, Cengiz; Rodini, Benjamin; Young, Daniel
2005-01-01
Contents include the following: JWST/ISIM introduction. Design and analysis challenges for ISIM bonded joints. JWST/ISIM joint designs. Bonded joint analysis. Finite element modeling. Failure criteria and margin calculation. Analysis/test correlation procedure. Example of test data and analysis.
Quantification of Microbial Phenotypes
Martínez, Verónica S.; Krömer, Jens O.
2016-01-01
Metabolite profiling technologies have improved to generate close to quantitative metabolomics data, which can be employed to quantitatively describe the metabolic phenotype of an organism. Here, we review the current technologies available for quantitative metabolomics, present their advantages and drawbacks, and the current challenges to generate fully quantitative metabolomics data. Metabolomics data can be integrated into metabolic networks using thermodynamic principles to constrain the directionality of reactions. Here we explain how to estimate Gibbs energy under physiological conditions, including examples of the estimations, and the different methods for thermodynamics-based network analysis. The fundamentals of the methods and how to perform the analyses are described. Finally, an example applying quantitative metabolomics to a yeast model by 13C fluxomics and thermodynamics-based network analysis is presented. The example shows that (1) these two methods are complementary to each other; and (2) there is a need to take into account Gibbs energy errors. Better estimations of metabolic phenotypes will be obtained when further constraints are included in the analysis. PMID:27941694
Analysis of defects of overhead facade systems and other light thin-walled structures
NASA Astrophysics Data System (ADS)
Endzhievskiy, L.; Frolovskaia, A.; Petrova, Y.
2017-04-01
This paper analyzes the defects and the causes of contemporary design solutions with an example of overhead facade systems with ventilated air gaps and light steel thin-walled structures on the basis of field experiments. The analysis is performed at all stages of work: design, manufacture, including quality, construction, and operation. Practical examples are given. The main causes of accidents and the accident rate prediction are looked upon and discussed.
Fortran programs for reliability analysis
John J. Zahn
1992-01-01
This report contains a set of FORTRAN subroutines written to calculate the Hasofer-Lind reliability index. Nonlinear failure criteria and correlated basic variables are permitted. Users may incorporate these routines into their own calling program (an example program, RELANAL, is included) and must provide a failure criterion subroutine (two example subroutines,...
MSFC crack growth analysis computer program, version 2 (users manual)
NASA Technical Reports Server (NTRS)
Creager, M.
1976-01-01
An updated version of the George C. Marshall Space Flight Center Crack Growth Analysis Program is described. The updated computer program has significantly expanded capabilities over the original one. This increased capability includes an extensive expansion of the library of stress intensity factors, plotting capability, increased design iteration capability, and the capability of performing proof test logic analysis. The technical approaches used within the computer program are presented, and the input and output formats and options are described. Details of the stress intensity equations, example data, and example problems are presented.
POLO2: a user's guide to multiple Probit Or LOgit analysis
Robert M. Russell; N. E. Savin; Jacqueline L. Robertson
1981-01-01
This guide provides instructions for the use of POLO2, a computer program for multivariate probit or logic analysis of quantal response data. As many as 3000 test subjects may be included in a single analysis. Including the constant term, up to nine explanatory variables may be used. Examples illustrating input, output, and uses of the program's special features...
Architectural Analysis of Dynamically Reconfigurable Systems
NASA Technical Reports Server (NTRS)
Lindvall, Mikael; Godfrey, Sally; Ackermann, Chris; Ray, Arnab; Yonkwa, Lyly
2010-01-01
oTpics include: the problem (increased flexibility of architectural styles decrease analyzability, behavior emerges and varies depending on the configuration, does the resulting system run according to the intended design, and architectural decisions can impede or facilitate testing); top down approach to architecture analysis, detection of defects and deviations, and architecture and its testability; currently targeted projects GMSEC and CFS; analyzing software architectures; analyzing runtime events; actual architecture recognition; GMPUB in Dynamic SAVE; sample output from new approach; taking message timing delays into account; CFS examples of architecture and testability; some recommendations for improved testablity; and CFS examples of abstract interfaces and testability; CFS example of opening some internal details.
Reported Barriers to Source Reduction in the 2015 TRI National Analysis
Source Reduction/Pollution Prevention - barriers to implementing source reduction activities as reported by facilities for the 2015 Toxics Release Inventory National Analysis, including examples of each type of barrier
Reported Barriers to Source Reduction in the 2016 TRI National Analysis
Source Reduction/Pollution Prevention - barriers to implementing source reduction activities as reported by facilities for the 2016 Toxics Release Inventory National Analysis, including examples of each type of barrier
Rapid Harmonic Analysis of Piezoelectric MEMS Resonators.
Puder, Jonathan M; Pulskamp, Jeffrey S; Rudy, Ryan Q; Cassella, Cristian; Rinaldi, Matteo; Chen, Guofeng; Bhave, Sunil A; Polcawich, Ronald G
2018-06-01
This paper reports on a novel simulation method combining the speed of analytical evaluation with the accuracy of finite-element analysis (FEA). This method is known as the rapid analytical-FEA technique (RAFT). The ability of the RAFT to accurately predict frequency response orders of magnitude faster than conventional simulation methods while providing deeper insights into device design not possible with other types of analysis is detailed. Simulation results from the RAFT across wide bandwidths are compared to measured results of resonators fabricated with various materials, frequencies, and topologies with good agreement. These include resonators targeting beam extension, disk flexure, and Lamé beam modes. An example scaling analysis is presented and other applications enabled are discussed as well. The supplemental material includes example code for implementation in ANSYS, although any commonly employed FEA package may be used.
Three Techniques for Task Analysis: Examples from the Nuclear Utilities.
ERIC Educational Resources Information Center
Carlisle, Kenneth E.
1984-01-01
Discusses three task analysis techniques utilized at the Palo Verde Nuclear Generating Station to review training programs: analysis of (1) job positions, (2) procedures, and (3) instructional presentations. All of these include task breakdown, relationship determination, and task restructuring. (MBR)
Facility Measurement Uncertainty Analysis at NASA GRC
NASA Technical Reports Server (NTRS)
Stephens, Julia; Hubbard, Erin
2016-01-01
This presentation provides and overview of the measurement uncertainty analysis currently being implemented in various facilities at NASA GRC. This presentation includes examples pertinent to the turbine engine community (mass flow and fan efficiency calculation uncertainties.
Choosing estimands in clinical trials with missing data.
Mallinckrodt, Craig; Molenberghs, Geert; Rathmann, Suchitrita
2017-01-01
Recent research has fostered new guidance on preventing and treating missing data. Consensus exists that clear objectives should be defined along with the causal estimands; trial design and conduct should maximize adherence to the protocol specified interventions; and a sensible primary analysis should be used along with plausible sensitivity analyses. Two general categories of estimands are effects of the drug as actually taken (de facto, effectiveness) and effects of the drug if taken as directed (de jure, efficacy). Motivated by examples, we argue that no single estimand is likely to meet the needs of all stakeholders and that each estimand has strengths and limitations. Therefore, stakeholder input should be part of an iterative study development process that includes choosing estimands that are consistent with trial objectives. To this end, an example is used to illustrate the benefit from assessing multiple estimands in the same study. A second example illustrates that maximizing adherence reduces sensitivity to missing data assumptions for de jure estimands but may reduce generalizability of results for de facto estimands if efforts to maximize adherence in the trial are not feasible in clinical practice. A third example illustrates that whether or not data after initiation of rescue medication should be included in the primary analysis depends on the estimand to be tested and the clinical setting. We further discuss the sample size and total exposure to placebo implications of including post-rescue data in the primary analysis. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.
Using Framework Analysis in nursing research: a worked example.
Ward, Deborah J; Furber, Christine; Tierney, Stephanie; Swallow, Veronica
2013-11-01
To demonstrate Framework Analysis using a worked example and to illustrate how criticisms of qualitative data analysis including issues of clarity and transparency can be addressed. Critics of the analysis of qualitative data sometimes cite lack of clarity and transparency about analytical procedures; this can deter nurse researchers from undertaking qualitative studies. Framework Analysis is flexible, systematic, and rigorous, offering clarity, transparency, an audit trail, an option for theme-based and case-based analysis and for readily retrievable data. This paper offers further explanation of the process undertaken which is illustrated with a worked example. Data were collected from 31 nursing students in 2009 using semi-structured interviews. The data collected are not reported directly here but used as a worked example for the five steps of Framework Analysis. Suggestions are provided to guide researchers through essential steps in undertaking Framework Analysis. The benefits and limitations of Framework Analysis are discussed. Nurses increasingly use qualitative research methods and need to use an analysis approach that offers transparency and rigour which Framework Analysis can provide. Nurse researchers may find the detailed critique of Framework Analysis presented in this paper a useful resource when designing and conducting qualitative studies. Qualitative data analysis presents challenges in relation to the volume and complexity of data obtained and the need to present an 'audit trail' for those using the research findings. Framework Analysis is an appropriate, rigorous and systematic method for undertaking qualitative analysis. © 2013 Blackwell Publishing Ltd.
NASA Technical Reports Server (NTRS)
Torres-Pomales, Wilfredo
2014-01-01
This report presents an example of the application of multi-criteria decision analysis to the selection of an architecture for a safety-critical distributed computer system. The design problem includes constraints on minimum system availability and integrity, and the decision is based on the optimal balance of power, weight and cost. The analysis process includes the generation of alternative architectures, evaluation of individual decision criteria, and the selection of an alternative based on overall value. In this example presented here, iterative application of the quantitative evaluation process made it possible to deliberately generate an alternative architecture that is superior to all others regardless of the relative importance of cost.
Modern CACSD using the Robust-Control Toolbox
NASA Technical Reports Server (NTRS)
Chiang, Richard Y.; Safonov, Michael G.
1989-01-01
The Robust-Control Toolbox is a collection of 40 M-files which extend the capability of PC/PRO-MATLAB to do modern multivariable robust control system design. Included are robust analysis tools like singular values and structured singular values, robust synthesis tools like continuous/discrete H(exp 2)/H infinity synthesis and Linear Quadratic Gaussian Loop Transfer Recovery methods and a variety of robust model reduction tools such as Hankel approximation, balanced truncation and balanced stochastic truncation, etc. The capabilities of the toolbox are described and illustated with examples to show how easily they can be used in practice. Examples include structured singular value analysis, H infinity loop-shaping and large space structure model reduction.
A Users Guide for the NASA ANOPP Propeller Analysis System
NASA Technical Reports Server (NTRS)
Nguygen, L. Cathy; Kelly, Jeffrey J.
1997-01-01
The purpose of this report is to document improvements to the Propeller Analysis System of the Aircraft Noise Prediction Program (PAS-ANOPP) and to serve as a users guide. An overview of the functional modules and modifications made to the Propeller ANOPP system are described. Propeller noise predictions are made by executing a sequence of functional modules through the use of ANOPP control statements. The most commonly used ANOPP control statements are discussed with detailed examples demonstrating the use of each control statement. Originally, the Propeller Analysis System included the angle-of-attack only in the performance module. Recently, modifications have been made to also include angle-of-attack in the noise prediction module. A brief description of PAS prediction capabilities is presented which illustrate the input requirements necessary to run the code by way of ten templates. The purpose of the templates are to provide PAS users with complete examples which can be modified to serve their particular purposes. The examples include the use of different approximations in the computation of the noise and the effects of synchrophasing. Since modifications have been made to the original PAS-ANOPP, comparisons of the modified ANOPP and wind tunnel data are also included. Two appendices are attached at the end of this report which provide useful reference material. One appendix summarizes the PAS functional modules while the second provides a detailed discussion of the TABLE control statement.
Medical Image Analysis by Cognitive Information Systems - a Review.
Ogiela, Lidia; Takizawa, Makoto
2016-10-01
This publication presents a review of medical image analysis systems. The paradigms of cognitive information systems will be presented by examples of medical image analysis systems. The semantic processes present as it is applied to different types of medical images. Cognitive information systems were defined on the basis of methods for the semantic analysis and interpretation of information - medical images - applied to cognitive meaning of medical images contained in analyzed data sets. Semantic analysis was proposed to analyzed the meaning of data. Meaning is included in information, for example in medical images. Medical image analysis will be presented and discussed as they are applied to various types of medical images, presented selected human organs, with different pathologies. Those images were analyzed using different classes of cognitive information systems. Cognitive information systems dedicated to medical image analysis was also defined for the decision supporting tasks. This process is very important for example in diagnostic and therapy processes, in the selection of semantic aspects/features, from analyzed data sets. Those features allow to create a new way of analysis.
29 CFR 2520.104-20 - Limited exemption for certain small welfare plans.
Code of Federal Regulations, 2012 CFR
2012-07-01
... from employee benefit plans for research and analysis (section 513). (d) Examples. (1) A welfare plan... administrator of an employee benefit plan from any other requirement of title I of the Act, including the... and maintained in the same way as the plan described in example (1), except that a trade association...
29 CFR 2520.104-20 - Limited exemption for certain small welfare plans.
Code of Federal Regulations, 2010 CFR
2010-07-01
... from employee benefit plans for research and analysis (section 513). (d) Examples. (1) A welfare plan... administrator of an employee benefit plan from any other requirement of title I of the Act, including the... and maintained in the same way as the plan described in example (1), except that a trade association...
A New Methodology for Systematic Exploitation of Technology Databases.
ERIC Educational Resources Information Center
Bedecarrax, Chantal; Huot, Charles
1994-01-01
Presents the theoretical aspects of a data analysis methodology that can help transform sequential raw data from a database into useful information, using the statistical analysis of patents as an example. Topics discussed include relational analysis and a technology watch approach. (Contains 17 references.) (LRW)
Sierra Toolkit Manual Version 4.48.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sierra Toolkit Team
This report provides documentation for the SIERRA Toolkit (STK) modules. STK modules are intended to provide infrastructure that assists the development of computational engineering soft- ware such as finite-element analysis applications. STK includes modules for unstructured-mesh data structures, reading/writing mesh files, geometric proximity search, and various utilities. This document contains a chapter for each module, and each chapter contains overview descriptions and usage examples. Usage examples are primarily code listings which are generated from working test programs that are included in the STK code-base. A goal of this approach is to ensure that the usage examples will not fall outmore » of date. This page intentionally left blank.« less
ERIC Educational Resources Information Center
Gearhart, William B.; Shultz, Harris S.
1990-01-01
Presents some examples from geometry: area of a circle; centroid of a sector; Buffon's needle problem; and expression for pi. Describes several roles of the trigonometric function in mathematics and applications, including Fourier analysis, spectral theory, approximation theory, and numerical analysis. (YP)
Open web system of Virtual labs for nuclear and applied physics
NASA Astrophysics Data System (ADS)
Saldikov, I. S.; Afanasyev, V. V.; Petrov, V. I.; Ternovykh, M. Yu
2017-01-01
An example of virtual lab work on unique experimental equipment is presented. The virtual lab work is software based on a model of real equipment. Virtual labs can be used for educational process in nuclear safety and analysis field. As an example it includes the virtual lab called “Experimental determination of the material parameter depending on the pitch of a uranium-water lattice”. This paper included general description of this lab. A description of a database on the support of laboratory work on unique experimental equipment which is included this work, its concept development are also presented.
Crash Certification by Analysis - Are We There Yet?
NASA Technical Reports Server (NTRS)
Jackson, Karen E.; Fasanella, Edwin L.; Lyle, Karen H.
2006-01-01
This paper addresses the issue of crash certification by analysis. This broad topic encompasses many ancillary issues including model validation procedures, uncertainty in test data and analysis models, probabilistic techniques for test-analysis correlation, verification of the mathematical formulation, and establishment of appropriate qualification requirements. This paper will focus on certification requirements for crashworthiness of military helicopters; capabilities of the current analysis codes used for crash modeling and simulation, including some examples of simulations from the literature to illustrate the current approach to model validation; and future directions needed to achieve "crash certification by analysis."
Optimal guidance law development for an advanced launch system
NASA Technical Reports Server (NTRS)
Calise, Anthony J.; Hodges, Dewey H.
1990-01-01
A regular perturbation analysis is presented. Closed-loop simulations were performed with a first order correction including all of the atmospheric terms. In addition, a method was developed for independently checking the accuracy of the analysis and the rather extensive programming required to implement the complete first order correction with all of the aerodynamic effects included. This amounted to developing an equivalent Hamiltonian computed from the first order analysis. A second order correction was also completed for the neglected spherical Earth and back-pressure effects. Finally, an analysis was begun on a method for dealing with control inequality constraints. The results on including higher order corrections do show some improvement for this application; however, it is not known at this stage if significant improvement will result when the aerodynamic forces are included. The weak formulation for solving optimal problems was extended in order to account for state inequality constraints. The formulation was tested on three example problems and numerical results were compared to the exact solutions. Development of a general purpose computational environment for the solution of a large class of optimal control problems is under way. An example, along with the necessary input and the output, is given.
NASA Astrophysics Data System (ADS)
Lee, Kwan Chul
2017-11-01
Three examples of electric field formation in the plasma are analyzed based on a new mechanism driven by ion-neutral collisions. The Gyro-Center Shift analysis uses the iteration of three equations including perpendicular current induced by the momentum exchange between ions and neutrals when there is asymmetry over the gyro-motion. This method includes non-zero divergence of current that leads the solution of time dependent state. The first example is radial electric field formation at the boundary of the nuclear fusion device, which is a key factor in the high-confinement mode operation of future fusion reactors. The second example is the reversed rotation of the arc discharge cathode spot, which has been a mysterious subject for more than one hundred years. The third example is electric field formations in the earth's ionosphere, which are important components of the equatorial electrojet and black aurora. The use of one method that explains various examples from different plasmas is reported, along with a discussion of the applications.
Deelman, E.; Callaghan, S.; Field, E.; Francoeur, H.; Graves, R.; Gupta, N.; Gupta, V.; Jordan, T.H.; Kesselman, C.; Maechling, P.; Mehringer, J.; Mehta, G.; Okaya, D.; Vahi, K.; Zhao, L.
2006-01-01
This paper discusses the process of building an environment where large-scale, complex, scientific analysis can be scheduled onto a heterogeneous collection of computational and storage resources. The example application is the Southern California Earthquake Center (SCEC) CyberShake project, an analysis designed to compute probabilistic seismic hazard curves for sites in the Los Angeles area. We explain which software tools were used to build to the system, describe their functionality and interactions. We show the results of running the CyberShake analysis that included over 250,000 jobs using resources available through SCEC and the TeraGrid. ?? 2006 IEEE.
smwrData—An R package of example hydrologic data, version 1.1.1
Lorenz, David L.
2015-11-06
A collection of 24 datasets, including streamflow, well characteristics, groundwater elevations, and discrete water-quality concentrations, is provided to produce a consistent set of example data to demonstrate typical data manipulations or statistical analysis of hydrologic data. These example data are provided in an R package called smwrData. The data in the package have been collected by the U.S. Geological Survey or published in its reports, for example Helsel and Hirsch (2002). The R package provides a convenient mechanism for distributing the data to users of R within the U.S. Geological Survey and other users in the R community.
Mapping Candidate Ecological Restoration Areas Using Morphological Spatial Pattern Analysis (MSPA)
Morphological Spatial Pattern Analysis (MSPA) has been widely adopted by landscape ecologists over the past decade. A few examples of its many uses include: 1) quantifying landscape indicators and fragmentation in continental forest assessments, 2) explaining interior-exterior p...
Rocket Engine Oscillation Diagnostics
NASA Technical Reports Server (NTRS)
Nesman, Tom; Turner, James E. (Technical Monitor)
2002-01-01
Rocket engine oscillating data can reveal many physical phenomena ranging from unsteady flow and acoustics to rotordynamics and structural dynamics. Because of this, engine diagnostics based on oscillation data should employ both signal analysis and physical modeling. This paper describes an approach to rocket engine oscillation diagnostics, types of problems encountered, and example problems solved. Determination of design guidelines and environments (or loads) from oscillating phenomena is required during initial stages of rocket engine design, while the additional tasks of health monitoring, incipient failure detection, and anomaly diagnostics occur during engine development and operation. Oscillations in rocket engines are typically related to flow driven acoustics, flow excited structures, or rotational forces. Additional sources of oscillatory energy are combustion and cavitation. Included in the example problems is a sampling of signal analysis tools employed in diagnostics. The rocket engine hardware includes combustion devices, valves, turbopumps, and ducts. Simple models of an oscillating fluid system or structure can be constructed to estimate pertinent dynamic parameters governing the unsteady behavior of engine systems or components. In the example problems it is shown that simple physical modeling when combined with signal analysis can be successfully employed to diagnose complex rocket engine oscillatory phenomena.
Military applications and examples of near-surface seismic surface wave methods (Invited)
NASA Astrophysics Data System (ADS)
sloan, S.; Stevens, R.
2013-12-01
Although not always widely known or publicized, the military uses a variety of geophysical methods for a wide range of applications--some that are already common practice in the industry while others are truly novel. Some of those applications include unexploded ordnance detection, general site characterization, anomaly detection, countering improvised explosive devices (IEDs), and security monitoring, to name a few. Techniques used may include, but are not limited to, ground penetrating radar, seismic, electrical, gravity, and electromagnetic methods. Seismic methods employed include surface wave analysis, refraction tomography, and high-resolution reflection methods. Although the military employs geophysical methods, that does not necessarily mean that those methods enable or support combat operations--often times they are being used for humanitarian applications within the military's area of operations to support local populations. The work presented here will focus on the applied use of seismic surface wave methods, including multichannel analysis of surface waves (MASW) and backscattered surface waves, often in conjunction with other methods such as refraction tomography or body-wave diffraction analysis. Multiple field examples will be shown, including explosives testing, tunnel detection, pre-construction site characterization, and cavity detection.
NASA Astrophysics Data System (ADS)
Pourattar, Parisa
The cementation process of making Egyptian faience, reported by Hans Wulff from a workshop in Qom, Iran, has not been easy to replicate and various views have been set forth to understand the transport of materials from the glazing powder to the surfaces of the crushed quartz beads. Replications of the process fired to 950° C and under-fired to 850° C were characterized by electron beam microprobe analysis (EPMA), petrographic thin section analysis, and scanning electron microscopy with energy dispersive x-ray analysis (SEM-EDS). Chemical variations were modeled using thermal data, phase diagrams, and copper vaporization experiments. These replications were compared to 52 examples from various collections, including 20th century ethnographic collections of beads, glazing powder and plant ash, 12th century CE beads and glazing powder from Fustat (Old Cairo), Egypt, and to an earlier example from Abydos, Egypt in the New Kingdom and to an ash example from the Smithsonian Institution National Museum of Natural History.
CELSS scenario analysis: Breakeven calculations
NASA Technical Reports Server (NTRS)
Mason, R. M.
1980-01-01
A model of the relative mass requirements of food production components in a controlled ecological life support system (CELSS) based on regenerative concepts is described. Included are a discussion of model scope, structure, and example calculations. Computer programs for cultivar and breakeven calculations are also included.
NASA Technical Reports Server (NTRS)
Cosentino, Gary B.
2007-01-01
Several examples from the past decade of success stories involving the design and flight test of three true X-planes will be described: in particular, X-plane design techniques that relied heavily upon computational fluid dynamics (CFD). Three specific examples chosen from the author s personal experience are presented: the X-36 Tailless Fighter Agility Research Aircraft, the X-45A Unmanned Combat Air Vehicle, and, most recently, the X-48B Blended Wing Body Demonstrator Aircraft. An overview will be presented of the uses of CFD analysis, comparisons and contrasts with wind tunnel testing, and information derived from the CFD analysis that directly related to successful flight test. Some lessons learned on the proper application, and misapplication, of CFD are illustrated. Finally, some highlights of the flight-test results of the three example X-planes will be presented. This overview paper will discuss some of the author s experience with taking an aircraft shape from early concept and three-dimensional modeling through CFD analysis, wind tunnel testing, further refined CFD analysis, and, finally, flight. An overview of the key roles in which CFD plays well during this process, and some other roles in which it does not, are discussed. How wind tunnel testing complements, calibrates, and verifies CFD analysis is also covered. Lessons learned on where CFD results can be misleading are also given. Strengths and weaknesses of the various types of flow solvers, including panel methods, Euler, and Navier-Stokes techniques, are discussed. The paper concludes with the three specific examples, including some flight test video footage of the X-36, the X-45A, and the X-48B.
NASA Technical Reports Server (NTRS)
Cosentino, Gary B.
2007-01-01
Several examples from the past decade of success stories involving the design and ight test of three true X-planes will be described: in particular, X-plane design techniques that relied heavily upon computational fluid dynamics (CFD). Three specific examples chosen from the authors personal experience are presented: the X-36 Tailless Fighter Agility Research Aircraft, the X-45A Unmanned Combat Air Vehicle, and, most recently, the X-48B Blended Wing Body Demonstrator Aircraft. An overview will be presented of the uses of CFD analysis, comparisons and contrasts with wind tunnel testing, and information derived from the CFD analysis that directly related to successful flight test. Some lessons learned on the proper application, and misapplication, of CFD are illustrated. Finally, some highlights of the flight-test results of the three example X-planes will be presented. This overview paper will discuss some of the authors experience with taking an aircraft shape from early concept and three-dimensional modeling through CFD analysis, wind tunnel testing, further re ned CFD analysis, and, finally, flight. An overview of the key roles in which CFD plays well during this process, and some other roles in which it does not, are discussed. How wind tunnel testing complements, calibrates, and verifies CFD analysis is also covered. Lessons learned on where CFD results can be misleading are also given. Strengths and weaknesses of the various types of ow solvers, including panel methods, Euler, and Navier-Stokes techniques, are discussed. The paper concludes with the three specific examples, including some flight test video footage of the X-36, the X-45A, and the X-48B.
Hardware Acceleration for Cyber Security
2010-11-01
perform different approaches. It includes behavioral analysis, by means of NetFlow monitoring, as well as packet content analysis, so called Deep...Interface (API). The example of such application is NetFlow exporter described in [5]. • We provide modified libpcap library using libsze2 API. This...cards. The software applications using NIFIC include FlowMon NetFlow /IPFIX generator, Wireshark packet analyzer, iptables - Linux kernel firewall, deep
Digital Circuit Analysis Using an 8080 Processor.
ERIC Educational Resources Information Center
Greco, John; Stern, Kenneth
1983-01-01
Presents the essentials of a program written in Intel 8080 assembly language for the steady state analysis of a combinatorial logic gate circuit. Program features and potential modifications are considered. For example, the program could also be extended to include clocked/unclocked sequential circuits. (JN)
Interfaces for End-User Information Seeking.
ERIC Educational Resources Information Center
Marchionini, Gary
1992-01-01
Discusses essential features of interfaces to support end-user information seeking. Highlights include cognitive engineering; task models and task analysis; the problem-solving nature of information seeking; examples of systems for end-users, including online public access catalogs (OPACs), hypertext, and help systems; and suggested research…
NASA Astrophysics Data System (ADS)
Reiterer, Alexander; Egly, Uwe; Vicovac, Tanja; Mai, Enrico; Moafipoor, Shahram; Grejner-Brzezinska, Dorota A.; Toth, Charles K.
2010-12-01
Artificial Intelligence (AI) is one of the key technologies in many of today's novel applications. It is used to add knowledge and reasoning to systems. This paper illustrates a review of AI methods including examples of their practical application in Geodesy like data analysis, deformation analysis, navigation, network adjustment, and optimization of complex measurement procedures. We focus on three examples, namely, a geo-risk assessment system supported by a knowledge-base, an intelligent dead reckoning personal navigator, and evolutionary strategies for the determination of Earth gravity field parameters. Some of the authors are members of IAG Sub-Commission 4.2 - Working Group 4.2.3, which has the main goal to study and report on the application of AI in Engineering Geodesy.
Computational Fluid Dynamics Analysis Success Stories of X-Plane Design to Flight Test
NASA Technical Reports Server (NTRS)
Cosentino, Gary B.
2008-01-01
Examples of the design and flight test of three true X-planes are described, particularly X-plane design techniques that relied heavily on computational fluid dynamics(CFD) analysis. Three examples are presented: the X-36 Tailless Fighter Agility Research Aircraft, the X-45A Unmanned Combat Air Vehicle, and the X-48B Blended Wing Body Demonstrator Aircraft. An overview is presented of the uses of CFD analysis, comparison and contrast with wind tunnel testing, and information derived from CFD analysis that directly related to successful flight test. Lessons learned on the proper and improper application of CFD analysis are presented. Highlights of the flight-test results of the three example X-planes are presented. This report discusses developing an aircraft shape from early concept and three-dimensional modeling through CFD analysis, wind tunnel testing, further refined CFD analysis, and, finally, flight. An overview of the areas in which CFD analysis does and does not perform well during this process is presented. How wind tunnel testing complements, calibrates, and verifies CFD analysis is discussed. Lessons learned revealing circumstances under which CFD analysis results can be misleading are given. Strengths and weaknesses of the various flow solvers, including panel methods, Euler, and Navier-Stokes techniques, are discussed.
Concept analysis of culture applied to nursing.
Marzilli, Colleen
2014-01-01
Culture is an important concept, especially when applied to nursing. A concept analysis of culture is essential to understanding the meaning of the word. This article applies Rodgers' (2000) concept analysis template and provides a definition of the word culture as it applies to nursing practice. This article supplies examples of the concept of culture to aid the reader in understanding its application to nursing and includes a case study demonstrating components of culture that must be respected and included when providing health care.
ERIC Educational Resources Information Center
Barton, Mitch; Yeatts, Paul E.; Henson, Robin K.; Martin, Scott B.
2016-01-01
There has been a recent call to improve data reporting in kinesiology journals, including the appropriate use of univariate and multivariate analysis techniques. For example, a multivariate analysis of variance (MANOVA) with univariate post hocs and a Bonferroni correction is frequently used to investigate group differences on multiple dependent…
ERIC Educational Resources Information Center
Torres y Torres, Janelle L.; Hiley, Shauna L.; Lorimor, Steven P.; Rhoad, Jonathan S.; Caldwell, Benjamin D.; Zweerink, Gerald L.; Ducey, Michael
2015-01-01
The Characterization and Analysis of a Product (CAP) project is used to introduce first-semester general chemistry students to chemical instrumentation through the analysis of caffeine-containing beverage products. Some examples of these products have included coffee, tea, and energy drinks. Students perform at least three instrumental experiments…
Examples of Data Analysis with SPSS-X.
ERIC Educational Resources Information Center
MacFarland, Thomas W.
Intended for classroom use only, these unpublished notes contain computer lessons on descriptive statistics using SPSS-X Release 3.0 for VAX/UNIX. Statistical measures covered include Chi-square analysis; Spearman's rank correlation coefficient; Student's t-test with two independent samples; Student's t-test with a paired sample; One-way analysis…
Global Sensitivity and Data-Worth Analyses in iTOUGH2: User's Guide
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wainwright, Haruko Murakami; Finsterle, Stefan
2016-07-15
This manual explains the use of local sensitivity analysis, the global Morris OAT and Sobol’ methods, and a related data-worth analysis as implemented in iTOUGH2. In addition to input specification and output formats, it includes some examples to show how to interpret results.
Thermo-elastoviscoplastic snapthrough behavior of cylindrical panels
NASA Technical Reports Server (NTRS)
Song, Y.; Simitses, G. J.
1992-01-01
The thermo-elastoviscoplastic snapthrough behavior of simply supported cylindrical panels is investigated. The analysis is based on nonlinear kinematic relations and nonlinear rate-dependent unified constitutive equations which include both Bodner-Partom's and Walker's material models. A finite element approach is employed to predict the inelastic buckling behavior. Numerical examples are given to demonstrate the effects of several parameters which include the temperature, thickness and flatness of the panel. Comparisons of buckling responses between Bodner-Partom's model and Walker's model are given. The creep buckling behavior, as an example of time-dependent inelastic deformation, is also presented.
Delamination Modeling of Composites for Improved Crash Analysis
NASA Technical Reports Server (NTRS)
Fleming, David C.
1999-01-01
Finite element crash modeling of composite structures is limited by the inability of current commercial crash codes to accurately model delamination growth. Efforts are made to implement and assess delamination modeling techniques using a current finite element crash code, MSC/DYTRAN. Three methods are evaluated, including a straightforward method based on monitoring forces in elements or constraints representing an interface; a cohesive fracture model proposed in the literature; and the virtual crack closure technique commonly used in fracture mechanics. Results are compared with dynamic double cantilever beam test data from the literature. Examples show that it is possible to accurately model delamination propagation in this case. However, the computational demands required for accurate solution are great and reliable property data may not be available to support general crash modeling efforts. Additional examples are modeled including an impact-loaded beam, damage initiation in laminated crushing specimens, and a scaled aircraft subfloor structures in which composite sandwich structures are used as energy-absorbing elements. These examples illustrate some of the difficulties in modeling delamination as part of a finite element crash analysis.
NASA Astrophysics Data System (ADS)
Bhartia, R.; Wanger, G.; Orphan, V. J.; Fries, M.; Rowe, A. R.; Nealson, K. H.; Abbey, W. J.; DeFlores, L. P.; Beegle, L. W.
2014-12-01
Detection of in situ biosignatures on terrestrial and planetary missions is becoming increasingly more important. Missions that target the Earth's deep biosphere, Mars, moons of Jupiter (including Europa), moons of Saturn (Titan and Enceladus), and small bodies such as asteroids or comets require methods that enable detection of materials for both in-situ analysis that preserve context and as a means to select high priority sample for return to Earth. In situ instrumentation for biosignature detection spans a wide range of analytical and spectroscopic methods that capitalize on amino acid distribution, chirality, lipid composition, isotopic fractionation, or textures that persist in the environment. Many of the existing analytical instruments are bulk analysis methods and while highly sensitive, these require sample acquisition and sample processing. However, by combining with triaging spectroscopic methods, biosignatures can be targeted on a surface and preserve spatial context (including mineralogy, textures, and organic distribution). To provide spatially correlated chemical analysis at multiple spatial scales (meters to microns) we have employed a dual spectroscopic approach that capitalizes on high sensitivity deep UV native fluorescence detection and high specificity deep UV Raman analysis.. Recently selected as a payload on the Mars 2020 mission, SHERLOC incorporates these optical methods for potential biosignatures detection on Mars. We present data from both Earth analogs that operate as our only examples known biosignatures and meteorite samples that provide an example of abiotic organic formation, and demonstrate how provenance effects the spatial distribution and composition of organics.
Memorable Exemplification in Undergraduate Biology: Instructor Strategies and Student Perceptions
NASA Astrophysics Data System (ADS)
Oliveira, Alandeom W.; Bretzlaff, Tiffany; Brown, Adam O.
2018-03-01
The present study examines the exemplification practices of a university biology instructor during a semester-long course. Attention is given specifically to how the instructor approaches memorable exemplification—classroom episodes identified by students as a source of memorable learning experiences. A mixed-method research approach is adopted wherein descriptive statistics is combined with qualitative multimodal analysis of video recordings and survey data. Our findings show that memorable experiencing of examples may depend on a multiplicity of factors, including whether students can relate to the example, how unique and extreme the example is, how much detail is provided, whether the example is enacted rather than told, and whether the example makes students feel sad, surprised, shocked, and/or amused. It is argued that, rather than simply assuming that all examples are equally effective, careful consideration needs be given to how exemplification can serve as an important source of memorable science learning experiences.
Scientific Software: How to Find What You Need and Get What You Pay for.
ERIC Educational Resources Information Center
Gabaldon, Diana J.
1984-01-01
Provides examples of software for the sciences, including: packages for pathology/toxicology laboratories (costing over $15,000), DNA sequencing, and data acquisition/analysis; general-purpose software for scientific uses; and "custom" packages, including a program to maintain a listing of "Escherichia coli" strains and a…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
The purpose of this supporting analysis is to provide a foundation for developing a model, an international or multinational institution capable of accomodating the back end of the fuel cycle, while meeting US nonproliferation goals. The analysis is based on a review of selected, defunct and extant institutions which, although not necessarily concerned with nonproliferation, have faced a trade-off between acceptability and effectiveness in meeting their objectives. Discussion of the various institutions is divided into three categories: international organizations, multinational consortia, and cartels or producer associations. Examples of international organizations include the International Seabed Authority, Intelsat, the United Nations andmore » the International Atomic Energy Agency (IAEA). The International Seabed Authority is discussed. Multinational consortia are organizations that have been developed primarily to meet common commercial objectives. Membership includes at least three member nations. Examples include the Scandinavian Airline System (SAS), URENCO, Unilever, Royal Dutch Shell, Eurochemic, Eurodif, Euratom, European Coal and Steel Community, and Serena. Cartels or producer associations are multinational agreements that restrict market forces; viz, production, market share, customers or prices. Examples include the Intergovernmental Council of Copper Exporting Countries (CIPEC), the Organization of Petroleum Exporting Countries (OPEC), and the Fifth International Tin Agreement (ITA), as well as agreements governing diamonds and uranium, bauxite and coffee. OPEC, CIPEC and ITA are discussed.« less
Analysis of pilot control strategy
NASA Technical Reports Server (NTRS)
Heffley, R. K.; Hanson, G. D.; Jewell, W. F.; Clement, W. F.
1983-01-01
Methods for nonintrusive identification of pilot control strategy and task execution dynamics are presented along with examples based on flight data. The specific analysis technique is Nonintrusive Parameter Identification Procedure (NIPIP), which is described in a companion user's guide (NASA CR-170398). Quantification of pilot control strategy and task execution dynamics is discussed in general terms followed by a more detailed description of how NIPIP can be applied. The examples are based on flight data obtained from the NASA F-8 digital fly by wire airplane. These examples involve various piloting tasks and control axes as well as a demonstration of how the dynamics of the aircraft itself are identified using NIPIP. Application of NIPIP to the AFTI/F-16 flight test program is discussed. Recommendations are made for flight test applications in general and refinement of NIPIP to include interactive computer graphics.
Molecular plant breeding: methodology and achievements.
Varshney, Rajeev K; Hoisington, Dave A; Nayak, Spurthi N; Graner, Andreas
2009-01-01
The progress made in DNA marker technology has been remarkable and exciting in recent years. DNA markers have proved valuable tools in various analyses in plant breeding, for example, early generation selection, enrichment of complex F(1)s, choice of donor parent in backcrossing, recovery of recurrent parent genotype in backcrossing, linkage block analysis and selection. Other main areas of applications of molecular markers in plant breeding include germplasm characterization/fingerprinting, determining seed purity, systematic sampling of germplasm, and phylogenetic analysis. Molecular markers, thus, have proved powerful tools in replacing the bioassays and there are now many examples available to show the efficacy of such markers. We have illustrated some basic concepts and methodology of applying molecular markers for enhancing the selection efficiency in plant breeding. Some successful examples of product developments of molecular breeding have also been presented.
Modal analysis applied to circular, rectangular, and coaxial waveguides
NASA Technical Reports Server (NTRS)
Hoppe, D. J.
1988-01-01
Recent developments in the analysis of various waveguide components and feedhorns using Modal Analysis (Mode Matching Method) are summarized. A brief description of the theory is presented, and the important features of the method are pointed out. Specific examples in circular, rectangular, and coaxial waveguides are included, with comparisons between the theory and experimental measurements. Extensions to the methods are described.
The Columbia Debris Loan Program; Examples of Microscopic Analysis
NASA Technical Reports Server (NTRS)
Russell, Rick; Thurston, Scott; Smith, Stephen; Marder, Arnold; Steckel, Gary
2006-01-01
Following the tragic loss of the Space Shuttle Columbia NASA formed The Columbia Recovery Office (CRO). The CRO was initially formed at the Johnson Space Center after the conclusion of recovery operations on May 1,2003 and then transferred .to the Kennedy Space Center on October 6,2003 and renamed The Columbia Recovery Office and Preservation. An integral part of the preservation project was the development of a process to loan Columbia debris to qualified researchers and technical educators. The purposes of this program include aiding in the advancement of advanced spacecraft design and flight safety development, the advancement of the study of hypersonic re-entry to enhance ground safety, to train and instruct accident investigators and to establish an enduring legacy for Space Shuttle Columbia and her crew. Along with a summary of the debris loan process examples of microscopic analysis of Columbia debris items will be presented. The first example will be from the reconstruction following the STS- 107 accident and how the Materials and Proessteesa m used microscopic analysis to confirm the accident scenario. Additionally, three examples of microstructural results from the debris loan process from NASA internal, academia and private industry will be presented.
[Visual representation of biological structures in teaching material].
Morato, M A; Struchiner, M; Bordoni, E; Ricciardi, R M
1998-01-01
Parameters must be defined for presenting and handling scientific information presented in the form of teaching materials. Through library research and consultations with specialists in the health sciences and in graphic arts and design, this study undertook a comparative description of the first examples of scientific illustrations of anatomy and the evolution of visual representations of knowledge on the cell. The study includes significant examples of illustrations which served as elements of analysis.
Building Training Curricula for Accelerating the Use of NOAA Climate Products and Tools
NASA Astrophysics Data System (ADS)
Timofeyeva-Livezey, M. M.; Meyers, J. C.; Stevermer, A.; Abshire, W. E.; Beller-Simms, N.; Herring, D.
2016-12-01
The National Oceanic and Atmospheric Administration (NOAA) plays a leading role in U.S. intergovernmental efforts on the Climate Data Initiative and the Climate Resilience Toolkit (CRT). CRT (http://toolkit.climate.gov/) is a valuable resource that provides tools, information, and subject matter expertise to decision makers in various sectors, such as agriculture, water resources and transportation, to help them build resilience to our changing climate. In order to make best use of the toolkit and all the resources within it, a training component is critical. The training section helps building users' understanding of the data, science, and impacts of climate variability and change. CRT identifies five steps in building resilience that includes use of appropriate tools to support decision makers depending on their needs. One tool that can be potentially integrated into CRT is NOAA's Local Climate Analysis Tool (LCAT), which provides access to trusted NOAA data and scientifically-sound analysis techniques for doing regional and local climate studies on climate variability and climate change. However, in order for LCAT to be used effectively, we have found an iterative learning approach using specific examples to train users. For example, for LCAT application in analysis of water resources, we use existing CRT case studies for Arizona and Florida water supply users. The Florida example demonstrates primary sensitivity to climate variability impacts, whereas the Arizona example takes into account longer- term climate change. The types of analyses included in LCAT are time series analysis of local climate and the estimated rate of change in the local climate. It also provides a composite analysis to evaluate the relationship between local climate and climate variability events such as El Niño Southern Oscillation, the Pacific North American Index, and other modes of climate variability. This paper will describe the development of a training module for use of LCAT and its integration into CRT. An iterative approach was used that incorporates specific examples of decision making while working with subject matter experts within the water supply community. The recommended strategy is to use a "stepping stone" learning structure to build users knowledge of best practices for use of LCAT.
The Azimuth Structure of Nuclear Collisions — I
NASA Astrophysics Data System (ADS)
Trainor, Thomas A.; Kettler, David T.
We describe azimuth structure commonly associated with elliptic and directed flow in the context of 2D angular autocorrelations for the purpose of precise separation of so-called nonflow (mainly minijets) from flow. We extend the Fourier-transform description of azimuth structure to include power spectra and autocorrelations related by the Wiener-Khintchine theorem. We analyze several examples of conventional flow analysis in that context and question the relevance of reaction plane estimation to flow analysis. We introduce the 2D angular autocorrelation with examples from data analysis and describe a simulation exercise which demonstrates precise separation of flow and nonflow using the 2D autocorrelation method. We show that an alternative correlation measure based on Pearson's normalized covariance provides a more intuitive measure of azimuth structure.
NASA Technical Reports Server (NTRS)
Freisinger, R. R.; Petersen, B. T.
1981-01-01
The assumption that teachers of technical writing agree on a definition of good writing was found to be without basis. Resolution of disagreements arising from close reading and textual analysis is described. Writing samples from corporate sources including IBM, ALCOA, Exxon, Weyerhaeuser, Bell Labs, Underwriters Laboratories, Dow Chemical, and US Steel were requested. A mixture of informative and persuasive examples, and examples directed at lay and specialist audiences were received. Analyses of 16 writing samples are reported. Analysis of word level, sentence level, and paragraph level, was completed. Syllabism, verb selection, nominalizations, vocabulary choices, t-units, subordination, sentence and clause length, syntactic order, patterns, development, topic sentences, propositional order, and transitions were analyzed.
ERIC Educational Resources Information Center
Alonzo, Julie; Tindal, Gerald
2011-01-01
This technical document provides guidance to educators on the creation and interpretation of survey instruments, particularly as they relate to an analysis of program implementation. Illustrative examples are drawn from a survey of educators related to the use of the easyCBM learning system. This document includes specific sections on…
29 CFR 4.51 - Prevailing in the locality determinations.
Code of Federal Regulations, 2010 CFR
2010-07-01
... applied in a given case will be determined after a careful analysis of the overall survey, separate... where, after analysis, it is determined that the median is not a reliable indicator. Examples where the mean may be used include situations where: (1) The number of workers studied for the job classification...
A Tutorial on Conducting Meta-Analyses of Clinical Outcome Research.
ERIC Educational Resources Information Center
Robey, Randall R.; Dalebout, Susan D.
1998-01-01
The purpose of this tutorial is to enhance the familiarity and accessibility of meta-analyses in the domains of audiology and speech-language pathology for investigating questions of treatment efficacy and treatment effectiveness. Steps to conducting a meta-analysis are explained and an example of meta-analysis using published data is included.…
Species-level Analysis of Biological Literature for Storage and Retrieval
ERIC Educational Resources Information Center
Shervis, L. J.; And Others
1972-01-01
Describes an information retrieval system in entomology which could also be used for other biological literature. With the examples of coding information into the system, a user might get some idea of how to search and what kind of information might be found. No cost analysis for running the program is included. (PS)
Giles, Tracey M; de Lacey, Sheryl; Muir-Cochrane, Eimear
2016-01-01
Grounded theory method has been described extensively in the literature. Yet, the varying processes portrayed can be confusing for novice grounded theorists. This article provides a worked example of the data analysis phase of a constructivist grounded theory study that examined family presence during resuscitation in acute health care settings. Core grounded theory methods are exemplified, including initial and focused coding, constant comparative analysis, memo writing, theoretical sampling, and theoretical saturation. The article traces the construction of the core category "Conditional Permission" from initial and focused codes, subcategories, and properties, through to its position in the final substantive grounded theory.
The Contribution of Particle Swarm Optimization to Three-Dimensional Slope Stability Analysis
A Rashid, Ahmad Safuan; Ali, Nazri
2014-01-01
Over the last few years, particle swarm optimization (PSO) has been extensively applied in various geotechnical engineering including slope stability analysis. However, this contribution was limited to two-dimensional (2D) slope stability analysis. This paper applied PSO in three-dimensional (3D) slope stability problem to determine the critical slip surface (CSS) of soil slopes. A detailed description of adopted PSO was presented to provide a good basis for more contribution of this technique to the field of 3D slope stability problems. A general rotating ellipsoid shape was introduced as the specific particle for 3D slope stability analysis. A detailed sensitivity analysis was designed and performed to find the optimum values of parameters of PSO. Example problems were used to evaluate the applicability of PSO in determining the CSS of 3D slopes. The first example presented a comparison between the results of PSO and PLAXI-3D finite element software and the second example compared the ability of PSO to determine the CSS of 3D slopes with other optimization methods from the literature. The results demonstrated the efficiency and effectiveness of PSO in determining the CSS of 3D soil slopes. PMID:24991652
The contribution of particle swarm optimization to three-dimensional slope stability analysis.
Kalatehjari, Roohollah; Rashid, Ahmad Safuan A; Ali, Nazri; Hajihassani, Mohsen
2014-01-01
Over the last few years, particle swarm optimization (PSO) has been extensively applied in various geotechnical engineering including slope stability analysis. However, this contribution was limited to two-dimensional (2D) slope stability analysis. This paper applied PSO in three-dimensional (3D) slope stability problem to determine the critical slip surface (CSS) of soil slopes. A detailed description of adopted PSO was presented to provide a good basis for more contribution of this technique to the field of 3D slope stability problems. A general rotating ellipsoid shape was introduced as the specific particle for 3D slope stability analysis. A detailed sensitivity analysis was designed and performed to find the optimum values of parameters of PSO. Example problems were used to evaluate the applicability of PSO in determining the CSS of 3D slopes. The first example presented a comparison between the results of PSO and PLAXI-3D finite element software and the second example compared the ability of PSO to determine the CSS of 3D slopes with other optimization methods from the literature. The results demonstrated the efficiency and effectiveness of PSO in determining the CSS of 3D soil slopes.
Symmetry analysis for hyperbolic equilibria using a TB/dengue fever model
NASA Astrophysics Data System (ADS)
Massoukou, R. Y. M.'Pika; Govinder, K. S.
2016-08-01
We investigate the interplay between Lie symmetry analysis and dynamical systems analysis. As an example, we take a toy model describing the spread of TB and dengue fever. We first undertake a comprehensive dynamical systems analysis including a discussion about local stability. For those regions in which such analyzes cannot be translated to global behavior, we undertake a Lie symmetry analysis. It is shown that the Lie analysis can be useful in providing information for systems where the (local) dynamical systems analysis breaks down.
Screening combinatorial arrays of inorganic materials with spectroscopy or microscopy
Schultz, Peter G.; Xiang, Xiaodong; Goldwasser, Isy
2004-02-03
Methods and apparatus for the preparation and use of a substrate having an array of diverse materials in predefined regions thereon. A substrate having an array of diverse materials thereon is generally prepared by delivering components of materials to predefined regions on a substrate, and simultaneously reacting the components to form at least two materials. Materials which can be prepared using the methods and apparatus of the present invention include, for example, covalent network solids, ionic solids and molecular solids. More particularly, materials which can be prepared using the methods and apparatus of the present invention include, for example, inorganic materials, intermetallic materials, metal alloys, ceramic materials, organic materials, organometallic materials, non-biological organic polymers, composite materials (e.g., inorganic composites, organic composites, or combinations thereof), etc. Once prepared, these materials can be screened for useful properties including, for example, electrical, thermal, mechanical, morphological, optical, magnetic, chemical, or other properties. Thus, the present invention provides methods for the parallel synthesis and analysis of novel materials having useful properties.
Combinatorial synthesis and screening of non-biological polymers
Schultz, Peter G.; Xiang, Xiao-Dong; Goldwasser, Isy; Briceno, Gabriel; Sun, Xiao-Dong; Wang, Kai-An
2006-04-25
Methods and apparatus for the preparation and use of a substrate having an array of diverse materials in predefined regions thereon. A substrate having an array of diverse materials thereon is generally prepared by delivering components of materials to predefined regions on a substrate, and simultaneously reacting the components to form at least two materials. Materials which can be prepared using the methods and apparatus of the present invention include, for example, covalent network solids, ionic solids and molecular solids. More particularly, materials which can be prepared using the methods and apparatus of the present invention include, for example, inorganic materials, intermetallic materials, metal alloys, ceramic materials, organic materials, organometallic materials, non-biological organic polymers, composite materials (e.g., inorganic composites, organic composites, or combinations thereof), etc. Once prepared, these materials can be screened for useful properties including, for example, electrical, thermal, mechanical, morphological, optical, magnetic, chemical, or other properties. Thus, the present invention provides methods for the parallel synthesis and analysis of novel materials having useful properties.
Combinatorial synthesis of novel materials
Schultz, Peter G.; Xiang, Xiaodong; Goldwasser, Isy
1999-01-01
Methods and apparatus for the preparation and use of a substrate having an array of diverse materials in predefined regions thereon. A substrate having an array of diverse materials thereon is generally prepared by delivering components of materials to predefined regions on a substrate, and simultaneously reacting the components to form at least two materials. Materials which can be prepared using the methods and apparatus of the present invention include, for example, covalent network solids, ionic solids and molecular solids. More particularly, materials which can be prepared using the methods and apparatus of the present invention include, for example, inorganic materials, intermetallic materials, metal alloys, ceramic materials, organic materials, organometallic materials, non-biological organic polymers, composite materials (e.g., inorganic composites, organic composites, or combinations thereof), etc. Once prepared, these materials can be screened for useful properties including, for example, electrical, thermal, mechanical, morphological, optical, magnetic, chemical, or other properties. Thus, the present invention provides methods for the parallel synthesis and analysis of novel materials having useful properties.
Combinatorial sythesis of organometallic materials
Schultz, Peter G.; Xiang, Xiaodong; Goldwasser, Isy
2002-07-16
Methods and apparatus for the preparation and use of a substrate having an array of diverse materials in predefined regions thereon. A substrate having an array of diverse materials thereon is generally prepared by delivering components of materials to predefined regions on a substrate, and simultaneously reacting the components to form at least two materials. Materials which can be prepared using the methods and apparatus of the present invention include, for example, covalent network solids, ionic solids and molecular solids. More particularly, materials which can be prepared using the methods and apparatus of the present invention include, for example, inorganic materials, intermetallic materials, metal alloys, ceramic materials, organic materials, organometallic materials, non-biological organic polymers, composite materials (e.g., inorganic composites, organic composites, or combinations thereof), etc. Once prepared, these materials can be screened for useful properties including, for example, electrical, thermal, mechanical, morphological, optical, magnetic, chemical, or other properties. Thus, the present invention provides methods for the parallel synthesis and analysis of novel materials having useful properties.
Polymer arrays from the combinatorial synthesis of novel materials
Schultz, Peter G.; Xiang, Xiao-Dong; Goldwasser, Isy; Briceno, Gabriel; Sun, Xiao-Dong
2004-09-21
Methods and apparatus for the preparation and use of a substrate having an array of diverse materials in predefined regions thereon. A substrate having an array of diverse materials thereon is generally prepared by delivering components of materials to predefined regions on a substrate, and simultaneously reacting the components to form at least two materials. Materials which can be prepared using the methods and apparatus of the present invention include, for example, covalent network solids, ionic solids and molecular solids. More particularly, materials which can be prepared using the methods and apparatus of the present invention include, for example, inorganic materials, intermetallic materials, metal alloys, ceramic materials, organic materials, organometallic materials, non-biological organic polymers, composite materials (e.g., inorganic composites, organic composites, or combinations thereof), etc. Once prepared, these materials can be screened for useful properties including, for example, electrical, thermal, mechanical, morphological, optical, magnetic, chemical, or other properties. Thus, the present invention provides methods for the parallel synthesis and analysis of novel materials having useful properties.
Giant magnetoresistive cobalt oxide compounds
Schultz, Peter G.; Xiang, Xiaodong; Goldwasser, Isy
1998-01-01
Methods and apparatus for the preparation and use of a substrate having an array of diverse materials in predefined regions thereon. A substrate having an array of diverse materials thereon is generally prepared by delivering components of materials to predefined regions on a substrate, and simultaneously reacting the components to form at least two materials. Materials which can be prepared using the methods and apparatus of the present invention include, for example, covalent network solids, ionic solids and molecular solids. More particularly, materials which can be prepared using the methods and apparatus of the present invention include, for example, inorganic materials, intermetallic materials, metal alloys, ceramic materials, organic materials, organometallic materials, non-biological organic polymers, composite materials (e.g., inorganic composites, organic composites, or combinations thereof), etc. Once prepared, these materials can be screened for useful properties including, for example, electrical, thermal, mechanical, morphological, optical, magnetic, chemical, or other properties. Thus, the present invention provides methods for the parallel synthesis and analysis of novel materials having useful properties.
Combinatorial synthesis of novel materials
Schultz, Peter G.; Xiang, Xiaodong; Goldwasser, Isy
2002-02-12
Methods and apparatus for the preparation and use of a substrate having an array of diverse materials in predefined regions thereon. A substrate having an array of diverse materials thereon is generally prepared by delivering components of materials to predefined regions on a substrate, and simultaneously reacting the components to form at least two materials. Materials which can be prepared using the methods and apparatus of the present invention include, for example, covalent network solids, ionic solids and molecular solids. More particularly, materials which can be prepared using the methods and apparatus of the present invention include, for example, inorganic materials, intermetallic materials, metal alloys, ceramic materials, organic materials, organometallic materials, non-biological organic polymers, composite materials (e.g., inorganic composites, organic composites, or combinations thereof), etc. Once prepared, these materials can be screened for useful properties including, for example, electrical, thermal, mechanical, morphological, optical, magnetic, chemical, or other properties. Thus, the present invention provides methods for the parallel synthesis and analysis of novel materials having useful properties.
Giant magnetoresistive cobalt oxide compounds
Schultz, P.G.; Xiang, X.; Goldwasser, I.
1998-07-07
Methods and apparatus are disclosed for the preparation and use of a substrate having an array of diverse materials in predefined regions thereon. A substrate having an array of diverse materials thereon is generally prepared by delivering components of materials to predefined regions on a substrate, and simultaneously reacting the components to form at least two materials. Materials which can be prepared using the methods and apparatus of the present invention include, for example, covalent network solids, ionic solids and molecular solids. More particularly, materials which can be prepared using the methods and apparatus of the present invention include, for example, inorganic materials, intermetallic materials, metal alloys, ceramic materials, organic materials, organometallic materials, non-biological organic polymers, composite materials (e.g., inorganic composites, organic composites, or combinations thereof), etc. Once prepared, these materials can be screened for useful properties including, for example, electrical, thermal, mechanical, morphological, optical, magnetic, chemical, or other properties. Thus, the present invention provides methods for the parallel synthesis and analysis of novel materials having useful properties. 58 figs.
Preparation and screening of crystalline inorganic materials
Schultz, Peter G [La Jolla, CA; Xiang, Xiaodong [Danville, CA; Goldwasser, Isy [Palo Alto, CA; Brice{hacek over }o, Gabriel; Sun, Xiao-Dong [Fremont, CA; Wang, Kai-An [Cupertino, CA
2008-10-28
Methods and apparatus for the preparation and use of a substrate having an array of diverse materials in predefined regions thereon. A substrate having an array of diverse materials thereon is generally prepared by delivering components of materials to predefined regions on a substrate, and simultaneously reacting the components to form at least two materials. Materials which can be prepared using the methods and apparatus of the present invention include, for example, covalent network solids, ionic solids and molecular solids. More particularly, materials which can be prepared using the methods and apparatus of the present invention include, for example, inorganic materials, intermetallic materials, metal alloys, ceramic materials, organic materials, organometallic materials, non-biological organic polymers, composite materials (e.g., inorganic composites, organic composites, or combinations thereof), etc. Once prepared, these materials can be screened for useful properties including, for example, electrical, thermal, mechanical, morphological, optical, magnetic, chemical, or other properties. Thus, the present invention provides methods for the parallel synthesis and analysis of novel materials having useful properties.
Synthesis and screening combinatorial arrays of zeolites
Schultz, Peter G.; Xiang, Xiaodong; Goldwasser, Isy
2003-11-18
Methods and apparatus for the preparation and use of a substrate having an array of diverse materials in predefined regions thereon. A substrate having an array of diverse materials thereon is generally prepared by delivering components of materials to predefined regions on a substrate, and simultaneously reacting the components to form at least two materials. Materials which can be prepared using the methods and apparatus of the present invention include, for example, covalent network solids, ionic solids and molecular solids. More particularly, materials which can be prepared using the methods and apparatus of the present invention include, for example, inorganic materials, intermetallic materials, metal alloys, ceramic materials, organic materials, organometallic materials, non-biological organic polymers, composite materials (e.g., inorganic composites, organic composites, or combinations thereof), etc. Once prepared, these materials can be screened for useful properties including, for example, electrical, thermal, mechanical, morphological, optical, magnetic, chemical, or other properties. Thus, the present invention provides methods for the parallel synthesis and analysis of novel materials having useful properties.
Combinatorial synthesis of novel materials
Schultz, Peter G.; Xiang, Xiaodong; Goldwasser, Isy
1999-12-21
Methods and apparatus for the preparation and use of a substrate having an array of diverse materials in predefined regions thereon. A substrate having an array of diverse materials thereon is generally prepared by delivering components of materials to predefined regions on a substrate, and simultaneously reacting the components to form at least two materials. Materials which can be prepared using the methods and apparatus of the present invention include, for example, covalent network solids, ionic solids and molecular solids. More particularly, materials which can be prepared using the methods and apparatus of the present invention include, for example, inorganic materials, intermetallic materials, metal alloys, ceramic materials, organic materials, organometallic materials, non-biological organic polymers, composite materials (e.g., inorganic composites, organic composites, or combinations thereof), etc. Once prepared, these materials can be screened for useful properties including, for example, electrical, thermal, mechanical, morphological, optical, magnetic, chemical, or other properties. Thus, the present invention provides methods for the parallel synthesis and analysis of novel materials having useful properties.
Combinatorial synthesis of novel materials
Schultz, Peter G.; Xiang, Xiaodong; Goldwasser, Isy
2001-01-01
Methods and apparatus for the preparation and use of a substrate having an array of diverse materials in predefined regions thereon. A substrate having an array of diverse materials thereon is generally prepared by delivering components of materials to predefined regions on a substrate, and simultaneously reacting the components to form at least two materials. Materials which can be prepared using the methods and apparatus of the present invention include, for example, covalent network solids, ionic solids and molecular solids. More particularly, materials which can be prepared using the methods and apparatus of the present invention include, for example, inorganic materials, intermetallic materials, metal alloys, ceramic materials, organic materials, organometallic materials, non-biological organic polymers, composite materials (e.g., inorganic composites, organic composites, or combinations thereof), etc. Once prepared, these materials can be screened for useful properties including, for example, electrical, thermal, mechanical, morphological, optical, magnetic, chemical, or other properties. Thus, the present invention provides methods for the parallel synthesis and analysis of novel materials having useful properties.
Combinatorial screening of inorganic and organometallic materials
Schultz, Peter G.; Xiang, Xiaodong; Goldwasser, Isy
2002-01-01
Methods and apparatus for the preparation and use of a substrate having an array of diverse materials in predefined regions thereon. A substrate having an array of diverse materials thereon is generally prepared by delivering components of materials to predefined regions on a substrate, and simultaneously reacting the components to form at least two materials. Materials which can be prepared using the methods and apparatus of the present invention include, for example, covalent network solids, ionic solids and molecular solids. More particularly, materials which can be prepared using the methods and apparatus of the present invention include, for example, inorganic materials, intermetallic materials, metal alloys, ceramic materials, organic materials, organometallic materials, non-biological organic polymers, composite materials (e.g., inorganic composites, organic composites, or combinations thereof), etc. Once prepared, these materials can be screened for useful properties including, for example, electrical, thermal, mechanical, morphological, optical, magnetic, chemical, or other properties. Thus, the present invention provides methods for the parallel synthesis and analysis of novel materials having useful properties.
Preparation and screening of crystalline zeolite and hydrothermally-synthesized materials
Schultz, Peter G.; Xiang, Xiaodong; Goldwasser, Isy; Briceno, Gabriel; Sun, Xiao-Dong; Wang, Kai-An
2005-03-08
Methods and apparatus for the preparation and use of a substrate having an array of diverse materials in predefined regions thereon. A substrate having an array of diverse materials thereon is generally prepared by delivering components of materials to predefined regions on a substrate, and simultaneously reacting the components to form at least two materials. Materials which can be prepared using the methods and apparatus of the present invention include, for example, covalent network solids, ionic solids and molecular solids. More particularly, materials which can be prepared using the methods and apparatus of the present invention include, for example, inorganic materials, intermetallic materials, metal alloys, ceramic materials, organic materials, organometallic materials, non-biological organic polymers, composite materials (e.g., inorganic composites, organic composites, or combinations thereof), etc. Once prepared, these materials can be screened for useful properties including, for example, electrical, thermal, mechanical, morphological, optical, magnetic, chemical, or other properties. Thus, the present invention provides methods for the parallel synthesis and analysis of novel materials having useful properties.
Space physics education via examples in the undergraduate physics curriculum
NASA Astrophysics Data System (ADS)
Martin, R.; Holland, D. L.
2011-12-01
The field of space physics is rich with examples of basic physics and analysis techniques, yet it is rarely seen in physics courses or textbooks. As space physicists in an undergraduate physics department we like to use research to inform teaching, and we find that students respond well to examples from magnetospheric science. While we integrate examples into general education courses as well, this talk will focus on physics major courses. Space physics examples are typically selected to illustrate a particular concept or method taught in the course. Four examples will be discussed, from an introductory electricity and magnetism course, a mechanics/nonlinear dynamics course, a computational physics course, and a plasma physics course. Space physics provides examples of many concepts from introductory E&M, including the application of Faraday's law to terrestrial magnetic storm effects and the use of the basic motion of charged particles as a springboard to discussion of the inner magnetosphere and the aurora. In the mechanics and nonlinear dynamics courses, the motion of charged particles in a magnetotail current sheet magnetic field is treated as a Newtonian dynamical system, illustrating the Poincaré surface-of-section technique, the partitioning of phase space, and the KAM theorem. Neural network time series analysis of AE data is used as an example in the computational physics course. Finally, among several examples, current sheet particle dynamics is utilized in the plasma physics course to illustrate the notion of adiabatic/guiding center motion and the breakdown of the adiabatic approximation. We will present short descriptions of our pedagogy and student assignments in this "backdoor" method of space physics education.
Strong smoker interest in 'setting an example to children' by quitting: national survey data.
Thomson, George; Wilson, Nick; Weerasekera, Deepa; Edwards, Richard
2011-02-01
To further explore smoker views on reasons to quit. As part of the multi-country ITC Project, a national sample of 1,376 New Zealand adult (18+ years) smokers was surveyed in 2007/08. This sample included boosted sampling of Māori, Pacific and Asian New Zealanders. 'Setting an example to children' was given as 'very much' a reason to quit by 51%, compared to 45% giving personal health concerns. However, the 'very much' and 'somewhat' responses (combined) were greater for personal health (81%) than 'setting an example to children' (74%). Price was the third ranked reason (67%). In a multivariate analysis, women were significantly more likely to state that 'setting an example to children' was 'very much' or 'somewhat' a reason to quit; as were Māori, or Pacific compared to European; and those suffering financial stress. The relatively high importance of 'example to children' as a reason to quit is an unusual finding, and may have arisen as a result of social marketing campaigns encouraging cessation to protect families in New Zealand. The policy implications could include a need for a greater emphasis on social reasons (e.g. 'example to children'), in pack warnings, and in social marketing for smoking cessation. © 2011 The Authors. ANZJPH © 2010 Public Health Association of Australia.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Paap, G.C.
1991-03-01
From general equations which describe the transient electromechanical behavior of the asynchronous squirrel-cage motor, and which include the influence of space harmonics and mutual slotting, simplified models are derived and compared. The models derived are demonstrated in examples where special attention is paid to the influence of the place of the harmonics in the mutual inductance matrix and the influence of mutual slotting. Further, the steady-state equations are derived and the back-transformation for the stator and rotor currents is given. One example is compared with the result of measurements.
Engineering issues for hand-held sensing devices, with examples
NASA Astrophysics Data System (ADS)
Freiwald, David A.; Freiwald, Joyce
1994-03-01
It is now U.S. defense policy that there will be no new platform starts. The emphasis for platforms will be on O&M cost reduction, life-extension improvements, and force-multiplier- device upgrades. There is also an increasing emphasis on hand-held force-multiplier devices for individuals, which is the focus of this paper. Engineering issues include operations analysis, weight, cube, cost, prime power, ease of use, data storage, reliability, fault tolerance, data communications and human factors. Two examples of hand-held devices are given. Applications include USMC, Army, SOCOM, DEA, FBI, SS, Border Patrol and others. Barriers to adoption of such technology are also discussed.
Analysis of precision and accuracy in a simple model of machine learning
NASA Astrophysics Data System (ADS)
Lee, Julian
2017-12-01
Machine learning is a procedure where a model for the world is constructed from a training set of examples. It is important that the model should capture relevant features of the training set, and at the same time make correct prediction for examples not included in the training set. I consider the polynomial regression, the simplest method of learning, and analyze the accuracy and precision for different levels of the model complexity.
NASA Astrophysics Data System (ADS)
Herzfeld, U. C.; Hunke, E. C.; Trantow, T.; Greve, R.; McDonald, B.; Wallin, B.
2014-12-01
Understanding of the state of the cryosphere and its relationship to other components of the Earth system requires both models of geophysical processes and observations of geophysical properties and processes, however linking observations and models is far from trivial. This paper looks at examples from sea ice and land ice model-observation linkages to examine some approaches, challenges and solutions. In a sea-ice example, ice deformation is analyzed as a key process that indicates fundamental changes in the Arctic sea ice cover. Simulation results from the Los Alamos Sea-Ice Model CICE, which is also the sea-ice component of the Community Earth System Model (CESM), are compared to parameters indicative of deformation as derived from mathematical analysis of remote sensing data. Data include altimeter, micro-ASAR and image data from manned and unmanned aircraft campaigns (NASA OIB and Characterization of Arctic Sea Ice Experiment, CASIE). The key problem to linking data and model results is the derivation of matching parameters on both the model and observation side.For terrestrial glaciology, we include an example of a surge process in a glacier system and and example of a dynamic ice sheet model for Greenland. To investigate the surge of the Bering Bagley Glacier System, we use numerical forward modeling experiments and, on the data analysis side, a connectionist approach to analyze crevasse provinces. In the Greenland ice sheet example, we look at the influence of ice surface and bed topography, as derived from remote sensing data, on on results from a dynamic ice sheet model.
Liver CT image processing: a short introduction of the technical elements.
Masutani, Y; Uozumi, K; Akahane, Masaaki; Ohtomo, Kuni
2006-05-01
In this paper, we describe the technical aspects of image analysis for liver diagnosis and treatment, including the state-of-the-art of liver image analysis and its applications. After discussion on modalities for liver image analysis, various technical elements for liver image analysis such as registration, segmentation, modeling, and computer-assisted detection are covered with examples performed with clinical data sets. Perspective in the imaging technologies is also reviewed and discussed.
NASTRAN user's guide: Level 15
NASA Technical Reports Server (NTRS)
1975-01-01
The NASTRAN structural analysis system is presented. This user's guide is an essential addition to the original four NASTRAN manuals. Clear, brief descriptions of capabilities with example input are included, with references to the location of more complete information.
An Economic Analysis of a Change in an Excise Tax
ERIC Educational Resources Information Center
Barron, John M.; Blanchard, Kelly Hunt; Umbeck, John R.
2004-01-01
The authors present an example of the effect a change in the excise tax can have on retail gasoline prices. The findings provide support for standard economic theory, as well as provide a vehicle for illustrating some of the subtleties of the analysis, including the implicit assumptions regarding the implications for the buying and selling prices…
Benefits of Using Planned Comparisons Rather Than Post Hoc Tests: A Brief Review with Examples.
ERIC Educational Resources Information Center
DuRapau, Theresa M.
The rationale behind analysis of variance (including analysis of covariance and multiple analyses of variance and covariance) methods is reviewed, and unplanned and planned methods of evaluating differences between means are briefly described. Two advantages of using planned or a priori tests over unplanned or post hoc tests are presented. In…
Examples of Data Analysis with SPSS/PC+ Studentware.
ERIC Educational Resources Information Center
MacFarland, Thomas W.
Intended for classroom use only, these unpublished notes contain computer lessons on descriptive statistics with files previously created in WordPerfect 4.2 and Lotus 1-2-3 Version 1.A for the IBM PC+. The statistical measures covered include Student's t-test with two independent samples; Student's t-test with a paired sample; Chi-square analysis;…
Code of Federal Regulations, 2010 CFR
2010-04-01
... transaction price initially reflected in the taxpayer's books and records. The results of controlled..., including an analysis of the economic and legal factors that affect the pricing of its property or services... the economic analysis and projections relied upon in developing the method. For example, if a profit...
Skylab S-191 spectrometer single spectral scan analysis program. [user manual
NASA Technical Reports Server (NTRS)
Downes, E. L.
1974-01-01
Documentation and user information for the S-191 single spectral scan analysis program are reported. A breakdown of the computational algorithms is supplied, followed by the program listing and examples of sample output. A copy of the flow chart which describes the driver routine in the body of the main program segment is included.
Infrared thermography for examination of paper structure
NASA Astrophysics Data System (ADS)
Kiiskinen, Harri T.; Pakarinen, Pekka I.
1998-03-01
The paper industry has used IR cameras primarily for troubleshooting, where the most common examples include the examination of the condition of dryer fabrics and dryer cylinders and the analysis of moisture variations in a paper web. Another application extensively using IR thermography is non-destructive testing of composite materials. This paper presents some recently developed laboratory methods using an IR camera to examine paper structure. Specific areas include cockling, moisture content, thermal uniformity, mechanism of failure, and an analysis of the copying process.
Space radiator simulation manual for computer code
NASA Technical Reports Server (NTRS)
Black, W. Z.; Wulff, W.
1972-01-01
A computer program that simulates the performance of a space radiator is presented. The program basically consists of a rigorous analysis which analyzes a symmetrical fin panel and an approximate analysis that predicts system characteristics for cases of non-symmetrical operation. The rigorous analysis accounts for both transient and steady state performance including aerodynamic and radiant heating of the radiator system. The approximate analysis considers only steady state operation with no aerodynamic heating. A description of the radiator system and instructions to the user for program operation is included. The input required for the execution of all program options is described. Several examples of program output are contained in this section. Sample output includes the radiator performance during ascent, reentry and orbit.
Simulation Tools for Forest Health Analysis: An Application in the Red River Watershed, Idaho
Andrew J. McMahan; Eric L. Smith
2006-01-01
Software tools for landscape analyses--including FVS model extensions, and a number of FVS-related pre- and post-processing âtoolsâ--are presented, using an analysis in the Red River Watershed, Nez Perce National Forest as an example. We present (1) a discussion of pre-simulation data analysis; (2) the Physiographic Information Extraction System (PIES), a tool that can...
45 CFR 96.125 - Primary prevention.
Code of Federal Regulations, 2013 CFR
2013-10-01
... following: (i) Clearinghouse/information resource center(s); (ii) Resource directories; (iii) Media... under this strategy aim to affect critical life and social skills, including decision-making, refusal skills, critical analysis (e.g. of media messages) and systematic judgment abilities. Examples of...
45 CFR 96.125 - Primary prevention.
Code of Federal Regulations, 2011 CFR
2011-10-01
... following: (i) Clearinghouse/information resource center(s); (ii) Resource directories; (iii) Media... under this strategy aim to affect critical life and social skills, including decision-making, refusal skills, critical analysis (e.g. of media messages) and systematic judgment abilities. Examples of...
45 CFR 96.125 - Primary prevention.
Code of Federal Regulations, 2014 CFR
2014-10-01
... following: (i) Clearinghouse/information resource center(s); (ii) Resource directories; (iii) Media... under this strategy aim to affect critical life and social skills, including decision-making, refusal skills, critical analysis (e.g. of media messages) and systematic judgment abilities. Examples of...
45 CFR 96.125 - Primary prevention.
Code of Federal Regulations, 2012 CFR
2012-10-01
... following: (i) Clearinghouse/information resource center(s); (ii) Resource directories; (iii) Media... under this strategy aim to affect critical life and social skills, including decision-making, refusal skills, critical analysis (e.g. of media messages) and systematic judgment abilities. Examples of...
ERIC Educational Resources Information Center
Grosser, Arthur E.
1984-01-01
Suggests chemistry of cooking and analysis of culinary recipes as subject matter for introducing chemistry to an audience, especially to individuals with neutral or negative attitudes toward science. Includes sample recipes and experiments and a table listing scientific topics with related cooking examples. (JN)
Knowledge Representation Standards and Interchange Formats for Causal Graphs
NASA Technical Reports Server (NTRS)
Throop, David R.; Malin, Jane T.; Fleming, Land
2005-01-01
In many domains, automated reasoning tools must represent graphs of causally linked events. These include fault-tree analysis, probabilistic risk assessment (PRA), planning, procedures, medical reasoning about disease progression, and functional architectures. Each of these fields has its own requirements for the representation of causation, events, actors and conditions. The representations include ontologies of function and cause, data dictionaries for causal dependency, failure and hazard, and interchange formats between some existing tools. In none of the domains has a generally accepted interchange format emerged. The paper makes progress towards interoperability across the wide range of causal analysis methodologies. We survey existing practice and emerging interchange formats in each of these fields. Setting forth a set of terms and concepts that are broadly shared across the domains, we examine the several ways in which current practice represents them. Some phenomena are difficult to represent or to analyze in several domains. These include mode transitions, reachability analysis, positive and negative feedback loops, conditions correlated but not causally linked and bimodal probability distributions. We work through examples and contrast the differing methods for addressing them. We detail recent work in knowledge interchange formats for causal trees in aerospace analysis applications in early design, safety and reliability. Several examples are discussed, with a particular focus on reachability analysis and mode transitions. We generalize the aerospace analysis work across the several other domains. We also recommend features and capabilities for the next generation of causal knowledge representation standards.
Reduction procedures for accurate analysis of MSX surveillance experiment data
NASA Technical Reports Server (NTRS)
Gaposchkin, E. Mike; Lane, Mark T.; Abbot, Rick I.
1994-01-01
Technical challenges of the Midcourse Space Experiment (MSX) science instruments require careful characterization and calibration of these sensors for analysis of surveillance experiment data. Procedures for reduction of Resident Space Object (RSO) detections will be presented which include refinement and calibration of the metric and radiometric (and photometric) data and calculation of a precise MSX ephemeris. Examples will be given which support the reduction, and these are taken from ground-test data similar in characteristics to the MSX sensors and from the IRAS satellite RSO detections. Examples to demonstrate the calculation of a precise ephemeris will be provided from satellites in similar orbits which are equipped with S-band transponders.
Analysis of crack propagation as an energy absorption mechanism in metal matrix composites
NASA Technical Reports Server (NTRS)
Adams, D. F.; Murphy, D. P.
1981-01-01
The crack initiation and crack propagation capability was extended to the previously developed generalized plane strain, finite element micromechanics analysis. Also, an axisymmetric analysis was developed, which contains all of the general features of the plane analysis, including elastoplastic material behavior, temperature-dependent material properties, and crack propagation. These analyses were used to generate various example problems demonstrating the inelastic response of, and crack initiation and propagation in, a boron/aluminum composite.
The NASA Monographs on Shell Stability Design Recommendations: A Review and Suggested Improvements
NASA Technical Reports Server (NTRS)
Nemeth, Michael P.; Starnes, James H., Jr.
1998-01-01
A summary of existing NASA design criteria monographs for the design of buckling-resistant thin-shell structures is presented. Subsequent improvements in the analysis for nonlinear shell response are reviewed, and current issues in shell stability analysis are discussed. Examples of nonlinear shell responses that are not included in the existing shell design monographs are presented, and an approach for including reliability based analysis procedures in the shell design process is discussed. Suggestions for conducting future shell experiments are presented, and proposed improvements to the NASA shell design criteria monographs are discussed.
The NASA Monographs on Shell Stability Design Recommendations: A Review and Suggested Improvements
NASA Technical Reports Server (NTRS)
Nemeth, Michael P.; Starnes, James H., Jr.
1998-01-01
A summary of the existing NASA design criteria monographs for the design of buckling-resistant thin-shell structures is presented. Subsequent improvements in the analysis for nonlinear shell response are reviewed, and current issues in shell stability analysis are discussed. Examples of nonlinear shell responses that are not included in the existing shell design monographs are presented, and an approach for including reliability-based analysis procedures in the shell design process is discussed. Suggestions for conducting future shell experiments are presented, and proposed improvements to the NASA shell design criteria monographs are discussed.
Chaudhry, Jehanzeb Hameed; Estep, Don; Tavener, Simon; Carey, Varis; Sandelin, Jeff
2016-01-01
We consider numerical methods for initial value problems that employ a two stage approach consisting of solution on a relatively coarse discretization followed by solution on a relatively fine discretization. Examples include adaptive error control, parallel-in-time solution schemes, and efficient solution of adjoint problems for computing a posteriori error estimates. We describe a general formulation of two stage computations then perform a general a posteriori error analysis based on computable residuals and solution of an adjoint problem. The analysis accommodates various variations in the two stage computation and in formulation of the adjoint problems. We apply the analysis to compute "dual-weighted" a posteriori error estimates, to develop novel algorithms for efficient solution that take into account cancellation of error, and to the Parareal Algorithm. We test the various results using several numerical examples.
Barreto, Goncalo; Soininen, Antti; Sillat, Tarvo; Konttinen, Yrjö T; Kaivosoja, Emilia
2014-01-01
Time-of-flight secondary ion mass spectrometry (ToF-SIMS) is increasingly being used in analysis of biological samples. For example, it has been applied to distinguish healthy and osteoarthritic human cartilage. This chapter discusses ToF-SIMS principle and instrumentation including the three modes of analysis in ToF-SIMS. ToF-SIMS sets certain requirements for the samples to be analyzed; for example, the samples have to be vacuum compatible. Accordingly, sample processing steps for different biological samples, i.e., proteins, cells, frozen and paraffin-embedded tissues and extracellular matrix for the ToF-SIMS are presented. Multivariate analysis of the ToF-SIMS data and the necessary data preprocessing steps (peak selection, data normalization, mean-centering, and scaling and transformation) are discussed in this chapter.
Sample size considerations for clinical research studies in nuclear cardiology.
Chiuzan, Cody; West, Erin A; Duong, Jimmy; Cheung, Ken Y K; Einstein, Andrew J
2015-12-01
Sample size calculation is an important element of research design that investigators need to consider in the planning stage of the study. Funding agencies and research review panels request a power analysis, for example, to determine the minimum number of subjects needed for an experiment to be informative. Calculating the right sample size is crucial to gaining accurate information and ensures that research resources are used efficiently and ethically. The simple question "How many subjects do I need?" does not always have a simple answer. Before calculating the sample size requirements, a researcher must address several aspects, such as purpose of the research (descriptive or comparative), type of samples (one or more groups), and data being collected (continuous or categorical). In this article, we describe some of the most frequent methods for calculating the sample size with examples from nuclear cardiology research, including for t tests, analysis of variance (ANOVA), non-parametric tests, correlation, Chi-squared tests, and survival analysis. For the ease of implementation, several examples are also illustrated via user-friendly free statistical software.
First integrals and parametric solutions of third-order ODEs admitting {\\mathfrak{sl}(2, {R})}
NASA Astrophysics Data System (ADS)
Ruiz, A.; Muriel, C.
2017-05-01
A complete set of first integrals for any third-order ordinary differential equation admitting a Lie symmetry algebra isomorphic to sl(2, {R}) is explicitly computed. These first integrals are derived from two linearly independent solutions of a linear second-order ODE, without additional integration. The general solution in parametric form can be obtained by using the computed first integrals. The study includes a parallel analysis of the four inequivalent realizations of sl(2, {R}) , and it is applied to several particular examples. These include the generalized Chazy equation, as well as an example of an equation which admits the most complicated of the four inequivalent realizations.
Immortal time bias in observational studies of time-to-event outcomes.
Jones, Mark; Fowler, Robert
2016-12-01
The purpose of the study is to show, through simulation and example, the magnitude and direction of immortal time bias when an inappropriate analysis is used. We compare 4 methods of analysis for observational studies of time-to-event outcomes: logistic regression, standard Cox model, landmark analysis, and time-dependent Cox model using an example data set of patients critically ill with influenza and a simulation study. For the example data set, logistic regression, standard Cox model, and landmark analysis all showed some evidence that treatment with oseltamivir provides protection from mortality in patients critically ill with influenza. However, when the time-dependent nature of treatment exposure is taken account of using a time-dependent Cox model, there is no longer evidence of a protective effect of treatment. The simulation study showed that, under various scenarios, the time-dependent Cox model consistently provides unbiased treatment effect estimates, whereas standard Cox model leads to bias in favor of treatment. Logistic regression and landmark analysis may also lead to bias. To minimize the risk of immortal time bias in observational studies of survival outcomes, we strongly suggest time-dependent exposures be included as time-dependent variables in hazard-based analyses. Copyright © 2016 Elsevier Inc. All rights reserved.
Linear Discriminant Analysis on a Spreadsheet.
ERIC Educational Resources Information Center
Busbey, Arthur Bresnahan III
1989-01-01
Described is a software package, "Trapeze," within which a routine called LinDis can be used. Discussed are teaching methods, the linear discriminant model and equations, the LinDis worksheet, and an example. The set up for this routine is included. (CW)
34 CFR 668.10 - Direct assessment programs.
Code of Federal Regulations, 2010 CFR
2010-07-01
... such as creativity, analysis or synthesis associated with the subject matter of the program. Examples... measurement apply to direct assessment programs. Because a direct assessment program does not utilize credit... program includes regularly scheduled learning sessions, faculty-guided independent study, consultations...
Guidelines for the design of subsurface drainage systems for highway structural sections
DOT National Transportation Integrated Search
1972-06-01
Design criteria and a design method for pavement subsurface drainage systems include inflow-outflow method of analysis, open graded drainage layers, collector drains, pipe outlets and markers. Design examples are given for embankment sections, cut se...
The use of multivariate statistics in studies of wildlife habitat
David E. Capen
1981-01-01
This report contains edited and reviewed versions of papers presented at a workshop held at the University of Vermont in April 1980. Topics include sampling avian habitats, multivariate methods, applications, examples, and new approaches to analysis and interpretation.
ERIC Educational Resources Information Center
Ross, Peter
1987-01-01
Discusses intelligent tutoring systems (ITS), one application of artificial intelligence to computers used in education. Basic designs of ITSs are described; examples are given including PROUST, GREATERP, and the use of simulation with ITSs; protocol analysis is discussed; and 38 prototype ITSs are listed. (LRW)
46 CFR 501.27 - Delegation to and redelegation by the Director, Bureau of Trade Analysis.
Code of Federal Regulations, 2010 CFR
2010-10-01
... category of agreement or modification includes, for example, the following: a restatement filed to conform... agreement; a correction of typographical or grammatical errors in the text of an agreement; a change in the...
ERIC Educational Resources Information Center
Reifel, Stuart
2009-01-01
The purpose of this study was to explore an example of girls' doll play in contemporary US culture, including its virtual, political, marketing, and other contextual meanings. The narrative that provoked the analysis was a brief news report about a controversial school function--a school fund-raiser fashion show featuring American Girl doll…
NASA Technical Reports Server (NTRS)
McGowan, David M.; Anderson, Melvin S.
1998-01-01
The analytical formulation of curved-plate non-linear equilibrium equations that include transverse-shear-deformation effects is presented. A unified set of non-linear strains that contains terms from both physical and tensorial strain measures is used. Using several simplifying assumptions, linearized, stability equations are derived that describe the response of the plate just after bifurcation buckling occurs. These equations are then modified to allow the plate reference surface to be located a distance z(c), from the centroid surface which is convenient for modeling stiffened-plate assemblies. The implementation of the new theory into the VICONOPT buckling and vibration analysis and optimum design program code is described. Either classical plate theory (CPT) or first-order shear-deformation plate theory (SDPT) may be selected in VICONOPT. Comparisons of numerical results for several example problems with different loading states are made. Results from the new curved-plate analysis compare well with closed-form solution results and with results from known example problems in the literature. Finally, a design-optimization study of two different cylindrical shells subject to uniform axial compression is presented.
Genome Data Exploration Using Correspondence Analysis
Tekaia, Fredj
2016-01-01
Recent developments of sequencing technologies that allow the production of massive amounts of genomic and genotyping data have highlighted the need for synthetic data representation and pattern recognition methods that can mine and help discovering biologically meaningful knowledge included in such large data sets. Correspondence analysis (CA) is an exploratory descriptive method designed to analyze two-way data tables, including some measure of association between rows and columns. It constructs linear combinations of variables, known as factors. CA has been used for decades to study high-dimensional data, and remarkable inferences from large data tables were obtained by reducing the dimensionality to a few orthogonal factors that correspond to the largest amount of variability in the data. Herein, I review CA and highlight its use by considering examples in handling high-dimensional data that can be constructed from genomic and genetic studies. Examples in amino acid compositions of large sets of species (viruses, phages, yeast, and fungi) as well as an example related to pairwise shared orthologs in a set of yeast and fungal species, as obtained from their proteome comparisons, are considered. For the first time, results show striking segregations between yeasts and fungi as well as between viruses and phages. Distributions obtained from shared orthologs show clusters of yeast and fungal species corresponding to their phylogenetic relationships. A direct comparison with the principal component analysis method is discussed using a recently published example of genotyping data related to newly discovered traces of an ancient hominid that was compared to modern human populations in the search for ancestral similarities. CA offers more detailed results highlighting links between modern humans and the ancient hominid and their characterizations. Compared to the popular principal component analysis method, CA allows easier and more effective interpretation of results, particularly by the ability of relating individual patterns with their corresponding characteristic variables. PMID:27279736
An introduction to kernel-based learning algorithms.
Müller, K R; Mika, S; Rätsch, G; Tsuda, K; Schölkopf, B
2001-01-01
This paper provides an introduction to support vector machines, kernel Fisher discriminant analysis, and kernel principal component analysis, as examples for successful kernel-based learning methods. We first give a short background about Vapnik-Chervonenkis theory and kernel feature spaces and then proceed to kernel based learning in supervised and unsupervised scenarios including practical and algorithmic considerations. We illustrate the usefulness of kernel algorithms by discussing applications such as optical character recognition and DNA analysis.
Jahn, Beate; Theurl, Engelbert; Siebert, Uwe; Pfeiffer, Karl-Peter
2010-01-01
In most decision-analytic models in health care, it is assumed that there is treatment without delay and availability of all required resources. Therefore, waiting times caused by limited resources and their impact on treatment effects and costs often remain unconsidered. Queuing theory enables mathematical analysis and the derivation of several performance measures of queuing systems. Nevertheless, an analytical approach with closed formulas is not always possible. Therefore, simulation techniques are used to evaluate systems that include queuing or waiting, for example, discrete event simulation. To include queuing in decision-analytic models requires a basic knowledge of queuing theory and of the underlying interrelationships. This tutorial introduces queuing theory. Analysts and decision-makers get an understanding of queue characteristics, modeling features, and its strength. Conceptual issues are covered, but the emphasis is on practical issues like modeling the arrival of patients. The treatment of coronary artery disease with percutaneous coronary intervention including stent placement serves as an illustrative queuing example. Discrete event simulation is applied to explicitly model resource capacities, to incorporate waiting lines and queues in the decision-analytic modeling example.
Combinatorial synthesis of inorganic or composite materials
Goldwasser, Isy; Ross, Debra A.; Schultz, Peter G.; Xiang, Xiao-Dong; Briceno, Gabriel; Sun, Xian-Dong; Wang, Kai-An
2010-08-03
Methods and apparatus for the preparation and use of a substrate having an array of diverse materials in predefined regions thereon. A substrate having an array of diverse materials thereon is generally prepared by delivering components of materials to predefined regions on a substrate, and simultaneously reacting the components to form at least two materials or, alternatively, allowing the components to interact to form at least two different materials. Materials which can be prepared using the methods and apparatus of the present invention include, for example, covalent network solids, ionic solids and molecular solids. More particularly, materials which can be prepared using the methods and apparatus of the present invention include, for example, inorganic materials, intermetallic materials, metal alloys, ceramic materials, organic materials, organometallic materials, nonbiological organic polymers, composite materials (e.g., inorganic composites, organic composites, or combinations thereof), etc. Once prepared, these materials can be screened for useful properties including, for example, electrical, thermal, mechanical, morphological, optical, magnetic, chemical, or other properties. Thus, the present invention provides methods for the parallel synthesis and analysis of novel materials having useful properties.
Why don't dentists talk to patients about oral cancer?
Awojobi, O; Newton, J T; Scott, S E
2015-05-08
Up to half of oral cancer patients are diagnosed with advanced lesions. One route to early diagnosis could involve dentists raising awareness of oral cancer through discussions with patients, emphasising prompt help-seeking. This study explores opinions and practices of dentists regarding discussing oral cancer with patients including views on barriers and facilitators. Qualitative in-depth interviews.Setting Dentists working in general dental practices in the United Kingdom were interviewed in 2013. In-depth interviews with dentists (n = 16) were conducted. Interviews were audio-recorded and transcribed. Data was analysed using framework analysis. Dentists recognised the importance of raising awareness but identified several barriers to discussions including system factors (for example, time constraints and a lack of financial incentive), patient factors (for example, fear of invoking undue anxiety) and dentist factors (for example, a lack of sufficient knowledge, training and self-confidence). Facilitators included developing practice standards and good dentist-patient relationships. Identified barriers may hold back efforts to raise awareness of oral cancer and could be targeted in future initiatives to encourage early detection.
NASA Technical Reports Server (NTRS)
Raju, I. S.
1986-01-01
The Q3DG is a computer program developed to perform a quasi-three-dimensional stress analysis for composite laminates which may contain delaminations. The laminates may be subjected to mechanical, thermal, and hygroscopic loads. The program uses the finite element method and models the laminates with eight-noded parabolic isoparametric elements. The program computes the strain-energy-release components and the total strain-energy release in all three modes for delamination growth. A rectangular mesh and data file generator, DATGEN, is included. The DATGEN program can be executed interactively and is user friendly. The documentation includes sections dealing with the Q3D analysis theory, derivation of element stiffness matrices and consistent load vectors for the parabolic element. Several sample problems with the input for Q3DG and output from the program are included. The capabilities of the DATGEN program are illustrated with examples of interactive sessions. A microfiche of all the examples is included. The Q3DG and DATGEN programs have been implemented on CYBER 170 class computers. Q3DG and DATGEN were developed at the Langley Research Center during the early eighties and documented in 1984 to 1985.
Modular reweighting software for statistical mechanical analysis of biased equilibrium data
NASA Astrophysics Data System (ADS)
Sindhikara, Daniel J.
2012-07-01
Here a simple, useful, modular approach and software suite designed for statistical reweighting and analysis of equilibrium ensembles is presented. Statistical reweighting is useful and sometimes necessary for analysis of equilibrium enhanced sampling methods, such as umbrella sampling or replica exchange, and also in experimental cases where biasing factors are explicitly known. Essentially, statistical reweighting allows extrapolation of data from one or more equilibrium ensembles to another. Here, the fundamental separable steps of statistical reweighting are broken up into modules - allowing for application to the general case and avoiding the black-box nature of some “all-inclusive” reweighting programs. Additionally, the programs included are, by-design, written with little dependencies. The compilers required are either pre-installed on most systems, or freely available for download with minimal trouble. Examples of the use of this suite applied to umbrella sampling and replica exchange molecular dynamics simulations will be shown along with advice on how to apply it in the general case. New version program summaryProgram title: Modular reweighting version 2 Catalogue identifier: AEJH_v2_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEJH_v2_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU General Public License, version 3 No. of lines in distributed program, including test data, etc.: 179 118 No. of bytes in distributed program, including test data, etc.: 8 518 178 Distribution format: tar.gz Programming language: C++, Python 2.6+, Perl 5+ Computer: Any Operating system: Any RAM: 50-500 MB Supplementary material: An updated version of the original manuscript (Comput. Phys. Commun. 182 (2011) 2227) is available Classification: 4.13 Catalogue identifier of previous version: AEJH_v1_0 Journal reference of previous version: Comput. Phys. Commun. 182 (2011) 2227 Does the new version supersede the previous version?: Yes Nature of problem: While equilibrium reweighting is ubiquitous, there are no public programs available to perform the reweighting in the general case. Further, specific programs often suffer from many library dependencies and numerical instability. Solution method: This package is written in a modular format that allows for easy applicability of reweighting in the general case. Modules are small, numerically stable, and require minimal libraries. Reasons for new version: Some minor bugs, some upgrades needed, error analysis added. analyzeweight.py/analyzeweight.py2 has been replaced by “multihist.py”. This new program performs all the functions of its predecessor while being versatile enough to handle other types of histograms and probability analysis. “bootstrap.py” was added. This script performs basic bootstrap resampling allowing for error analysis of data. “avg_dev_distribution.py” was added. This program computes the averages and standard deviations of multiple distributions, making error analysis (e.g. from bootstrap resampling) easier to visualize. WRE.cpp was slightly modified purely for cosmetic reasons. The manual was updated for clarity and to reflect version updates. Examples were removed from the manual in favor of online tutorials (packaged examples remain). Examples were updated to reflect the new format. An additional example is included to demonstrate error analysis. Running time: Preprocessing scripts 1-5 minutes, WHAM engine <1 minute, postprocess script ∼1-5 minutes.
Descriptive approaches to landscape analysis
R. Burton Litton Jr.
1979-01-01
Descriptive landscape analyses include various procedures used to document visual/scenic resources. Historic and regional examples of landscape description represent desirable insight for contemporary professional inventory work. Routed and areal landscape inventories are discussed as basic tools. From them, qualitative and quantitative evaluations can be developed...
Survey of International Trade/Economics Textbooks.
ERIC Educational Resources Information Center
Lucier, Richard L.
1992-01-01
Reviews 14 international economics textbooks to help instructors with selection. Includes organization and structure, topics covered, and characteristics of the texts. Suggests considerations such as course length, level of abstraction desired, opinion of numerically based graphical analysis, extensiveness of examples and applications, and whether…
ERIC Educational Resources Information Center
Thoms, Karen J.; Kellerman, Debra K.
1995-01-01
Provides eight guidelines for the effective design of questionnaires to be used in the assessment of training needs. Highlights include cost effectiveness; the use of surveys; front-end analysis; and examples for each guideline. (LRW)
Development of stable isotope mixing models in ecology - Dublin
More than 40 years ago, stable isotope analysis methods used in geochemistry began to be applied to ecological studies. One common application is using mathematical mixing models to sort out the proportional contributions of various sources to a mixture. Examples include contri...
Historical development of stable isotope mixing models in ecology
More than 40 years ago, stable isotope analysis methods used in geochemistry began to be applied to ecological studies. One common application is using mathematical mixing models to sort out the proportional contributions of various sources to a mixture. Examples include contri...
Development of stable isotope mixing models in ecology - Perth
More than 40 years ago, stable isotope analysis methods used in geochemistry began to be applied to ecological studies. One common application is using mathematical mixing models to sort out the proportional contributions of various sources to a mixture. Examples include contri...
Development of stable isotope mixing models in ecology - Fremantle
More than 40 years ago, stable isotope analysis methods used in geochemistry began to be applied to ecological studies. One common application is using mathematical mixing models to sort out the proportional contributions of various sources to a mixture. Examples include contri...
Development of stable isotope mixing models in ecology - Sydney
More than 40 years ago, stable isotope analysis methods used in geochemistry began to be applied to ecological studies. One common application is using mathematical mixing models to sort out the proportional contributions of various sources to a mixture. Examples include contri...
Reconceptualising risk: Perceptions of risk in rural and remote maternity service planning.
Barclay, Lesley; Kornelsen, Jude; Longman, Jo; Robin, Sarah; Kruske, Sue; Kildea, Sue; Pilcher, Jennifer; Martin, Tanya; Grzybowski, Stefan; Donoghue, Deborah; Rolfe, Margaret; Morgan, Geoff
2016-07-01
to explore perceptions and examples of risk related to pregnancy and childbirth in rural and remote Australia and how these influence the planning of maternity services. data collection in this qualitative component of a mixed methods study included 88 semi-structured individual and group interviews (n=102), three focus groups (n=22) and one group information session (n=17). Researchers identified two categories of risk for exploration: health services risk (including clinical and corporate risks) and social risk (including cultural, emotional and financial risks). Data were aggregated and thematically analysed to identify perceptions and examples of risk related to each category. fieldwork was conducted in four jurisdictions at nine sites in rural (n=3) and remote (n=6) Australia. 117 health service employees and 24 consumers. examples and perceptions relating to each category of risk were identified from the data. Most medical practitioners and health service managers perceived clinical risks related to rural birthing services without access to caesarean section. Consumer participants were more likely to emphasise social risks arising from a lack of local birthing services. our analysis demonstrated that the closure of services adds social risk, which exacerbates clinical risk. Analysis also highlighted that perceptions of clinical risk are privileged over social risk in decisions about rural and remote maternity service planning. a comprehensive analysis of risk that identifies how social and other forms of risk contribute to adverse clinical outcomes would benefit rural and remote people and their health services. Formal risk analyses should consider the risks associated with failure to provide birthing services in rural and remote communities as well as the risks of maintaining services. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.
General Mission Analysis Tool (GMAT)
NASA Technical Reports Server (NTRS)
Hughes, Steven P. (Compiler)
2016-01-01
This is a software tutorial and presentation demonstrating the application of the General Mission Analysis Tool (GMAT) to the critical design phase of NASA missions. The demonstration discusses GMAT basics, then presents a detailed example of GMAT application to the Transiting Exoplanet Survey Satellite (TESS) mission. Other examples include OSIRIS-Rex. This talk is a combination of existing presentations; a GMAT basics and overview, and technical presentations from the TESS and OSIRIS-REx projects on their application of GMAT to critical mission design. The GMAT basics slides are taken from the open source training material. The OSIRIS-REx slides are from a previous conference presentation. The TESS slides are a streamlined version of the CDR package provided by the project with SBU and ITAR data removed by the TESS project.
NASA Astrophysics Data System (ADS)
Vaks, V. L.; Domracheva, E. G.; Chernyaeva, M. B.; Pripolzin, S. I.; Revin, L. S.; Tretyakov, I. V.; Anfertyev, V. A.; Yablokov, A. A.; Lukyanenko, I. A.; Sheikov, Yu. V.
2018-02-01
We show prospects for using the method of high-resolution terahertz spectroscopy for a continuous analysis of the decomposition products of energy substances in the gas phase (including short-lived ones) in a wide temperature range. The experimental setup, which includes a terahertz spectrometer for studying the thermal decomposition reactions, is described. The results of analysis of the gaseous decomposition products of energy substances by the example of ammonium nitrate heated from room temperature to 167°C are presented.
Analysis of longitudinal data from animals with missing values using SPSS.
Duricki, Denise A; Soleman, Sara; Moon, Lawrence D F
2016-06-01
Testing of therapies for disease or injury often involves the analysis of longitudinal data from animals. Modern analytical methods have advantages over conventional methods (particularly when some data are missing), yet they are not used widely by preclinical researchers. Here we provide an easy-to-use protocol for the analysis of longitudinal data from animals, and we present a click-by-click guide for performing suitable analyses using the statistical package IBM SPSS Statistics software (SPSS). We guide readers through the analysis of a real-life data set obtained when testing a therapy for brain injury (stroke) in elderly rats. If a few data points are missing, as in this example data set (for example, because of animal dropout), repeated-measures analysis of covariance may fail to detect a treatment effect. An alternative analysis method, such as the use of linear models (with various covariance structures), and analysis using restricted maximum likelihood estimation (to include all available data) can be used to better detect treatment effects. This protocol takes 2 h to carry out.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gilbert, Richard O.
The application of statistics to environmental pollution monitoring studies requires a knowledge of statistical analysis methods particularly well suited to pollution data. This book fills that need by providing sampling plans, statistical tests, parameter estimation procedure techniques, and references to pertinent publications. Most of the statistical techniques are relatively simple, and examples, exercises, and case studies are provided to illustrate procedures. The book is logically divided into three parts. Chapters 1, 2, and 3 are introductory chapters. Chapters 4 through 10 discuss field sampling designs and Chapters 11 through 18 deal with a broad range of statistical analysis procedures. Somemore » statistical techniques given here are not commonly seen in statistics book. For example, see methods for handling correlated data (Sections 4.5 and 11.12), for detecting hot spots (Chapter 10), and for estimating a confidence interval for the mean of a lognormal distribution (Section 13.2). Also, Appendix B lists a computer code that estimates and tests for trends over time at one or more monitoring stations using nonparametric methods (Chapters 16 and 17). Unfortunately, some important topics could not be included because of their complexity and the need to limit the length of the book. For example, only brief mention could be made of time series analysis using Box-Jenkins methods and of kriging techniques for estimating spatial and spatial-time patterns of pollution, although multiple references on these topics are provided. Also, no discussion of methods for assessing risks from environmental pollution could be included.« less
Multidimensional bioseparation with modular microfluidics
Chirica, Gabriela S.; Renzi, Ronald F.
2013-08-27
A multidimensional chemical separation and analysis system is described including a prototyping platform and modular microfluidic components capable of rapid and convenient assembly, alteration and disassembly of numerous candidate separation systems. Partial or total computer control of the separation system is possible. Single or multiple alternative processing trains can be tested, optimized and/or run in parallel. Examples related to the separation and analysis of human bodily fluids are given.
Transportable, Low-Dose Active Fast-Neutron Imaging
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mihalczo, John T.; Wright, Michael C.; McConchie, Seth M.
2017-08-01
This document contains a description of the method of transportable, low-dose active fast-neutron imaging as developed by ORNL. The discussion begins with the technique and instrumentation and continues with the image reconstruction and analysis. The analysis discussion includes an example of how a gap smaller than the neutron production spot size and detector size can be detected and characterized depending upon the measurement time.
Solar electric geocentric transfer with attitude constraints: Analysis
NASA Technical Reports Server (NTRS)
Sackett, L. L.; Malchow, H. L.; Delbaum, T. N.
1975-01-01
A time optimal or nearly time optimal trajectory program was developed for solar electric geocentric transfer with or without attitude constraints and with an optional initial high thrust stage. The method of averaging reduces computation time. A nonsingular set of orbital elements is used. The constraints, which are those of one of the SERT-C designs, introduce complexities into the analysis and the solution yields possible discontinuous changes in thrust direction. The power degradation due to VanAllen radiation is modeled analytically. A wide range of solar cell characteristics is assumed. Effects such as oblateness and shadowing are included. The analysis and the results of many example runs are included.
11th Annual CMMI Technology Conference and User Group
2011-11-17
Examples of triggers may include: – Cost performance – Schedule performance – Results of management reviews – Occurrence of the risk • as a...Analysis (PHA) – Method 3 – Through bottom- up analysis of design data (e.g., flow diagrams, Failure Mode Effects and Criticality Analysis (FMECA...of formal reviews and the setting up of delta or follow- up reviews can be used to give the organization more places to look at the products as they
The Crew Earth Observations Experiment: Earth System Science from the ISS
NASA Technical Reports Server (NTRS)
Stefanov, William L.; Evans, Cynthia A.; Robinson, Julie A.; Wilkinson, M. Justin
2007-01-01
This viewgraph presentation reviews the use of Astronaut Photography (AP) as taken from the International Space Station (ISS) in Earth System Science (ESS). Included are slides showing basic remote sensing theory, data characteristics of astronaut photography, astronaut training and operations, crew Earth observations group, targeting sites and acquisition, cataloging and database, analysis and applications for ESS, image analysis of particular interest urban areas, megafans, deltas, coral reefs. There are examples of the photographs and the analysis.
Combination of Thin Lenses--A Computer Oriented Method.
ERIC Educational Resources Information Center
Flerackers, E. L. M.; And Others
1984-01-01
Suggests a method treating geometric optics using a microcomputer to do the calculations of image formation. Calculations are based on the connection between the composition of lenses and the mathematics of fractional linear equations. Logic of the analysis and an example problem are included. (JM)
Enhanced Reality Visualization in a Surgical Environment.
1995-01-13
perhaps by a significant amount fol- lowing calibration. Examples of these methods include [Brown, 1965, Lenz and Tsai, 1988, Maybank and Faugeras...Analysis And Machine In- telligence, 10(5):713-720, September 1988. [ Maybank and Faugeras, 1992] Stephen J. Maybank and Olivier D. Faugeras. A
Visual Basic programs for spreadsheet analysis.
Hunt, Bruce
2005-01-01
A collection of Visual Basic programs, entitled Function.xls, has been written for ground water spreadsheet calculations. This collection includes programs for calculating mathematical functions and for evaluating analytical solutions in ground water hydraulics and contaminant transport. Several spreadsheet examples are given to illustrate their use.
COMBINING SOURCES IN STABLE ISOTOPE MIXING MODELS: ALTERNATIVE METHODS
Stable isotope mixing models are often used to quantify source contributions to a mixture. Examples include pollution source identification; trophic web studies; analysis of water sources for soils, plants, or water bodies; and many others. A common problem is having too many s...
Computer program for the analysis of the cross flow in a radial inflow turbine scroll
NASA Technical Reports Server (NTRS)
Hamed, A.; Abdallah, S.; Tabakoff, W.
1977-01-01
A computer program was used to solve the governing of the potential flow in the cross sectional planes of a radial inflow turbine scroll. A list of the main program, the subroutines, and typical output example are included.
The Pragmatic University: A Feasible Utopia?
ERIC Educational Resources Information Center
Badley, Graham
2016-01-01
"Imaginings" of the modern university include such ideas as "the ecological university" and "the pragmatic university". In his attempt to separate utopian from dystopian visions of the university, Ronald Barnett concentrates on an analysis of the ecological university and ignores, for example, the case of the…
Uncertainty Modeling for Robustness Analysis of Control Upset Prevention and Recovery Systems
NASA Technical Reports Server (NTRS)
Belcastro, Christine M.; Khong, Thuan H.; Shin, Jong-Yeob; Kwatny, Harry; Chang, Bor-Chin; Balas, Gary J.
2005-01-01
Formal robustness analysis of aircraft control upset prevention and recovery systems could play an important role in their validation and ultimate certification. Such systems (developed for failure detection, identification, and reconfiguration, as well as upset recovery) need to be evaluated over broad regions of the flight envelope and under extreme flight conditions, and should include various sources of uncertainty. However, formulation of linear fractional transformation (LFT) models for representing system uncertainty can be very difficult for complex parameter-dependent systems. This paper describes a preliminary LFT modeling software tool which uses a matrix-based computational approach that can be directly applied to parametric uncertainty problems involving multivariate matrix polynomial dependencies. Several examples are presented (including an F-16 at an extreme flight condition, a missile model, and a generic example with numerous crossproduct terms), and comparisons are given with other LFT modeling tools that are currently available. The LFT modeling method and preliminary software tool presented in this paper are shown to compare favorably with these methods.
Acquisition and analysis of accelerometer data
NASA Astrophysics Data System (ADS)
Verges, Keith R.
1990-08-01
Acceleration data reduction must be undertaken with a complete understanding of the physical process, the means by which the data are acquired, and finally, the calculations necessary to put the data into a meaningful format. Discussed here are the acceleration sensor requirements dictated by the measurements desired. Sensor noise, dynamic range, and linearity will be determined from the physical parameters of the experiment. The digitizer requirements are discussed. Here the system from sensor to digital storage medium will be integrated, and rules of thumb for experiment duration, filter response, and number of bits are explained. Data reduction techniques after storage are also discussed. Time domain operations including decimating, digital filtering, and averaging are covered, as well as frequency domain methods, including windowing and the difference between power and amplitude spectra, and simple noise determination via coherence analysis. Finally, an example experiment using the Teledyne Geotech Model 44000 Seismometer to measure from 1 Hz to 10(exp -6) Hz is discussed. The sensor, data acquisition system, and example spectra are presented.
Acquisition and analysis of accelerometer data
NASA Technical Reports Server (NTRS)
Verges, Keith R.
1990-01-01
Acceleration data reduction must be undertaken with a complete understanding of the physical process, the means by which the data are acquired, and finally, the calculations necessary to put the data into a meaningful format. Discussed here are the acceleration sensor requirements dictated by the measurements desired. Sensor noise, dynamic range, and linearity will be determined from the physical parameters of the experiment. The digitizer requirements are discussed. Here the system from sensor to digital storage medium will be integrated, and rules of thumb for experiment duration, filter response, and number of bits are explained. Data reduction techniques after storage are also discussed. Time domain operations including decimating, digital filtering, and averaging are covered, as well as frequency domain methods, including windowing and the difference between power and amplitude spectra, and simple noise determination via coherence analysis. Finally, an example experiment using the Teledyne Geotech Model 44000 Seismometer to measure from 1 Hz to 10(exp -6) Hz is discussed. The sensor, data acquisition system, and example spectra are presented.
[Pharmacogenomics in routine medical care].
Rosskopf, D; Meyer zu Schwabedissen, H E; Kroemer, H K; Siegmund, W
2010-01-01
Pharmacogenomics investigates inherited differences in drug responses including beneficial and adverse reactions. While a considerable amount of evidence for genetic influences on drug responses has been accumulated within the last decade, predominantly in small studies, its value in routine therapy is still a matter of debate. The aim of this review is to discuss well established examples where pharmacogenomic techniques can improve routine treatment. Examples include genotyping of CYP2D6 in the context of antidepressant therapy, analysis of TPMT variants for the prediction of mercaptopurine-induced bone marrow depression, VKORC1 and CYP2C9 analyses for a better control of anticoagulant administration and the SLCO1B1 variant in the context of statin-induced myopathies. Georg Thieme Verlag KG Stuttgart.New York.
Cartographic services contract...for everything geographic
,
2003-01-01
The U.S. Geological Survey's (USGS) Cartographic Services Contract (CSC) is used to award work for photogrammetric and mapping services under the umbrella of Architect-Engineer (A&E) contracting. The A&E contract is broad in scope and can accommodate any activity related to standard, nonstandard, graphic, and digital cartographic products. Services provided may include, but are not limited to, photogrammetric mapping and aerotriangulation; orthophotography; thematic mapping (for example, land characterization); analog and digital imagery applications; geographic information systems development; surveying and control acquisition, including ground-based and airborne Global Positioning System; analog and digital image manipulation, analysis, and interpretation; raster and vector map digitizing; data manipulations (for example, transformations, conversions, generalization, integration, and conflation); primary and ancillary data acquisition (for example, aerial photography, satellite imagery, multispectral, multitemporal, and hyperspectral data); image scanning and processing; metadata production, revision, and creation; and production or revision of standard USGS products defined by formal and informal specification and standards, such as those for digital line graphs, digital elevation models, digital orthophoto quadrangles, and digital raster graphics.
A Design Architecture for an Integrated Training System Decision Support System
1990-07-01
Sensory modes include visual, auditory, tactile, or kinesthetic; performance categories include time to complete , speed of response, or correct action ...procedures, and finally application and examples from the aviation proponency with emphasis on the LHX program. Appendix B is a complete bibliography...integrated analysis of ITS development. The approach was designed to provide an accurate and complete representation of the ITS development process and
NASA Astrophysics Data System (ADS)
Wagener, Thorsten; Pianosi, Francesca
2016-04-01
Sensitivity Analysis (SA) investigates how the variation in the output of a numerical model can be attributed to variations of its input factors. SA is increasingly being used in earth and environmental modelling for a variety of purposes, including uncertainty assessment, model calibration and diagnostic evaluation, dominant control analysis and robust decision-making. Here we provide some practical advice regarding best practice in SA and discuss important open questions based on a detailed recent review of the existing body of work in SA. Open questions relate to the consideration of input factor interactions, methods for factor mapping and the formal inclusion of discrete factors in SA (for example for model structure comparison). We will analyse these questions using relevant examples and discuss possible ways forward. We aim at stimulating the discussion within the community of SA developers and users regarding the setting of good practices and on defining priorities for future research.
2007-10-01
1984. Complex principal component analysis : Theory and examples. Journal of Climate and Applied Meteorology 23: 1660-1673. Hotelling, H. 1933...Sediments 99. ASCE: 2,566-2,581. Von Storch, H., and A. Navarra. 1995. Analysis of climate variability. Applications of statistical techniques. Berlin...ERDC TN-SWWRP-07-9 October 2007 Regional Morphology Empirical Analysis Package (RMAP): Orthogonal Function Analysis , Background and Examples by
Racism as a determinant of health: a protocol for conducting a systematic review and meta-analysis.
Paradies, Yin; Priest, Naomi; Ben, Jehonathan; Truong, Mandy; Gupta, Arpana; Pieterse, Alex; Kelaher, Margaret; Gee, Gilbert
2013-09-23
Racism is increasingly recognized as a key determinant of health. A growing body of epidemiological evidence shows strong associations between self-reported racism and poor health outcomes across diverse minority groups in developed countries. While the relationship between racism and health has received increasing attention over the last two decades, a comprehensive meta-analysis focused on the health effects of racism has yet to be conducted. The aim of this review protocol is to provide a structure from which to conduct a systematic review and meta-analysis of studies that assess the relationship between racism and health. This research will consist of a systematic review and meta-analysis. Studies will be considered for review if they are empirical studies reporting quantitative data on the association between racism and health for adults and/or children of all ages from any racial/ethnic/cultural groups. Outcome measures will include general health and well-being, physical health, mental health, healthcare use and health behaviors. Scientific databases (for example, Medline) will be searched using a comprehensive search strategy and reference lists will be manually searched for relevant studies. In addition, use of online search engines (for example, Google Scholar), key websites, and personal contact with experts will also be undertaken. Screening of search results and extraction of data from included studies will be independently conducted by at least two authors, including assessment of inter-rater reliability. Studies included in the review will be appraised for quality using tools tailored to each study design. Summary statistics of study characteristics and findings will be compiled and findings synthesized in a narrative summary as well as a meta-analysis. This review aims to examine associations between reported racism and health outcomes. This comprehensive and systematic review and meta-analysis of empirical research will provide a rigorous and reliable evidence base for future research, policy and practice, including information on the extent of available evidence for a range of racial/ethnic minority groups.
NASA Technical Reports Server (NTRS)
Kvaternik, R. G.
1976-01-01
The manner of representing a flight vehicle structure as an assembly of beam, spring, and rigid-body components for vibration analysis is described. The development is couched in terms of a substructures methodology which is based on the finite-element stiffness method. The particular manner of employing beam, spring, and rigid-body components to model such items as wing structures, external stores, pylons supporting engines or external stores, and sprung masses associated with launch vehicle fuel slosh is described by means of several simple qualitative examples. A detailed numerical example consisting of a tilt-rotor VTOL aircraft is included to provide a unified illustration of the procedure for representing a structure as an equivalent system of beams, springs, and rigid bodies, the manner of forming the substructure mass and stiffness matrices, and the mechanics of writing the equations of constraint which enforce deflection compatibility at the junctions of the substructures. Since many structures, or selected components of structures, can be represented in this manner for vibration analysis, the modeling concepts described and their application in the numerical example shown should prove generally useful to the dynamicist.
LCSH and PRECIS in Music: A Comparison.
ERIC Educational Resources Information Center
Gabbard, Paula Beversdorf
1985-01-01
By studying examples of their applications by two major English language bibliographic agencies, this article compares strengths and weaknesses of PRECIS and Library of Congress Subject Headings for books about music. Highlights include quantitative and qualitative analysis, comparison of number of subject statements, and terminology problems in…
The Expanding Role of the Atom in the Humanities
ERIC Educational Resources Information Center
Seaborg, Glenn T.
1970-01-01
The techniques of radioactive dating, thermoluminescence dating, cesium magnetometer detecting, x-ray flourescence analysis, and neutron radiography are briefly explained. Examples are given in the use of techniques in determining age and composition of paintings, ceramics, and archeological finds. Included is a history of Lawrence Radiation…
Lexical Innovation in Ghanaian English.
ERIC Educational Resources Information Center
Bamiro, Edmund O.
An analysis of lexical innovation in Ghanaian English uses ten linguistic categories identified in earlier research on Nigerian English, offering an explanation of each category and a number of examples. The categories include: loanshifts (English words manipulated to produce and transmit meanings beyond purely denotative reference and conveying a…
40 CFR 63.2406 - What definitions apply to this subpart?
Code of Federal Regulations, 2010 CFR
2010-07-01
... the stored organic liquid as determined from a design analysis of the storage tank. (2) For ambient... stored, transported, treated, disposed of, or otherwise handled. Examples of containers include, but are... or any other forms of transportation. Design evaluation means a procedure for evaluating control...
40 CFR 63.2406 - What definitions apply to this subpart?
Code of Federal Regulations, 2011 CFR
2011-07-01
... the stored organic liquid as determined from a design analysis of the storage tank. (2) For ambient... stored, transported, treated, disposed of, or otherwise handled. Examples of containers include, but are... or any other forms of transportation. Design evaluation means a procedure for evaluating control...
The Social Relevance of Montessori in the First Plane
ERIC Educational Resources Information Center
Andrews, Sarah Werner
2015-01-01
This article represents an amazing reversal of linguistic analysis. Usually Montessori language is translated into "state" terminology. In this case, Sarah Werner Andrews puts state quality assessment terms into Montessori language. For example, domains for school readiness include 1) physical wellbeing and motor development, 2) social…
Moodog: Tracking Student Activity in Online Course Management Systems
ERIC Educational Resources Information Center
Zhang, Hangjin; Almeroth, Kevin
2010-01-01
Many universities are currently using Course Management Systems (CMSes) to conduct online learning, for example, by distributing course materials or submitting homework assignments. However, most CMSes do not include comprehensive activity tracking and analysis capabilities. This paper describes a method to track students' online learning…
Tricco, Andrea C; Soobiah, Charlene; Antony, Jesmin; Hemmelgarn, Brenda; Moher, David; Hutton, Brian; Straus, Sharon E
2013-06-28
Serotonin (5-HT3) receptor antagonists are a class of antiemetic medications often used to prevent nausea and vomiting among patients undergoing chemotherapy, radiotherapy or surgery. However, recent studies suggest that these agents might be associated with increased cardiac harm. To examine this further, we are proposing to conduct a systematic review and network meta-analysis on the comparative safety of 5-HT3 receptor antagonists among patients undergoing chemotherapy or surgery. Studies reporting one or more safety outcomes of interest for 5-HT3 receptor antagonists compared with each other, placebo, and/or other anti-emetic agents (for example, benzamides, phenothiazines, butyrophenones, antihistamines, and anticholinergics) among children and adult patients undergoing surgery or chemotherapy will be included. Our primary outcome of interest is arrhythmia. Our secondary outcomes include cardiac death, QT prolongation, PR prolongation, all-cause mortality, nausea, and vomiting. We will include experimental studies, quasi-experimental studies (namely controlled before-after and interrupted time series), and observational studies (namely cohort studies). We will not limit inclusion by publication status, time period, duration of follow-up or language of dissemination.Electronic databases (for example, MEDLINE, EMBASE) will be searched from inception onwards. These main searches will be supplemented by searching for difficult to locate and unpublished studies, such as dissertations, and governmental reports. The eligibility criteria will be pilot-tested and subsequently used to screen the literature search results by two reviewers in duplicate. A similar process will be followed for full-text screening, data abstraction, and risk of bias/methodological quality appraisal. The Cochrane Risk of Bias tool will be used to appraise experimental and quasi-experimental studies, and cohort studies will be assessed using the Newcastle Ottawa Scale. If the data allows, random effects meta-analysis and a network (that is, mixed treatment comparisons) meta-analysis will be conducted. All analyses will be conducted separately for different study designs, patient populations (for example, children and adults), and reason for administering 5-HT3 receptor antagonists (for example, post-surgery and chemotherapy). Our results will help inform patients, clinicians, and health policy-makers about the potential safety concerns, as well as the comparative safety, of using these antiemetic agents. PROSPERO registry number:CRD42013003564.
NASA standard: Trend analysis techniques
NASA Technical Reports Server (NTRS)
1990-01-01
Descriptive and analytical techniques for NASA trend analysis applications are presented in this standard. Trend analysis is applicable in all organizational elements of NASA connected with, or supporting, developmental/operational programs. This document should be consulted for any data analysis activity requiring the identification or interpretation of trends. Trend analysis is neither a precise term nor a circumscribed methodology: it generally connotes quantitative analysis of time-series data. For NASA activities, the appropriate and applicable techniques include descriptive and graphical statistics, and the fitting or modeling of data by linear, quadratic, and exponential models. Usually, but not always, the data is time-series in nature. Concepts such as autocorrelation and techniques such as Box-Jenkins time-series analysis would only rarely apply and are not included in this document. The basic ideas needed for qualitative and quantitative assessment of trends along with relevant examples are presented.
Three-Dimensional Analysis and Surgical Planning in Craniomaxillofacial Surgery.
Steinbacher, Derek M
2015-12-01
Three-dimensional (3D) analysis and planning are powerful tools in craniofacial and reconstructive surgery. The elements include 1) analysis, 2) planning, 3) virtual surgery, 4) 3D printouts of guides or implants, and 5) verification of actual to planned results. The purpose of this article is to review different applications of 3D planning in craniomaxillofacial surgery. Case examples involving 3D analysis and planning were reviewed. Common threads pertaining to all types of reconstruction are highlighted and contrasted with unique aspects specific to new applications in craniomaxillofacial surgery. Six examples of 3D planning are described: 1) cranial reconstruction, 2) craniosynostosis, 3) midface advancement, 4) mandibular distraction, 5) mandibular reconstruction, and 6) orthognathic surgery. Planning in craniomaxillofacial surgery is useful and has applicability across different procedures and reconstructions. Three-dimensional planning and virtual surgery enhance efficiency, accuracy, creativity, and reproducibility in craniomaxillofacial surgery. Copyright © 2015 American Association of Oral and Maxillofacial Surgeons. Published by Elsevier Inc. All rights reserved.
Banta, E.R.; Hill, M.C.; Poeter, E.; Doherty, J.E.; Babendreier, J.
2008-01-01
The open-source, public domain JUPITER (Joint Universal Parameter IdenTification and Evaluation of Reliability) API (Application Programming Interface) provides conventions and Fortran-90 modules to develop applications (computer programs) for analyzing process models. The input and output conventions allow application users to access various applications and the analysis methods they embody with a minimum of time and effort. Process models simulate, for example, physical, chemical, and (or) biological systems of interest using phenomenological, theoretical, or heuristic approaches. The types of model analyses supported by the JUPITER API include, but are not limited to, sensitivity analysis, data needs assessment, calibration, uncertainty analysis, model discrimination, and optimization. The advantages provided by the JUPITER API for users and programmers allow for rapid programming and testing of new ideas. Application-specific coding can be in languages other than the Fortran-90 of the API. This article briefly describes the capabilities and utility of the JUPITER API, lists existing applications, and uses UCODE_2005 as an example.
Thermal-stress analysis for wood composite blade. [horizontal axis wind turbines
NASA Technical Reports Server (NTRS)
Fu, K. C.; Harb, A.
1984-01-01
The thermal-stress induced by solar insolation on a wood composite blade of a Mod-OA wind turbine was investigated. The temperature distribution throughout the blade (a heat conduction problem) was analyzed and the thermal-stress distribution of the blades caused by the temperature distribution (a thermal-stress analysis problem) was then determined. The computer programs used for both problems are included along with output examples.
Sierra Structural Dynamics User's Notes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Reese, Garth M.
2015-10-19
Sierra/SD provides a massively parallel implementation of structural dynamics finite element analysis, required for high fidelity, validated models used in modal, vibration, static and shock analysis of weapons systems. This document provides a users guide to the input for Sierra/SD. Details of input specifications for the different solution types, output options, element types and parameters are included. The appendices contain detailed examples, and instructions for running the software on parallel platforms.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Munday, Lynn Brendon; Day, David M.; Bunting, Gregory
Sierra/SD provides a massively parallel implementation of structural dynamics finite element analysis, required for high fidelity, validated models used in modal, vibration, static and shock analysis of weapons systems. This document provides a users guide to the input for Sierra/SD. Details of input specifications for the different solution types, output options, element types and parameters are included. The appendices contain detailed examples, and instructions for running the software on parallel platforms.
Computer code for off-design performance analysis of radial-inflow turbines with rotor blade sweep
NASA Technical Reports Server (NTRS)
Meitner, P. L.; Glassman, A. J.
1983-01-01
The analysis procedure of an existing computer program was extended to include rotor blade sweep, to model the flow more accurately at the rotor exit, and to provide more detail to the loss model. The modeling changes are described and all analysis equations and procedures are presented. Program input and output are described and are illustrated by an example problem. Results obtained from this program and from a previous program are compared with experimental data.
Numerical integration of asymptotic solutions of ordinary differential equations
NASA Technical Reports Server (NTRS)
Thurston, Gaylen A.
1989-01-01
Classical asymptotic analysis of ordinary differential equations derives approximate solutions that are numerically stable. However, the analysis also leads to tedious expansions in powers of the relevant parameter for a particular problem. The expansions are replaced with integrals that can be evaluated by numerical integration. The resulting numerical solutions retain the linear independence that is the main advantage of asymptotic solutions. Examples, including the Falkner-Skan equation from laminar boundary layer theory, illustrate the method of asymptotic analysis with numerical integration.
NASA Astrophysics Data System (ADS)
Bohmann, Jonathan A.; Weinhold, Frank; Farrar, Thomas C.
1997-07-01
Nuclear magnetic shielding tensors computed by the gauge including atomic orbital (GIAO) method in the Hartree-Fock self-consistent-field (HF-SCF) framework are partitioned into magnetic contributions from chemical bonds and lone pairs by means of natural chemical shielding (NCS) analysis, an extension of natural bond orbital (NBO) analysis. NCS analysis complements the description provided by alternative localized orbital methods by directly calculating chemical shieldings due to delocalized features in the electronic structure, such as bond conjugation and hyperconjugation. Examples of NCS tensor decomposition are reported for CH4, CO, and H2CO, for which a graphical mnemonic due to Cornwell is used to illustrate the effect of hyperconjugative delocalization on the carbon shielding.
Integration of electro-optical mechanical systems and medicine: where are we and where can we go?
NASA Astrophysics Data System (ADS)
Gourley, Mark F.; Gourley, Paul L.
1997-03-01
The marriage of microfabricated materials with microbiological systems will allow advances in medicine to proceed at an unprecedented pace. Biomedical research is placing new demands on speed and limits of detection to assay body tissues and fluids. Emerging microfabricated chip technologies from the engineering community offer researchers novel types of analysis of human samples. In guiding these developments, the ability to swiftly and accurately gain useful information for identification and establish a diagnosis, is of utmost importance. Current examples of such technology include DNA amplification and analysis, and fluorescent cell analysis by flow cytometry. Potential applications include the development of rapid techniques for examining large number of cells in tissue or in blood. These could serve as screening tools for the detection and quantification of abnormal cell types; for example malignant or HIV infected cells. Micro/nanofabrication methods will make these devices compact, providing access of this technology to point of care providers; in a clinic, ambulance, or on a battlefield. Currently, these tools are in the construction phase. Upon delivery to researchers, validation of these instruments leads to clinical demand that requires approval from the Food and Drug Administration. This paper outlines criteria that successful devices must satisfy.
Recent developments of the NESSUS probabilistic structural analysis computer program
NASA Technical Reports Server (NTRS)
Millwater, H.; Wu, Y.-T.; Torng, T.; Thacker, B.; Riha, D.; Leung, C. P.
1992-01-01
The NESSUS probabilistic structural analysis computer program combines state-of-the-art probabilistic algorithms with general purpose structural analysis methods to compute the probabilistic response and the reliability of engineering structures. Uncertainty in loading, material properties, geometry, boundary conditions and initial conditions can be simulated. The structural analysis methods include nonlinear finite element and boundary element methods. Several probabilistic algorithms are available such as the advanced mean value method and the adaptive importance sampling method. The scope of the code has recently been expanded to include probabilistic life and fatigue prediction of structures in terms of component and system reliability and risk analysis of structures considering cost of failure. The code is currently being extended to structural reliability considering progressive crack propagation. Several examples are presented to demonstrate the new capabilities.
Paradoxical Behavior of Granger Causality
NASA Astrophysics Data System (ADS)
Witt, Annette; Battaglia, Demian; Gail, Alexander
2013-03-01
Granger causality is a standard tool for the description of directed interaction of network components and is popular in many scientific fields including econometrics, neuroscience and climate science. For time series that can be modeled as bivariate auto-regressive processes we analytically derive an expression for spectrally decomposed Granger Causality (SDGC) and show that this quantity depends only on two out of four groups of model parameters. Then we present examples of such processes whose SDGC expose paradoxical behavior in the sense that causality is high for frequency ranges with low spectral power. For avoiding misinterpretations of Granger causality analysis we propose to complement it by partial spectral analysis. Our findings are illustrated by an example from brain electrophysiology. Finally, we draw implications for the conventional definition of Granger causality. Bernstein Center for Computational Neuroscience Goettingen
GeoPAT: A toolbox for pattern-based information retrieval from large geospatial databases
NASA Astrophysics Data System (ADS)
Jasiewicz, Jarosław; Netzel, Paweł; Stepinski, Tomasz
2015-07-01
Geospatial Pattern Analysis Toolbox (GeoPAT) is a collection of GRASS GIS modules for carrying out pattern-based geospatial analysis of images and other spatial datasets. The need for pattern-based analysis arises when images/rasters contain rich spatial information either because of their very high resolution or their very large spatial extent. Elementary units of pattern-based analysis are scenes - patches of surface consisting of a complex arrangement of individual pixels (patterns). GeoPAT modules implement popular GIS algorithms, such as query, overlay, and segmentation, to operate on the grid of scenes. To achieve these capabilities GeoPAT includes a library of scene signatures - compact numerical descriptors of patterns, and a library of distance functions - providing numerical means of assessing dissimilarity between scenes. Ancillary GeoPAT modules use these functions to construct a grid of scenes or to assign signatures to individual scenes having regular or irregular geometries. Thus GeoPAT combines knowledge retrieval from patterns with mapping tasks within a single integrated GIS environment. GeoPAT is designed to identify and analyze complex, highly generalized classes in spatial datasets. Examples include distinguishing between different styles of urban settlements using VHR images, delineating different landscape types in land cover maps, and mapping physiographic units from DEM. The concept of pattern-based spatial analysis is explained and the roles of all modules and functions are described. A case study example pertaining to delineation of landscape types in a subregion of NLCD is given. Performance evaluation is included to highlight GeoPAT's applicability to very large datasets. The GeoPAT toolbox is available for download from
77 FR 21637 - Authority To Require Supervision and Regulation of Certain Nonbank Financial Companies
Federal Register 2010, 2011, 2012, 2013, 2014
2012-04-11
... ``threat to financial stability''; The uniform quantitative thresholds that the Council intends to use to... a determination, including examples of quantitative metrics for assessing each category; and The... potential determination with respect to a nonbank financial company, a comparative cost-benefit analysis of...
On Transitions between Representations: The Role of Contextual Reasoning in Calculus Problem Solving
ERIC Educational Resources Information Center
Zazkis, Dov
2016-01-01
This article argues for a shift in how researchers discuss and examine students' uses and understandings of multiple representations within a calculus context. An extension of Zazkis, Dubinsky, and Dautermann's (1996) visualization/analysis framework to include contextual reasoning is proposed. Several examples that detail transitions between…
Pendulum Rides, Rotations and the Coriolis Effect
ERIC Educational Resources Information Center
Pendrill, Ann-Marie; Modig, Conny
2018-01-01
An amusement park is full of examples that can be made into challenging problems for students, combining mathematical modelling with video analysis, as well as measurements in the rides. Traditional amusement ride related textbook problems include free-fall, circular motion, pendula and energy conservation in roller coasters, where the moving…
The Personal Living Space Cue Inventory: An Analysis and Evaluation
ERIC Educational Resources Information Center
Gosling, Samuel D.; Craik, Kenneth H.; Martin, Nicholas R.; Pryor, Michelle R.
2005-01-01
The authors introduce the Personal Living Space Cue Inventory (PLSCI), designed to document comprehensively features of personal living spaces (PLSs); common examples of PLSs include rooms in family households, dormitories, or residential centers. The article describes the PLSCI's development and provides evidence for its reliability and…
Network analysis: a new tool for resource managers
Ruth H. Allen
1980-01-01
Resource managers manipulate ecosystems for direct or indirect human uses. Examples of relatively well studied resource management issues include familiar biological products such as: forests, ranges, fish and wildlife; or physical products such as air, water and soil. Until very recently, urban environments received much less scholarly attention. However, as Spurr (...
Asking the Right Questions: Techniques for Collaboration and School Change. 2nd Edition.
ERIC Educational Resources Information Center
Holcomb, Edie L.
This work provides school change leaders with tools, techniques, tips, examples, illustrations, and stories about promoting school change. Tools provided include histograms, surveys, run charts, weighted voting, force-field analysis, decision matrices, and many others. Chapter 1, "Introduction," applies a matrix for asking questions…
Molecular analysis of killer DNA from Neurospora Spore killer-2
USDA-ARS?s Scientific Manuscript database
In standard Mendelian inheritance, each allele in a sexual cross has an equal probability of being transmitted to the next generation. However, there are certain “selfish” genes that are able to propagate themselves at a higher frequency than others in a population. Examples include the Neurospora S...
40 CFR 63.2406 - What definitions apply to this subpart?
Code of Federal Regulations, 2014 CFR
2014-07-01
... average temperature of the stored organic liquid as determined from a design analysis of the storage tank.... Examples of containers include, but are not limited to, drums and portable cargo containers known as... transfer facilities to pipelines or any other forms of transportation. Design evaluation means a procedure...
40 CFR 63.2406 - What definitions apply to this subpart?
Code of Federal Regulations, 2013 CFR
2013-07-01
... average temperature of the stored organic liquid as determined from a design analysis of the storage tank.... Examples of containers include, but are not limited to, drums and portable cargo containers known as... transfer facilities to pipelines or any other forms of transportation. Design evaluation means a procedure...
USDA-ARS?s Scientific Manuscript database
The fungal family Clavicipitaceae includes plant symbionts and pathogens that produce neurotropic alkaloids with diverse effects on vertebrate and invertebrate animals. For example, ergot alkaloids are historically linked to mass poisonings (St. Anthony's fire) and sociological effects such as the ...
Quantifying uncertainty in forest nutrient budgets
Ruth D. Yanai; Carrie R. Levine; Mark B. Green; John L. Campbell
2012-01-01
Nutrient budgets for forested ecosystems have rarely included error analysis, in spite of the importance of uncertainty to interpretation and extrapolation of the results. Uncertainty derives from natural spatial and temporal variation and also from knowledge uncertainty in measurement and models. For example, when estimating forest biomass, researchers commonly report...
Structural Equations and Path Analysis for Discrete Data.
ERIC Educational Resources Information Center
Winship, Christopher; Mare, Robert D.
1983-01-01
Presented is an approach to causal models in which some or all variables are discretely measured, showing that path analytic methods permit quantification of causal relationships among variables with the same flexibility and power of interpretation as is feasible in models including only continuous variables. Examples are provided. (Author/IS)
University Student Conceptual Resources for Understanding Energy
ERIC Educational Resources Information Center
Sabo, Hannah C.; Goodhew, Lisa M.; Robertson, Amy D.
2016-01-01
We report some of the common, prevalent conceptual resources that students used to reason about energy, based on our analysis of written responses to questions given to 807 introductory physics students. These resources include, for example, associating forms of energy with indicators, relating forces and energy, and representing energy…
Students Investigate Local Communities with Geographic Information Systems (GIS).
ERIC Educational Resources Information Center
Carlstrom, Dick; Quinlan, Laurie A.
1997-01-01
Describes the use of Geographic Information Systems (GIS) in elementary and secondary school classrooms to analyze neighborhoods, cities, and regions. Discusses GIS software, databases, graphing data, and spatial analysis, and includes an example of a project for secondary school students investigating the local economy for summer jobs. (LRW)
Experiential Learning through Integrated Project Work: An Example from Soil Science.
ERIC Educational Resources Information Center
Mellor, Antony
1991-01-01
Describes the planning, implementation, and evaluation of an integrated student soil science project. Reports that the course was designed to develop student-centered approaches to learning and to develop transferable skills and personal qualities at the same time. Explains that the project included fieldwork, laboratory analysis, data…
Educating Boys: Tempering Rhetoric with Research
ERIC Educational Resources Information Center
Froese-Germain, Bernie
2006-01-01
In the context of boys' declining academic achievement in schools in relation to girls, this article highlights some critical issues arising from the debate on boys' education. The emphasis is on the contribution of feminist analysis and other perspectives to broaden and contextualize the debate. This includes, for example, the need to carefully…
USDA-ARS?s Scientific Manuscript database
Recent advances in technology have led to the collection of high-dimensional data not previously encountered in many scientific environments. As a result, scientists are often faced with the challenging task of including these high-dimensional data into statistical models. For example, data from sen...
Advances in molecular quantum chemistry contained in the Q-Chem 4 program package
NASA Astrophysics Data System (ADS)
Shao, Yihan; Gan, Zhengting; Epifanovsky, Evgeny; Gilbert, Andrew T. B.; Wormit, Michael; Kussmann, Joerg; Lange, Adrian W.; Behn, Andrew; Deng, Jia; Feng, Xintian; Ghosh, Debashree; Goldey, Matthew; Horn, Paul R.; Jacobson, Leif D.; Kaliman, Ilya; Khaliullin, Rustam Z.; Kuś, Tomasz; Landau, Arie; Liu, Jie; Proynov, Emil I.; Rhee, Young Min; Richard, Ryan M.; Rohrdanz, Mary A.; Steele, Ryan P.; Sundstrom, Eric J.; Woodcock, H. Lee, III; Zimmerman, Paul M.; Zuev, Dmitry; Albrecht, Ben; Alguire, Ethan; Austin, Brian; Beran, Gregory J. O.; Bernard, Yves A.; Berquist, Eric; Brandhorst, Kai; Bravaya, Ksenia B.; Brown, Shawn T.; Casanova, David; Chang, Chun-Min; Chen, Yunqing; Chien, Siu Hung; Closser, Kristina D.; Crittenden, Deborah L.; Diedenhofen, Michael; DiStasio, Robert A., Jr.; Do, Hainam; Dutoi, Anthony D.; Edgar, Richard G.; Fatehi, Shervin; Fusti-Molnar, Laszlo; Ghysels, An; Golubeva-Zadorozhnaya, Anna; Gomes, Joseph; Hanson-Heine, Magnus W. D.; Harbach, Philipp H. P.; Hauser, Andreas W.; Hohenstein, Edward G.; Holden, Zachary C.; Jagau, Thomas-C.; Ji, Hyunjun; Kaduk, Benjamin; Khistyaev, Kirill; Kim, Jaehoon; Kim, Jihan; King, Rollin A.; Klunzinger, Phil; Kosenkov, Dmytro; Kowalczyk, Tim; Krauter, Caroline M.; Lao, Ka Un; Laurent, Adèle D.; Lawler, Keith V.; Levchenko, Sergey V.; Lin, Ching Yeh; Liu, Fenglai; Livshits, Ester; Lochan, Rohini C.; Luenser, Arne; Manohar, Prashant; Manzer, Samuel F.; Mao, Shan-Ping; Mardirossian, Narbe; Marenich, Aleksandr V.; Maurer, Simon A.; Mayhall, Nicholas J.; Neuscamman, Eric; Oana, C. Melania; Olivares-Amaya, Roberto; O'Neill, Darragh P.; Parkhill, John A.; Perrine, Trilisa M.; Peverati, Roberto; Prociuk, Alexander; Rehn, Dirk R.; Rosta, Edina; Russ, Nicholas J.; Sharada, Shaama M.; Sharma, Sandeep; Small, David W.; Sodt, Alexander; Stein, Tamar; Stück, David; Su, Yu-Chuan; Thom, Alex J. W.; Tsuchimochi, Takashi; Vanovschi, Vitalii; Vogt, Leslie; Vydrov, Oleg; Wang, Tao; Watson, Mark A.; Wenzel, Jan; White, Alec; Williams, Christopher F.; Yang, Jun; Yeganeh, Sina; Yost, Shane R.; You, Zhi-Qiang; Zhang, Igor Ying; Zhang, Xing; Zhao, Yan; Brooks, Bernard R.; Chan, Garnet K. L.; Chipman, Daniel M.; Cramer, Christopher J.; Goddard, William A., III; Gordon, Mark S.; Hehre, Warren J.; Klamt, Andreas; Schaefer, Henry F., III; Schmidt, Michael W.; Sherrill, C. David; Truhlar, Donald G.; Warshel, Arieh; Xu, Xin; Aspuru-Guzik, Alán; Baer, Roi; Bell, Alexis T.; Besley, Nicholas A.; Chai, Jeng-Da; Dreuw, Andreas; Dunietz, Barry D.; Furlani, Thomas R.; Gwaltney, Steven R.; Hsu, Chao-Ping; Jung, Yousung; Kong, Jing; Lambrecht, Daniel S.; Liang, WanZhen; Ochsenfeld, Christian; Rassolov, Vitaly A.; Slipchenko, Lyudmila V.; Subotnik, Joseph E.; Van Voorhis, Troy; Herbert, John M.; Krylov, Anna I.; Gill, Peter M. W.; Head-Gordon, Martin
2015-01-01
A summary of the technical advances that are incorporated in the fourth major release of the Q-Chem quantum chemistry program is provided, covering approximately the last seven years. These include developments in density functional theory methods and algorithms, nuclear magnetic resonance (NMR) property evaluation, coupled cluster and perturbation theories, methods for electronically excited and open-shell species, tools for treating extended environments, algorithms for walking on potential surfaces, analysis tools, energy and electron transfer modelling, parallel computing capabilities, and graphical user interfaces. In addition, a selection of example case studies that illustrate these capabilities is given. These include extensive benchmarks of the comparative accuracy of modern density functionals for bonded and non-bonded interactions, tests of attenuated second order Møller-Plesset (MP2) methods for intermolecular interactions, a variety of parallel performance benchmarks, and tests of the accuracy of implicit solvation models. Some specific chemical examples include calculations on the strongly correlated Cr2 dimer, exploring zeolite-catalysed ethane dehydrogenation, energy decomposition analysis of a charged ter-molecular complex arising from glycerol photoionisation, and natural transition orbitals for a Frenkel exciton state in a nine-unit model of a self-assembling nanotube.
An introduction to Bayesian statistics in health psychology.
Depaoli, Sarah; Rus, Holly M; Clifton, James P; van de Schoot, Rens; Tiemensma, Jitske
2017-09-01
The aim of the current article is to provide a brief introduction to Bayesian statistics within the field of health psychology. Bayesian methods are increasing in prevalence in applied fields, and they have been shown in simulation research to improve the estimation accuracy of structural equation models, latent growth curve (and mixture) models, and hierarchical linear models. Likewise, Bayesian methods can be used with small sample sizes since they do not rely on large sample theory. In this article, we discuss several important components of Bayesian statistics as they relate to health-based inquiries. We discuss the incorporation and impact of prior knowledge into the estimation process and the different components of the analysis that should be reported in an article. We present an example implementing Bayesian estimation in the context of blood pressure changes after participants experienced an acute stressor. We conclude with final thoughts on the implementation of Bayesian statistics in health psychology, including suggestions for reviewing Bayesian manuscripts and grant proposals. We have also included an extensive amount of online supplementary material to complement the content presented here, including Bayesian examples using many different software programmes and an extensive sensitivity analysis examining the impact of priors.
NASA Technical Reports Server (NTRS)
Almroth, B. O.; Brogan, F. A.
1978-01-01
Basic information about the computer code STAGS (Structural Analysis of General Shells) is presented to describe to potential users the scope of the code and the solution procedures that are incorporated. Primarily, STAGS is intended for analysis of shell structures, although it has been extended to more complex shell configurations through the inclusion of springs and beam elements. The formulation is based on a variational approach in combination with local two dimensional power series representations of the displacement components. The computer code includes options for analysis of linear or nonlinear static stress, stability, vibrations, and transient response. Material as well as geometric nonlinearities are included. A few examples of applications of the code are presented for further illustration of its scope.
Principal component regression analysis with SPSS.
Liu, R X; Kuang, J; Gong, Q; Hou, X L
2003-06-01
The paper introduces all indices of multicollinearity diagnoses, the basic principle of principal component regression and determination of 'best' equation method. The paper uses an example to describe how to do principal component regression analysis with SPSS 10.0: including all calculating processes of the principal component regression and all operations of linear regression, factor analysis, descriptives, compute variable and bivariate correlations procedures in SPSS 10.0. The principal component regression analysis can be used to overcome disturbance of the multicollinearity. The simplified, speeded up and accurate statistical effect is reached through the principal component regression analysis with SPSS.
Epigenetics and Psychoneuroimmunology: Mechanisms and Models
Mathews, Herbert L.; Janusek, Linda Witek
2010-01-01
In this Introduction to the Named Series “Epigenetics, Brain, Behavior, and Immunity” an overview of epigenetics is provided with a consideration of the nature of epigenetic regulation including DNA methylation, histone modification and chromatin remodeling. Illustrative examples of recent scientific developments are highlighted to demonstrate the influence of epigenetics in areas of research relevant to those who investigate phenomena within the scientific discipline of psychoneuroimmunology. These examples are presented in order to provide a perspective on how epigenetic analysis will add insight into the molecular processes that connect the brain with behavior, neuroendocrine responsivity and immune outcome. PMID:20832468
Complex Synchronization Phenomena in Ecological Systems
NASA Astrophysics Data System (ADS)
Stone, Lewi; Olinky, Ronen; Blasius, Bernd; Huppert, Amit; Cazelles, Bernard
2002-07-01
Ecological and biological systems provide us with many striking examples of synchronization phenomena. Here we discuss a number of intriguing cases and attempt to explain them taking advantage of a modelling framework. One main focus will concern synchronized ecological end epidemiological cycles which have Uniform Phase growth associated with their regular recurrence, and Chaotic Amplitudes - a feature we term UPCA. Examples come from different areas and include decadal cycles of small mammals, recurrent viral epidemics such as childhood infections (eg., measles), and seasonally driven phytoplankton blooms observed in lakes and the oceans. A more detailed theoretical analysis of seasonally synchronized chaotic population cycles is presented.
Comments on the "Byzantine Self-Stabilizing Pulse Synchronization" Protocol: Counter-examples
NASA Technical Reports Server (NTRS)
Malekpour, Mahyar R.; Siminiceanu, Radu
2006-01-01
Embedded distributed systems have become an integral part of many safety-critical applications. There have been many attempts to solve the self-stabilization problem of clocks across a distributed system. An analysis of one such protocol called the Byzantine Self-Stabilizing Pulse Synchronization (BSS-Pulse-Synch) protocol from a paper entitled "Linear Time Byzantine Self-Stabilizing Clock Synchronization" by Daliot, et al., is presented in this report. This report also includes a discussion of the complexity and pitfalls of designing self-stabilizing protocols and provides counter-examples for the claims of the above protocol.
ParallABEL: an R library for generalized parallelization of genome-wide association studies.
Sangket, Unitsa; Mahasirimongkol, Surakameth; Chantratita, Wasun; Tandayya, Pichaya; Aulchenko, Yurii S
2010-04-29
Genome-Wide Association (GWA) analysis is a powerful method for identifying loci associated with complex traits and drug response. Parts of GWA analyses, especially those involving thousands of individuals and consuming hours to months, will benefit from parallel computation. It is arduous acquiring the necessary programming skills to correctly partition and distribute data, control and monitor tasks on clustered computers, and merge output files. Most components of GWA analysis can be divided into four groups based on the types of input data and statistical outputs. The first group contains statistics computed for a particular Single Nucleotide Polymorphism (SNP), or trait, such as SNP characterization statistics or association test statistics. The input data of this group includes the SNPs/traits. The second group concerns statistics characterizing an individual in a study, for example, the summary statistics of genotype quality for each sample. The input data of this group includes individuals. The third group consists of pair-wise statistics derived from analyses between each pair of individuals in the study, for example genome-wide identity-by-state or genomic kinship analyses. The input data of this group includes pairs of SNPs/traits. The final group concerns pair-wise statistics derived for pairs of SNPs, such as the linkage disequilibrium characterisation. The input data of this group includes pairs of individuals. We developed the ParallABEL library, which utilizes the Rmpi library, to parallelize these four types of computations. ParallABEL library is not only aimed at GenABEL, but may also be employed to parallelize various GWA packages in R. The data set from the North American Rheumatoid Arthritis Consortium (NARAC) includes 2,062 individuals with 545,080, SNPs' genotyping, was used to measure ParallABEL performance. Almost perfect speed-up was achieved for many types of analyses. For example, the computing time for the identity-by-state matrix was linearly reduced from approximately eight hours to one hour when ParallABEL employed eight processors. Executing genome-wide association analysis using the ParallABEL library on a computer cluster is an effective way to boost performance, and simplify the parallelization of GWA studies. ParallABEL is a user-friendly parallelization of GenABEL.
Saccomani, Maria Pia; Audoly, Stefania; Bellu, Giuseppina; D'Angiò, Leontina
2010-04-01
DAISY (Differential Algebra for Identifiability of SYstems) is a recently developed computer algebra software tool which can be used to automatically check global identifiability of (linear and) nonlinear dynamic models described by differential equations involving polynomial or rational functions. Global identifiability is a fundamental prerequisite for model identification which is important not only for biological or medical systems but also for many physical and engineering systems derived from first principles. Lack of identifiability implies that the parameter estimation techniques may not fail but any obtained numerical estimates will be meaningless. The software does not require understanding of the underlying mathematical principles and can be used by researchers in applied fields with a minimum of mathematical background. We illustrate the DAISY software by checking the a priori global identifiability of two benchmark nonlinear models taken from the literature. The analysis of these two examples includes comparison with other methods and demonstrates how identifiability analysis is simplified by this tool. Thus we illustrate the identifiability analysis of other two examples, by including discussion of some specific aspects related to the role of observability and knowledge of initial conditions in testing identifiability and to the computational complexity of the software. The main focus of this paper is not on the description of the mathematical background of the algorithm, which has been presented elsewhere, but on illustrating its use and on some of its more interesting features. DAISY is available on the web site http://www.dei.unipd.it/ approximately pia/. 2010 Elsevier Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Heffley, R. K.; Jewell, W. F.; Whitbeck, R. F.; Schulman, T. M.
1980-01-01
The effects of spurious delays in real time digital computing systems are examined. Various sources of spurious delays are defined and analyzed using an extant simulator system as an example. A specific analysis procedure is set forth and four cases are viewed in terms of their time and frequency domain characteristics. Numerical solutions are obtained for three single rate one- and two-computer examples, and the analysis problem is formulated for a two-rate, two-computer example.
Tuomisto, Martti T; Parkkinen, Lauri
2012-05-01
Verbal behavior, as in the use of terms, is an important part of scientific activity in general and behavior analysis in particular. Many glossaries and dictionaries of behavior analysis have been published in English, but few in any other language. Here we review the area of behavior analytic terminology, its translations, and development in languages other than English. As an example, we use our own mother tongue, Finnish, which provides a suitable example of the process of translation and development of behavior analytic terminology, because it differs from Indo-European languages and entails specific advantages and challenges in the translation process. We have published three editions of a general dictionary of behavior analysis including 801 terms relevant to the experimental analysis of behavior and applied behavior analysis and one edition of a dictionary of applied and clinical behavior analysis containing 280 terms. Because this work has been important to us, we hope this review will encourage similar work by behavior analysts in other countries whose native language is not English. Behavior analysis as an advanced science deserves widespread international dissemination and proper translations are essential to that goal.
Tuomisto, Martti T; Parkkinen, Lauri
2012-01-01
Verbal behavior, as in the use of terms, is an important part of scientific activity in general and behavior analysis in particular. Many glossaries and dictionaries of behavior analysis have been published in English, but few in any other language. Here we review the area of behavior analytic terminology, its translations, and development in languages other than English. As an example, we use our own mother tongue, Finnish, which provides a suitable example of the process of translation and development of behavior analytic terminology, because it differs from Indo-European languages and entails specific advantages and challenges in the translation process. We have published three editions of a general dictionary of behavior analysis including 801 terms relevant to the experimental analysis of behavior and applied behavior analysis and one edition of a dictionary of applied and clinical behavior analysis containing 280 terms. Because this work has been important to us, we hope this review will encourage similar work by behavior analysts in other countries whose native language is not English. Behavior analysis as an advanced science deserves widespread international dissemination and proper translations are essential to that goal. PMID:22693363
Stakeholder Perceptions of Cyberbullying Cases: Application of the Uniform Definition of Bullying.
Moreno, Megan A; Suthamjariya, Nina; Selkie, Ellen
2018-04-01
The Uniform Definition of Bullying was developed to address bullying and cyberbullying, and to promote consistency in measurement and policy. The purpose of this study was to understand community stakeholder perceptions of typical cyberbullying cases, and to evaluate how these case descriptions align with the Uniform Definition. In this qualitative case analysis we recruited stakeholders commonly involved in cyberbullying. We used purposeful sampling to identify and recruit adolescents and young adults, parents, and professionals representing education and health care. Participants were asked to write a typical case of cyberbullying and descriptors in the context of a group discussion. We applied content analysis to case excerpts using inductive and deductive approaches, and chi-squared tests for mixed methods analyses. A total of 68 participants contributed; participants included 73% adults and 27% adolescents and young adults. A total of 650 excerpts were coded from participants' example cases and 362 (55.6%) were consistent with components of the Uniform Definition. The most frequently mentioned component of the Uniform Definition was Aggressive Behavior (n = 218 excerpts), whereas Repeated was mentioned infrequently (n = 19). Most participants included two to three components of the Uniform Definition within an example case; none of the example cases included all components of the Uniform Definition. We found that most participants described cyberbullying cases using few components of the Uniform Definition. Findings can be applied toward considering refinement of the Uniform Definition to ensure stakeholders find it applicable to cyberbullying. Copyright © 2017 The Society for Adolescent Health and Medicine. Published by Elsevier Inc. All rights reserved.
Optical holographic structural analysis of Kevlar rocket motor cases
NASA Astrophysics Data System (ADS)
Harris, W. J.
1981-05-01
The methodology of applying optical holography to evaluation of subscale Kevlar 49 composite pressure vessels is explored. The results and advantages of the holographic technique are discussed. The cases utilized were of similar design, but each had specific design features, the effects of which are reviewed. Burst testing results are presented in conjunction with the holographic fringe patterns obtained during progressive pressurization. Examples of quantitative data extracted by analysis of fringe fields are included.
Exercise and Bone Density: Meta-Analysis
2003-10-01
were estimat- included in our analysis. Thus, for example, if BMD ed using previously developed methods .ŕ T- was also assessed in women performing...from 12.6% unpublished work is inappropriate because it has in the placebo group to 9.0% in the alendronate not gone through the peer review process...Olkin I. Statistical Methods for Meta-Analy- taken that could enhance BMD, cigarette smoking, 0 sis. San Diego, CA: Academic Press; 1985. take tht
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schmittroth, F.
1979-09-01
A documentation of the FERRET data analysis code is given. The code provides a way to combine related measurements and calculations in a consistent evaluation. Basically a very general least-squares code, it is oriented towards problems frequently encountered in nuclear data and reactor physics. A strong emphasis is on the proper treatment of uncertainties and correlations and in providing quantitative uncertainty estimates. Documentation includes a review of the method, structure of the code, input formats, and examples.
NASA Technical Reports Server (NTRS)
Thompson, E.
1979-01-01
A finite element computer code for the analysis of mantle convection is described. The coupled equations for creeping viscous flow and heat transfer can be solved for either a transient analysis or steady-state analysis. For transient analyses, either a control volume or a control mass approach can be used. Non-Newtonian fluids with viscosities which have thermal and spacial dependencies can be easily incorporated. All material parameters may be written as function statements by the user or simply specified as constants. A wide range of boundary conditions, both for the thermal analysis and the viscous flow analysis can be specified. For steady-state analyses, elastic strain rates can be included. Although this manual was specifically written for users interested in mantle convection, the code is equally well suited for analysis in a number of other areas including metal forming, glacial flows, and creep of rock and soil.
NASA standard: Trend analysis techniques
NASA Technical Reports Server (NTRS)
1988-01-01
This Standard presents descriptive and analytical techniques for NASA trend analysis applications. Trend analysis is applicable in all organizational elements of NASA connected with, or supporting, developmental/operational programs. Use of this Standard is not mandatory; however, it should be consulted for any data analysis activity requiring the identification or interpretation of trends. Trend Analysis is neither a precise term nor a circumscribed methodology, but rather connotes, generally, quantitative analysis of time-series data. For NASA activities, the appropriate and applicable techniques include descriptive and graphical statistics, and the fitting or modeling of data by linear, quadratic, and exponential models. Usually, but not always, the data is time-series in nature. Concepts such as autocorrelation and techniques such as Box-Jenkins time-series analysis would only rarely apply and are not included in this Standard. The document presents the basic ideas needed for qualitative and quantitative assessment of trends, together with relevant examples. A list of references provides additional sources of information.
Experimental analysis and modeling of melt growth processes
NASA Astrophysics Data System (ADS)
Müller, Georg
2002-04-01
Melt growth processes provide the basic crystalline materials for many applications. The research and development of crystal growth processes is therefore driven by the demands which arise from these specific applications; however, common goals include an increased uniformity of the relevant crystal properties at the micro- and macro-scale, a decrease of deleterious crystal defects, and an increase of crystal dimensions. As melt growth equipment and experimentation becomes more and more expensive, little room remains for improvements by trial and error procedures. A more successful strategy is to optimize the crystal growth process by a combined use of experimental process analysis and computer modeling. This will be demonstrated in this paper by several examples from the bulk growth of silicon, gallium arsenide, indium phosphide, and calcium fluoride. These examples also involve the most important melt growth techniques, crystal pulling (Czochralski methods) and vertical gradient freeze (Bridgman-type methods). The power and success of the above optimization strategy, however, is not limited only to the given examples but can be generalized and applied to many types of bulk crystal growth.
A Biosequence-based Approach to Software Characterization
DOE Office of Scientific and Technical Information (OSTI.GOV)
Oehmen, Christopher S.; Peterson, Elena S.; Phillips, Aaron R.
For many applications, it is desirable to have some process for recognizing when software binaries are closely related without relying on them to be identical or have identical segments. Some examples include monitoring utilization of high performance computing centers or service clouds, detecting freeware in licensed code, and enforcing application whitelists. But doing so in a dynamic environment is a nontrivial task because most approaches to software similarity require extensive and time-consuming analysis of a binary, or they fail to recognize executables that are similar but nonidentical. Presented herein is a novel biosequence-based method for quantifying similarity of executable binaries.more » Using this method, it is shown in an example application on large-scale multi-author codes that 1) the biosequence-based method has a statistical performance in recognizing and distinguishing between a collection of real-world high performance computing applications better than 90% of ideal; and 2) an example of using family tree analysis to tune identification for a code subfamily can achieve better than 99% of ideal performance.« less
A concept for holistic whole body MRI data analysis, Imiomics
Malmberg, Filip; Johansson, Lars; Lind, Lars; Sundbom, Magnus; Ahlström, Håkan; Kullberg, Joel
2017-01-01
Purpose To present and evaluate a whole-body image analysis concept, Imiomics (imaging–omics) and an image registration method that enables Imiomics analyses by deforming all image data to a common coordinate system, so that the information in each voxel can be compared between persons or within a person over time and integrated with non-imaging data. Methods The presented image registration method utilizes relative elasticity constraints of different tissue obtained from whole-body water-fat MRI. The registration method is evaluated by inverse consistency and Dice coefficients and the Imiomics concept is evaluated by example analyses of importance for metabolic research using non-imaging parameters where we know what to expect. The example analyses include whole body imaging atlas creation, anomaly detection, and cross-sectional and longitudinal analysis. Results The image registration method evaluation on 128 subjects shows low inverse consistency errors and high Dice coefficients. Also, the statistical atlas with fat content intensity values shows low standard deviation values, indicating successful deformations to the common coordinate system. The example analyses show expected associations and correlations which agree with explicit measurements, and thereby illustrate the usefulness of the proposed Imiomics concept. Conclusions The registration method is well-suited for Imiomics analyses, which enable analyses of relationships to non-imaging data, e.g. clinical data, in new types of holistic targeted and untargeted big-data analysis. PMID:28241015
User's Guide to Handlens - A Computer Program that Calculates the Chemistry of Minerals in Mixtures
Eberl, D.D.
2008-01-01
HandLens is a computer program, written in Excel macro language, that calculates the chemistry of minerals in mineral mixtures (for example, in rocks, soils and sediments) for related samples from inputs of quantitative mineralogy and chemistry. For best results, the related samples should contain minerals having the same chemical compositions; that is, the samples should differ only in the proportions of minerals present. This manual describes how to use the program, discusses the theory behind its operation, and presents test results of the program's accuracy. Required input for HandLens includes quantitative mineralogical data, obtained, for example, by RockJock analysis of X-ray diffraction (XRD) patterns, and quantitative chemical data, obtained, for example, by X-ray florescence (XRF) analysis of the same samples. Other quantitative data, such as sample depth, temperature, surface area, also can be entered. The minerals present in the samples are selected from a list, and the program is started. The results of the calculation include: (1) a table of linear coefficients of determination (r2's) which relate pairs of input data (for example, Si versus quartz weight percents); (2) a utility for plotting all input data, either as pairs of variables, or as sums of up to eight variables; (3) a table that presents the calculated chemical formulae for minerals in the samples; (4) a table that lists the calculated concentrations of major, minor, and trace elements in the various minerals; and (5) a table that presents chemical formulae for the minerals that have been corrected for possible systematic errors in the mineralogical and/or chemical analyses. In addition, the program contains a method for testing the assumption of constant chemistry of the minerals within a sample set.
NASA Technical Reports Server (NTRS)
Deng, Xiaomin; Newman, James C., Jr.
1997-01-01
ZIP2DL is a two-dimensional, elastic-plastic finte element program for stress analysis and crack growth simulations, developed for the NASA Langley Research Center. It has many of the salient features of the ZIP2D program. For example, ZIP2DL contains five material models (linearly elastic, elastic-perfectly plastic, power-law hardening, linear hardening, and multi-linear hardening models), and it can simulate mixed-mode crack growth for prescribed crack growth paths under plane stress, plane strain and mixed state of stress conditions. Further, as an extension of ZIP2D, it also includes a number of new capabilities. The large-deformation kinematics in ZIP2DL will allow it to handle elastic problems with large strains and large rotations, and elastic-plastic problems with small strains and large rotations. Loading conditions in terms of surface traction, concentrated load, and nodal displacement can be applied with a default linear time dependence or they can be programmed according to a user-defined time dependence through a user subroutine. The restart capability of ZIP2DL will make it possible to stop the execution of the program at any time, analyze the results and/or modify execution options and resume and continue the execution of the program. This report includes three sectons: a theoretical manual section, a user manual section, and an example manual secton. In the theoretical secton, the mathematics behind the various aspects of the program are concisely outlined. In the user manual section, a line-by-line explanation of the input data is given. In the example manual secton, three types of examples are presented to demonstrate the accuracy and illustrate the use of this program.
Bivariate normal, conditional and rectangular probabilities: A computer program with applications
NASA Technical Reports Server (NTRS)
Swaroop, R.; Brownlow, J. D.; Ashwworth, G. R.; Winter, W. R.
1980-01-01
Some results for the bivariate normal distribution analysis are presented. Computer programs for conditional normal probabilities, marginal probabilities, as well as joint probabilities for rectangular regions are given: routines for computing fractile points and distribution functions are also presented. Some examples from a closed circuit television experiment are included.
Evaluating Public Libraries Using Standard Scores: The Library Quotient.
ERIC Educational Resources Information Center
O'Connor, Daniel O.
1982-01-01
Describes a method for assessing the performance of public libraries using a standardized scoring system and provides an analysis of public library data from New Jersey as an example. Library standards and the derivation of measurement ratios are also discussed. A 33-item bibliography and three data tables are included. (JL)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Burns, Kimberly A.
2009-08-01
The accurate and efficient simulation of coupled neutron-photon problems is necessary for several important radiation detection applications. Examples include the detection of nuclear threats concealed in cargo containers and prompt gamma neutron activation analysis for nondestructive determination of elemental composition of unknown samples.
The Treatment of Propaganda in Selected Social Studies Texts.
ERIC Educational Resources Information Center
Fleming, Dan B.
1985-01-01
A survey found that secondary U.S. history textbooks provided the most coverage of propaganda and included the largest number of examples for student analysis. Very little coverage of propaganda was found in world geography and world history texts. A few government texts provided excellent coverage, but most gave the subject little attention. (RM)
New chalcones bearing a long-chain alkylphenol from the rhizomes of Alpinia galanga.
Yang, Wan-Qiu; Gao, Yuan; Li, Ming; Miao, De-Ren; Wang, Fei
2015-01-01
Three novel chalcones bearing a long-chain alkylphenol, galanganones A-C (1-3), were isolated from the rhizomes of Alpinia galanga. Their structures were elucidated by extensive spectroscopic analysis including 2D NMR experiments. Compounds 1-3 represent the first examples of long-chain alkylphenol-coupled chalcone.
30 CFR 250.905 - How do I get approval for the installation, modification, or repair of my platform?
Code of Federal Regulations, 2010 CFR
2010-07-01
..., DEPARTMENT OF THE INTERIOR OFFSHORE OIL AND GAS AND SULPHUR OPERATIONS IN THE OUTER CONTINENTAL SHELF...., cathodic protection systems; jacket design; pile foundations; drilling, production, and pipeline risers and... design or analysis of the platform. Examples of relevant data include information on waves, wind, current...
NASA Technical Reports Server (NTRS)
Arnold, J.; Cheatwood, N.; Powell, D.; Wolf, A.; Guensey, C.; Rivellini, T.; Venkatapathy, E.; Beard, T.; Beutter, B.; Laub, B.
2005-01-01
Contents include the following: 3 Listing of critical capabilities (knowledge, procedures, training, facilities) and metrics for validating that they are mission ready. Examples of critical capabilities and validation metrics: ground test and simulations. Flight testing to prove capabilities are mission ready. Issues and recommendations.
Access to Information in Both CitaDel and FirstSearch: A Comparative Study of Dissertation Coverage.
ERIC Educational Resources Information Center
Perry, Stephen; Salisbury, Lutishoor
1995-01-01
Presents a comparative analysis of electronic access to theses and dissertations through CitaDel and FirstSearch. Highlights include the effectiveness and ease of use in providing enduser access; strengths and weaknesses of searching capabilities; coverage; pricing; and examples of direct retrieval comparison. (LRW)
New Technologies in Maritime Education and Training, Turkish Experiment
ERIC Educational Resources Information Center
Erdogan, Oral; Demirel, Ergun
2017-01-01
The aim of this study is to introduce new technologies and approaches in the maritime education and training (MET) and Turkish experiment/acquisitions/contributions including some analysis which may be helpful for the future studies on this subject. As an example of such an effort, Turkish experiment/contribution on seafaring officer education…
Innovation in the Community College.
ERIC Educational Resources Information Center
O'Banion, Terry
An analysis is provided of the innovations that have marked the community college movement during the 1980s, including speculations about their effect on postsecondary education in the 1990s. The authors of the 13 chapters of the book were directed to cite examples of innovative practices from a range of community colleges to illustrate their…
A Decade of Charter Schools: From Theory to Practice.
ERIC Educational Resources Information Center
Bulkley, Katrina; Fisler, Jennifer
2003-01-01
Analysis of selected set of charter-school research reports through late 2001. Finds, for example, that charter schools are more autonomous than other public schools, but that the jury is still out on some of the most important questions, including those about innovation, accountability, equity, and outcomes. Provides a framework for examining…
TMF design considerations in turbine airfoils of advanced turbine engines
NASA Astrophysics Data System (ADS)
Date, C. G.; Zamrik, S. Y.; Adams, J. H.; Frani, N. E.
A review of thermal-mechanicalfatigue (TMF) in advanced turbine engines is presented. The review includes examples of typical thermal-mechnical loadings encountered in the design of hot section blades and vanes. Specific issues related to TMF behavior are presented and the associated impact on component life analysis and design is discussed.
29 CFR 2520.104-20 - Limited exemption for certain small welfare plans.
Code of Federal Regulations, 2014 CFR
2014-07-01
... any other requirement of title I of the Act, including the provisions which require that plan... authorize the Secretary of Labor to collect information and data from employee benefit plans for research and analysis (section 513). (d) Examples. (1) A welfare plan has 75 participants at the beginning of...
29 CFR 2520.104-20 - Limited exemption for certain small welfare plans.
Code of Federal Regulations, 2013 CFR
2013-07-01
... any other requirement of title I of the Act, including the provisions which require that plan... authorize the Secretary of Labor to collect information and data from employee benefit plans for research and analysis (section 513). (d) Examples. (1) A welfare plan has 75 participants at the beginning of...
Emergent Complex Behavior in Social Networks: Examples from the Ktunaxa Speech Community
ERIC Educational Resources Information Center
Horsethief, Christopher
2012-01-01
Language serves as a primary tool for structuring identity and loss of language represents the loss of that identity. This study utilizes a social network analysis of Ktunaxa speech community activities for evidence of internally generated revitalization efforts. These behaviors include instances of self-organized emergence. Such emergent behavior…
A summary and evaluation of semi-empirical methods for the prediction of helicopter rotor noise
NASA Technical Reports Server (NTRS)
Pegg, R. J.
1979-01-01
Existing prediction techniques are compiled and described. The descriptions include input and output parameter lists, required equations and graphs, and the range of validity for each part of the prediction procedures. Examples are provided illustrating the analysis procedure and the degree of agreement with experimental results.
Beyond Teachers' Sight Lines: Using Video Modeling to Examine Peer Discourse
ERIC Educational Resources Information Center
Kotsopoulos, Donna
2008-01-01
This article introduces readers to various examples of discourse analysis in mathematics education. Highlighted is interactional sociolinguistics, used in a present study to investigate peer discourse in a middle-school setting. Key findings from this study include the benefits of video modeling as a mechanism for fostering inclusive peer group…
Strategic Long Range Planning for Universities. AIR Forum 1980 Paper.
ERIC Educational Resources Information Center
Baker, Michael E.
The use of strategic long-range planning at Carnegie-Mellon University (CMU) is discussed. A structure for strategic planning analysis that integrates existing techniques is presented, and examples of planning activities at CMU are included. The key concept in strategic planning is competitive advantage: if a university has a competitive…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dionne, B.J.; Morris, S.C. III; Baum, J.W.
1998-01-01
The Department of Energy`s (DOE) Office of Environment, Safety, and Health (EH) sought examples of risk-based approaches to environmental restoration to include in their guidance for DOE nuclear facilities. Extensive measurements of radiological contamination in soil and ground water have been made at Brookhaven National Laboratory`s Hazardous Waste Management Facility (HWMF) as part of a Comprehensive Environmental Response, Compensation and Liability Act (CERCLA) remediation process. This provided an ideal opportunity for a case study. This report provides a risk assessment and an {open_quotes}As Low as Reasonably Achievable{close_quotes} (ALARA) analysis for use at other DOE nuclear facilities as an example ofmore » a risk-based decision technique. This document contains the Appendices for the report.« less
Fast and accurate modeling of stray light in optical systems
NASA Astrophysics Data System (ADS)
Perrin, Jean-Claude
2017-11-01
The first problem to be solved in most optical designs with respect to stray light is that of internal reflections on the several surfaces of individual lenses and mirrors, and on the detector itself. The level of stray light ratio can be considerably reduced by taking into account the stray light during the optimization to determine solutions in which the irradiance due to these ghosts is kept to the minimum possible value. Unhappily, the routines available in most optical design software's, for example CODE V, do not permit all alone to make exact quantitative calculations of the stray light due to these ghosts. Therefore, the engineer in charge of the optical design is confronted to the problem of using two different software's, one for the design and optimization, for example CODE V, one for stray light analysis, for example ASAP. This makes a complete optimization very complex . Nevertheless, using special techniques and combinations of the routines available in CODE V, it is possible to have at its disposal a software macro tool to do such an analysis quickly and accurately, including Monte-Carlo ray tracing, or taking into account diffraction effects. This analysis can be done in a few minutes, to be compared to hours with other software's.
Chapter 5: Modulation Excitation Spectroscopy with Phase-Sensitive Detection for Surface Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shulda, Sarah; Richards, Ryan M.
Advancements in in situ spectroscopic techniques have led to significant progress being made in elucidating heterogeneous reaction mechanisms. The potential of these progressive methods is often limited only by the complexity of the system and noise in the data. Short-lived intermediates can be challenging, if not impossible, to identify with conventional spectra analysis means. Often equally difficult is separating signals that arise from active and inactive species. Modulation excitation spectroscopy combined with phase-sensitive detection analysis is a powerful tool for removing noise from the data while simultaneously revealing the underlying kinetics of the reaction. A stimulus is applied at amore » constant frequency to the reaction system, for example, a reactant cycled with an inert phase. Through mathematical manipulation of the data, any signal contributing to the overall spectra but not oscillating with the same frequency as the stimulus will be dampened or removed. With phase-sensitive detection, signals oscillating with the stimulus frequency but with various lag times are amplified providing valuable kinetic information. In this chapter, some examples are provided from the literature that have successfully used modulation excitation spectroscopy with phase-sensitive detection to uncover previously unobserved reaction intermediates and kinetics. Examples from a broad range of spectroscopic methods are included to provide perspective to the reader.« less
A modal analysis of lamellar diffraction gratings in conical mountings
NASA Technical Reports Server (NTRS)
Li, Lifeng
1992-01-01
A rigorous modal analysis of lamellar grating, i.e., gratings having rectangular grooves, in conical mountings is presented. It is an extension of the analysis of Botten et al. which considered non-conical mountings. A key step in the extension is a decomposition of the electromagnetic field in the grating region into two orthogonal components. A computer program implementing this extended modal analysis is capable of dealing with plane wave diffraction by dielectric and metallic gratings with deep grooves, at arbitrary angles of incidence, and having arbitrary incident polarizations. Some numerical examples are included.
2013-03-31
certainly remain comingled with other solid waste. For example, some bases provided containers for segregation of recyclables including plastic and...prevalent types of solid waste are food (19.1% by average sample weight), wood (18.9%), and plastics (16.0%) based on analysis of bases in...within the interval shown. Food and wood wastes are the largest components of the average waste stream (both at ~19% by weight), followed by plastic
Extension of a System Level Tool for Component Level Analysis
NASA Technical Reports Server (NTRS)
Majumdar, Alok; Schallhorn, Paul
2002-01-01
This paper presents an extension of a numerical algorithm for network flow analysis code to perform multi-dimensional flow calculation. The one dimensional momentum equation in network flow analysis code has been extended to include momentum transport due to shear stress and transverse component of velocity. Both laminar and turbulent flows are considered. Turbulence is represented by Prandtl's mixing length hypothesis. Three classical examples (Poiseuille flow, Couette flow and shear driven flow in a rectangular cavity) are presented as benchmark for the verification of the numerical scheme.
Extension of a System Level Tool for Component Level Analysis
NASA Technical Reports Server (NTRS)
Majumdar, Alok; Schallhorn, Paul; McConnaughey, Paul K. (Technical Monitor)
2001-01-01
This paper presents an extension of a numerical algorithm for network flow analysis code to perform multi-dimensional flow calculation. The one dimensional momentum equation in network flow analysis code has been extended to include momentum transport due to shear stress and transverse component of velocity. Both laminar and turbulent flows are considered. Turbulence is represented by Prandtl's mixing length hypothesis. Three classical examples (Poiseuille flow, Couette flow, and shear driven flow in a rectangular cavity) are presented as benchmark for the verification of the numerical scheme.
FRAP Analysis: Accounting for Bleaching during Image Capture
Wu, Jun; Shekhar, Nandini; Lele, Pushkar P.; Lele, Tanmay P.
2012-01-01
The analysis of Fluorescence Recovery After Photobleaching (FRAP) experiments involves mathematical modeling of the fluorescence recovery process. An important feature of FRAP experiments that tends to be ignored in the modeling is that there can be a significant loss of fluorescence due to bleaching during image capture. In this paper, we explicitly include the effects of bleaching during image capture in the model for the recovery process, instead of correcting for the effects of bleaching using reference measurements. Using experimental examples, we demonstrate the usefulness of such an approach in FRAP analysis. PMID:22912750
NASA Technical Reports Server (NTRS)
Fertis, D. G.; Simon, A. L.
1981-01-01
The requisite methodology to solve linear and nonlinear problems associated with the static and dynamic analysis of rotating machinery, their static and dynamic behavior, and the interaction between the rotating and nonrotating parts of an engine is developed. Linear and nonlinear structural engine problems are investigated by developing solution strategies and interactive computational methods whereby the man and computer can communicate directly in making analysis decisions. Representative examples include modifying structural models, changing material, parameters, selecting analysis options and coupling with interactive graphical display for pre- and postprocessing capability.
Hauber, A Brett; González, Juan Marcos; Groothuis-Oudshoorn, Catharina G M; Prior, Thomas; Marshall, Deborah A; Cunningham, Charles; IJzerman, Maarten J; Bridges, John F P
2016-06-01
Conjoint analysis is a stated-preference survey method that can be used to elicit responses that reveal preferences, priorities, and the relative importance of individual features associated with health care interventions or services. Conjoint analysis methods, particularly discrete choice experiments (DCEs), have been increasingly used to quantify preferences of patients, caregivers, physicians, and other stakeholders. Recent consensus-based guidance on good research practices, including two recent task force reports from the International Society for Pharmacoeconomics and Outcomes Research, has aided in improving the quality of conjoint analyses and DCEs in outcomes research. Nevertheless, uncertainty regarding good research practices for the statistical analysis of data from DCEs persists. There are multiple methods for analyzing DCE data. Understanding the characteristics and appropriate use of different analysis methods is critical to conducting a well-designed DCE study. This report will assist researchers in evaluating and selecting among alternative approaches to conducting statistical analysis of DCE data. We first present a simplistic DCE example and a simple method for using the resulting data. We then present a pedagogical example of a DCE and one of the most common approaches to analyzing data from such a question format-conditional logit. We then describe some common alternative methods for analyzing these data and the strengths and weaknesses of each alternative. We present the ESTIMATE checklist, which includes a list of questions to consider when justifying the choice of analysis method, describing the analysis, and interpreting the results. Copyright © 2016 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.
Calculation of the aerodynamic loading of swept and unswept flexible wings of arbitrary stiffness
NASA Technical Reports Server (NTRS)
Diederich, Franklin W
1950-01-01
A method is presented for calculating the aerodynamic loading, the divergence speed, and certain stability derivatives of swept and unswept wings and tail surfaces of arbitrary stiffness. Provision is made for using either stiffness curves and root rotation constants or structural influence coefficients in the analysis. Computing forms, tables of numerical constants required in the analysis, and an illustrative example are included to facilitate calculations by means of the method.
Evidence from machines that learn and think like people.
Forbus, Kenneth D; Gentner, Dedre
2017-01-01
We agree with Lake et al.'s trenchant analysis of deep learning systems, including that they are highly brittle and that they need vastly more examples than do people. We also agree that human cognition relies heavily on structured relational representations. However, we differ in our analysis of human cognitive processing. We argue that (1) analogical comparison processes are central to human cognition; and (2) intuitive physical knowledge is captured by qualitative representations, rather than quantitative simulations.
Solution of elastic-plastic stress analysis problems by the p-version of the finite element method
NASA Technical Reports Server (NTRS)
Szabo, Barna A.; Actis, Ricardo L.; Holzer, Stefan M.
1993-01-01
The solution of small strain elastic-plastic stress analysis problems by the p-version of the finite element method is discussed. The formulation is based on the deformation theory of plasticity and the displacement method. Practical realization of controlling discretization errors for elastic-plastic problems is the main focus. Numerical examples which include comparisons between the deformation and incremental theories of plasticity under tight control of discretization errors are presented.
Data Mining: The Art of Automated Knowledge Extraction
NASA Astrophysics Data System (ADS)
Karimabadi, H.; Sipes, T.
2012-12-01
Data mining algorithms are used routinely in a wide variety of fields and they are gaining adoption in sciences. The realities of real world data analysis are that (a) data has flaws, and (b) the models and assumptions that we bring to the data are inevitably flawed, and/or biased and misspecified in some way. Data mining can improve data analysis by detecting anomalies in the data, check for consistency of the user model assumptions, and decipher complex patterns and relationships that would not be possible otherwise. The common form of data collected from in situ spacecraft measurements is multi-variate time series which represents one of the most challenging problems in data mining. We have successfully developed algorithms to deal with such data and have extended the algorithms to handle streaming data. In this talk, we illustrate the utility of our algorithms through several examples including automated detection of reconnection exhausts in the solar wind and flux ropes in the magnetotail. We also show examples from successful applications of our technique to analysis of 3D kinetic simulations. With an eye to the future, we provide an overview of our upcoming plans that include collaborative data mining, expert outsourcing data mining, computer vision for image analysis, among others. Finally, we discuss the integration of data mining algorithms with web-based services such as VxOs and other Heliophysics data centers and the resulting capabilities that it would enable.
NASA Technical Reports Server (NTRS)
Radhakrishnan, Krishnan; Bittker, David A.
1994-01-01
LSENS, the Lewis General Chemical Kinetics Analysis Code, has been developed for solving complex, homogeneous, gas-phase chemical kinetics problems and contains sensitivity analysis for a variety of problems, including nonisothermal situations. This report is part 2 of a series of three reference publications that describe LSENS, provide a detailed guide to its usage, and present many example problems. Part 2 describes the code, how to modify it, and its usage, including preparation of the problem data file required to execute LSENS. Code usage is illustrated by several example problems, which further explain preparation of the problem data file and show how to obtain desired accuracy in the computed results. LSENS is a flexible, convenient, accurate, and efficient solver for chemical reaction problems such as static system; steady, one-dimensional, inviscid flow; reaction behind incident shock wave, including boundary layer correction; and perfectly stirred (highly backmixed) reactor. In addition, the chemical equilibrium state can be computed for the following assigned states: temperature and pressure, enthalpy and pressure, temperature and volume, and internal energy and volume. For static problems the code computes the sensitivity coefficients of the dependent variables and their temporal derivatives with respect to the initial values of the dependent variables and/or the three rate coefficient parameters of the chemical reactions. Part 1 (NASA RP-1328) derives the governing equations describes the numerical solution procedures for the types of problems that can be solved by lSENS. Part 3 (NASA RP-1330) explains the kinetics and kinetics-plus-sensitivity-analysis problems supplied with LSENS and presents sample results.
Particle analysis using laser ablation mass spectroscopy
Parker, Eric P.; Rosenthal, Stephen E.; Trahan, Michael W.; Wagner, John S.
2003-09-09
The present invention provides a method of quickly identifying bioaerosols by class, even if the subject bioaerosol has not been previously encountered. The method begins by collecting laser ablation mass spectra from known particles. The spectra are correlated with the known particles, including the species of particle and the classification (e.g., bacteria). The spectra can then be used to train a neural network, for example using genetic algorithm-based training, to recognize each spectra and to recognize characteristics of the classifications. The spectra can also be used in a multivariate patch algorithm. Laser ablation mass specta from unknown particles can be presented as inputs to the trained neural net for identification as to classification. The description below first describes suitable intelligent algorithms and multivariate patch algorithms, then presents an example of the present invention including results.
The Philosophical Basis of Bioethics.
Horn, Peter
2015-09-01
In this article, I consider in what sense bioethics is philosophical. Philosophy includes both analysis and synthesis. Analysis focuses on central concepts in a domain, for example, informed consent, death, medical futility, and health. It is argued that analysis should avoid oversimplification. The synthesis or synoptic dimension prompts people to explain how their views have logical assumptions and implications. In addition to the conceptual elements are the evaluative and empirical dimensions. Among its functions, philosophy can be a form of prophylaxis--helping people avoid some commonly accepted questionable theories. Generally, recent philosophy has steered away from algorithms and deductivist approaches to ethical justification. In bioethics, philosophy works in partnership with a range of other disciplines, including pediatrics and neurology. Copyright © 2015 Elsevier Inc. All rights reserved.
Enhancements of Bayesian Blocks; Application to Large Light Curve Databases
NASA Technical Reports Server (NTRS)
Scargle, Jeff
2015-01-01
Bayesian Blocks are optimal piecewise linear representations (step function fits) of light-curves. The simple algorithm implementing this idea, using dynamic programming, has been extended to include more data modes and fitness metrics, multivariate analysis, and data on the circle (Studies in Astronomical Time Series Analysis. VI. Bayesian Block Representations, Scargle, Norris, Jackson and Chiang 2013, ApJ, 764, 167), as well as new results on background subtraction and refinement of the procedure for precise timing of transient events in sparse data. Example demonstrations will include exploratory analysis of the Kepler light curve archive in a search for "star-tickling" signals from extraterrestrial civilizations. (The Cepheid Galactic Internet, Learned, Kudritzki, Pakvasa1, and Zee, 2008, arXiv: 0809.0339; Walkowicz et al., in progress).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dunn, W.N.
1998-03-01
LUG and Sway brace ANalysis (LUGSAN) II is an analysis and database computer program that is designed to calculate store lug and sway brace loads for aircraft captive carriage. LUGSAN II combines the rigid body dynamics code, SWAY85, with a Macintosh Hypercard database to function both as an analysis and archival system. This report describes the LUGSAN II application program, which operates on the Macintosh System (Hypercard 2.2 or later) and includes function descriptions, layout examples, and sample sessions. Although this report is primarily a user`s manual, a brief overview of the LUGSAN II computer code is included with suggestedmore » resources for programmers.« less
Human Factors Operability Timeline Analysis to Improve the Processing Flow of the Orion Spacecraft
NASA Technical Reports Server (NTRS)
Stambolian, Damon B.; Schlierf, Roland; Miller, Darcy; Posada, Juan; Haddock, Mike; Haddad, Mike; Tran, Donald; Henderon, Gena; Barth, Tim
2011-01-01
This slide presentation reviews the use of Human factors and timeline analysis to have a more efficient and effective processing flow. The solution involved developing a written timeline of events that included each activity within each functional flow block. Each activity had computer animation videos and pictures of the people involved and the hardware. The Human Factors Engineering Analysis Tool (HFEAT) was improved by modifying it to include the timeline of events. The HFEAT was used to define the human factors requirements and design solutions were developed for these requirements. An example of a functional flow block diagram is shown, and a view from one of the animations (i.e., short stack pallet) is shown and explained.
Analysis Tools in Geant4 10.2 and 10.3
NASA Astrophysics Data System (ADS)
Hřivnáčová, I.; Barrand, G.
2017-10-01
A new analysis category based on g4tools was added in Geant4 release 9.5 (2011). The aim was to provide users with a lightweight analysis tool available as part of the Geant4 installation without the need to link to an external analysis package. It has progressively been included in all Geant4 examples. Frequent questions in the Geant4 users forum show its increasing popularity in the Geant4 users community. In this presentation, we will give a brief overview of g4tools and the analysis category. We report on new developments since our CHEP 2013 contribution as well as mention upcoming new features.
NASA Technical Reports Server (NTRS)
Knight, Norman F., Jr.; Rankin, Charles C.
2006-01-01
This document summarizes the STructural Analysis of General Shells (STAGS) development effort, STAGS performance for selected demonstration problems, and STAGS application problems illustrating selected advanced features available in the STAGS Version 5.0. Each problem is discussed including selected background information and reference solutions when available. The modeling and solution approach for each problem is described and illustrated. Numerical results are presented and compared with reference solutions, test data, and/or results obtained from mesh refinement studies. These solutions provide an indication of the overall capabilities of the STAGS nonlinear finite element analysis tool and provide users with representative cases, including input files, to explore these capabilities that may then be tailored to other applications.
Space station integrated wall design and penetration damage control
NASA Technical Reports Server (NTRS)
Coronado, A. R.; Gibbins, M. N.; Wright, M. A.; Stern, P. H.
1987-01-01
The analysis code BUMPER executes a numerical solution to the problem of calculating the probability of no penetration (PNP) of a spacecraft subject to man-made orbital debris or meteoroid impact. The codes were developed on a DEC VAX 11/780 computer that uses the Virtual Memory System (VMS) operating system, which is written in FORTRAN 77 with no VAX extensions. To help illustrate the steps involved, a single sample analysis is performed. The example used is the space station reference configuration. The finite element model (FEM) of this configuration is relatively complex but demonstrates many BUMPER features. The computer tools and guidelines are described for constructing a FEM for the space station under consideration. The methods used to analyze the sensitivity of PNP to variations in design, are described. Ways are suggested for developing contour plots of the sensitivity study data. Additional BUMPER analysis examples are provided, including FEMs, command inputs, and data outputs. The mathematical theory used as the basis for the code is described, and illustrates the data flow within the analysis.
Arc_Mat: a Matlab-based spatial data analysis toolbox
NASA Astrophysics Data System (ADS)
Liu, Xingjian; Lesage, James
2010-03-01
This article presents an overview of Arc_Mat, a Matlab-based spatial data analysis software package whose source code has been placed in the public domain. An earlier version of the Arc_Mat toolbox was developed to extract map polygon and database information from ESRI shapefiles and provide high quality mapping in the Matlab software environment. We discuss revisions to the toolbox that: utilize enhanced computing and graphing capabilities of more recent versions of Matlab, restructure the toolbox with object-oriented programming features, and provide more comprehensive functions for spatial data analysis. The Arc_Mat toolbox functionality includes basic choropleth mapping; exploratory spatial data analysis that provides exploratory views of spatial data through various graphs, for example, histogram, Moran scatterplot, three-dimensional scatterplot, density distribution plot, and parallel coordinate plots; and more formal spatial data modeling that draws on the extensive Spatial Econometrics Toolbox functions. A brief review of the design aspects of the revised Arc_Mat is described, and we provide some illustrative examples that highlight representative uses of the toolbox. Finally, we discuss programming with and customizing the Arc_Mat toolbox functionalities.
A methodology for the semi-automatic digital image analysis of fragmental impactites
NASA Astrophysics Data System (ADS)
Chanou, A.; Osinski, G. R.; Grieve, R. A. F.
2014-04-01
A semi-automated digital image analysis method is developed for the comparative textural study of impact melt-bearing breccias. This method uses the freeware software ImageJ developed by the National Institute of Health (NIH). Digital image analysis is performed on scans of hand samples (10-15 cm across), based on macroscopic interpretations of the rock components. All image processing and segmentation are done semi-automatically, with the least possible manual intervention. The areal fraction of components is estimated and modal abundances can be deduced, where the physical optical properties (e.g., contrast, color) of the samples allow it. Other parameters that can be measured include, for example, clast size, clast-preferred orientations, average box-counting dimension or fragment shape complexity, and nearest neighbor distances (NnD). This semi-automated method allows the analysis of a larger number of samples in a relatively short time. Textures, granulometry, and shape descriptors are of considerable importance in rock characterization. The methodology is used to determine the variations of the physical characteristics of some examples of fragmental impactites.
NASA Technical Reports Server (NTRS)
Macneal, R. H.; Harder, R. L.; Mason, J. B.
1973-01-01
A development for NASTRAN which facilitates the analysis of structures made up of identical segments symmetrically arranged with respect to an axis is described. The key operation in the method is the transformation of the degrees of freedom for the structure into uncoupled symmetrical components, thereby greatly reducing the number of equations which are solved simultaneously. A further reduction occurs if each segment has a plane of reflective symmetry. The only required assumption is that the problem be linear. The capability, as developed, will be available in level 16 of NASTRAN for static stress analysis, steady state heat transfer analysis, and vibration analysis. The paper includes a discussion of the theory, a brief description of the data supplied by the user, and the results obtained for two example problems. The first problem concerns the acoustic modes of a long prismatic cavity imbedded in the propellant grain of a solid rocket motor. The second problem involves the deformations of a large space antenna. The latter example is the first application of the NASTRAN Cyclic Symmetry capability to a really large problem.
Parameter estimation for terrain modeling from gradient data. [navigation system for Martian rover
NASA Technical Reports Server (NTRS)
Dangelo, K. R.
1974-01-01
A method is developed for modeling terrain surfaces for use on an unmanned Martian roving vehicle. The modeling procedure employs a two-step process which uses gradient as well as height data in order to improve the accuracy of the model's gradient. Least square approximation is used in order to stochastically determine the parameters which describe the modeled surface. A complete error analysis of the modeling procedure is included which determines the effect of instrumental measurement errors on the model's accuracy. Computer simulation is used as a means of testing the entire modeling process which includes the acquisition of data points, the two-step modeling process and the error analysis. Finally, to illustrate the procedure, a numerical example is included.
A computer program for predicting nonlinear uniaxial material responses using viscoplastic models
NASA Technical Reports Server (NTRS)
Chang, T. Y.; Thompson, R. L.
1984-01-01
A computer program was developed for predicting nonlinear uniaxial material responses using viscoplastic constitutive models. Four specific models, i.e., those due to Miller, Walker, Krieg-Swearengen-Rhode, and Robinson, are included. Any other unified model is easily implemented into the program in the form of subroutines. Analysis features include stress-strain cycling, creep response, stress relaxation, thermomechanical fatigue loop, or any combination of these responses. An outline is given on the theoretical background of uniaxial constitutive models, analysis procedure, and numerical integration methods for solving the nonlinear constitutive equations. In addition, a discussion on the computer program implementation is also given. Finally, seven numerical examples are included to demonstrate the versatility of the computer program developed.
The Utility of EEG Band Power Analysis in the Study of Infancy and Early Childhood
Saby, Joni N.; Marshall, Peter J.
2012-01-01
Research employing electroencephalographic (EEG) techniques with infants and young children has flourished in recent years due to increased interest in understanding the neural processes involved in early social and cognitive development. This review focuses on the functional characteristics of the alpha, theta, and gamma frequency bands in the developing EEG. Examples of how analyses of EEG band power have been applied to specific lines of developmental research are also discussed. These examples include recent work on the infant mu rhythm and action processing, frontal alpha asymmetry and approach-withdrawal tendencies, and EEG power measures in the study of early psychosocial adversity. PMID:22545661
Using Perilog to Explore "Decision Making at NASA"
NASA Technical Reports Server (NTRS)
McGreevy, Michael W.
2005-01-01
Perilog, a context intensive text mining system, is used as a discovery tool to explore topics and concerns in "Decision Making at NASA," chapter 6 of the Columbia Accident Investigation Board (CAIB) Report, Volume I. Two examples illustrate how Perilog can be used to discover highly significant safety-related information in the text without prior knowledge of the contents of the document. A third example illustrates how "if-then" statements found by Perilog can be used in logical analysis of decision making. In addition, in order to serve as a guide for future work, the technical details of preparing a PDF document for input to Perilog are included in an appendix.
DataToText: A Consumer-Oriented Approach to Data Analysis
ERIC Educational Resources Information Center
Kenny, David A.
2010-01-01
DataToText is a project developed where the user communicates the relevant information for an analysis and DataToText computer routine produces text output that describes in words, tables, and figures the results from the analyses. Two extended examples are given, one an example of a moderator analysis and the other an example of a dyadic data…
Cyber-Physical Security Assessment (CyPSA) Toolset
DOE Office of Scientific and Technical Information (OSTI.GOV)
Garcia, Luis; Patapanchala, Panini; Zonouz, Saman
CyPSA seeks to organize and gain insight into the diverse sets of data that a critical infrastructure provider must manage. Specifically CyPSA inventories, manages, and analyzes assets and relations among those assets. A variety of interfaces are provided. CyPSA inventories assets (both cyber and physical). This may include the cataloging of assets through a common interface. Data sources used to generate a catalogue of assets include PowerWorld, NPView, NMap Scans, and device configurations. Depending upon the role of the person using the tool the types of assets accessed as well as the data sources through which asset information is accessedmore » may vary. CyPSA allows practitioners to catalogue relations among assets and these may either be manually or programmatically generated. For example, some common relations among assets include the following: Topological Network Data: Which devices and assets are connected and how? Data sources for this kind of information include NMap scans, NPView topologies (via Firewall rule analysis). Security Metrics Outputs: The output of various security metrics such as overall exposure. Configure Assets:CyPSA may eventually include the ability to configure assets including relays and switches. For example, a system administrator would be able to configure and alter the state of a relay via the CyPSA interface. Annotate Assets: CyPSA also allows practitioners to manually and programmatically annotate assets. Sources of information with which to annotate assets include provenance metadata regarding the data source from which the asset was loaded, vulnerability information from vulnerability databases, configuration information, and the output of an analysis in general.« less
Racism as a determinant of health: a protocol for conducting a systematic review and meta-analysis
2013-01-01
Background Racism is increasingly recognized as a key determinant of health. A growing body of epidemiological evidence shows strong associations between self-reported racism and poor health outcomes across diverse minority groups in developed countries. While the relationship between racism and health has received increasing attention over the last two decades, a comprehensive meta-analysis focused on the health effects of racism has yet to be conducted. The aim of this review protocol is to provide a structure from which to conduct a systematic review and meta-analysis of studies that assess the relationship between racism and health. Methods This research will consist of a systematic review and meta-analysis. Studies will be considered for review if they are empirical studies reporting quantitative data on the association between racism and health for adults and/or children of all ages from any racial/ethnic/cultural groups. Outcome measures will include general health and well-being, physical health, mental health, healthcare use and health behaviors. Scientific databases (for example, Medline) will be searched using a comprehensive search strategy and reference lists will be manually searched for relevant studies. In addition, use of online search engines (for example, Google Scholar), key websites, and personal contact with experts will also be undertaken. Screening of search results and extraction of data from included studies will be independently conducted by at least two authors, including assessment of inter-rater reliability. Studies included in the review will be appraised for quality using tools tailored to each study design. Summary statistics of study characteristics and findings will be compiled and findings synthesized in a narrative summary as well as a meta-analysis. Discussion This review aims to examine associations between reported racism and health outcomes. This comprehensive and systematic review and meta-analysis of empirical research will provide a rigorous and reliable evidence base for future research, policy and practice, including information on the extent of available evidence for a range of racial/ethnic minority groups PMID:24059279
American Guild of Musical Artists: A Case for System Development, Data Modeling, and Analytics
ERIC Educational Resources Information Center
Harris, Ranida; Wedel, Thomas
2017-01-01
This article presents a case scenario that may be used in system analysis and design, database management, and business analytics classes. The case document includes realistic, detailed information on the operations at the American Guild of Musical Artists (AGMA). Examples of assignments for each class and suggested reading are presented. In each…
Time-Frequency Domain Analysis of Helicopter Transmission Vibration
1991-08-01
Wigner - Ville distribution ( WVD ) have be reported, including speech...FREQUENCY DISTRIBUTIONS . 8 6. THE WIGNER - VILLE DISTRIBUTION . 9 6.1 History. 9 6.2 Definition. 9 6.3 Discrete-Time/Frequency Wigner - Ville Distribution . 10...signals are examined to indicate how various forms of modulation are portrayed using the Wigner - Ville distribution . Practical examples A signal is
On Improving the Experiment Methodology in Pedagogical Research
ERIC Educational Resources Information Center
Horakova, Tereza; Houska, Milan
2014-01-01
The paper shows how the methodology for a pedagogical experiment can be improved through including the pre-research stage. If the experiment has the form of a test procedure, an improvement of methodology can be achieved using for example the methods of statistical and didactic analysis of tests which are traditionally used in other areas, i.e.…
Problem-Based Teaching in International Management: A Political/Economic Risk Assessment Exercise
ERIC Educational Resources Information Center
Daly, Paula S.; White, Marion M.; Zisk, Daniel S.; Cavazos, David E.
2013-01-01
This article draws from the current literature to examine problem-based learning (PBL) as a management education tool, and provides an example of how to incorporate PBL into an undergraduate international management course. Also included are an explanation of, and specific guidelines for, a PBL exercise focused on the analysis of "country risk"…
A Multifaceted Approach to Investigating Pre-Task Planning Effects on Paired Oral Test Performance
ERIC Educational Resources Information Center
Nitta, Ryo; Nakatsuhara, Fumiyo
2014-01-01
Despite the growing popularity of paired format speaking assessments, the effects of pre-task planning time on performance in these formats are not yet well understood. For example, some studies have revealed the benefits of planning but others have not. Using a multifaceted approach including analysis of the process of speaking performance, the…
Understanding the evidence for historical fire across eastern forests
Charles M. Ruffner
2006-01-01
Evidence for historical fire across the eastern deciduous biome spans several fields, including paleoecology, fire scar analysis, witness tree studies, historical documents and ethnographic sources. In this paper I provide an overview of many of these methods as well as the limitations and examples of each. While the use of any single approach has its cautions and...
ERIC Educational Resources Information Center
Coursen, David
Modern educators and playground designers are increasingly recognizing that play is a part, perhaps the decisive part, of the entire learning process. Theories of playground equipment design, planning the playground, financial considerations, and equipment suggestions are featured in this review. Examples of playgrounds include innovative…
The MicronEye Motion Monitor: A New Tool for Class and Laboratory Demonstrations.
ERIC Educational Resources Information Center
Nissan, M.; And Others
1988-01-01
Describes a special camera that can be directly linked to a computer that has been adapted for studying movement. Discusses capture, processing, and analysis of two-dimensional data with either IBM PC or Apple II computers. Gives examples of a variety of mechanical tests including pendulum motion, air track, and air table. (CW)
Researching Literacy in Context: Using Video Analysis to Explore School Literacies
ERIC Educational Resources Information Center
Blikstad-Balas, Marte; Sørvik, Gard Ove
2015-01-01
This article addresses how methodological approaches relying on video can be included in literacy research to capture changing literacies. In addition to arguing why literacy is best studied in context, we provide empirical examples of how small, head-mounted video cameras have been used in two different research projects that share a common aim:…
Some developing concepts of engineering education
NASA Technical Reports Server (NTRS)
Perkins, C. D.
1975-01-01
An analysis of the circumstances which have created a shortage of aeronautical engineering undergraduate students in the universities is presented. Suggestions for motivating students to enter aeronautical engineering are examined. The support of the aeronautical industry for graduate education funding is recommended. Examples of actions taken by governmental agencies to promote increased interest in aeronautical engineering are included.
Jiang, Jing [Nanjing University; Walters, Diane M [University of Wisconsin-Madison; Zhou, Dongshan [Nanjing University; Ediger, Mark D [University of Wisconsin-Madison
2016-08-18
Data set for work presented in Jiang, J.; Walters, D. M.; Zhou, D.; Ediger, M. D. “Substrate Temperature Controls Molecular Orientation in Two -Component Vapor-deposited Glasses.” Soft Matt. 2016, 12, 3265. Includes all data presented in the manuscript as well as example raw data and analysis.
The Mysterious Box: Nuclear Science and Art.
ERIC Educational Resources Information Center
Keisch, Bernard
In this booklet intended for junior high school science students a short story format is used to provide examples of the use of nuclear chemistry and physics in the analysis of paints and pigments for authentication of paintings. The techniques discussed include the measurement of the relative amounts of lead-210 and radium-226 in white-lead…
Post-earthquake dilatancy recovery
NASA Technical Reports Server (NTRS)
Scholz, C. H.
1974-01-01
Geodetic measurements of the 1964 Niigata, Japan earthquake and of three other examples are briefly examined. They show exponentially decaying subsidence for a year after the quakes. The observations confirm the dilatancy-fluid diffusion model of earthquake precursors and clarify the extent and properties of the dilatant zone. An analysis using one-dimensional consolidation theory is included which agrees well with this interpretation.
Understanding and Using the Fermi Science Tools
NASA Astrophysics Data System (ADS)
Asercion, Joseph
2018-01-01
The Fermi Science Support Center (FSSC) provides information, documentation, and tools for the analysis of Fermi science data, including both the Large-Area Telescope (LAT) and the Gamma-ray Burst Monitor (GBM). Source and binary versions of the Fermi Science Tools can be downloaded from the FSSC website, and are supported on multiple platforms. An overview document, the Cicerone, provides details of the Fermi mission, the science instruments and their response functions, the science data preparation and analysis process, and interpretation of the results. Analysis Threads and a reference manual available on the FSSC website provide the user with step-by-step instructions for many different types of data analysis: point source analysis - generating maps, spectra, and light curves, pulsar timing analysis, source identification, and the use of python for scripting customized analysis chains. We present an overview of the structure of the Fermi science tools and documentation, and how to acquire them. We also provide examples of standard analyses, including tips and tricks for improving Fermi science analysis.
Task 2 Report: Algorithm Development and Performance Analysis
1993-07-01
separated peaks ............................................. 39 7-16 Example ILGC data for schedule 3 phosphites showing an analysis method which integrates...more closely follows the baseline ................. 40 7-18 Example R.GC data for schedule 3 phosphites showing an analysis method resulting in unwanted...much of the ambiguity that can arise in GC/MS with trace environmental samples, for example. Correlated chromatography, on the other hand, separates the
MO-F-211-01: Methods for Completing Practice Quality Improvement (PQI).
Johnson, J; Brown, K; Ibbott, G; Pawlicki, T
2012-06-01
Practice Quality Improvement (PQI) is becoming an expected part of routine practice in healthcare as an approach to provide more efficient, effective and high quality care. Additionally, as part of the ABR's Maintenance of Certification (MOC) pathway, medical physicists are now expected to complete a PQI project. This session will describe the history behind and benefits of the ABR's MOC program, provide details of quality improvement methods and how to successfully complete a PQI project. PQI methods include various commonly used engineering and management tools. The Plan-Do-Study-Act (PDSA) cycle will be presented as one project planning and implementation tool. Other PQI analysis instruments such as flowcharts, Pareto charts, process control charts and fishbone diagrams will also be explained with examples. Cause analysis, solution development and implementation, and post-implementation measurement will be presented. Project identification and definition as well as appropriate measurement tool selection will be offered. Methods to choose key quality metrics (key quality indicators) will also be addressed. Several sample PQI projects and templates available through the AAPM and other organizations will be described. At least three examples of completed PQI projects will be shared. 1. Identify and define a PQI project 2. Identify and select measurement methods/techniques for use with the PQI project 3. Describe example(s) of completed projects. © 2012 American Association of Physicists in Medicine.
Oeck, Sebastian; Malewicz, Nathalie M; Hurst, Sebastian; Al-Refae, Klaudia; Krysztofiak, Adam; Jendrossek, Verena
2017-07-01
The quantitative analysis of foci plays an important role in various cell biological methods. In the fields of radiation biology and experimental oncology, the effect of ionizing radiation, chemotherapy or molecularly targeted drugs on DNA damage induction and repair is frequently performed by the analysis of protein clusters or phosphorylated proteins recruited to so called repair foci at DNA damage sites, involving for example γ-H2A.X, 53BP1 or RAD51. We recently developed "The Focinator" as a reliable and fast tool for automated quantitative and qualitative analysis of nuclei and DNA damage foci. The refined software is now even more user-friendly due to a graphical interface and further features. Thus, we included an R-script-based mode for automated image opening, file naming, progress monitoring and an error report. Consequently, the evaluation no longer required the attendance of the operator after initial parameter definition. Moreover, the Focinator v2-0 is now able to perform multi-channel analysis of four channels and evaluation of protein-protein colocalization by comparison of up to three foci channels. This enables for example the quantification of foci in cells of a specific cell cycle phase.
NASA Astrophysics Data System (ADS)
Karmazikov, Y. V.; Fainberg, E. M.
2005-06-01
Work with DICOM compatible equipment integrated into hardware and software systems for medical purposes has been considered. Structures of process of reception and translormation of the data are resulted by the example of digital rentgenography and angiography systems, included in hardware-software complex DIMOL-IK. Algorithms of reception and the analysis of the data are offered. Questions of the further processing and storage of the received data are considered.
Instrumental biosensors: new perspectives for the analysis of biomolecular interactions.
Nice, E C; Catimel, B
1999-04-01
The use of instrumental biosensors in basic research to measure biomolecular interactions in real time is increasing exponentially. Applications include protein-protein, protein-peptide, DNA-protein, DNA-DNA, and lipid-protein interactions. Such techniques have been applied to, for example, antibody-antigen, receptor-ligand, signal transduction, and nuclear receptor studies. This review outlines the principles of two of the most commonly used instruments and highlights specific operating parameters that will assist in optimising experimental design, data generation, and analysis.
Accommodating complexity and human behaviors in decision analysis.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Backus, George A.; Siirola, John Daniel; Schoenwald, David Alan
2007-11-01
This is the final report for a LDRD effort to address human behavior in decision support systems. One sister LDRD effort reports the extension of this work to include actual human choices and additional simulation analyses. Another provides the background for this effort and the programmatic directions for future work. This specific effort considered the feasibility of five aspects of model development required for analysis viability. To avoid the use of classified information, healthcare decisions and the system embedding them became the illustrative example for assessment.
On the Exploitation of Sensitivity Derivatives for Improving Sampling Methods
NASA Technical Reports Server (NTRS)
Cao, Yanzhao; Hussaini, M. Yousuff; Zang, Thomas A.
2003-01-01
Many application codes, such as finite-element structural analyses and computational fluid dynamics codes, are capable of producing many sensitivity derivatives at a small fraction of the cost of the underlying analysis. This paper describes a simple variance reduction method that exploits such inexpensive sensitivity derivatives to increase the accuracy of sampling methods. Three examples, including a finite-element structural analysis of an aircraft wing, are provided that illustrate an order of magnitude improvement in accuracy for both Monte Carlo and stratified sampling schemes.
Battery Test Manual For 48 Volt Mild Hybrid Electric Vehicles
DOE Office of Scientific and Technical Information (OSTI.GOV)
Walker, Lee Kenneth
2017-03-01
This manual details the U.S. Advanced Battery Consortium and U.S. Department of Energy Vehicle Technologies Program goals, test methods, and analysis techniques for a 48 Volt Mild Hybrid Electric Vehicle system. The test methods are outlined stating with characterization tests, followed by life tests. The final section details standardized analysis techniques for 48 V systems that allow for the comparison of different programs that use this manual. An example test plan is included, along with guidance to filling in gap table numbers.
Analysing uncertainties of supply and demand in the future use of hydrogen as an energy vector
NASA Astrophysics Data System (ADS)
Lenel, U. R.; Davies, D. G. S.; Moore, M. A.
An analytical technique (Analysis with Uncertain Qualities), developed at Fulmer, is being used to examine the sensitivity of the outcome to uncertainties in input quantities in order to highlight which input quantities critically affect the potential role of hydrogen. The work presented here includes an outline of the model and the analysis technique, along with basic considerations of the input quantities to the model (demand, supply and constraints). Some examples are given of probabilistic estimates of input quantities.
Discourse analysis: towards an understanding of its place in nursing.
Crowe, Marie
2005-07-01
This paper describes how discourse analysis, and in particular critical discourse analysis, can be used in nursing research, and provides an example to illustrate the techniques involved. Discourse analysis has risen to prominence in the 1980s and 1990s in disciplines such as the social sciences, literary theory and cultural studies and is increasingly used in nursing. This paper investigates discourse analysis as a useful methodology for conducting nursing research. Effective clinical reasoning relies on employing several different kinds of knowledge and research that draw on different perspectives, methodologies and techniques to generate breadth of knowledge and depth of understanding of clinical practices and patients' experiences of those practices. The steps in a discourse analysis include: choosing the text, and identifying the explicit purpose of the text, the processes used for claiming authority connections to other discourses, construction of major concepts, processes of naming and categorizing, construction of subject positions, construction of reality and social relations and implications for the practice of nursing. The limitations of discourse analysis, its relationship to other qualitative approaches and questions for evaluating the rigour of research using discourse analysis are also explored. The example of discourse analysis shows how a text influences the practice of nursing by shaping knowledge, values and beliefs. Discourse analysis can make a contribution to the development of nursing knowledge by providing a research strategy to examine dominant discourses that influence nursing practice.
Sekuła, Justyna; Nizioł, Joanna; Rode, Wojciech; Ruman, Tomasz
2015-05-22
Preparation is described of a durable surface of cationic gold nanoparticles (AuNPs), covering commercial and custom-made MALDI targets, along with characterization of the nanoparticle surface properties and examples of the use in MS analyses and MS imaging (IMS) of low molecular weight (LMW) organic compounds. Tested compounds include nucleosides, saccharides, amino acids, glycosides, and nucleic bases for MS measurements, as well as over one hundred endogenous compounds in imaging experiment. The nanoparticles covering target plate were enriched in sodium in order to promote sodium-adduct formation. The new surface allows fast analysis, high sensitivity of detection and high mass determination accuracy. Example of application of new Au nanoparticle-enhanced target for fast and simple MS imaging of a fingerprint is also presented. Copyright © 2015 Elsevier B.V. All rights reserved.
Enhanced reproducibility of SADI web service workflows with Galaxy and Docker.
Aranguren, Mikel Egaña; Wilkinson, Mark D
2015-01-01
Semantic Web technologies have been widely applied in the life sciences, for example by data providers such as OpenLifeData and through web services frameworks such as SADI. The recently reported OpenLifeData2SADI project offers access to the vast OpenLifeData data store through SADI services. This article describes how to merge data retrieved from OpenLifeData2SADI with other SADI services using the Galaxy bioinformatics analysis platform, thus making this semantic data more amenable to complex analyses. This is demonstrated using a working example, which is made distributable and reproducible through a Docker image that includes SADI tools, along with the data and workflows that constitute the demonstration. The combination of Galaxy and Docker offers a solution for faithfully reproducing and sharing complex data retrieval and analysis workflows based on the SADI Semantic web service design patterns.
Acoustic emission non-destructive testing of structures using source location techniques.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Beattie, Alan G.
2013-09-01
The technology of acoustic emission (AE) testing has been advanced and used at Sandia for the past 40 years. AE has been used on structures including pressure vessels, fire bottles, wind turbines, gas wells, nuclear weapons, and solar collectors. This monograph begins with background topics in acoustics and instrumentation and then focuses on current acoustic emission technology. It covers the overall design and system setups for a test, with a wind turbine blade as the object. Test analysis is discussed with an emphasis on source location. Three test examples are presented, two on experimental wind turbine blades and one onmore » aircraft fire extinguisher bottles. Finally, the code for a FORTRAN source location program is given as an example of a working analysis program. Throughout the document, the stress is on actual testing of real structures, not on laboratory experiments.« less
Observational evidence and strength of evidence domains: case examples
2014-01-01
Background Systematic reviews of healthcare interventions most often focus on randomized controlled trials (RCTs). However, certain circumstances warrant consideration of observational evidence, and such studies are increasingly being included as evidence in systematic reviews. Methods To illustrate the use of observational evidence, we present case examples of systematic reviews in which observational evidence was considered as well as case examples of individual observational studies, and how they demonstrate various strength of evidence domains in accordance with current Agency for Healthcare Research and Quality (AHRQ) Evidence-based Practice Center (EPC) methods guidance. Results In the presented examples, observational evidence is used when RCTs are infeasible or raise ethical concerns, lack generalizability, or provide insufficient data. Individual study case examples highlight how observational evidence may fulfill required strength of evidence domains, such as study limitations (reduced risk of selection, detection, performance, and attrition); directness; consistency; precision; and reporting bias (publication, selective outcome reporting, and selective analysis reporting), as well as additional domains of dose-response association, plausible confounding that would decrease the observed effect, and strength of association (magnitude of effect). Conclusions The cases highlighted in this paper demonstrate how observational studies may provide moderate to (rarely) high strength evidence in systematic reviews. PMID:24758494
Energy-Water System Solutions | Energy Analysis | NREL
simultaneously. Example Projects Energy, water, and renewable opportunities assessment at Bagram Air Force Base opportunity to plan integrated infrastructure. Example Projects Identification of critical water and campus-level opportunities. Example Projects Net Zero Energy-Water-Waste analysis for Fort Carson Net
Artificial Intelligence in Sports Biomechanics: New Dawn or False Hope?
Bartlett, Roger
2006-01-01
This article reviews developments in the use of Artificial Intelligence (AI) in sports biomechanics over the last decade. It outlines possible uses of Expert Systems as diagnostic tools for evaluating faults in sports movements (‘techniques’) and presents some example knowledge rules for such an expert system. It then compares the analysis of sports techniques, in which Expert Systems have found little place to date, with gait analysis, in which they are routinely used. Consideration is then given to the use of Artificial Neural Networks (ANNs) in sports biomechanics, focusing on Kohonen self-organizing maps, which have been the most widely used in technique analysis, and multi-layer networks, which have been far more widely used in biomechanics in general. Examples of the use of ANNs in sports biomechanics are presented for javelin and discus throwing, shot putting and football kicking. I also present an example of the use of Evolutionary Computation in movement optimization in the soccer throw in, which predicted an optimal technique close to that in the coaching literature. After briefly overviewing the use of AI in both sports science and biomechanics in general, the article concludes with some speculations about future uses of AI in sports biomechanics. Key Points Expert Systems remain almost unused in sports biomechanics, unlike in the similar discipline of gait analysis. Artificial Neural Networks, particularly Kohonen Maps, have been used, although their full value remains unclear. Other AI applications, including Evolutionary Computation, have received little attention. PMID:24357939
Artificial intelligence in sports biomechanics: new dawn or false hope?
Bartlett, Roger
2006-12-15
This article reviews developments in the use of Artificial Intelligence (AI) in sports biomechanics over the last decade. It outlines possible uses of Expert Systems as diagnostic tools for evaluating faults in sports movements ('techniques') and presents some example knowledge rules for such an expert system. It then compares the analysis of sports techniques, in which Expert Systems have found little place to date, with gait analysis, in which they are routinely used. Consideration is then given to the use of Artificial Neural Networks (ANNs) in sports biomechanics, focusing on Kohonen self-organizing maps, which have been the most widely used in technique analysis, and multi-layer networks, which have been far more widely used in biomechanics in general. Examples of the use of ANNs in sports biomechanics are presented for javelin and discus throwing, shot putting and football kicking. I also present an example of the use of Evolutionary Computation in movement optimization in the soccer throw in, which predicted an optimal technique close to that in the coaching literature. After briefly overviewing the use of AI in both sports science and biomechanics in general, the article concludes with some speculations about future uses of AI in sports biomechanics. Key PointsExpert Systems remain almost unused in sports biomechanics, unlike in the similar discipline of gait analysis.Artificial Neural Networks, particularly Kohonen Maps, have been used, although their full value remains unclear.Other AI applications, including Evolutionary Computation, have received little attention.
NASA Astrophysics Data System (ADS)
Cich, Matthew J.; Guillaume, Alexandre; Drouin, Brian; Benner, D. Chris
2017-06-01
Multispectrum analysis can be a challenge for a variety of reasons. It can be computationally intensive to fit a proper line shape model especially for high resolution experimental data. Band-wide analyses including many transitions along with interactions, across many pressures and temperatures are essential to accurately model, for example, atmospherically relevant systems. Labfit is a fast multispectrum analysis program originally developed by D. Chris Benner with a text-based interface. More recently at JPL a graphical user interface was developed with the goal of increasing the ease of use but also the number of potential users. The HTP lineshape model has been added to Labfit keeping it up-to-date with community standards. Recent analyses using labfit will be shown to demonstrate its ability to competently handle large experimental datasets, including high order lineshape effects, that are otherwise unmanageable.
Bringing a transgenic crop to market: where compositional analysis fits.
Privalle, Laura S; Gillikin, Nancy; Wandelt, Christine
2013-09-04
In the process of developing a biotechnology product, thousands of genes and transformation events are evaluated to select the event that will be commercialized. The ideal event is identified on the basis of multiple characteristics including trait efficacy, the molecular characteristics of the insert, and agronomic performance. Once selected, the commercial event is subjected to a rigorous safety evaluation taking a multipronged approach including examination of the safety of the gene and gene product - the protein, plant performance, impact of cultivating the crop on the environment, agronomic performance, and equivalence of the crop/food to conventional crops/food - by compositional analysis. The compositional analysis is composed of a comparison of the nutrient and antinutrient composition of the crop containing the event, its parental line (variety), and other conventional lines (varieties). Different geographies have different requirements for the compositional analysis studies. Parameters that vary include the number of years (seasons) and locations (environments) to be evaluated, the appropriate comparator(s), analytes to be evaluated, and statistical analysis. Specific examples of compositional analysis results will be presented.
2010-01-01
Horizon (DH) was an ultra deepwater , semisubmers- ible offshore drilling rig contracted to BP by its owner, Transocean. The rig was capable of...Warnings from Comparable Examples Including Deepwater Horizon 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT...research quality and objectivity. StrategieS and WarningS from Comparable exampleS inCluding deepWater Horizon Confronting SpaCe DebriS dave baiocchi
Regression Analysis by Example. 5th Edition
ERIC Educational Resources Information Center
Chatterjee, Samprit; Hadi, Ali S.
2012-01-01
Regression analysis is a conceptually simple method for investigating relationships among variables. Carrying out a successful application of regression analysis, however, requires a balance of theoretical results, empirical rules, and subjective judgment. "Regression Analysis by Example, Fifth Edition" has been expanded and thoroughly…
Methylxanthines: properties and determination in various objects
NASA Astrophysics Data System (ADS)
Andreeva, Elena Yu; Dmitrienko, Stanislava G.; Zolotov, Yurii A.
2012-05-01
Published data on the properties and determination of caffeine, theophylline, theobromine and some other methylxanthines in various objects are surveyed and described systematically. Different sample preparation procedures such as liquid extraction from solid matrices and liquid-liquid, supercritical fluid and solid-phase extraction are compared. The key methods of analysis including chromatography, electrophoresis, spectrometry and electrochemical methods are discussed. Examples of methylxanthine determination in plants, food products, energy beverages, pharmaceuticals, biological fluids and natural and waste waters are given. The bibliography includes 393 references.
The SPAN cookbook: A practical guide to accessing SPAN
NASA Technical Reports Server (NTRS)
Mason, Stephanie; Tencati, Ronald D.; Stern, David M.; Capps, Kimberly D.; Dorman, Gary; Peters, David J.
1990-01-01
This is a manual for remote users who wish to send electronic mail messages from the Space Physics Analysis Network (SPAN) to scientific colleagues on other computer networks and vice versa. In several instances more than one gateway has been included for the same network. Users are provided with an introduction to each network listed with helpful details about accessing the system and mail syntax examples. Also included is information on file transfers, remote logins, and help telephone numbers.
Species data: National inventory of range maps and distribution models
Gergely, Kevin J.; McKerrow, Alexa
2013-01-01
The Gap Analysis Program (GAP) produces data and tools that help meet critical national challenges such as biodiversity conservation, renewable energy development, climate change adaptation, and infrastructure investment. The GAP species data includes vertebrate range maps and distribution models for the continental United States, as well as Alaska, Hawaii, Puerto Rico, and U.S. Virgin Islands. The vertebrate species include amphibians, birds, mammals, and reptiles. Furthermore, data used to create the distribution models (for example, percent canopy cover, elevation, and so forth) also are available.
NASA Technical Reports Server (NTRS)
Purves, L.; Strang, R. F.; Dube, M. P.; Alea, P.; Ferragut, N.; Hershfeld, D.
1983-01-01
The software and procedures of a system of programs used to generate a report of the statistical correlation between NASTRAN modal analysis results and physical tests results from modal surveys are described. Topics discussed include: a mathematical description of statistical correlation, a user's guide for generating a statistical correlation report, a programmer's guide describing the organization and functions of individual programs leading to a statistical correlation report, and a set of examples including complete listings of programs, and input and output data.
Non-isothermal elastoviscoplastic snap-through and creep buckling of shallow arches
NASA Technical Reports Server (NTRS)
Simitses, G. J.; Riff, R.
1987-01-01
The problem of buckling of shallow arches under transient thermomechanical loads is investigated. The analysis is based on nonlinear geometric and constitutive relations, and is expressed in a rate form. The material constitutive equations are capable of reproducing all non-isothermal, elasto-viscoplastic characteristics. The solution scheme is capable of predicting response which includes pre and postbuckling with creep and plastic effects. The solution procedure is demonstrated through several examples which include both creep and snap-through behavior.
Microwave techniques for measuring complex permittivity and permeability of materials
DOE Office of Scientific and Technical Information (OSTI.GOV)
Guillon, P.
1995-08-01
Different materials are of fundamental importance to the aerospace, microwave, electronics and communications industries, and include for example microwave absorbing materials, antennas lenses and radomes, substrates for MMIC and microwave components and antennaes. Basic measurements for the complex permittivity and permeability of those homogeneous solid materials in the microwave spectral region are described including hardware, instrumentation and analysis. Elevated temperature measurements as well as measurements intercomparisons, with a discussion of the strengths and weaknesses of each techniques are also presented.
Method of multi-dimensional moment analysis for the characterization of signal peaks
Pfeifer, Kent B; Yelton, William G; Kerr, Dayle R; Bouchier, Francis A
2012-10-23
A method of multi-dimensional moment analysis for the characterization of signal peaks can be used to optimize the operation of an analytical system. With a two-dimensional Peclet analysis, the quality and signal fidelity of peaks in a two-dimensional experimental space can be analyzed and scored. This method is particularly useful in determining optimum operational parameters for an analytical system which requires the automated analysis of large numbers of analyte data peaks. For example, the method can be used to optimize analytical systems including an ion mobility spectrometer that uses a temperature stepped desorption technique for the detection of explosive mixtures.
A new u-statistic with superior design sensitivity in matched observational studies.
Rosenbaum, Paul R
2011-09-01
In an observational or nonrandomized study of treatment effects, a sensitivity analysis indicates the magnitude of bias from unmeasured covariates that would need to be present to alter the conclusions of a naïve analysis that presumes adjustments for observed covariates suffice to remove all bias. The power of sensitivity analysis is the probability that it will reject a false hypothesis about treatment effects allowing for a departure from random assignment of a specified magnitude; in particular, if this specified magnitude is "no departure" then this is the same as the power of a randomization test in a randomized experiment. A new family of u-statistics is proposed that includes Wilcoxon's signed rank statistic but also includes other statistics with substantially higher power when a sensitivity analysis is performed in an observational study. Wilcoxon's statistic has high power to detect small effects in large randomized experiments-that is, it often has good Pitman efficiency-but small effects are invariably sensitive to small unobserved biases. Members of this family of u-statistics that emphasize medium to large effects can have substantially higher power in a sensitivity analysis. For example, in one situation with 250 pair differences that are Normal with expectation 1/2 and variance 1, the power of a sensitivity analysis that uses Wilcoxon's statistic is 0.08 while the power of another member of the family of u-statistics is 0.66. The topic is examined by performing a sensitivity analysis in three observational studies, using an asymptotic measure called the design sensitivity, and by simulating power in finite samples. The three examples are drawn from epidemiology, clinical medicine, and genetic toxicology. © 2010, The International Biometric Society.
ParallABEL: an R library for generalized parallelization of genome-wide association studies
2010-01-01
Background Genome-Wide Association (GWA) analysis is a powerful method for identifying loci associated with complex traits and drug response. Parts of GWA analyses, especially those involving thousands of individuals and consuming hours to months, will benefit from parallel computation. It is arduous acquiring the necessary programming skills to correctly partition and distribute data, control and monitor tasks on clustered computers, and merge output files. Results Most components of GWA analysis can be divided into four groups based on the types of input data and statistical outputs. The first group contains statistics computed for a particular Single Nucleotide Polymorphism (SNP), or trait, such as SNP characterization statistics or association test statistics. The input data of this group includes the SNPs/traits. The second group concerns statistics characterizing an individual in a study, for example, the summary statistics of genotype quality for each sample. The input data of this group includes individuals. The third group consists of pair-wise statistics derived from analyses between each pair of individuals in the study, for example genome-wide identity-by-state or genomic kinship analyses. The input data of this group includes pairs of SNPs/traits. The final group concerns pair-wise statistics derived for pairs of SNPs, such as the linkage disequilibrium characterisation. The input data of this group includes pairs of individuals. We developed the ParallABEL library, which utilizes the Rmpi library, to parallelize these four types of computations. ParallABEL library is not only aimed at GenABEL, but may also be employed to parallelize various GWA packages in R. The data set from the North American Rheumatoid Arthritis Consortium (NARAC) includes 2,062 individuals with 545,080, SNPs' genotyping, was used to measure ParallABEL performance. Almost perfect speed-up was achieved for many types of analyses. For example, the computing time for the identity-by-state matrix was linearly reduced from approximately eight hours to one hour when ParallABEL employed eight processors. Conclusions Executing genome-wide association analysis using the ParallABEL library on a computer cluster is an effective way to boost performance, and simplify the parallelization of GWA studies. ParallABEL is a user-friendly parallelization of GenABEL. PMID:20429914
EduGATE - basic examples for educative purpose using the GATE simulation platform.
Pietrzyk, Uwe; Zakhnini, Abdelhamid; Axer, Markus; Sauerzapf, Sophie; Benoit, Didier; Gaens, Michaela
2013-02-01
EduGATE is a collection of basic examples to introduce students to the fundamental physical aspects of medical imaging devices. It is based on the GATE platform, which has received a wide acceptance in the field of simulating medical imaging devices including SPECT, PET, CT and also applications in radiation therapy. GATE can be configured by commands, which are, for the sake of simplicity, listed in a collection of one or more macro files to set up phantoms, multiple types of sources, detection device, and acquisition parameters. The aim of the EduGATE is to use all these helpful features of GATE to provide insights into the physics of medical imaging by means of a collection of very basic and simple GATE macros in connection with analysis programs based on ROOT, a framework for data processing. A graphical user interface to define a configuration is also included. Copyright © 2012. Published by Elsevier GmbH.
Hydrogen bonding in phytohormone-auxin (IAA) and its derivatives
NASA Astrophysics Data System (ADS)
Kojić-Prodić, Biserka; Kroon, Jan; Puntarec, Vitomir
1994-06-01
The significant importance of hydrogen bonds in biological structures and enzymatic reactions has been demonstrated in many examples. As a part of the molecular recognition study of auxins (plant growth hormones) the influence of hydrogen bonding on molecular conformation, particularly of the carboxyl group, which is one of the biologically active ligand sites, has been studied by X-ray diffraction and computational chemistry methods. The survey includes about 40 crystal structures of free auxins such as indol-3-ylacetic acid and its n-alkylated and halogenated derivatives but also bound auxins such as N-(indol-3-ylacetyl)- L-amino acids, and carbohydrate conjugates. The study includes hydrogen bonds of the NH⋯O and OH⋯O types. The classification of hydrogen bond patterns based on the discrimination between the centrosymmetric and non-centrosymmetric space groups and several examples of hydrogen bond systematics on graph set analysis are also shown.
NASA Technical Reports Server (NTRS)
Walley, J. L.; Nunes, A. C.; Clounch, J. L.; Russell, C. K.
2007-01-01
This study presents examples and considerations for differentiating linear radiographic indications produced by gas tungsten arc welds in a 0.05-in-thick sheet of Inconel 718. A series of welds with different structural features, including the enigma indications and other defect indications such as lack of fusion and penetration, were produced, radiographed, and examined metallographically. The enigma indications were produced by a large columnar grain running along the center of the weld nugget occurring when the weld speed was reduced sufficiently below nominal. Examples of respective indications, including the effect of changing the x-ray source location, are presented as an aid to differentiation. Enigma, nominal, and hot-weld specimens were tensile tested to demonstrate the harmlessness of the enigma indication. Statistical analysis showed that there is no difference between the strengths of these three weld conditions.
Advanced analysis technique for the evaluation of linear alternators and linear motors
NASA Technical Reports Server (NTRS)
Holliday, Jeffrey C.
1995-01-01
A method for the mathematical analysis of linear alternator and linear motor devices and designs is described, and an example of its use is included. The technique seeks to surpass other methods of analysis by including more rigorous treatment of phenomena normally omitted or coarsely approximated such as eddy braking, non-linear material properties, and power losses generated within structures surrounding the device. The technique is broadly applicable to linear alternators and linear motors involving iron yoke structures and moving permanent magnets. The technique involves the application of Amperian current equivalents to the modeling of the moving permanent magnet components within a finite element formulation. The resulting steady state and transient mode field solutions can simultaneously account for the moving and static field sources within and around the device.
NASTRAN nonlinear vibration analysis of beam and frame structures
NASA Technical Reports Server (NTRS)
Mei, C.; Rogers, J. L., Jr.
1975-01-01
A capability for the nonlinear vibration analysis of beam and frame structures suitable for use with NASTRAN level 15.5 is described. The nonlinearity considered is due to the presence of axial loads induced by longitudinal end restraints and lateral displacements that are large compared to the beam height. A brief discussion is included of the mathematical analysis and the geometrical stiffness matrix for a prismatic beam (BAR) element. Also included are a brief discussion of the equivalent linearization iterative process used to determine the nonlinear frequency, the required modifications to subroutines DBAR and XMPLBD of the NASTRAN code, and the appropriate vibration capability, four example problems are presented. Comparisons with existing experimental and analytical results show that excellent accuracy is achieved with NASTRAN in all cases.
Shardell, Michelle; Harris, Anthony D; El-Kamary, Samer S; Furuno, Jon P; Miller, Ram R; Perencevich, Eli N
2007-10-01
Quasi-experimental study designs are frequently used to assess interventions that aim to limit the emergence of antimicrobial-resistant pathogens. However, previous studies using these designs have often used suboptimal statistical methods, which may result in researchers making spurious conclusions. Methods used to analyze quasi-experimental data include 2-group tests, regression analysis, and time-series analysis, and they all have specific assumptions, data requirements, strengths, and limitations. An example of a hospital-based intervention to reduce methicillin-resistant Staphylococcus aureus infection rates and reduce overall length of stay is used to explore these methods.
Analysis of an inventory model for both linearly decreasing demand and holding cost
NASA Astrophysics Data System (ADS)
Malik, A. K.; Singh, Parth Raj; Tomar, Ajay; Kumar, Satish; Yadav, S. K.
2016-03-01
This study proposes the analysis of an inventory model for linearly decreasing demand and holding cost for non-instantaneous deteriorating items. The inventory model focuses on commodities having linearly decreasing demand without shortages. The holding cost doesn't remain uniform with time due to any form of variation in the time value of money. Here we consider that the holding cost decreases with respect to time. The optimal time interval for the total profit and the optimal order quantity are determined. The developed inventory model is pointed up through a numerical example. It also includes the sensitivity analysis.
Improving Public Perception of Behavior Analysis.
Freedman, David H
2016-05-01
The potential impact of behavior analysis is limited by the public's dim awareness of the field. The mass media rarely cover behavior analysis, other than to echo inaccurate negative stereotypes about control and punishment. The media instead play up appealing but less-evidence-based approaches to problems, a key example being the touting of dubious diets over behavioral approaches to losing excess weight. These sorts of claims distort or skirt scientific evidence, undercutting the fidelity of behavior analysis to scientific rigor. Strategies for better connecting behavior analysis with the public might include reframing the field's techniques and principles in friendlier, more resonant form; pushing direct outcome comparisons between behavior analysis and its rivals in simple terms; and playing up the "warm and fuzzy" side of behavior analysis.
NASA Astrophysics Data System (ADS)
Genberg, Victor L.; Michels, Gregory J.
2017-08-01
The ultimate design goal of an optical system subjected to dynamic loads is to minimize system level wavefront error (WFE). In random response analysis, system WFE is difficult to predict from finite element results due to the loss of phase information. In the past, the use of ystem WFE was limited by the difficulty of obtaining a linear optics model. In this paper, an automated method for determining system level WFE using a linear optics model is presented. An error estimate is included in the analysis output based on fitting errors of mode shapes. The technique is demonstrated by example with SigFit, a commercially available tool integrating mechanical analysis with optical analysis.
Jacobson, Steven D.
2014-08-19
Certain examples provide optical contact micrometers and methods of use. An example optical contact micrometer includes a pair of opposable lenses to receive an object and immobilize the object in a position. The example optical contact micrometer includes a pair of opposable mirrors positioned with respect to the pair of lenses to facilitate viewing of the object through the lenses. The example optical contact micrometer includes a microscope to facilitate viewing of the object through the lenses via the mirrors; and an interferometer to obtain one or more measurements of the object.
Simulation analysis of an integrated model for dynamic cellular manufacturing system
NASA Astrophysics Data System (ADS)
Hao, Chunfeng; Luan, Shichao; Kong, Jili
2017-05-01
Application of dynamic cellular manufacturing system (DCMS) is a well-known strategy to improve manufacturing efficiency in the production environment with high variety and low volume of production. Often, neither the trade-off of inter and intra-cell material movements nor the trade-off of hiring and firing of operators are examined in details. This paper presents simulation results of an integrated mixed-integer model including sensitivity analysis for several numerical examples. The comprehensive model includes cell formation, inter and intracellular materials handling, inventory and backorder holding, operator assignment (including resource adjustment) and flexible production routing. The model considers multi-production planning with flexible resources (machines and operators) where each period has different demands. The results verify the validity and sensitivity of the proposed model using a genetic algorithm.
NASA Technical Reports Server (NTRS)
Radhakrishnan, Krishnan; Bittker, David A.
1994-01-01
LSENS, the Lewis General Chemical Kinetics and Sensitivity Analysis Code, has been developed for solving complex, homogeneous, gas-phase chemical kinetics problems and contains sensitivity analysis for a variety of problems, including nonisothermal situations. This report is part II of a series of three reference publications that describe LSENS, provide a detailed guide to its usage, and present many example problems. Part II describes the code, how to modify it, and its usage, including preparation of the problem data file required to execute LSENS. Code usage is illustrated by several example problems, which further explain preparation of the problem data file and show how to obtain desired accuracy in the computed results. LSENS is a flexible, convenient, accurate, and efficient solver for chemical reaction problems such as static system; steady, one-dimensional, inviscid flow; reaction behind incident shock wave, including boundary layer correction; and perfectly stirred (highly backmixed) reactor. In addition, the chemical equilibrium state can be computed for the following assigned states: temperature and pressure, enthalpy and pressure, temperature and volume, and internal energy and volume. For static problems the code computes the sensitivity coefficients of the dependent variables and their temporal derivatives with respect to the initial values of the dependent variables and/or the three rate coefficient parameters of the chemical reactions. Part I (NASA RP-1328) derives the governing equations and describes the numerical solution procedures for the types of problems that can be solved by LSENS. Part III (NASA RP-1330) explains the kinetics and kinetics-plus-sensitivity-analysis problems supplied with LSENS and presents sample results.
Statistical analysis and interpolation of compositional data in materials science.
Pesenson, Misha Z; Suram, Santosh K; Gregoire, John M
2015-02-09
Compositional data are ubiquitous in chemistry and materials science: analysis of elements in multicomponent systems, combinatorial problems, etc., lead to data that are non-negative and sum to a constant (for example, atomic concentrations). The constant sum constraint restricts the sampling space to a simplex instead of the usual Euclidean space. Since statistical measures such as mean and standard deviation are defined for the Euclidean space, traditional correlation studies, multivariate analysis, and hypothesis testing may lead to erroneous dependencies and incorrect inferences when applied to compositional data. Furthermore, composition measurements that are used for data analytics may not include all of the elements contained in the material; that is, the measurements may be subcompositions of a higher-dimensional parent composition. Physically meaningful statistical analysis must yield results that are invariant under the number of composition elements, requiring the application of specialized statistical tools. We present specifics and subtleties of compositional data processing through discussion of illustrative examples. We introduce basic concepts, terminology, and methods required for the analysis of compositional data and utilize them for the spatial interpolation of composition in a sputtered thin film. The results demonstrate the importance of this mathematical framework for compositional data analysis (CDA) in the fields of materials science and chemistry.
The Boeing plastic analysis capability for engines
NASA Technical Reports Server (NTRS)
Vos, R. G.
1976-01-01
The current BOPACE program is described as a nonlinear stress analysis program, which is based on a family of isoparametric finite elements. The theoretical, user, programmer, preprocessing aspects are discussed, and example problems are included. New features in the current program version include substructuring, an out-of-core Gauss wavefront equation solver, multipoint constraints, combined material and geometric nonlinearities, automatic calculation of inertia effects, provision for distributed as well as concentrated mechanical loads, follower forces, singular crack-tip elements, the SAIL automatic generation capability, and expanded user control over input quantity definition, output selection, and program execution. BOPACE is written in FORTRAN 4 and is currently available for both the IBM 360/370 and the UNIVAC 1108 machines.
NASA Technical Reports Server (NTRS)
Glassman, A. J.
1974-01-01
A computer program to analyze power systems having any number of shafts up to a maximum of five is presented. On each shaft there can be as many as five compressors and five turbines, along with any specified number of intervening intercoolers and reheaters. A recuperator can be included. Turbine coolant flow can be accounted for. Any fuel consisting entirely of hydrogen and/or carbon can be used. The program is valid for maximum temperatures up to about 2000 K (3600 R). The system description, the analysis method, a detailed explanation of program input and output including an illustrative example, a dictionary of program variables, and the program listing are explained.
Advances in Additive Manufacturing
2016-07-14
of 3D - printed structures. Analysis examples will include quantification of tolerance differences between the designed and manufactured parts, void...15. SUBJECT TERMS 3-D printing , validation and verification, nondestructive inspection, print -on-the-move, prototyping 16. SECURITY CLASSIFICATION...researching the formation of AM-grade metal powder from battlefield scrap and operating base waste, 2) potential of 3-D printing with sand to make
X-ray and synchrotron methods in studies of cultural heritage sites
DOE Office of Scientific and Technical Information (OSTI.GOV)
Koval’chuk, M. V.; Yatsishina, E. B.; Blagov, A. E.
2016-09-15
X-ray and synchrotron methods that are most widely used in studies of cultural heritage objects (including archaeological sites)—X-ray diffraction analysis, X-ray spectroscopy, and visualization techniques— have been considered. The reported examples show high efficiency and informativeness of natural science studies when solving most diverse problems of archaeology, history, the study of art, museology, etc.
In-flight thrust determination
NASA Technical Reports Server (NTRS)
Abernethy, Robert B.; Adams, Gary R.; Ascough, John C.; Baer-Riedhart, Jennifer L.; Balkcom, George H.; Biesiadny, Thomas
1986-01-01
The major aspects of processes that may be used for the determination of in-flight thrust are reviewed. Basic definitions are presented as well as analytical and ground-test methods for gathering data and calculating the thrust of the propulsion system during the flight development program of the aircraft. Test analysis examples include a single-exhaust turbofan, an intermediate-cowl turbofan, and a mixed-flow afterburning turbofan.
ERIC Educational Resources Information Center
Gryphon, Marie
In 2002, the Supreme court upheld an Ohio school choice program designed to help children leave Cleveland's failing public schools. This paper explains the history of the Cleveland program upheld in Zelman v. Simmons-Harris, describing the rules that the Supreme Court established for school choice. It includes examples and strategy to help…
Comparison of Fixed-Item and Response-Sensitive Versions of an Online Tutorial
ERIC Educational Resources Information Center
Grant, Lyle K.; Courtoreille, Marni
2007-01-01
This study is a comparison of 2 versions of an Internet-based tutorial that teaches the behavior-analysis concept of positive reinforcement. A fixed-item group of students studied a version of the tutorial that included 14 interactive examples and nonexamples of the concept. A response-sensitive group of students studied a different version of the…
Combining computer and manual overlays—Willamette River Greenway Study
Asa Hanamoto; Lucille Biesbroeck
1979-01-01
We will present a method of combining computer mapping with manual overlays. An example of its use is the Willamette River Greenway Study produced for the State of Oregon Department of Transportation in 1974. This one year planning study included analysis of data relevant to a 286-mile river system. The product is a "wise use" plan which conserves the basic...
ERIC Educational Resources Information Center
Taub, Edward
2012-01-01
Constraint-induced (CI) therapy is a term given to a family of efficacious neurorehabilitation treatments including to date: upper extremity CI movement therapy, lower extremity CI movement therapy, pediatric CI therapy, and CI aphasia therapy. The purpose of this article is to outline the behavior analysis origins of CI therapy and the ways in…
Vulnerability Analysis of an All-Electric Warship
2010-06-01
active. Damage Control: Fire fighting, dewatering, lighting, electrical receptacles (for powering damage control equipment such as submersible pumps ...sufficient radar not available. This also requires an increase in chill water capacity by adding pump , compressor, and ASW pump . Remaining ventilation systems...Activate towed-array sonar, if applicable. Increase speed to 25 knots. Non-Vital Loads: All non-vital loads. Examples include galley equipment, heat
NASA Astrophysics Data System (ADS)
Wu, Qing-Chu; Fu, Xin-Chu; Sun, Wei-Gang
2010-01-01
In this paper a class of networks with multiple connections are discussed. The multiple connections include two different types of links between nodes in complex networks. For this new model, we give a simple generating procedure. Furthermore, we investigate dynamical synchronization behavior in a delayed two-layer network, giving corresponding theoretical analysis and numerical examples.
Information-Decay Pursuit of Dynamic Parameters in Student Models
1994-04-01
simple worked-through example). Commercially available computer programs for structuring and using Bayesian inference include ERGO ( Noetic Systems...Tukey, J.W. (1977). Data analysis and Regression: A second course in statistics. Reading, MA: Addison-Wesley. Noetic Systems, Inc. (1991). ERGO...Naval Academy Division of Educational Studies Annapolis MD 21402-5002 Elmory Univerity Dr Janice Gifford 210 Fiabburne Bldg University of
An Analysis of Botnet Vulnerabilities
2007-06-01
Definition Currently, the primary defense against botnets is prompt patching of vulnerable systems and antivirus software . Network monitoring can identify...IRCd software , none were identified during this effort. AFIT iv For my wife, for her caring and support throughout the course of this...are software agents designed to automatically perform tasks. Examples include web-spiders that catalog the Internet and bots found in popular online
Gregorich, Steven E
2006-11-01
Comparative public health research makes wide use of self-report instruments. For example, research identifying and explaining health disparities across demographic strata may seek to understand the health effects of patient attitudes or private behaviors. Such personal attributes are difficult or impossible to observe directly and are often best measured by self-reports. Defensible use of self-reports in quantitative comparative research requires not only that the measured constructs have the same meaning across groups, but also that group comparisons of sample estimates (eg, means and variances) reflect true group differences and are not contaminated by group-specific attributes that are unrelated to the construct of interest. Evidence for these desirable properties of measurement instruments can be established within the confirmatory factor analysis (CFA) framework; a nested hierarchy of hypotheses is tested that addresses the cross-group invariance of the instrument's psychometric properties. By name, these hypotheses include configural, metric (or pattern), strong (or scalar), and strict factorial invariance. The CFA model and each of these hypotheses are described in nontechnical language. A worked example and technical appendices are included.
Analysis and remediation of aphasia in the U.S.S.R: the contribution of A. R. Luria.
Hatfield, F M
1981-11-01
This paper surveys the contribution of A. R. Luria to aphasiology, emphasising the unique extent to which he integrated theory and therapeutic practice. The influence exerted by two prominent Russian figures, Pavlov and Vygotskii, is discussed. Luria's view of the primary defects underlying the main forms of aphasia is summarised; this is followed by a brief account of his application of certain notions of structural linguistics, including Jakobson's interpretations of the breakdown of language following brain damage. Examples are given of the wide range of simple tests included in Luria's neuropsychological investigations. The factual part of the article culminates in some examples of his methods of restoring higher cortical functions, in particular, verbal skills. The summary criticises certain aspects of Luria's analysis as being too mechanistic and simplistic, and cites criticisms of details from other workers, but considers many of his insights and the total coherence of his view of cortical functioning and cortical disturbance to be still of the utmost importance for clinicians undertaking aphasia therapy. The need for therapists everywhere to develop language rehabilitation with as systematic a basis as Luria's is stressed.
Theoretical study of air forces on an oscillating or steady thin wing in a supersonic main stream
NASA Technical Reports Server (NTRS)
Garrick, I E; Rubinow, S I
1947-01-01
A theoretical study, based on the linearized equations of motion for small disturbance, is made of the air forces on wings of general plan forms moving forward at a constant supersonic speed. The boundary problem is set up for both the harmonically oscillating and the steady conditions. Two types of boundary conditions are distinguished, which are designated "purely supersonic" and "mixed supersonic." the method is illustrated by applications to a number of examples for both the steady and the oscillating conditions. The purely supersonic case involves independence of action of the upper and lower surfaces of the airfoil and present analysis is mainly concerned with this case. A discussion is first given of the fundamental or elementary solution corresponding to a moving source. The solutions for the velocity potential are then synthesized by means of integration of the fundamental solution for the moving source. The method is illustrated by applications to a number of examples for both the steady and the oscillating cases and for various plan forms, including swept wings and rectangular and triangular plan forms. The special results of a number of authors are shown to be included in the analysis.
Characteristics Desired in Clinical Data Warehouse for Biomedical Research
Shin, Soo-Yong; Kim, Woo Sung
2014-01-01
Objectives Due to the unique characteristics of clinical data, clinical data warehouses (CDWs) have not been successful so far. Specifically, the use of CDWs for biomedical research has been relatively unsuccessful thus far. The characteristics necessary for the successful implementation and operation of a CDW for biomedical research have not clearly defined yet. Methods Three examples of CDWs were reviewed: a multipurpose CDW in a hospital, a CDW for independent multi-institutional research, and a CDW for research use in an institution. After reviewing the three CDW examples, we propose some key characteristics needed in a CDW for biomedical research. Results A CDW for research should include an honest broker system and an Institutional Review Board approval interface to comply with governmental regulations. It should also include a simple query interface, an anonymized data review tool, and a data extraction tool. Also, it should be a biomedical research platform for data repository use as well as data analysis. Conclusions The proposed characteristics desired in a CDW may have limited transfer value to organizations in other countries. However, these analysis results are still valid in Korea, and we have developed clinical research data warehouse based on these desiderata. PMID:24872909
Development of an integrated BEM approach for hot fluid structure interaction
NASA Technical Reports Server (NTRS)
Dargush, G. F.; Banerjee, P. K.; Shi, Y.
1991-01-01
The development of a comprehensive fluid-structure interaction capability within a boundary element computer code is described. This new capability is implemented in a completely general manner, so that quite arbitrary geometry, material properties and boundary conditions may be specified. Thus, a single analysis code can be used to run structures-only problems, fluids-only problems, or the combined fluid-structure problem. In all three cases, steady or transient conditions can be selected, with or without thermal effects. Nonlinear analyses can be solved via direct iteration or by employing a modified Newton-Raphson approach. A number of detailed numerical examples are included at the end of these two sections to validate the formulations and to emphasize both the accuracy and generality of the computer code. A brief review of the recent applicable boundary element literature is included for completeness. The fluid-structure interaction facility is discussed. Once again, several examples are provided to highlight this unique capability. A collection of potential boundary element applications that have been uncovered as a result of work related to the present grant is given. For most of those problems, satisfactory analysis techniques do not currently exist.
Guidance for using mixed methods design in nursing practice research.
Chiang-Hanisko, Lenny; Newman, David; Dyess, Susan; Piyakong, Duangporn; Liehr, Patricia
2016-08-01
The mixed methods approach purposefully combines both quantitative and qualitative techniques, enabling a multi-faceted understanding of nursing phenomena. The purpose of this article is to introduce three mixed methods designs (parallel; sequential; conversion) and highlight interpretive processes that occur with the synthesis of qualitative and quantitative findings. Real world examples of research studies conducted by the authors will demonstrate the processes leading to the merger of data. The examples include: research questions; data collection procedures and analysis with a focus on synthesizing findings. Based on experience with mixed methods studied, the authors introduce two synthesis patterns (complementary; contrasting), considering application for practice and implications for research. Copyright © 2015 Elsevier Inc. All rights reserved.
Pendulum rides, rotations and the Coriolis effect
NASA Astrophysics Data System (ADS)
Pendrill, Ann-Marie; Modig, Conny
2018-07-01
An amusement park is full of examples that can be made into challenging problems for students, combining mathematical modelling with video analysis, as well as measurements in the rides. Traditional amusement ride related textbook problems include free-fall, circular motion, pendula and energy conservation in roller coasters, where the moving bodies are typically considered point-like. However, an amusement park can offer many more examples that are useful in physics and engineering education, many of them with strong mathematical content. This paper analyses forces on riders in a large rotating pendulum ride, where the Coriolis effect is sufficiently large to be visible in accelerometer data from the rides and leads to different ride experiences in different positions.
NASA Astrophysics Data System (ADS)
Shatravin, V.; Shashev, D. V.
2018-05-01
Currently, robots are increasingly being used in every industry. One of the most high-tech areas is creation of completely autonomous robotic devices including vehicles. The results of various global research prove the efficiency of vision systems in autonomous robotic devices. However, the use of these systems is limited because of the computational and energy resources available in the robot device. The paper describes the results of applying the original approach for image processing on reconfigurable computing environments by the example of morphological operations over grayscale images. This approach is prospective for realizing complex image processing algorithms and real-time image analysis in autonomous robotic devices.
Analysis of Local Structure, Chemistry and Bonding by Electron Energy Loss Spectroscopy
NASA Astrophysics Data System (ADS)
Mayer, Joachim
In the present chapter, the reader will first be introduced briefly to the basic principles of analytical transmission electron microscopy (ATEM) with special emphasis on electron energy-loss spectroscopy (EELS) and energy-filtering TEM. The quantification of spectra to obtain chemical information and the origin and interpretation of near-edge fine structures in EELS (ELNES) are discussed. Special attention will be given to the characterization of internal interfaces and the literature in this area will be reviewed. Selected examples of the application of ATEM in the investigation of internal interfaces will be given. These examples include both EELS in the energy-filtering TEM and in the scanning transmission electron microscope (STEM).
Althuis, Michelle D; Weed, Douglas L; Frankenfeld, Cara L
2014-07-23
Assessment of design heterogeneity conducted prior to meta-analysis is infrequently reported; it is often presented post hoc to explain statistical heterogeneity. However, design heterogeneity determines the mix of included studies and how they are analyzed in a meta-analysis, which in turn can importantly influence the results. The goal of this work is to introduce ways to improve the assessment and reporting of design heterogeneity prior to statistical summarization of epidemiologic studies. In this paper, we use an assessment of sugar-sweetened beverages (SSB) and type 2 diabetes (T2D) as an example to show how a technique called 'evidence mapping' can be used to organize studies and evaluate design heterogeneity prior to meta-analysis.. Employing a systematic and reproducible approach, we evaluated the following elements across 11 selected cohort studies: variation in definitions of SSB, T2D, and co-variables, design features and population characteristics associated with specific definitions of SSB, and diversity in modeling strategies. Evidence mapping strategies effectively organized complex data and clearly depicted design heterogeneity. For example, across 11 studies of SSB and T2D, 7 measured diet only once (with 7 to 16 years of disease follow-up), 5 included primarily low SSB consumers, and 3 defined the study variable (SSB) as consumption of either sugar or artificially-sweetened beverages. This exercise also identified diversity in analysis strategies, such as adjustment for 11 to 17 co-variables and a large degree of fluctuation in SSB-T2D risk estimates depending on variables selected for multivariable models (2 to 95% change in the risk estimate from the age-adjusted model). Meta-analysis seeks to understand heterogeneity in addition to computing a summary risk estimate. This strategy effectively documents design heterogeneity, thus improving the practice of meta-analysis by aiding in: 1) protocol and analysis planning, 2) transparent reporting of differences in study designs, and 3) interpretation of pooled estimates. We recommend expanding the practice of meta-analysis reporting to include a table that summarizes design heterogeneity. This would provide readers with more evidence to interpret the summary risk estimates.
2014-01-01
Background Assessment of design heterogeneity conducted prior to meta-analysis is infrequently reported; it is often presented post hoc to explain statistical heterogeneity. However, design heterogeneity determines the mix of included studies and how they are analyzed in a meta-analysis, which in turn can importantly influence the results. The goal of this work is to introduce ways to improve the assessment and reporting of design heterogeneity prior to statistical summarization of epidemiologic studies. Methods In this paper, we use an assessment of sugar-sweetened beverages (SSB) and type 2 diabetes (T2D) as an example to show how a technique called ‘evidence mapping’ can be used to organize studies and evaluate design heterogeneity prior to meta-analysis.. Employing a systematic and reproducible approach, we evaluated the following elements across 11 selected cohort studies: variation in definitions of SSB, T2D, and co-variables, design features and population characteristics associated with specific definitions of SSB, and diversity in modeling strategies. Results Evidence mapping strategies effectively organized complex data and clearly depicted design heterogeneity. For example, across 11 studies of SSB and T2D, 7 measured diet only once (with 7 to 16 years of disease follow-up), 5 included primarily low SSB consumers, and 3 defined the study variable (SSB) as consumption of either sugar or artificially-sweetened beverages. This exercise also identified diversity in analysis strategies, such as adjustment for 11 to 17 co-variables and a large degree of fluctuation in SSB-T2D risk estimates depending on variables selected for multivariable models (2 to 95% change in the risk estimate from the age-adjusted model). Conclusions Meta-analysis seeks to understand heterogeneity in addition to computing a summary risk estimate. This strategy effectively documents design heterogeneity, thus improving the practice of meta-analysis by aiding in: 1) protocol and analysis planning, 2) transparent reporting of differences in study designs, and 3) interpretation of pooled estimates. We recommend expanding the practice of meta-analysis reporting to include a table that summarizes design heterogeneity. This would provide readers with more evidence to interpret the summary risk estimates. PMID:25055879
Symbolic computer vector analysis
NASA Technical Reports Server (NTRS)
Stoutemyer, D. R.
1977-01-01
A MACSYMA program is described which performs symbolic vector algebra and vector calculus. The program can combine and simplify symbolic expressions including dot products and cross products, together with the gradient, divergence, curl, and Laplacian operators. The distribution of these operators over sums or products is under user control, as are various other expansions, including expansion into components in any specific orthogonal coordinate system. There is also a capability for deriving the scalar or vector potential of a vector field. Examples include derivation of the partial differential equations describing fluid flow and magnetohydrodynamics, for 12 different classic orthogonal curvilinear coordinate systems.
Fundamental principles of conducting a surgery economic analysis study.
Kotsis, Sandra V; Chung, Kevin C
2010-02-01
The use of economic evaluation in surgery is scarce. Economic evaluation is used even less so in plastic surgery, in which health-related quality of life is of particular importance. This article, part of a tutorial series on evidence-based medicine, focuses on the fundamental principles of conducting a surgery economic analysis. The authors include the essential aspects of conducting a surgical cost-utility analysis by considering perspectives, costs, outcomes, and utilities. The authors also describe and give examples of how to conduct the analyses (including calculating quality-adjusted life-years and discounting), how to interpret the results, and how to report the results. Although economic analyses are not simple to conduct, a well-conducted one provides many rewards, such as recommending the adoption of a more effective treatment. For comparing and interpreting economic analysis publications, it is important that all studies use consistent methodology and report the results in a similar manner.
Modeling and Analysis of Power Processing Systems (MAPPS), initial phase 2
NASA Technical Reports Server (NTRS)
Yu, Y.; Lee, F. C.; Wangenheim, H.; Warren, D.
1977-01-01
The overall objective of the program is to provide the engineering tools to reduce the analysis, design, and development effort, and thus the cost, in achieving the required performances for switching regulators and dc-dc converter systems. The program was both tutorial and application oriented. Various analytical methods were described in detail and supplemented with examples, and those with standardization appeals were reduced into computer-based subprograms. Major program efforts included those concerning small and large signal control-dependent performance analysis and simulation, control circuit design, power circuit design and optimization, system configuration study, and system performance simulation. Techniques including discrete time domain, conventional frequency domain, Lagrange multiplier, nonlinear programming, and control design synthesis were employed in these efforts. To enhance interactive conversation between the modeling and analysis subprograms and the user, a working prototype of the Data Management Program was also developed to facilitate expansion as future subprogram capabilities increase.
Neyman, Markov processes and survival analysis.
Yang, Grace
2013-07-01
J. Neyman used stochastic processes extensively in his applied work. One example is the Fix and Neyman (F-N) competing risks model (1951) that uses finite homogeneous Markov processes to analyse clinical trials with breast cancer patients. We revisit the F-N model, and compare it with the Kaplan-Meier (K-M) formulation for right censored data. The comparison offers a way to generalize the K-M formulation to include risks of recovery and relapses in the calculation of a patient's survival probability. The generalization is to extend the F-N model to a nonhomogeneous Markov process. Closed-form solutions of the survival probability are available in special cases of the nonhomogeneous processes, like the popular multiple decrement model (including the K-M model) and Chiang's staging model, but these models do not consider recovery and relapses while the F-N model does. An analysis of sero-epidemiology current status data with recurrent events is illustrated. Fix and Neyman used Neyman's RBAN (regular best asymptotic normal) estimates for the risks, and provided a numerical example showing the importance of considering both the survival probability and the length of time of a patient living a normal life in the evaluation of clinical trials. The said extension would result in a complicated model and it is unlikely to find analytical closed-form solutions for survival analysis. With ever increasing computing power, numerical methods offer a viable way of investigating the problem.
NASA Technical Reports Server (NTRS)
Mcgreevy, Michael W.
1995-01-01
An objective and quantitative method has been developed for deriving models of complex and specialized spheres of activity (domains) from domain-generated verbal data. The method was developed for analysis of interview transcripts, incident reports, and other text documents whose original source is people who are knowledgeable about, and participate in, the domain in question. To test the method, it is applied here to a report describing a remote sensing project within the scope of the Earth Observing System (EOS). The method has the potential to improve the designs of domain-related computer systems and software by quickly providing developers with explicit and objective models of the domain in a form which is useful for design. Results of the analysis include a network model of the domain, and an object-oriented relational analysis report which describes the nodes and relationships in the network model. Other products include a database of relationships in the domain, and an interactive concordance. The analysis method utilizes a newly developed relational metric, a proximity-weighted frequency of co-occurrence. The metric is applied to relations between the most frequently occurring terms (words or multiword entities) in the domain text, and the terms found within the contexts of these terms. Contextual scope is selectable. Because of the discriminating power of the metric, data reduction from the association matrix to the network is simple. In addition to their value for design. the models produced by the method are also useful for understanding the domains themselves. They can, for example, be interpreted as models of presence in the domain.
48 CFR 1845.7101-1 - Property classification.
Code of Federal Regulations, 2010 CFR
2010-10-01
... aeronautical and space programs, which are capable of stand-alone operation. Examples include research aircraft... characteristics. (ii) Examples of NASA heritage assets include buildings and structures designated as National...., it no longer provides service to NASA operations). Examples of obsolete property are items in...
48 CFR 1845.7101-1 - Property classification.
Code of Federal Regulations, 2011 CFR
2011-10-01
... aeronautical and space programs, which are capable of stand-alone operation. Examples include research aircraft... characteristics. (ii) Examples of NASA heritage assets include buildings and structures designated as National...., it no longer provides service to NASA operations). Examples of obsolete property are items in...
48 CFR 1845.7101-1 - Property classification.
Code of Federal Regulations, 2014 CFR
2014-10-01
... aeronautical and space programs, which are capable of stand-alone operation. Examples include research aircraft... characteristics. (ii) Examples of NASA heritage assets include buildings and structures designated as National...., it no longer provides service to NASA operations). Examples of obsolete property are items in...
48 CFR 1845.7101-1 - Property classification.
Code of Federal Regulations, 2013 CFR
2013-10-01
... aeronautical and space programs, which are capable of stand-alone operation. Examples include research aircraft... characteristics. (ii) Examples of NASA heritage assets include buildings and structures designated as National...., it no longer provides service to NASA operations). Examples of obsolete property are items in...
48 CFR 1845.7101-1 - Property classification.
Code of Federal Regulations, 2012 CFR
2012-10-01
... aeronautical and space programs, which are capable of stand-alone operation. Examples include research aircraft... characteristics. (ii) Examples of NASA heritage assets include buildings and structures designated as National...., it no longer provides service to NASA operations). Examples of obsolete property are items in...
Isolating the Effects of Training Using Simple Regression Analysis: An Example of the Procedure.
ERIC Educational Resources Information Center
Waugh, C. Keith
This paper provides a case example of simple regression analysis, a forecasting procedure used to isolate the effects of training from an identified extraneous variable. This case example focuses on results of a three-day sales training program to improve bank loan officers' knowledge, skill-level, and attitude regarding solicitation and sale of…
Data analysis in emission tomography using emission-count posteriors
NASA Astrophysics Data System (ADS)
Sitek, Arkadiusz
2012-11-01
A novel approach to the analysis of emission tomography data using the posterior probability of the number of emissions per voxel (emission count) conditioned on acquired tomographic data is explored. The posterior is derived from the prior and the Poisson likelihood of the emission-count data by marginalizing voxel activities. Based on emission-count posteriors, examples of Bayesian analysis including estimation and classification tasks in emission tomography are provided. The application of the method to computer simulations of 2D tomography is demonstrated. In particular, the minimum-mean-square-error point estimator of the emission count is demonstrated. The process of finding this estimator can be considered as a tomographic image reconstruction technique since the estimates of the number of emissions per voxel divided by voxel sensitivities and acquisition time are the estimates of the voxel activities. As an example of a classification task, a hypothesis stating that some region of interest (ROI) emitted at least or at most r-times the number of events in some other ROI is tested. The ROIs are specified by the user. The analysis described in this work provides new quantitative statistical measures that can be used in decision making in diagnostic imaging using emission tomography.
An overview of STRUCTURE: applications, parameter settings, and supporting software
Porras-Hurtado, Liliana; Ruiz, Yarimar; Santos, Carla; Phillips, Christopher; Carracedo, Ángel; Lareu, Maria V.
2013-01-01
Objectives: We present an up-to-date review of STRUCTURE software: one of the most widely used population analysis tools that allows researchers to assess patterns of genetic structure in a set of samples. STRUCTURE can identify subsets of the whole sample by detecting allele frequency differences within the data and can assign individuals to those sub-populations based on analysis of likelihoods. The review covers STRUCTURE's most commonly used ancestry and frequency models, plus an overview of the main applications of the software in human genetics including case-control association studies (CCAS), population genetics, and forensic analysis. The review is accompanied by supplementary material providing a step-by-step guide to running STRUCTURE. Methods: With reference to a worked example, we explore the effects of changing the principal analysis parameters on STRUCTURE results when analyzing a uniform set of human genetic data. Use of the supporting software: CLUMPP and distruct is detailed and we provide an overview and worked example of STRAT software, applicable to CCAS. Conclusion: The guide offers a simplified view of how STRUCTURE, CLUMPP, distruct, and STRAT can be applied to provide researchers with an informed choice of parameter settings and supporting software when analyzing their own genetic data. PMID:23755071
Results of an integrated structure/control law design sensitivity analysis
NASA Technical Reports Server (NTRS)
Gilbert, Michael G.
1989-01-01
A design sensitivity analysis method for Linear Quadratic Cost, Gaussian (LQG) optimal control laws, which predicts change in the optimal control law due to changes in fixed problem parameters using analytical sensitivity equations is discussed. Numerical results of a design sensitivity analysis for a realistic aeroservoelastic aircraft example are presented. In this example, the sensitivity of the optimally controlled aircraft's response to various problem formulation and physical aircraft parameters is determined. These results are used to predict the aircraft's new optimally controlled response if the parameter was to have some other nominal value during the control law design process. The sensitivity results are validated by recomputing the optimal control law for discrete variations in parameters, computing the new actual aircraft response, and comparing with the predicted response. These results show an improvement in sensitivity accuracy for integrated design purposes over methods which do not include changes in the optimal control law. Use of the analytical LQG sensitivity expressions is also shown to be more efficient than finite difference methods for the computation of the equivalent sensitivity information.
NASA Astrophysics Data System (ADS)
Jouzel, Jean
2003-06-01
Studies of past climate have, over the last 15 years, provided a wealth of information directly relevant to its evolution in the future. These results include, in particular, the discovery of a link between greenhouse gases and climate in the past and the characterization of rapid climate changes. They are, for example, based on the analysis of deep ice cores such as the one drilled at the Vostok site, which allows us to describe the evolution of the Antarctic climate and of the atmospheric composition over more than 400 thousands years (kyr). This period is also now better and better documented from the analysis of oceanic and continental records. Through examples based on recent studies, in which French teams are deeply involved, we will illustrate the most important results obtained from the analysis of polar ice cores, deep-sea cores and continental archives. To cite this article: J. Jouzel, C. R. Geoscience 335 (2003).
USEPA EXAMPLE EXIT LEVEL ANALYSIS RESULTS
Developed by NERL/ERD for the Office of Solid Waste, the enclosed product provides an example uncertainty analysis (UA) and initial process-based sensitivity analysis (SA) of hazardous waste "exit" concentrations for 7 chemicals and metals using the 3MRA Version 1.0 Modeling Syst...
NASA Astrophysics Data System (ADS)
Auken, Esben; Christiansen, Anders Vest; Kirkegaard, Casper; Fiandaca, Gianluca; Schamper, Cyril; Behroozmand, Ahmad Ali; Binley, Andrew; Nielsen, Emil; Effersø, Flemming; Christensen, Niels Bøie; Sørensen, Kurt; Foged, Nikolaj; Vignoli, Giulio
2015-07-01
We present an overview of a mature, robust and general algorithm providing a single framework for the inversion of most electromagnetic and electrical data types and instrument geometries. The implementation mainly uses a 1D earth formulation for electromagnetics and magnetic resonance sounding (MRS) responses, while the geoelectric responses are both 1D and 2D and the sheet's response models a 3D conductive sheet in a conductive host with an overburden of varying thickness and resistivity. In all cases, the focus is placed on delivering full system forward modelling across all supported types of data. Our implementation is modular, meaning that the bulk of the algorithm is independent of data type, making it easy to add support for new types. Having implemented forward response routines and file I/O for a given data type provides access to a robust and general inversion engine. This engine includes support for mixed data types, arbitrary model parameter constraints, integration of prior information and calculation of both model parameter sensitivity analysis and depth of investigation. We present a review of our implementation and methodology and show four different examples illustrating the versatility of the algorithm. The first example is a laterally constrained joint inversion (LCI) of surface time domain induced polarisation (TDIP) data and borehole TDIP data. The second example shows a spatially constrained inversion (SCI) of airborne transient electromagnetic (AEM) data. The third example is an inversion and sensitivity analysis of MRS data, where the electrical structure is constrained with AEM data. The fourth example is an inversion of AEM data, where the model is described by a 3D sheet in a layered conductive host.
NASA Systems Analysis and Concepts Directorate Mission and Trade Study Analysis
NASA Technical Reports Server (NTRS)
Ricks, Wendell; Guynn, Mark; Hahn, Andrew; Lepsch, Roger; Mazanek, Dan; Dollyhigh, Sam
2006-01-01
Mission analysis, as practiced by the NASA Langley Research Center's Systems Analysis and Concepts Directorate (SACD), consists of activities used to define, assess, and evaluate a wide spectrum of aerospace systems for given requirements. The missions for these systems encompass a broad range from aviation to space exploration. The customer, who is usually another NASA organization or another government agency, often predefines the mission. Once a mission is defined, the goals and objectives that the system will need to meet are delineated and quantified. A number of alternative systems are then typically developed and assessed relative to these goals and objectives. This is done in order to determine the most favorable design approaches for further refinement. Trade studies are performed in order to understand the impact of a requirement on each system and to select among competing design options. Items varied in trade studies typically include: design variables or design constraints; technology and subsystem options; and operational approaches. The results of trade studies are often used to refine the mission and system requirements. SACD studies have been integral to the decision processes of many organizations for decades. Many recent examples of SACD mission and trade study analyses illustrate their excellence and influence. The SACD-led, Agency-wide effort to analyze a broad range of future human lunar exploration scenarios for NASA s Exploration Systems Mission Directorate (ESMD) and the Mars airplane design study in support of the Aerial Regional-scale Environment Survey of Mars (ARES) mission are two such examples. This paper describes SACD's mission and trade study analysis activities in general and presents the lunar exploration and Mars airplane studies as examples of type of work performed by the SACD.
Automated processing of zebrafish imaging data: a survey.
Mikut, Ralf; Dickmeis, Thomas; Driever, Wolfgang; Geurts, Pierre; Hamprecht, Fred A; Kausler, Bernhard X; Ledesma-Carbayo, María J; Marée, Raphaël; Mikula, Karol; Pantazis, Periklis; Ronneberger, Olaf; Santos, Andres; Stotzka, Rainer; Strähle, Uwe; Peyriéras, Nadine
2013-09-01
Due to the relative transparency of its embryos and larvae, the zebrafish is an ideal model organism for bioimaging approaches in vertebrates. Novel microscope technologies allow the imaging of developmental processes in unprecedented detail, and they enable the use of complex image-based read-outs for high-throughput/high-content screening. Such applications can easily generate Terabytes of image data, the handling and analysis of which becomes a major bottleneck in extracting the targeted information. Here, we describe the current state of the art in computational image analysis in the zebrafish system. We discuss the challenges encountered when handling high-content image data, especially with regard to data quality, annotation, and storage. We survey methods for preprocessing image data for further analysis, and describe selected examples of automated image analysis, including the tracking of cells during embryogenesis, heartbeat detection, identification of dead embryos, recognition of tissues and anatomical landmarks, and quantification of behavioral patterns of adult fish. We review recent examples for applications using such methods, such as the comprehensive analysis of cell lineages during early development, the generation of a three-dimensional brain atlas of zebrafish larvae, and high-throughput drug screens based on movement patterns. Finally, we identify future challenges for the zebrafish image analysis community, notably those concerning the compatibility of algorithms and data formats for the assembly of modular analysis pipelines.
Automated Processing of Zebrafish Imaging Data: A Survey
Dickmeis, Thomas; Driever, Wolfgang; Geurts, Pierre; Hamprecht, Fred A.; Kausler, Bernhard X.; Ledesma-Carbayo, María J.; Marée, Raphaël; Mikula, Karol; Pantazis, Periklis; Ronneberger, Olaf; Santos, Andres; Stotzka, Rainer; Strähle, Uwe; Peyriéras, Nadine
2013-01-01
Abstract Due to the relative transparency of its embryos and larvae, the zebrafish is an ideal model organism for bioimaging approaches in vertebrates. Novel microscope technologies allow the imaging of developmental processes in unprecedented detail, and they enable the use of complex image-based read-outs for high-throughput/high-content screening. Such applications can easily generate Terabytes of image data, the handling and analysis of which becomes a major bottleneck in extracting the targeted information. Here, we describe the current state of the art in computational image analysis in the zebrafish system. We discuss the challenges encountered when handling high-content image data, especially with regard to data quality, annotation, and storage. We survey methods for preprocessing image data for further analysis, and describe selected examples of automated image analysis, including the tracking of cells during embryogenesis, heartbeat detection, identification of dead embryos, recognition of tissues and anatomical landmarks, and quantification of behavioral patterns of adult fish. We review recent examples for applications using such methods, such as the comprehensive analysis of cell lineages during early development, the generation of a three-dimensional brain atlas of zebrafish larvae, and high-throughput drug screens based on movement patterns. Finally, we identify future challenges for the zebrafish image analysis community, notably those concerning the compatibility of algorithms and data formats for the assembly of modular analysis pipelines. PMID:23758125
Barton, Mitch; Yeatts, Paul E; Henson, Robin K; Martin, Scott B
2016-12-01
There has been a recent call to improve data reporting in kinesiology journals, including the appropriate use of univariate and multivariate analysis techniques. For example, a multivariate analysis of variance (MANOVA) with univariate post hocs and a Bonferroni correction is frequently used to investigate group differences on multiple dependent variables. However, this univariate approach decreases power, increases the risk for Type 1 error, and contradicts the rationale for conducting multivariate tests in the first place. The purpose of this study was to provide a user-friendly primer on conducting descriptive discriminant analysis (DDA), which is a post-hoc strategy to MANOVA that takes into account the complex relationships among multiple dependent variables. A real-world example using the Statistical Package for the Social Sciences syntax and data from 1,095 middle school students on their body composition and body image are provided to explain and interpret the results from DDA. While univariate post hocs increased the risk for Type 1 error to 76%, the DDA identified which dependent variables contributed to group differences and which groups were different from each other. For example, students in the very lean and Healthy Fitness Zone categories for body mass index experienced less pressure to lose weight, more satisfaction with their body, and higher physical self-concept than the Needs Improvement Zone groups. However, perceived pressure to gain weight did not contribute to group differences because it was a suppressor variable. Researchers are encouraged to use DDA when investigating group differences on multiple correlated dependent variables to determine which variables contributed to group differences.
Using mind mapping techniques for rapid qualitative data analysis in public participation processes.
Burgess-Allen, Jilla; Owen-Smith, Vicci
2010-12-01
In a health service environment where timescales for patient participation in service design are short and resources scarce, a balance needs to be achieved between research rigour and the timeliness and utility of the findings of patient participation processes. To develop a pragmatic mind mapping approach to managing the qualitative data from patient participation processes. While this article draws on experience of using mind maps in a variety of participation processes, a single example is used to illustrate the approach. In this example mind maps were created during the course of patient participation focus groups. Two group discussions were also transcribed verbatim to allow comparison of the rapid mind mapping approach with traditional thematic analysis of qualitative data. The illustrative example formed part of a local alcohol service review which included consultation with local alcohol service users, their families and staff groups. The mind mapping approach provided a pleasing graphical format for representing the key themes raised during the focus groups. It helped stimulate and galvanize discussion and keep it on track, enhanced transparency and group ownership of the data analysis process, allowed a rapid dynamic between data collection and feedback, and was considerably faster than traditional methods for the analysis of focus groups, while resulting in similar broad themes. This study suggests that the use of a mind mapping approach to managing qualitative data can provide a pragmatic resolution of the tension between limited resources and quality in patient participation processes. © 2010 The Authors. Health Expectations © 2010 Blackwell Publishing Ltd.
Matrix Perturbation Techniques in Structural Dynamics
NASA Technical Reports Server (NTRS)
Caughey, T. K.
1973-01-01
Matrix perturbation are developed techniques which can be used in the dynamical analysis of structures where the range of numerical values in the matrices extreme or where the nature of the damping matrix requires that complex valued eigenvalues and eigenvectors be used. The techniques can be advantageously used in a variety of fields such as earthquake engineering, ocean engineering, aerospace engineering and other fields concerned with the dynamical analysis of large complex structures or systems of second order differential equations. A number of simple examples are included to illustrate the techniques.
Pinning synchronization of delayed complex dynamical networks with nonlinear coupling
NASA Astrophysics Data System (ADS)
Cheng, Ranran; Peng, Mingshu; Yu, Weibin
2014-11-01
In this paper, we find that complex networks with the Watts-Strogatz or scale-free BA random topological architecture can be synchronized more easily by pin-controlling fewer nodes than regular systems. Theoretical analysis is included by means of Lyapunov functions and linear matrix inequalities (LMI) to make all nodes reach complete synchronization. Numerical examples are also provided to illustrate the importance of our theoretical analysis, which implies that there exists a gap between the theoretical prediction and numerical results about the minimum number of pinning controlled nodes.
A computer-controlled instrumentation system for third octave analysis
NASA Technical Reports Server (NTRS)
Faulcon, N. D.; Monteith, J. H.
1978-01-01
An instrumentation system is described which employs a minicomputer, a one-third octave band analyzer, and a time code/tape search unit for the automatic control and analysis of third-octave data. With this system the information necessary for data adjustment is formatted in such a way as to eliminate much operator interface, thereby substantially reducing the probability for error. A description of a program for the calculation of effective perceived noise level from aircraft noise data is included as an example of how this system can be used.
Application of abstract harmonic analysis to the high-speed recognition of images
NASA Technical Reports Server (NTRS)
Usikov, D. A.
1979-01-01
Methods are constructed for rapidly computing correlation functions using the theory of abstract harmonic analysis. The theory developed includes as a particular case the familiar Fourier transform method for a correlation function which makes it possible to find images which are independent of their translation in the plane. Two examples of the application of the general theory described are the search for images, independent of their rotation and scale, and the search for images which are independent of their translations and rotations in the plane.
The nail and hair in forensic science.
Daniel, C Ralph; Piraccini, Bianca Maria; Tosti, Antonella
2004-02-01
Drugs, chemicals, and biological substances accumulate and are stored in hair and nails where they can be detected and measured. Advantages of analyzing hair and nail samples also include their easy and non-invasive collection, the small sample size required for analysis, and their easy storage at room temperature. We report 3 examples of heavy metal poisoning diagnosed because of the hair or nail symptoms. Drugs and toxins that can be detected in hair and nails are reviewed and the application of hair/nail analysis in general and in forensic medicine is discussed.
Analysis technique for controlling system wavefront error with active/adaptive optics
NASA Astrophysics Data System (ADS)
Genberg, Victor L.; Michels, Gregory J.
2017-08-01
The ultimate goal of an active mirror system is to control system level wavefront error (WFE). In the past, the use of this technique was limited by the difficulty of obtaining a linear optics model. In this paper, an automated method for controlling system level WFE using a linear optics model is presented. An error estimate is included in the analysis output for both surface error disturbance fitting and actuator influence function fitting. To control adaptive optics, the technique has been extended to write system WFE in state space matrix form. The technique is demonstrated by example with SigFit, a commercially available tool integrating mechanical analysis with optical analysis.
Squids in the Study of Cerebral Magnetic Field
NASA Astrophysics Data System (ADS)
Romani, G. L.; Narici, L.
The following sections are included: * INTRODUCTION * HISTORICAL OVERVIEW * NEUROMAGNETIC FIELDS AND AMBIENT NOISE * DETECTORS * Room temperature sensors * SQUIDs * DETECTION COILS * Magnetometers * Gradiometers * Balancing * Planar gradiometers * Choice of the gradiometer parameters * MODELING * Current pattern due to neural excitations * Action potentials and postsynaptic currents * The current dipole model * Neural population and detected fields * Spherically bounded medium * SPATIAL CONFIGURATION OF THE SENSORS * SOURCE LOCALIZATION * Localization procedure * Experimental accuracy and reproducibility * SIGNAL PROCESSING * Analog Filtering * Bandpass filters * Line rejection filters * DATA ANALYSIS * Analysis of evoked/event-related responses * Simple average * Selected average * Recursive techniques * Similarity analysis * Analysis of spontaneous activity * Mapping and localization * EXAMPLES OF NEUROMAGNETIC STUDIES * Neuromagnetic measurements * Studies on the normal brain * Clinical applications * Epilepsy * Tinnitus * CONCLUSIONS * ACKNOWLEDGEMENTS * REFERENCES
Effects of Tropospheric Spatio-Temporal Correlated Noise on the Analysis of Space Geodetic Data
NASA Technical Reports Server (NTRS)
Romero-Wolf, A. F.; Jacobs, C. S.
2011-01-01
The standard VLBI analysis models measurement noise as purely thermal errors modeled according to uncorrelated Gaussian distributions. As the price of recording bits steadily decreases, thermal errors will soon no longer dominate. It is therefore expected that troposphere and instrumentation/clock errors will increasingly become more dominant. Given that both of these errors have correlated spectra, properly modeling the error distributions will become more relevant for optimal analysis. This paper will discuss the advantages of including the correlations between tropospheric delays using a Kolmogorov spectrum and the frozen ow model pioneered by Treuhaft and Lanyi. We will show examples of applying these correlated noise spectra to the weighting of VLBI data analysis.
Persistent Infrared Spectral Hole-Burning for Impurity Vibrational Modes in Solids.
1986-09-30
infrared vibrational transitions of impurity molecules in solids. Examples include 1,2- difluoroethane in rare gas matrices, perrhenate ions in alkali...observed consists of infrared vibrational transitions of impurity molecules in solids. Examples include 1,2- difluoroethane in rare gas matrices...solids. Examples include 1,2- difluoroethane in rare gas matrices, perrhenate ions in alkali halide crystals, and most recently, cyanide and nitrite
CAUSAL ANALYSIS AND PROBABILITY DATA: EXAMPLES FOR IMPAIRED AQUATIC CONDITION
Causal analysis is plausible reasoning applied to diagnosing observed effect(s), for example, diagnosing
cause of biological impairment in a stream. Sir Bradford Hill basically defined the application of causal
analysis when he enumerated the elements of causality f...
Imaging System and Method for Biomedical Analysis
2013-03-11
biological particles and items of interest. Broadly, Padmanabhan et al. utilize the diffraction of a laser light source in flow cytometry to count...spread of light from multiple LED devices over the entire sample surface. Preferably, light source 308 projects a full spectrum white light. Light...for example, red blood cells, white blood cells (which may include lymphocytes which are relatively large and easily detectable), T-helper cells
Modal analysis of a nonuniform string with end mass and variable tension
NASA Technical Reports Server (NTRS)
Rheinfurth, M. H.; Galaboff, Z. J.
1983-01-01
Modal synthesis techniques for dynamic systems containing strings describe the lateral displacements of these strings by properly chosen shape functions. An iterative algorithm is provided to calculate the natural modes of a nonuniform string and variable tension for some typical boundary conditions including one end mass. Numerical examples are given for a string in a constant and a gravity gradient force field.
ERIC Educational Resources Information Center
Ehrmann, Stephen C.; Milam, John H., Jr.
2003-01-01
This volume describes for educators how to create simple models of the full costs of educational innovations, including the costs for time devoted to the activity, space needed for the activity, etc. Examples come from educational uses of technology in higher education in the United States and China. Real case studies illustrate the method in use:…
Knowledge, Skills, and Abilities for Military Leader Influence
2011-03-01
tactics. The attributes vary in breadth, encompassing broad traits, such as those represented in the five-factor model of personality ( FFM ; e.g...attributes related to the application of influence strategies, together with their definitions, are shown in Table 10. The FFM includes extroversion...leadership. For example, Judge, Bono, Ilies, and Gerhardt (2002) conducted a meta-analysis investigating the relationship between FFM personality traits
The Measurement of Army Battalion Performance
1981-01-01
reflect the degree to which these goals are met. Familiar examples of this approach would include cost/benefit analysis and a management -by-objectives...these goals or how they may change . Through this model, the assessment of organizational effectiveness does not proceed through a determination of...Further, the unit must meet these demands in the fact of shifting priorities and with changing resource levels. In this
TARA: Tool Assisted Requirements Analysis
1988-05-01
provided during the project and to aid tool integration . Chapter 6 provides a brief discussion of the experience of specifying the ASET case study in CORE...set of Prolog clauses. This includes the context-free grammar rules depicted in Figure 2.1, integrity constraints such as those defining the binding...Jeremaes (1986). This was developed originally for specifying database management ". semantics (for example, the preservation of integrity constraints
NASA Technical Reports Server (NTRS)
Taylor, Lawrence W., Jr.; Rajiyah, H.
1991-01-01
Partial differential equations for modeling the structural dynamics and control systems of flexible spacecraft are applied here in order to facilitate systems analysis and optimization of these spacecraft. Example applications are given, including the structural dynamics of SCOLE, the Solar Array Flight Experiment, the Mini-MAST truss, and the LACE satellite. The development of related software is briefly addressed.
Qualitative case study data analysis: an example from practice.
Houghton, Catherine; Murphy, Kathy; Shaw, David; Casey, Dympna
2015-05-01
To illustrate an approach to data analysis in qualitative case study methodology. There is often little detail in case study research about how data were analysed. However, it is important that comprehensive analysis procedures are used because there are often large sets of data from multiple sources of evidence. Furthermore, the ability to describe in detail how the analysis was conducted ensures rigour in reporting qualitative research. The research example used is a multiple case study that explored the role of the clinical skills laboratory in preparing students for the real world of practice. Data analysis was conducted using a framework guided by the four stages of analysis outlined by Morse ( 1994 ): comprehending, synthesising, theorising and recontextualising. The specific strategies for analysis in these stages centred on the work of Miles and Huberman ( 1994 ), which has been successfully used in case study research. The data were managed using NVivo software. Literature examining qualitative data analysis was reviewed and strategies illustrated by the case study example provided. Discussion Each stage of the analysis framework is described with illustration from the research example for the purpose of highlighting the benefits of a systematic approach to handling large data sets from multiple sources. By providing an example of how each stage of the analysis was conducted, it is hoped that researchers will be able to consider the benefits of such an approach to their own case study analysis. This paper illustrates specific strategies that can be employed when conducting data analysis in case study research and other qualitative research designs.
ERIC Educational Resources Information Center
Huang, Xiaoxia; Cribbs, Jennifer
2017-01-01
This study examined mathematics and science teachers' perceptions and use of four types of examples, including typical textbook examples (standard worked examples) and erroneous worked examples in the written form as well as mastery modelling examples and peer modelling examples involving the verbalization of the problem-solving process. Data…
Bayesian Methods for the Physical Sciences. Learning from Examples in Astronomy and Physics.
NASA Astrophysics Data System (ADS)
Andreon, Stefano; Weaver, Brian
2015-05-01
Chapter 1: This chapter presents some basic steps for performing a good statistical analysis, all summarized in about one page. Chapter 2: This short chapter introduces the basics of probability theory inan intuitive fashion using simple examples. It also illustrates, again with examples, how to propagate errors and the difference between marginal and profile likelihoods. Chapter 3: This chapter introduces the computational tools and methods that we use for sampling from the posterior distribution. Since all numerical computations, and Bayesian ones are no exception, may end in errors, we also provide a few tips to check that the numerical computation is sampling from the posterior distribution. Chapter 4: Many of the concepts of building, running, and summarizing the resultsof a Bayesian analysis are described with this step-by-step guide using a basic (Gaussian) model. The chapter also introduces examples using Poisson and Binomial likelihoods, and how to combine repeated independent measurements. Chapter 5: All statistical analyses make assumptions, and Bayesian analyses are no exception. This chapter emphasizes that results depend on data and priors (assumptions). We illustrate this concept with examples where the prior plays greatly different roles, from major to negligible. We also provide some advice on how to look for information useful for sculpting the prior. Chapter 6: In this chapter we consider examples for which we want to estimate more than a single parameter. These common problems include estimating location and spread. We also consider examples that require the modeling of two populations (one we are interested in and a nuisance population) or averaging incompatible measurements. We also introduce quite complex examples dealing with upper limits and with a larger-than-expected scatter. Chapter 7: Rarely is a sample randomly selected from the population we wish to study. Often, samples are affected by selection effects, e.g., easier-to-collect events or objects are over-represented in samples and difficult-to-collect are under-represented if not missing altogether. In this chapter we show how to account for non-random data collection to infer the properties of the population from the studied sample. Chapter 8: In this chapter we introduce regression models, i.e., how to fit (regress) one, or more quantities, against each other through a functional relationship and estimate any unknown parameters that dictate this relationship. Questions of interest include: how to deal with samples affected by selection effects? How does a rich data structure influence the fitted parameters? And what about non-linear multiple-predictor fits, upper/lower limits, measurements errors of different amplitudes and an intrinsic variety in the studied populations or an extra source of variability? A number of examples illustrate how to answer these questions and how to predict the value of an unavailable quantity by exploiting the existence of a trend with another, available, quantity. Chapter 9: This chapter provides some advice on how the careful scientist should perform model checking and sensitivity analysis, i.e., how to answer the following questions: is the considered model at odds with the current available data (the fitted data), for example because it is over-simplified compared to some specific complexity pointed out by the data? Furthermore, are the data informative about the quantity being measured or are results sensibly dependent on details of the fitted model? And, finally, what about if assumptions are uncertain? A number of examples illustrate how to answer these questions. Chapter 10: This chapter compares the performance of Bayesian methods against simple, non-Bayesian alternatives, such as maximum likelihood, minimal chi square, ordinary and weighted least square, bivariate correlated errors and intrinsic scatter, and robust estimates of location and scale. Performances are evaluated in terms of quality of the prediction, accuracy of the estimates, and fairness and noisiness of the quoted errors. We also focus on three failures of maximum likelihood methods occurring with small samples, with mixtures, and with regressions with errors in the predictor quantity.
A brief history of the most remarkable numbers e, i and γ in mathematical sciences with applications
NASA Astrophysics Data System (ADS)
Debnath, Lokenath
2015-08-01
This paper deals with a brief history of the most remarkable Euler numbers e, i and γ in mathematical sciences. Included are many properties of the constants e, i and γ and their applications in algebra, geometry, physics, chemistry, ecology, business and industry. Special attention is given to the growth and decay phenomena in many real-world problems including stability and instability of their solutions. Some specific and modern applications of logarithms, complex numbers and complex exponential functions to electrical circuits and mechanical systems are presented with examples. Included are the use of complex numbers and complex functions in the description and analysis of chaos and fractals with the aid of modern computer technology. In addition, the phasor method is described with examples of applications in engineering science. The major focus of this paper is to provide basic information through historical approach to mathematics teaching and learning of the fundamental knowledge and skills required for students and teachers at all levels so that they can understand the concepts of mathematics, and mathematics education in science and technology.
FLUT - A program for aeroelastic stability analysis. [of aircraft structures in subsonic flow
NASA Technical Reports Server (NTRS)
Johnson, E. H.
1977-01-01
A computer program (FLUT) that can be used to evaluate the aeroelastic stability of aircraft structures in subsonic flow is described. The algorithm synthesizes data from a structural vibration analysis with an unsteady aerodynamics analysis and then performs a complex eigenvalue analysis to assess the system stability. The theoretical basis of the program is discussed with special emphasis placed on some innovative techniques which improve the efficiency of the analysis. User information needed to efficiently and successfully utilize the program is provided. In addition to identifying the required input, the flow of the program execution and some possible sources of difficulty are included. The use of the program is demonstrated with a listing of the input and output for a simple example.
Tricco, Andrea C; Antony, Jesmin; Soobiah, Charlene; Hemmelgarn, Brenda; Moher, David; Hutton, Brian; Yu, Catherine H; Majumdar, Sumit R; Straus, Sharon E
2013-06-28
Type 2 diabetes mellitus (T2DM) results from insulin resistance and relative insulin deficiency. T2DM treatment is a step-wise approach beginning with lifestyle modifications (for example, diet, exercise), followed by the addition of oral hypoglycemic agents (for example, metformin). Patients who do not respond to first-line therapy are offered second-line therapy (for example, sulfonylureas). Third-line therapy may include insulin and/or dipeptidyl peptidase-4 (DPP-4) inhibitors.It is unclear whether DPP-4 inhibitors are safer and more effective than intermediate acting insulin for third-line management of T2DM. As such, our objective is to evaluate the comparative effectiveness, safety and cost-effectiveness of DPP-4 inhibitors versus intermediate acting insulin for T2DM patients who have failed both first- and second-line diabetes treatments. Electronic searches of MEDLINE, Cochrane Central Register of Controlled Trials, EMBASE, and grey literature (for example, trial registries, public health websites) will be conducted to identify studies examining DPP-4 inhibitors compared with each other, intermediate acting insulin, no treatment, or placebo for adults with T2DM. The outcomes of interest include glycosylated hemoglobin (A1C) (primary outcome), as well as emergency department visits, physician visits, hospital admissions, weight gain, quality of life, microvascular complications, macrovascular complications, all-cause mortality, and cost (secondary outcomes). Randomized clinical trials (RCTs), quasi-RCTs, non-RCTs, controlled before-after, interrupted time series, cohort studies, and cost studies reporting data on these outcomes will be included. Eligibility will not be restricted by publication status, language of dissemination, duration of study follow-up, or time period of study conduct.Two reviewers will screen the titles and abstracts resulting from the literature search, as well as potentially relevant full-text articles, in duplicate. Data will be abstracted and quality will be appraised by two team members independently. Conflicts at all levels of screening and abstraction will be resolved through team discussion.Our results will be described narratively. Random effects meta-analysis and network meta-analysis will be conducted, if feasible and appropriate. Our systematic review results can be used to determine the most effective, safe and cost-effective third-line strategies for managing T2DM. This information will be of great use to health policy-makers and clinicians, as well as patients living with T2DM and their families. PROSPERO registry number: CRD42013003624.
Several examples where turbulence models fail in inlet flow field analysis
NASA Technical Reports Server (NTRS)
Anderson, Bernhard H.
1993-01-01
Computational uncertainties in turbulence modeling for three dimensional inlet flow fields include flows approaching separation, strength of secondary flow field, three dimensional flow predictions of vortex liftoff, and influence of vortex-boundary layer interactions; computational uncertainties in vortex generator modeling include representation of generator vorticity field and the relationship between generator and vorticity field. The objectives of the inlet flow field studies presented in this document are to advance the understanding, prediction, and control of intake distortion and to study the basic interactions that influence this design problem.
Multi-fluid CFD analysis in Process Engineering
NASA Astrophysics Data System (ADS)
Hjertager, B. H.
2017-12-01
An overview of modelling and simulation of flow processes in gas/particle and gas/liquid systems are presented. Particular emphasis is given to computational fluid dynamics (CFD) models that use the multi-dimensional multi-fluid techniques. Turbulence modelling strategies for gas/particle flows based on the kinetic theory for granular flows are given. Sub models for the interfacial transfer processes and chemical kinetics modelling are presented. Examples are shown for some gas/particle systems including flow and chemical reaction in risers as well as gas/liquid systems including bubble columns and stirred tanks.
NASA Astrophysics Data System (ADS)
Bachmann, M.; Besse, P. A.; Melchior, H.
1995-10-01
Overlapping-image multimode interference (MMI) couplers, a new class of devices, permit uniform and nonuniform power splitting. A theoretical description directly relates coupler geometry to image intensities, positions, and phases. Among many possibilities of nonuniform power splitting, examples of 1 \\times 2 couplers with ratios of 15:85 and 28:72 are given. An analysis of uniform power splitters includes the well-known 2 \\times N and 1 \\times N MMI couplers. Applications of MMI couplers include mode filters, mode splitters-combiners, and mode converters.
The ABC (in any D) of logarithmic CFT
NASA Astrophysics Data System (ADS)
Hogervorst, Matthijs; Paulos, Miguel; Vichi, Alessandro
2017-10-01
Logarithmic conformal field theories have a vast range of applications, from critical percolation to systems with quenched disorder. In this paper we thoroughly examine the structure of these theories based on their symmetry properties. Our analysis is model-independent and holds for any spacetime dimension. Our results include a determination of the general form of correlation functions and conformal block decompositions, clearing the path for future bootstrap applications. Several examples are discussed in detail, including logarithmic generalized free fields, holographic models, self-avoiding random walks and critical percolation.
A new JAVA interface implementation of THESIAS: testing haplotype effects in association studies.
Tregouet, D A; Garelle, V
2007-04-15
THESIAS (Testing Haplotype EffectS In Association Studies) is a popular software for carrying haplotype association analysis in unrelated individuals. In addition to the command line interface, a graphical JAVA interface is now proposed allowing one to run THESIAS in a user-friendly manner. Besides, new functionalities have been added to THESIAS including the possibility to analyze polychotomous phenotype and X-linked polymorphisms. The software package including documentation and example data files is freely available at http://genecanvas.ecgene.net. The source codes are also available upon request.
Silylene-diethynyl-arylene polymers having liquid crystalline properties
Barton, Thomas J.; Ding, Yiwei
1993-09-07
The present invention provides linear organosilicon polymers including diethynyl-(substituted)arylene units, and a process for their preparation. These novel polymers possess useful properties including electrical conductivity, liquid crystallinity, and/or photoluminescence. These polymers possess good solubility in organic solvents. A preferred example is produced according to the following reaction scheme. ##STR1## These polymers can be solvent-cast to yield excellent films and can also be pulled into fibers from concentrated solutions. All possess substantial crystallinity as revealed by DSC analysis and observation through a polarizing microscope, and possess liquid crystalline properties.
System Safety and the Unintended Consequence
NASA Technical Reports Server (NTRS)
Watson, Clifford
2012-01-01
The analysis and identification of risks often result in design changes or modification of operational steps. This paper identifies the potential of unintended consequences as an over-looked result of these changes. Examples of societal changes such as prohibition, regulatory changes including mandating lifeboats on passenger ships, and engineering proposals or design changes to automobiles and spaceflight hardware are used to demonstrate that the System Safety Engineer must be cognizant of the potential for unintended consequences as a result of an analysis. Conclusions of the report indicate the need for additional foresight and consideration of the potential effects of analysis-driven design, processing changes, and/or operational modifications.
Alternative to Ritt's pseudodivision for finding the input-output equations of multi-output models.
Meshkat, Nicolette; Anderson, Chris; DiStefano, Joseph J
2012-09-01
Differential algebra approaches to structural identifiability analysis of a dynamic system model in many instances heavily depend upon Ritt's pseudodivision at an early step in analysis. The pseudodivision algorithm is used to find the characteristic set, of which a subset, the input-output equations, is used for identifiability analysis. A simpler algorithm is proposed for this step, using Gröbner Bases, along with a proof of the method that includes a reduced upper bound on derivative requirements. Efficacy of the new algorithm is illustrated with several biosystem model examples. Copyright © 2012 Elsevier Inc. All rights reserved.
Takeda, Mitsuo
2013-01-01
The paper reviews a technique for fringe analysis referred to as Fourier fringe analysis (FFA) or the Fourier transform method, with a particular focus on its application to metrology of extreme physical phenomena. Examples include the measurement of extremely small magnetic fields with subfluxon sensitivity by electron wave interferometry, subnanometer wavefront evaluation of projection optics for extreme UV lithography, the detection of sub-Ångstrom distortion of a crystal lattice, and the measurement of ultrashort optical pulses in the femotsecond to attosecond range, which show how the advantages of FFA are exploited in these cutting edge applications.
CADDIS Volume 3. Examples and Applications: Analytical Examples
Examples illustrating the use of statistical analysis to support different types of evidence, stream temperature, temperature inferred from macroinverterbate, macroinvertebrate responses, zinc concentrations, observed trait characteristics.
NASA Astrophysics Data System (ADS)
Byrne, A. R.; Benedik, L.
1999-01-01
Neutron activation analysis (NAA), being essentially an isotopic and not an elemental method of analysis, is capable of determining a number of important radionuclides of radioecological interest by transformation into another, more easily quantifiable radionuclide. The nuclear characteristics which favour this technique may be summarized in an advantage factor relative to radiometric analysis of the original radioanalyte. Well known or hardly known examples include235U,238U,232Th,230Th,129I,99Tc,237Np and231Pa; a number of these are discussed and illustrated in analysis of real samples of environmental and biological origin. In particular, determination of231Pa by RNAA was performed using both postirradiation and preseparation methods. Application of INAA to enable the use of238U and232Th as endogenous (internal) radiotracers in alpha spectrometric analyses of uranium and thorium radioisotopes in radioecological studies is described, also allowing independent data sets to be obtained for quality control.
Hyperspectral data analysis procedures with reduced sensitivity to noise
NASA Technical Reports Server (NTRS)
Landgrebe, David A.
1993-01-01
Multispectral sensor systems have become steadily improved over the years in their ability to deliver increased spectral detail. With the advent of hyperspectral sensors, including imaging spectrometers, this technology is in the process of taking a large leap forward, thus providing the possibility of enabling delivery of much more detailed information. However, this direction of development has drawn even more attention to the matter of noise and other deleterious effects in the data, because reducing the fundamental limitations of spectral detail on information collection raises the limitations presented by noise to even greater importance. Much current effort in remote sensing research is thus being devoted to adjusting the data to mitigate the effects of noise and other deleterious effects. A parallel approach to the problem is to look for analysis approaches and procedures which have reduced sensitivity to such effects. We discuss some of the fundamental principles which define analysis algorithm characteristics providing such reduced sensitivity. One such analysis procedure including an example analysis of a data set is described, illustrating this effect.
De Boer, Jan L M; Ritsema, Rob; Piso, Sjoerd; Van Staden, Hans; Van Den Beld, Wilbert
2004-07-01
Two screening methods were developed for rapid analysis of a great number of urine and blood samples within the framework of an exposure check of the population after a firework explosion. A total of 56 elements was measured including major elements. Sample preparation consisted of simple dilution. Extensive quality controls were applied including element addition and the use of certified reference materials. Relevant results at levels similar to those found in the literature were obtained for Co, Ni, Cu, Zn, Sr, Cd, Sn, Sb, Ba, Tl, and Pb in urine and for the same elements except Ni, Sn, Sb, and Ba in blood. However, quadrupole ICP-MS has limitations, mainly related to spectral interferences, for the analysis of urine and blood, and these cause higher detection limits. The general aspects discussed in the paper give it wider applicability than just for analysis of blood and urine-it can for example be used in environmental analysis.
NASA Astrophysics Data System (ADS)
Nickelsen, J.; Kück, U.
Chloroplasts are typical organelles of photoautotrophic eukaryotic cells which drive a variety of functions, including photosynthesis. For many years the unicellular green alga Chlamydomonas reinhardtii has served as an experimental organism for studying photosynthetic processes. The recent development of molecular tools for this organism together with efficient methods of genetic analysis and the availability of many photosynthesis mutants has now made this alga a powerful model system for the analysis of chloroplast biogenesis. For example, techniques have been developed to transfer recombinant DNA into both the nuclear and the chloroplast genome. This allows both complementation tests and analyses of gene functions in vivo. Moreover, site-specific DNA recombinations in the chloroplast allow targeted gene disruption experiments which enable a "reverse genetics" to be performed. The potential of the algal system for the study of chloroplast biogenesis is illustrated in this review by the description of regulatory systems of gene expression involved in organelle biogenesis. One example concerns the regulation of trans-splicing of chloroplast mRNAs, a process which is controlled by both multiple nuclear- and chloroplast-encoded factors. The second example involves the stabilization of chloroplast mRNAs. The available data lead us predict distinct RNA elements, which interact with trans-acting factors to protect the RNA against nucleolytic attacks.
Bus, James S
2017-06-01
The International Agency for Research on Cancer (IARC) has formulated 10 key characteristics of human carcinogens to incorporate mechanistic data into cancer hazard classifications. The analysis used glyphosate as a case example to examine the robustness of IARC's determination of oxidative stress as "strong" evidence supporting a plausible cancer mechanism in humans. The IARC analysis primarily relied on 14 human/mammalian studies; 19 non-mammalian studies were uninformative of human cancer given the broad spectrum of test species and extensive use of formulations and aquatic testing. The mammalian studies had substantial experimental limitations for informing cancer mechanism including use of: single doses and time points; cytotoxic/toxic test doses; tissues not identified as potential cancer targets; glyphosate formulations or mixtures; technically limited oxidative stress biomarkers. The doses were many orders of magnitude higher than human exposures determined in human biomonitoring studies. The glyphosate case example reveals that the IARC evaluation fell substantially short of "strong" supporting evidence of oxidative stress as a plausible human cancer mechanism, and suggests that other IARC monographs relying on the 10 key characteristics approach should be similarly examined for a lack of robust data integration fundamental to reasonable mode of action evaluations. Copyright © 2017 Elsevier Inc. All rights reserved.
Characterization methods for liquid interfacial layers
NASA Astrophysics Data System (ADS)
Javadi, A.; Mucic, N.; Karbaschi, M.; Won, J. Y.; Lotfi, M.; Dan, A.; Ulaganathan, V.; Gochev, G.; Makievski, A. V.; Kovalchuk, V. I.; Kovalchuk, N. M.; Krägel, J.; Miller, R.
2013-05-01
Liquid interfaces are met everywhere in our daily life. The corresponding interfacial properties and their modification play an important role in many modern technologies. Most prominent examples are all processes involved in the formation of foams and emulsions, as they are based on a fast creation of new surfaces, often of an immense extension. During the formation of an emulsion, for example, all freshly created and already existing interfaces are permanently subject to all types of deformation. This clearly entails the need of a quantitative knowledge on relevant dynamic interfacial properties and their changes under conditions pertinent to the technological processes. We report on the state of the art of interfacial layer characterization, including the determination of thermodynamic quantities as base line for a further quantitative analysis of the more important dynamic interfacial characteristics. Main focus of the presented work is on the experimental possibilities available at present to gain dynamic interfacial parameters, such as interfacial tensions, adsorbed amounts, interfacial composition, visco-elastic parameters, at shortest available surface ages and fastest possible interfacial perturbations. The experimental opportunities are presented along with examples for selected systems and theoretical models for a best data analysis. We also report on simulation results and concepts of necessary refinements and developments in this important field of interfacial dynamics.
Preferential sampling and Bayesian geostatistics: Statistical modeling and examples.
Cecconi, Lorenzo; Grisotto, Laura; Catelan, Dolores; Lagazio, Corrado; Berrocal, Veronica; Biggeri, Annibale
2016-08-01
Preferential sampling refers to any situation in which the spatial process and the sampling locations are not stochastically independent. In this paper, we present two examples of geostatistical analysis in which the usual assumption of stochastic independence between the point process and the measurement process is violated. To account for preferential sampling, we specify a flexible and general Bayesian geostatistical model that includes a shared spatial random component. We apply the proposed model to two different case studies that allow us to highlight three different modeling and inferential aspects of geostatistical modeling under preferential sampling: (1) continuous or finite spatial sampling frame; (2) underlying causal model and relevant covariates; and (3) inferential goals related to mean prediction surface or prediction uncertainty. © The Author(s) 2016.
Causal Relation Analysis Tool of the Case Study in the Engineer Ethics Education
NASA Astrophysics Data System (ADS)
Suzuki, Yoshio; Morita, Keisuke; Yasui, Mitsukuni; Tanada, Ichirou; Fujiki, Hiroyuki; Aoyagi, Manabu
In engineering ethics education, the virtual experiencing of dilemmas is essential. Learning through the case study method is a particularly effective means. Many case studies are, however, difficult to deal with because they often include many complex causal relationships and social factors. It would thus be convenient if there were a tool that could analyze the factors of a case example and organize them into a hierarchical structure to get a better understanding of the whole picture. The tool that was developed applies a cause-and-effect matrix and simple graph theory. It analyzes the causal relationship between facts in a hierarchical structure and organizes complex phenomena. The effectiveness of this tool is shown by presenting an actual example.
Differential theory of learning for efficient neural network pattern recognition
NASA Astrophysics Data System (ADS)
Hampshire, John B., II; Vijaya Kumar, Bhagavatula
1993-09-01
We describe a new theory of differential learning by which a broad family of pattern classifiers (including many well-known neural network paradigms) can learn stochastic concepts efficiently. We describe the relationship between a classifier's ability to generate well to unseen test examples and the efficiency of the strategy by which it learns. We list a series of proofs that differential learning is efficient in its information and computational resource requirements, whereas traditional probabilistic learning strategies are not. The proofs are illustrated by a simple example that lends itself to closed-form analysis. We conclude with an optical character recognition task for which three different types of differentially generated classifiers generalize significantly better than their probabilistically generated counterparts.
Differential theory of learning for efficient neural network pattern recognition
NASA Astrophysics Data System (ADS)
Hampshire, John B., II; Vijaya Kumar, Bhagavatula
1993-08-01
We describe a new theory of differential learning by which a broad family of pattern classifiers (including many well-known neural network paradigms) can learn stochastic concepts efficiently. We describe the relationship between a classifier's ability to generalize well to unseen test examples and the efficiency of the strategy by which it learns. We list a series of proofs that differential learning is efficient in its information and computational resource requirements, whereas traditional probabilistic learning strategies are not. The proofs are illustrated by a simple example that lends itself to closed-form analysis. We conclude with an optical character recognition task for which three different types of differentially generated classifiers generalize significantly better than their probabilistically generated counterparts.
Lindsay, Sally; DePape, Anne-Marie
2015-01-01
Objective Although people with disabilities have great potential to provide advantages to work environments, many encounter barriers in finding employment, especially youth who are looking for their first job. A job interview is an essential component of obtaining employment. The objective of this study is to explore the content of the answers given in job interviews among youth with disabilities compared to typically developing youth. Methods A purposive sample of 31 youth (16 with typical development and 15 with disability) completed a mock job interview as part of an employment readiness study. The interview questions focused on skills and experiences, areas for improvement, and actions taken during problem-based scenarios. Transcribed interviews were analyzed using a content analysis of themes that emerged from the interviews. Results We found several similarities and differences between youth with disabilities and typically developing youth. Similarities included giving examples from school, emphasizing their “soft skills” (i.e., people and communication skills) and giving examples of relevant experience for the position. Both groups of youth gave similar examples for something they were proud of but fewer youth with disabilities provided examples. Differences in the content of job interview answers between the two groups included youth with disabilities: (1) disclosing their condition; (2) giving fewer examples related to customer service and teamwork skills; (3) experiencing greater challenges in providing feedback to team members and responding to scenario-based problem solving questions; and (4) drawing on examples from past work, volunteer and extra curricular activities. Conclusions Clinicians and educators should help youth to understand what their marketable skills are and how to highlight them in an interview. Employers need to understand that the experiences of youth with disabilities may be different than typically developing youth. Our findings also help to inform employment readiness programs by highlighting the areas where youth with disabilities may need extra help as compared to typically developing youth. PMID:25799198
Ilott, Irene; Gerrish, Kate; Booth, Andrew; Field, Becky
2013-10-01
There is an international imperative to implement research into clinical practice to improve health care. Understanding the dynamics of change requires knowledge from theoretical and empirical studies. This paper presents a novel approach to testing a new meta theoretical framework: the Consolidated Framework for Implementation Research. The utility of the Framework was evaluated using a post hoc, deductive analysis of 11 narrative accounts of innovation in health care services and practice from England, collected in 2010. A matrix, comprising the five domains and 39 constructs of the Framework was developed to examine the coherence of the terminology, to compare results across contexts and to identify new theoretical developments. The Framework captured the complexity of implementation across 11 diverse examples, offering theoretically informed, comprehensive coverage. The Framework drew attention to relevant points in individual cases together with patterns across cases; for example, all were internally developed innovations that brought direct or indirect patient advantage. In 10 cases, the change was led by clinicians. Most initiatives had been maintained for several years and there was evidence of spread in six examples. Areas for further development within the Framework include sustainability and patient/public engagement in implementation. Our analysis suggests that this conceptual framework has the potential to offer useful insights, whether as part of a situational analysis or by developing context-specific propositions for hypothesis testing. Such studies are vital now that innovation is being promoted as core business for health care. © 2012 John Wiley & Sons Ltd.
NASA Technical Reports Server (NTRS)
Carlson, Harry W.; Darden, Christine M.
1987-01-01
Low-speed experimental force and data on a series of thin swept wings with sharp leading edges and leading and trailing-edge flaps are compared with predictions made using a linearized-theory method which includes estimates of vortex forces. These comparisons were made to assess the effectiveness of linearized-theory methods for use in the design and analysis of flap systems in subsonic flow. Results demonstrate that linearized-theory, attached-flow methods (with approximate representation of vortex forces) can form the basis of a rational system for flap design and analysis. Even attached-flow methods that do not take vortex forces into account can be used for the selection of optimized flap-system geometry, but design-point performance levels tend to be underestimated unless vortex forces are included. Illustrative examples of the use of these methods in the design of efficient low-speed flap systems are included.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cushman, R.M.
2003-08-28
The Carbon Dioxide Information Analysis Center (CDIAC), which includes the World Data Center (WDC) for Atmospheric Trace Gases, is the primary global change data and information analysis center of the U.S. Department of Energy (DOE). More than just an archive of data sets and publications, CDIAC has, since its inception in 1982, enhanced the value of its holdings through intensive quality assurance, documentation, and integration. Whereas many traditional data centers are discipline-based (for example, meteorology or oceanography), CDIAC's scope includes potentially anything and everything that would be of value to users concerned with the greenhouse effect and global climate change,more » including atmospheric concentrations and atmospheric emissions of carbon dioxide (CO{sub 2}) and other radiatively active gases; the role of the terrestrial biosphere and the oceans in the biogeochemical cycles of greenhouse gases; long-term climate trends; the effects of elevated CO{sub 2} on vegetation; and the vulnerability of coastal areas to rising sea levels.« less
Caldwell, Michael W; Nydam, Randall L; Palci, Alessandro; Apesteguía, Sebastián
2015-01-27
The previous oldest known fossil snakes date from ~100 million year old sediments (Upper Cretaceous) and are both morphologically and phylogenetically diverse, indicating that snakes underwent a much earlier origin and adaptive radiation. We report here on snake fossils that extend the record backwards in time by an additional ~70 million years (Middle Jurassic-Lower Cretaceous). These ancient snakes share features with fossil and modern snakes (for example, recurved teeth with labial and lingual carinae, long toothed suborbital ramus of maxillae) and with lizards (for example, pronounced subdental shelf/gutter). The paleobiogeography of these early snakes is diverse and complex, suggesting that snakes had undergone habitat differentiation and geographic radiation by the mid-Jurassic. Phylogenetic analysis of squamates recovers these early snakes in a basal polytomy with other fossil and modern snakes, where Najash rionegrina is sister to this clade. Ingroup analysis finds them in a basal position to all other snakes including Najash.
NASA Astrophysics Data System (ADS)
Li, Hui-Jia; Cheng, Qing; Mao, He-Jin; Wang, Huanian; Chen, Junhua
2017-03-01
The study of community structure is a primary focus of network analysis, which has attracted a large amount of attention. In this paper, we focus on two famous functions, i.e., the Hamiltonian function H and the modularity density measure D, and intend to uncover the effective thresholds of their corresponding resolution parameter γ without resolution limit problem. Two widely used example networks are employed, including the ring network of lumps as well as the ad hoc network. In these two networks, we use discrete convex analysis to study the interval of resolution parameter of H and D that will not cause the misidentification. By comparison, we find that in both examples, for Hamiltonian function H, the larger the value of resolution parameter γ, the less resolution limit the network suffers; while for modularity density D, the less resolution limit the network suffers when we decrease the value of γ. Our framework is mathematically strict and efficient and can be applied in a lot of scientific fields.
Three-dimensional analysis of anisotropic spatially reinforced structures
NASA Technical Reports Server (NTRS)
Bogdanovich, Alexander E.
1993-01-01
The material-adaptive three-dimensional analysis of inhomogeneous structures based on the meso-volume concept and application of deficient spline functions for displacement approximations is proposed. The general methodology is demonstrated on the example of a brick-type mosaic parallelepiped arbitrarily composed of anisotropic meso-volumes. A partition of each meso-volume into sub-elements, application of deficient spline functions for a local approximation of displacements and, finally, the use of the variational principle allows one to obtain displacements, strains, and stresses at anypoint within the structural part. All of the necessary external and internal boundary conditions (including the conditions of continuity of transverse stresses at interfaces between adjacent meso-volumes) can be satisfied with requisite accuracy by increasing the density of the sub-element mesh. The application of the methodology to textile composite materials is described. Several numerical examples for woven and braided rectangular composite plates and stiffened panels under transverse bending are considered. Some typical effects of stress concentrations due to the material inhomogeneities are demonstrated.
On holographic Rényi entropy in some modified theories of gravity
NASA Astrophysics Data System (ADS)
Dey, Anshuman; Roy, Pratim; Sarkar, Tapobrata
2018-04-01
We perform a detailed analysis of holographic entanglement Rényi entropy in some modified theories of gravity with four dimensional conformal field theory duals. First, we construct perturbative black hole solutions in a recently proposed model of Einsteinian cubic gravity in five dimensions, and compute the Rényi entropy as well as the scaling dimension of the twist operators in the dual field theory. Consistency of these results are verified from the AdS/CFT correspondence, via a corresponding computation of the Weyl anomaly on the gravity side. Similar analyses are then carried out for three other examples of modified gravity in five dimensions that include a chemical potential, namely Born-Infeld gravity, charged quasi-topological gravity and a class of Weyl corrected gravity theories with a gauge field, with the last example being treated perturbatively. Some interesting bounds in the dual conformal field theory parameters in quasi-topological gravity are pointed out. We also provide arguments on the validity of our perturbative analysis, whenever applicable.
Liao, Z L; He, Y; Huang, F; Wang, S; Li, H Z
2013-01-01
Although a commonly applied measure across the United States and Europe for alleviating the negative impacts of urbanization on the hydrological cycle, low impact development (LID) has not been widely used in highly urbanized areas, especially in rapidly urbanizing cities in developing countries like China. In this paper, given five LID practices including Bio-Retention, Infiltration Trench, Porous Pavement, Rain Barrels, and Green Swale, an analysis on LID for highly urbanized areas' waterlogging control is demonstrated using the example of Caohejing in Shanghai, China. Design storm events and storm water management models are employed to simulate the total waterlogging volume reduction, peak flow rate reduction and runoff coefficient reduction of different scenarios. Cost-effectiveness is calculated for the five practices. The aftermath shows that LID practices can have significant effects on storm water management in a highly urbanized area, and the comparative results reveal that Rain Barrels and Infiltration Trench are the two most suitable cost-effective measures for the study area.
Štys, Dalibor; Urban, Jan; Vaněk, Jan; Císař, Petr
2011-06-01
We report objective analysis of information in the microscopic image of the cell monolayer. The process of transfer of information about the cell by the microscope is analyzed in terms of the classical Shannon information transfer scheme. The information source is the biological object, the information transfer channel is the whole microscope including the camera chip. The destination is the model of biological system. The information contribution is analyzed as information carried by a point to overall information in the image. Subsequently we obtain information reflection of the biological object. This is transformed in the biological model which, in information terminology, is the destination. This, we propose, should be constructed as state transitions in individual cells modulated by information bonds between the cells. We show examples of detected cell states in multidimensional state space. This space is reflected as colour channel intensity phenomenological state space. We have also observed information bonds and show examples of them.
Stys, Dalibor; Urban, Jan; Vanek, Jan; Císar, Petr
2010-07-01
We report objective analysis of information in the microscopic image of the cell monolayer. The process of transfer of information about the cell by the microscope is analyzed in terms of the classical Shannon information transfer scheme. The information source is the biological object, the information transfer channel is the whole microscope including the camera chip. The destination is the model of biological system. The information contribution is analyzed as information carried by a point to overall information in the image. Subsequently we obtain information reflection of the biological object. This is transformed in the biological model which, in information terminology, is the destination. This, we propose, should be constructed as state transitions in individual cells modulated by information bonds between the cells. We show examples of detected cell states in multidimensional state space reflected in space an colour channel intensity phenomenological state space. We have also observed information bonds and show examples of them. Copyright 2010 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Allain, Rhett
2016-05-01
We currently live in a world filled with videos. There are videos on YouTube, feature movies and even videos recorded with our own cameras and smartphones. These videos present an excellent opportunity to not only explore physical concepts, but also inspire others to investigate physics ideas. With video analysis, we can explore the fantasy world in science-fiction films. We can also look at online videos to determine if they are genuine or fake. Video analysis can be used in the introductory physics lab and it can even be used to explore the make-believe physics embedded in video games. This book covers the basic ideas behind video analysis along with the fundamental physics principles used in video analysis. The book also includes several examples of the unique situations in which video analysis can be used.
Uses of software in digital image analysis: a forensic report
NASA Astrophysics Data System (ADS)
Sharma, Mukesh; Jha, Shailendra
2010-02-01
Forensic image analysis is required an expertise to interpret the content of an image or the image itself in legal matters. Major sub-disciplines of forensic image analysis with law enforcement applications include photo-grammetry, photographic comparison, content analysis and image authentication. It has wide applications in forensic science range from documenting crime scenes to enhancing faint or indistinct patterns such as partial fingerprints. The process of forensic image analysis can involve several different tasks, regardless of the type of image analysis performed. Through this paper authors have tried to explain these tasks, which are described in to three categories: Image Compression, Image Enhancement & Restoration and Measurement Extraction. With the help of examples like signature comparison, counterfeit currency comparison and foot-wear sole impression using the software Canvas and Corel Draw.
A Bayesian approach to meta-analysis of plant pathology studies.
Mila, A L; Ngugi, H K
2011-01-01
Bayesian statistical methods are used for meta-analysis in many disciplines, including medicine, molecular biology, and engineering, but have not yet been applied for quantitative synthesis of plant pathology studies. In this paper, we illustrate the key concepts of Bayesian statistics and outline the differences between Bayesian and classical (frequentist) methods in the way parameters describing population attributes are considered. We then describe a Bayesian approach to meta-analysis and present a plant pathological example based on studies evaluating the efficacy of plant protection products that induce systemic acquired resistance for the management of fire blight of apple. In a simple random-effects model assuming a normal distribution of effect sizes and no prior information (i.e., a noninformative prior), the results of the Bayesian meta-analysis are similar to those obtained with classical methods. Implementing the same model with a Student's t distribution and a noninformative prior for the effect sizes, instead of a normal distribution, yields similar results for all but acibenzolar-S-methyl (Actigard) which was evaluated only in seven studies in this example. Whereas both the classical (P = 0.28) and the Bayesian analysis with a noninformative prior (95% credibility interval [CRI] for the log response ratio: -0.63 to 0.08) indicate a nonsignificant effect for Actigard, specifying a t distribution resulted in a significant, albeit variable, effect for this product (CRI: -0.73 to -0.10). These results confirm the sensitivity of the analytical outcome (i.e., the posterior distribution) to the choice of prior in Bayesian meta-analyses involving a limited number of studies. We review some pertinent literature on more advanced topics, including modeling of among-study heterogeneity, publication bias, analyses involving a limited number of studies, and methods for dealing with missing data, and show how these issues can be approached in a Bayesian framework. Bayesian meta-analysis can readily include information not easily incorporated in classical methods, and allow for a full evaluation of competing models. Given the power and flexibility of Bayesian methods, we expect them to become widely adopted for meta-analysis of plant pathology studies.
APOD Data Release of Social Network Footprint for 2015
NASA Astrophysics Data System (ADS)
Nemiroff, Robert J.; Russell, David; Allen, Alice; Connelly, Paul; Lowe, Stuart R.; Petz, Sydney; Haring, Ralf; Bonnell, Jerry T.; APOD Team
2017-01-01
APOD data for 2015 are being made freely available for download and analysis. The data includes page view statistics for the main NASA APOD website at https://apod.nasa.gov, as well as for APOD's social media sites on Facebook, Instagram, Google Plus, and Twitter. General APOD-specific demographic information for each site is included. Popularity statistics that have been archived including Page Views, Likes, Shares, Hearts, and Retweets. The downloadable Excel-type spreadsheet also includes the APOD title and (unlinked) explanation. This data is released not to highlight APOD's popularity but to encourage analyses, with potential examples involving which astronomy topics trend the best and whether popularity is social group dependent.
Jahn, Ingeborg; Börnhorst, Claudia; Günther, Frauke; Brand, Tilman
2017-02-15
During the last decades, sex and gender biases have been identified in various areas of biomedical and public health research, leading to compromised validity of research findings. As a response, methodological requirements were developed but these are rarely translated into research practice. The aim of this study is to provide good practice examples of sex/gender sensitive health research. We conducted a systematic search of research articles published in JECH between 2006 and 2014. An instrument was constructed to evaluate sex/gender sensitivity in four stages of the research process (background, study design, statistical analysis, discussion). In total, 37 articles covering diverse topics were included. Thereof, 22 were evaluated as good practice example in at least one stage; two articles achieved highest ratings across all stages. Good examples of the background referred to available knowledge on sex/gender differences and sex/gender informed theoretical frameworks. Related to the study design, good examples calculated sample sizes to be able to detect sex/gender differences, selected sex/gender sensitive outcome/exposure indicators, or chose different cut-off values for male and female participants. Good examples of statistical analyses used interaction terms with sex/gender or different shapes of the estimated relationship for men and women. Examples of good discussions interpreted their findings related to social and biological explanatory models or questioned the statistical methods used to detect sex/gender differences. The identified good practice examples may inspire researchers to critically reflect on the relevance of sex/gender issues of their studies and help them to translate methodological recommendations of sex/gender sensitivity into research practice.
Goal-oriented sensitivity analysis for lattice kinetic Monte Carlo simulations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Arampatzis, Georgios, E-mail: garab@math.uoc.gr; Department of Mathematics and Statistics, University of Massachusetts, Amherst, Massachusetts 01003; Katsoulakis, Markos A., E-mail: markos@math.umass.edu
2014-03-28
In this paper we propose a new class of coupling methods for the sensitivity analysis of high dimensional stochastic systems and in particular for lattice Kinetic Monte Carlo (KMC). Sensitivity analysis for stochastic systems is typically based on approximating continuous derivatives with respect to model parameters by the mean value of samples from a finite difference scheme. Instead of using independent samples the proposed algorithm reduces the variance of the estimator by developing a strongly correlated-“coupled”- stochastic process for both the perturbed and unperturbed stochastic processes, defined in a common state space. The novelty of our construction is that themore » new coupled process depends on the targeted observables, e.g., coverage, Hamiltonian, spatial correlations, surface roughness, etc., hence we refer to the proposed method as goal-oriented sensitivity analysis. In particular, the rates of the coupled Continuous Time Markov Chain are obtained as solutions to a goal-oriented optimization problem, depending on the observable of interest, by considering the minimization functional of the corresponding variance. We show that this functional can be used as a diagnostic tool for the design and evaluation of different classes of couplings. Furthermore, the resulting KMC sensitivity algorithm has an easy implementation that is based on the Bortz–Kalos–Lebowitz algorithm's philosophy, where events are divided in classes depending on level sets of the observable of interest. Finally, we demonstrate in several examples including adsorption, desorption, and diffusion Kinetic Monte Carlo that for the same confidence interval and observable, the proposed goal-oriented algorithm can be two orders of magnitude faster than existing coupling algorithms for spatial KMC such as the Common Random Number approach. We also provide a complete implementation of the proposed sensitivity analysis algorithms, including various spatial KMC examples, in a supplementary MATLAB source code.« less
NASA Astrophysics Data System (ADS)
Förtsch, Christian; Dorfner, Tobias; Baumgartner, Julia; Werner, Sonja; von Kotzebue, Lena; Neuhaus, Birgit J.
2018-04-01
The German National Education Standards (NES) for biology were introduced in 2005. The content part of the NES emphasizes fostering conceptual knowledge. However, there are hardly any indications of what such an instructional implementation could look like. We introduce a theoretical framework of an instructional approach to foster students' conceptual knowledge as demanded in the NES (Fostering Conceptual Knowledge) including instructional practices derived from research on single core ideas, general psychological theories, and biology-specific features of instructional quality. First, we aimed to develop a rating manual, which is based on this theoretical framework. Second, we wanted to describe current German biology instruction according to this approach and to quantitatively analyze its effectiveness. And third, we aimed to provide qualitative examples of this approach to triangulate our findings. In a first step, we developed a theoretically devised rating manual to measure Fostering Conceptual Knowledge in videotaped lessons. Data for quantitative analysis included 81 videotaped biology lessons of 28 biology teachers from different German secondary schools. Six hundred forty students completed a questionnaire on their situational interest after each lesson and an achievement test. Results from multilevel modeling showed significant positive effects of Fostering Conceptual Knowledge on students' achievement and situational interest. For qualitative analysis, we contrasted instruction of four teachers, two with high and two with low student achievement and situational interest using the qualitative method of thematic analysis. Qualitative analysis revealed five main characteristics describing Fostering Conceptual Knowledge. Therefore, implementing Fostering Conceptual Knowledge in biology instruction seems promising. Examples of how to implement Fostering Conceptual Knowledge in instruction are shown and discussed.
Fragmentary and incidental behaviour of columns, slabs and crystals
Whiteley, Walter
2014-01-01
Between the study of small finite frameworks and infinite incidentally periodic frameworks, we find the real materials which are large, but finite, fragments that fit into the infinite periodic frameworks. To understand these materials, we seek insights from both (i) their analysis as large frameworks with associated geometric and combinatorial properties (including the geometric repetitions) and (ii) embedding them into appropriate infinite periodic structures with motions that may break the periodic structure. A review of real materials identifies a number of examples with a local appearance of ‘unit cells’ which repeat under isometries but perhaps in unusual forms. These examples also refocus attention on several new classes of infinite ‘periodic’ frameworks: (i) columns—three-dimensional structures generated with one repeating isometry and (ii) slabs—three-dimensional structures with two independent repeating translations. With this larger vision of structures to be studied, we find some patterns and partial results that suggest new conjectures as well as many additional open questions. These invite a search for new examples and additional theorems. PMID:24379423
Coastal and Marine Bird Data Base
Anderson, S.H.; Geissler, P.H.; Dawson, D.K.
1980-01-01
Summary: This report discusses the development of a coastal and marine bird data base at the Migratory Bird and Habitat Research Laboratory. The system is compared with other data bases, and suggestions for future development, such as possible adaptations for other taxonomic groups, are included. The data base is based on the Statistical Analysis System but includes extensions programmed in PL/I. The Appendix shows how the system evolved. Output examples are given for heron data and pelagic bird data which indicate the types of analyses that can be conducted and output figures. The Appendixes include a retrieval language user's guide and description of the retrieval process and listing of translator program.
NASA Technical Reports Server (NTRS)
Liu, Zhong; Ostrenga, Dana; Teng, William; Kempler, Steven; Milich, Lenard
2014-01-01
New online prototypes have been developed to extend and enhance the previous effort by facilitating investigation of product characteristics and intercomparison of precipitation products in different algorithms as well as in different versions at different spatial scales ranging from local to global without downloading data and software. Several popular Tropical Rainfall Measuring Mission (TRMM) products and the TRMM Composite Climatology are included. In addition, users can download customized data in several popular formats for further analysis. Examples show product quality problems and differences in several monthly precipitation products. It is seen that differences in daily and monthly precipitation products are distributed unevenly in space and it is necessary to have tools such as those presented here for customized and detailed investigations. A simple time series and two area maps allow the discovery of abnormal values of 3A25 in one of the months. An example shows a V-shaped valley issue in the Version 6 3B43 time series and another example shows a sudden drop in 3A25 monthly rain rate, all of which provide important information when the products are used for long-term trend studies. Future plans include adding more products and statistical functionality in the prototypes.
AstroBlend: An astrophysical visualization package for Blender
NASA Astrophysics Data System (ADS)
Naiman, J. P.
2016-04-01
The rapid growth in scale and complexity of both computational and observational astrophysics over the past decade necessitates efficient and intuitive methods for examining and visualizing large datasets. Here, I present AstroBlend, an open-source Python library for use within the three dimensional modeling software, Blender. While Blender has been a popular open-source software among animators and visual effects artists, in recent years it has also become a tool for visualizing astrophysical datasets. AstroBlend combines the three dimensional capabilities of Blender with the analysis tools of the widely used astrophysical toolset, yt, to afford both computational and observational astrophysicists the ability to simultaneously analyze their data and create informative and appealing visualizations. The introduction of this package includes a description of features, work flow, and various example visualizations. A website - www.astroblend.com - has been developed which includes tutorials, and a gallery of example images and movies, along with links to downloadable data, three dimensional artistic models, and various other resources.
Toye, Francine; Williamson, Esther; Williams, Mark A; Fairbank, Jeremy; Lamb, Sarah E
2016-08-09
Using an example of qualitative research embedded in a non-surgical feasibility trial, we explore the benefits of including qualitative research in trial design and reflect on epistemological challenges. We interviewed 18 trial participants and used methods of Interpretive Phenomenological Analysis. Our findings demonstrate that qualitative research can make a valuable contribution by allowing trial stakeholders to see things from alternative perspectives. Specifically, it can help to make specific recommendations for improved trial design, generate questions which contextualize findings, and also explore disease experience beyond the trial. To make the most out of qualitative research embedded in quantitative design it would be useful to (a) agree specific qualitative study aims that underpin research design, (b) understand the impact of differences in epistemological truth claims, (c) provide clear thematic interpretations for trial researchers to utilize, and (d) include qualitative findings that explore experience beyond the trial setting within the impact plan. © The Author(s) 2016.
Dy, Sydney M; Purnell, Tanjala S
2012-02-01
High-quality provider-patient decision-making is key to quality care for complex conditions. We performed an analysis of key elements relevant to quality and complex, shared medical decision-making. Based on a search of electronic databases, including Medline and the Cochrane Library, as well as relevant articles' reference lists, reviews of tools, and annotated bibliographies, we developed a list of key concepts and applied them to a decision-making example. Key concepts identified included provider competence, trustworthiness, and cultural competence; communication with patients and families; information quality; patient/surrogate competence; and roles and involvement. We applied this concept list to a case example, shared decision-making for live donor kidney transplantation, and identified the likely most important concepts as provider and cultural competence, information quality, and communication with patients and families. This concept list may be useful for conceptualizing the quality of complex shared decision-making and in guiding research in this area. Copyright © 2011 Elsevier Ltd. All rights reserved.
Progressive Fracture of Composite Structures
NASA Technical Reports Server (NTRS)
Minnetyan, Levon
2001-01-01
This report includes the results of a research in which the COmposite Durability STRuctural ANalysis (CODSTRAN) computational simulation capabilities were augmented and applied to various structures for demonstration of the new features and verification. The first chapter of this report provides an introduction to the computational simulation or virtual laboratory approach for the assessment of damage and fracture progression characteristics in composite structures. The second chapter outlines the details of the overall methodology used, including the failure criteria and the incremental/iterative loading procedure with the definitions of damage, fracture, and equilibrium states. The subsequent chapters each contain an augmented feature of the code and/or demonstration examples. All but one of the presented examples contains laminated composite structures with various fiber/matrix constituents. For each structure simulated, damage initiation and progression mechanisms are identified and the structural damage tolerance is quantified at various degradation stages. Many chapters contain the simulation of defective and defect free structures to evaluate the effects of existing defects on structural durability.
Error Propagation in a System Model
NASA Technical Reports Server (NTRS)
Schloegel, Kirk (Inventor); Bhatt, Devesh (Inventor); Oglesby, David V. (Inventor); Madl, Gabor (Inventor)
2015-01-01
Embodiments of the present subject matter can enable the analysis of signal value errors for system models. In an example, signal value errors can be propagated through the functional blocks of a system model to analyze possible effects as the signal value errors impact incident functional blocks. This propagation of the errors can be applicable to many models of computation including avionics models, synchronous data flow, and Kahn process networks.
Air Force Systems Engineering Assessment Model (AF SEAM) Management Guide, Version 2
2010-09-21
gleaned from experienced professionals who assisted with the model’s development. Examples of the references used include the following: • ISO /IEC...Defense Acquisition Guidebook, Chapter 4 • AFI 63-1201, Life Cycle Systems Engineering • IEEE/EIA 12207 , Software Life Cycle Processes • Air...Selection criteria Reference Material: IEEE/EIA 12207 , MIL-HDBK-514 Other Considerations: Modeling, simulation and analysis techniques can be
The Global Financial Crisis: Analysis and Policy Implications
2009-10-02
financial institutions, as well as government capital injections and loans to private corporations have become parts of rescue and stimulus packages and...postponement of corporate tax increases, government guarantees for loans to small and midsize businesses, spending on public works, including public...measures to assist specific industries or firms. For example, the government reduced the corporate tax rate from 24% to 20% and the tax rate on small
The Global Financial Crisis: Analysis and Policy Implications
2009-08-21
extent the U.S. government and Federal Reserve as “domestic lenders of last resort” should intervene in the day-to-day activities of corporations ...postponement of corporate tax increases, government guarantees for loans to small and midsize businesses, spending on public works, including public housing...rather than measures to assist specific industries or firms. For example, the government reduced the corporate tax rate from 24% to 20% and the tax
Maritime strategy and the nuclear age: Second edition
DOE Office of Scientific and Technical Information (OSTI.GOV)
Till, G.
1984-01-01
This book offers an examination of the issues and theories that underlie contemporary maritime strategy. The author provides a description of the historical evolution of maritime strategy including an analysis of the works of Mahan, Columb and Corbett; assesses the impact that current political, technological and legal developments will have on the world's navies; and discusses contemporary American and Soviet maritime theory citing practical examples from recent naval events world-wide.
Experimental and Computational Analysis of a Miniature Ramjet at Mach 4.0
2013-09-01
this collection of information is estimated to average 1 hour per response, including the time for reviewing instruction, searching existing data...sources, gathering and maintaining the data needed, and completing and reviewing the collection of information. Send comments regarding this burden...intermittent after the second World War, with the most well-known example being Lockheed Martin’s SR-71 Blackbird using the Pratt & Whitney J58 turbojet
Analysis of the Salvation Army World Service Offices Disaster Relief Capabilities
2017-03-01
AOR based primarily on their financial revenues, since revenue is a prerequisite enabling mechanism for the delivery of goods and services . The...are from government sources, whereas contributions include cash and dollar value of in-kind services and goods . Investment revenues largely consist of...taking action among those most in need of assistance offers a compelling and admirable example of the good a religious organization can accomplish
ERIC Educational Resources Information Center
Rook, Michael M.
2018-01-01
The author presents a three-step process for selecting participants for any study of a social phenomenon that occurs between people in locations and at times that are difficult to observe. The process is described with illustrative examples from a previous study of help giving in a community of learners. This paper includes a rationale for…
Approximation and Numerical Analysis of Nonlinear Equations of Evolution.
1980-01-31
dominant convective terms, or Stefan type problems such as the flow of fluids through porous media or the melting and freezing of ice. Such problems...means of formulating time-dependent Stefan problems was initiated. Classes of problems considered here include the one-phase and two-phase Stefan ...some new numerical methods were 2 developed for two dimensional, two-phase Stefan problems with time dependent boundary conditions. A variety of example
Wireless Emergency Alerts (WEA) Cybersecurity Risk Management Strategy for Alert Originators
2014-03-01
formerly known as the Commercial Mobile Alert Service ( CMAS ) RDT&E program, is a collaborative partnership that includes the cellular industry, the...Examples illustrate a STRIDE analysis of the generic mission 1 The CMAS Alerting Pipeline Taxonomy describes in detail a hierarchical classification...SEI-2013-SR-018 | 1 1 Introduction The Wireless Emergency Alerts (WEA) service, formerly known as the Commercial Mobile Alert Service ( CMAS ), is a
How extractive industries affect health: Political economy underpinnings and pathways.
Schrecker, Ted; Birn, Anne-Emanuelle; Aguilera, Mariajosé
2018-06-07
A systematic and theoretically informed analysis of how extractive industries affect health outcomes and health inequities is overdue. Informed by the work of Saskia Sassen on "logics of extraction," we adopt an expansive definition of extractive industries to include (for example) large-scale foreign acquisitions of agricultural land for export production. To ground our analysis in concrete place-based evidence, we begin with a brief review of four case examples of major extractive activities. We then analyze the political economy of extractivism, focusing on the societal structures, processes, and relationships of power that drive and enable extraction. Next, we examine how this global order shapes and interacts with politics, institutions, and policies at the state/national level contextualizing extractive activity. Having provided necessary context, we posit a set of pathways that link the global political economy and national politics and institutional practices surrounding extraction to health outcomes and their distribution. These pathways involve both direct health effects, such as toxic work and environmental exposures and assassination of activists, and indirect effects, including sustained impoverishment, water insecurity, and stress-related ailments. We conclude with some reflections on the need for future research on the health and health equity implications of the global extractive order. Copyright © 2018 The Authors. Published by Elsevier Ltd.. All rights reserved.
Shelton, Rachel C; Colgrove, James; Lee, Grace; Truong, Michelle; Wingood, Gina M
2017-04-01
We conducted a content analysis of public comments to understand the key framing approaches used by private industry v. public health sector, with the goal of informing future public health messaging, framing and advocacy in the context of policy making. Comments to the proposed menu-labelling policy were extracted from Regulations.gov and analysed. A framing matrix was used to organize and code key devices and themes. Documents were analysed using content analysis with Dedoose software. Recent national nutrition-labelling regulations in the USA provide a timely opportunity to understand message framing in relation to obesity prevention and policy. We examined a total of ninety-seven documents submitted on behalf of organizations (private industry, n 64; public health, n 33). Public health focused on positive health consequences of the policy, used a social justice frame and supported its arguments with academic data. Industry was more critical of the policy; it used a market justice frame that emphasized minimal regulation, depicted its members as small, family-run businesses, and illustrated points with humanizing examples. Public health framing should counter and consider engaging directly with non-health-related arguments made by industry. Public health should include more powerful framing devices to convey their messages, including metaphors and humanizing examples.
Structural bioinformatics of the human spliceosomal proteome
Korneta, Iga; Magnus, Marcin; Bujnicki, Janusz M.
2012-01-01
In this work, we describe the results of a comprehensive structural bioinformatics analysis of the spliceosomal proteome. We used fold recognition analysis to complement prior data on the ordered domains of 252 human splicing proteins. Examples of newly identified domains include a PWI domain in the U5 snRNP protein 200K (hBrr2, residues 258–338), while examples of previously known domains with a newly determined fold include the DUF1115 domain of the U4/U6 di-snRNP protein 90K (hPrp3, residues 540–683). We also established a non-redundant set of experimental models of spliceosomal proteins, as well as constructed in silico models for regions without an experimental structure. The combined set of structural models is available for download. Altogether, over 90% of the ordered regions of the spliceosomal proteome can be represented structurally with a high degree of confidence. We analyzed the reduced spliceosomal proteome of the intron-poor organism Giardia lamblia, and as a result, we proposed a candidate set of ordered structural regions necessary for a functional spliceosome. The results of this work will aid experimental and structural analyses of the spliceosomal proteins and complexes, and can serve as a starting point for multiscale modeling of the structure of the entire spliceosome. PMID:22573172
NASA Astrophysics Data System (ADS)
Abellan, A.; Carrea, D.; Jaboyedoff, M.; Riquelme, A.; Tomas, R.; Royan, M. J.; Vilaplana, J. M.; Gauvin, N.
2014-12-01
The acquisition of dense terrain information using well-established 3D techniques (e.g. LiDAR, photogrammetry) and the use of new mobile platforms (e.g. Unmanned Aerial Vehicles) together with the increasingly efficient post-processing workflows for image treatment (e.g. Structure From Motion) are opening up new possibilities for analysing, modeling and predicting rock slope failures. Examples of applications at different scales ranging from the monitoring of small changes at unprecedented level of detail (e.g. sub millimeter-scale deformation under lab-scale conditions) to the detection of slope deformation at regional scale. In this communication we will show the main accomplishments of the Swiss National Foundation project "Characterizing and analysing 3D temporal slope evolution" carried out at Risk Analysis group (Univ. of Lausanne) in close collaboration with the RISKNAT and INTERES groups (Univ. of Barcelona and Univ. of Alicante, respectively). We have recently developed a series of innovative approaches for rock slope analysis using 3D point clouds, some examples include: the development of semi-automatic methodologies for the identification and extraction of rock-slope features such as discontinuities, type of material, rockfalls occurrence and deformation. Moreover, we have been improving our knowledge in progressive rupture characterization thanks to several algorithms, some examples include the computing of 3D deformation, the use of filtering techniques on permanently based TLS, the use of rock slope failure analogies at different scales (laboratory simulations, monitoring at glacier's front, etc.), the modelling of the influence of external forces such as precipitation on the acceleration of the deformation rate, etc. We have also been interested on the analysis of rock slope deformation prior to the occurrence of fragmental rockfalls and the interaction of this deformation with the spatial location of future events. In spite of these recent advances, a great challenge still remains in the development of new algorithms for more accurate techniques for 3D point cloud treatment (e.g. filtering, segmentation, etc.) aiming to improve rock slope characterization and monitoring, a series of exciting research findings are expected in the forthcoming years.
Meta-Analysis in Higher Education: An Illustrative Example Using Hierarchical Linear Modeling
ERIC Educational Resources Information Center
Denson, Nida; Seltzer, Michael H.
2011-01-01
The purpose of this article is to provide higher education researchers with an illustrative example of meta-analysis utilizing hierarchical linear modeling (HLM). This article demonstrates the step-by-step process of meta-analysis using a recently-published study examining the effects of curricular and co-curricular diversity activities on racial…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lutes, Robert G.; Neubauer, Casey C.; Haack, Jereme N.
2015-03-31
The Department of Energy’s (DOE’s) Building Technologies Office (BTO) is supporting the development of an open-source software tool for analyzing building energy and operational data: OpenEIS (open energy information system). This tool addresses the problems of both owners of building data and developers of tools to analyze this data. Building owners and managers have data but lack the tools to analyze it while tool developers lack data in a common format to ease development of reusable data analysis tools. This document is intended for developers of applications and explains the mechanisms for building analysis applications, accessing data, and displaying datamore » using a visualization from the included library. A brief introduction to the visualizations can be used as a jumping off point for developers familiar with JavaScript to produce their own. Several example applications are included which can be used along with this document to implement algorithms for performing energy data analysis.« less
Anderson-Cook, Christine Michaela
2017-03-01
Here, one of the substantial improvements to the practice of data analysis in recent decades is the change from reporting just a point estimate for a parameter or characteristic, to now including a summary of uncertainty for that estimate. Understanding the precision of the estimate for the quantity of interest provides better understanding of what to expect and how well we are able to predict future behavior from the process. For example, when we report a sample average as an estimate of the population mean, it is good practice to also provide a confidence interval (or credible interval, if youmore » are doing a Bayesian analysis) to accompany that summary. This helps to calibrate what ranges of values are reasonable given the variability observed in the sample and the amount of data that were included in producing the summary.« less
Analysis of continuous-time switching networks
NASA Astrophysics Data System (ADS)
Edwards, R.
2000-11-01
Models of a number of biological systems, including gene regulation and neural networks, can be formulated as switching networks, in which the interactions between the variables depend strongly on thresholds. An idealized class of such networks in which the switching takes the form of Heaviside step functions but variables still change continuously in time has been proposed as a useful simplification to gain analytic insight. These networks, called here Glass networks after their originator, are simple enough mathematically to allow significant analysis without restricting the range of dynamics found in analogous smooth systems. A number of results have been obtained before, particularly regarding existence and stability of periodic orbits in such networks, but important cases were not considered. Here we present a coherent method of analysis that summarizes previous work and fills in some of the gaps as well as including some new results. Furthermore, we apply this analysis to a number of examples, including surprising long and complex limit cycles involving sequences of hundreds of threshold transitions. Finally, we show how the above methods can be extended to investigate aperiodic behaviour in specific networks, though a complete analysis will have to await new results in matrix theory and symbolic dynamics.
Advanced Software for Analysis of High-Speed Rolling-Element Bearings
NASA Technical Reports Server (NTRS)
Poplawski, J. V.; Rumbarger, J. H.; Peters, S. M.; Galatis, H.; Flower, R.
2003-01-01
COBRA-AHS is a package of advanced software for analysis of rigid or flexible shaft systems supported by rolling-element bearings operating at high speeds under complex mechanical and thermal loads. These loads can include centrifugal and thermal loads generated by motions of bearing components. COBRA-AHS offers several improvements over prior commercial bearing-analysis programs: It includes innovative probabilistic fatigue-life-estimating software that provides for computation of three-dimensional stress fields and incorporates stress-based (in contradistinction to prior load-based) mathematical models of fatigue life. It interacts automatically with the ANSYS finite-element code to generate finite-element models for estimating distributions of temperature and temperature-induced changes in dimensions in iterative thermal/dimensional analyses: thus, for example, it can be used to predict changes in clearances and thermal lockup. COBRA-AHS provides an improved graphical user interface that facilitates the iterative cycle of analysis and design by providing analysis results quickly in graphical form, enabling the user to control interactive runs without leaving the program environment, and facilitating transfer of plots and printed results for inclusion in design reports. Additional features include roller-edge stress prediction and influence of shaft and housing distortion on bearing performance.
SOCR Analyses - an Instructional Java Web-based Statistical Analysis Toolkit.
Chu, Annie; Cui, Jenny; Dinov, Ivo D
2009-03-01
The Statistical Online Computational Resource (SOCR) designs web-based tools for educational use in a variety of undergraduate courses (Dinov 2006). Several studies have demonstrated that these resources significantly improve students' motivation and learning experiences (Dinov et al. 2008). SOCR Analyses is a new component that concentrates on data modeling and analysis using parametric and non-parametric techniques supported with graphical model diagnostics. Currently implemented analyses include commonly used models in undergraduate statistics courses like linear models (Simple Linear Regression, Multiple Linear Regression, One-Way and Two-Way ANOVA). In addition, we implemented tests for sample comparisons, such as t-test in the parametric category; and Wilcoxon rank sum test, Kruskal-Wallis test, Friedman's test, in the non-parametric category. SOCR Analyses also include several hypothesis test models, such as Contingency tables, Friedman's test and Fisher's exact test.The code itself is open source (http://socr.googlecode.com/), hoping to contribute to the efforts of the statistical computing community. The code includes functionality for each specific analysis model and it has general utilities that can be applied in various statistical computing tasks. For example, concrete methods with API (Application Programming Interface) have been implemented in statistical summary, least square solutions of general linear models, rank calculations, etc. HTML interfaces, tutorials, source code, activities, and data are freely available via the web (www.SOCR.ucla.edu). Code examples for developers and demos for educators are provided on the SOCR Wiki website.In this article, the pedagogical utilization of the SOCR Analyses is discussed, as well as the underlying design framework. As the SOCR project is on-going and more functions and tools are being added to it, these resources are constantly improved. The reader is strongly encouraged to check the SOCR site for most updated information and newly added models.
Parameter-expanded data augmentation for Bayesian analysis of capture-recapture models
Royle, J. Andrew; Dorazio, Robert M.
2012-01-01
Data augmentation (DA) is a flexible tool for analyzing closed and open population models of capture-recapture data, especially models which include sources of hetereogeneity among individuals. The essential concept underlying DA, as we use the term, is based on adding "observations" to create a dataset composed of a known number of individuals. This new (augmented) dataset, which includes the unknown number of individuals N in the population, is then analyzed using a new model that includes a reformulation of the parameter N in the conventional model of the observed (unaugmented) data. In the context of capture-recapture models, we add a set of "all zero" encounter histories which are not, in practice, observable. The model of the augmented dataset is a zero-inflated version of either a binomial or a multinomial base model. Thus, our use of DA provides a general approach for analyzing both closed and open population models of all types. In doing so, this approach provides a unified framework for the analysis of a huge range of models that are treated as unrelated "black boxes" and named procedures in the classical literature. As a practical matter, analysis of the augmented dataset by MCMC is greatly simplified compared to other methods that require specialized algorithms. For example, complex capture-recapture models of an augmented dataset can be fitted with popular MCMC software packages (WinBUGS or JAGS) by providing a concise statement of the model's assumptions that usually involves only a few lines of pseudocode. In this paper, we review the basic technical concepts of data augmentation, and we provide examples of analyses of closed-population models (M 0, M h , distance sampling, and spatial capture-recapture models) and open-population models (Jolly-Seber) with individual effects.
Blom, H; Gösch, M
2004-04-01
The past few years we have witnessed a tremendous surge of interest in so-called array-based miniaturised analytical systems due to their value as extremely powerful tools for high-throughput sequence analysis, drug discovery and development, and diagnostic tests in medicine (see articles in Issue 1). Terminologies that have been used to describe these array-based bioscience systems include (but are not limited to): DNA-chip, microarrays, microchip, biochip, DNA-microarrays and genome chip. Potential technological benefits of introducing these miniaturised analytical systems include improved accuracy, multiplexing, lower sample and reagent consumption, disposability, and decreased analysis times, just to mention a few examples. Among the many alternative principles of detection-analysis (e.g.chemiluminescence, electroluminescence and conductivity), fluorescence-based techniques are widely used, examples being fluorescence resonance energy transfer, fluorescence quenching, fluorescence polarisation, time-resolved fluorescence, and fluorescence fluctuation spectroscopy (see articles in Issue 11). Time-dependent fluctuations of fluorescent biomolecules with different molecular properties, like molecular weight, translational and rotational diffusion time, colour and lifetime, potentially provide all the kinetic and thermodynamic information required in analysing complex interactions. In this mini-review article, we present recent extensions aimed to implement parallel laser excitation and parallel fluorescence detection that can lead to even further increase in throughput in miniaturised array-based analytical systems. We also report on developments and characterisations of multiplexing extension that allow multifocal laser excitation together with matched parallel fluorescence detection for parallel confocal dynamical fluorescence fluctuation studies at the single biomolecule level.
Combining historical and geomorphological information to investigate earthquake induced landslides
NASA Astrophysics Data System (ADS)
Cardinali, M.; Ferrari, G.; Galli, M.; Guidoboni, E.; Guzzetti, F.
2003-04-01
Landslides are caused by many different triggers, including earthquakes. In Italy, a detailed new generation catalogue of information on historical earthquakes for the period 461 B.C to 1997 is available (Catalogue of Strong Italian Earthquakes from 461 B.C. to 1997, ING-SGA 2000). The catalogue lists 548 earthquakes and provides information on a total of about 450 mass-movements triggered by 118 seismic events. The information on earthquake-induced landslides listed in the catalogue was obtained through the careful scrutiny of historical documents and chronicles, but was rarely checked in the field. We report on an attempt to combine the available historical information on landslides caused by earthquakes with standard geomorphological techniques, including the interpretation of aerial photographs and field surveys, to better determine the location, type and distribution of seismically induced historical slope failures. We present four examples in the Central Apennines. The first example describes a rock slide triggered by the 1279 April 30 Umbria-Marche Apennines earthquake (Io = IX) at Serravalle, along the Chienti River (Central Italy). The landslide is the oldest known earthquake-induced slope failure in Italy. The second example describes the location of 2 large landslides triggered by the 1584 September 10 earthquake (Io = IX) at San Piero in Bagno, along the Savio River (Northern Italy). The landslides were subsequently largely modified by mass movements occurred on 1855 making the recognition of the original seismically induced failures difficult, if not impossible. In the third example we present the geographical distribution of the available information on landslide events triggered by 8 earthquakes in Central Valnerina, in the period 1703 to 1979. A comparison with the location of landslides triggered by the September-October 1997 Umbria-Marche earthquake sequence is presented. The fourth example describes the geographical distribution of the available information on landslides triggered by the great 1915 January 13 Marsica (Central Italy) earthquake (Io = XI) mostly along the Liri River valley. Problems encountered in matching the recent historical information with the local geomorphological setting are discussed. A critical analysis of the four studied examples allows general considerations on the advantages and limitations of a combined historical and geomorphological approach to investigate past earthquake induced landslides. Lastly, a preliminary analysis of the relationship between the earthquake intensity and the distance of the known slope failures to the triggering earthquake epicentres is presented, for the four investigated areas and for the entire catalogue of historical earthquakes.
Representation of scientific methodology in secondary science textbooks
NASA Astrophysics Data System (ADS)
Binns, Ian C.
The purpose of this investigation was to assess the representation of scientific methodology in secondary science textbooks. More specifically, this study looked at how textbooks introduced scientific methodology and to what degree the examples from the rest of the textbook, the investigations, and the images were consistent with the text's description of scientific methodology, if at all. The sample included eight secondary science textbooks from two publishers, McGraw-Hill/Glencoe and Harcourt/Holt, Rinehart & Winston. Data consisted of all student text and teacher text that referred to scientific methodology. Second, all investigations in the textbooks were analyzed. Finally, any images that depicted scientists working were also collected and analyzed. The text analysis and activity analysis used the ethnographic content analysis approach developed by Altheide (1996). The rubrics used for the text analysis and activity analysis were initially guided by the Benchmarks (AAAS, 1993), the NSES (NRC, 1996), and the nature of science literature. Preliminary analyses helped to refine each of the rubrics and grounded them in the data. Image analysis used stereotypes identified in the DAST literature. Findings indicated that all eight textbooks presented mixed views of scientific methodology in their initial descriptions. Five textbooks placed more emphasis on the traditional view and three placed more emphasis on the broad view. Results also revealed that the initial descriptions, examples, investigations, and images all emphasized the broad view for Glencoe Biology and the traditional view for Chemistry: Matter and Change. The initial descriptions, examples, investigations, and images in the other six textbooks were not consistent. Overall, the textbook with the most appropriate depiction of scientific methodology was Glencoe Biology and the textbook with the least appropriate depiction of scientific methodology was Physics: Principles and Problems. These findings suggest that compared to earlier investigations, textbooks have begun to improve in how they represent scientific methodology. However, there is still much room for improvement. Future research needs to consider how textbooks impact teachers' and students' understandings of scientific methodology.
Bujkiewicz, Sylwia; Riley, Richard D
2016-01-01
Multivariate random-effects meta-analysis allows the joint synthesis of correlated results from multiple studies, for example, for multiple outcomes or multiple treatment groups. In a Bayesian univariate meta-analysis of one endpoint, the importance of specifying a sensible prior distribution for the between-study variance is well understood. However, in multivariate meta-analysis, there is little guidance about the choice of prior distributions for the variances or, crucially, the between-study correlation, ρB; for the latter, researchers often use a Uniform(−1,1) distribution assuming it is vague. In this paper, an extensive simulation study and a real illustrative example is used to examine the impact of various (realistically) vague prior distributions for ρB and the between-study variances within a Bayesian bivariate random-effects meta-analysis of two correlated treatment effects. A range of diverse scenarios are considered, including complete and missing data, to examine the impact of the prior distributions on posterior results (for treatment effect and between-study correlation), amount of borrowing of strength, and joint predictive distributions of treatment effectiveness in new studies. Two key recommendations are identified to improve the robustness of multivariate meta-analysis results. First, the routine use of a Uniform(−1,1) prior distribution for ρB should be avoided, if possible, as it is not necessarily vague. Instead, researchers should identify a sensible prior distribution, for example, by restricting values to be positive or negative as indicated by prior knowledge. Second, it remains critical to use sensible (e.g. empirically based) prior distributions for the between-study variances, as an inappropriate choice can adversely impact the posterior distribution for ρB, which may then adversely affect inferences such as joint predictive probabilities. These recommendations are especially important with a small number of studies and missing data. PMID:26988929
Davidson, Natalie R; Godfrey, Keith R; Alquaddoomi, Faisal; Nola, David; DiStefano, Joseph J
2017-05-01
We describe and illustrate use of DISTING, a novel web application for computing alternative structurally identifiable linear compartmental models that are input-output indistinguishable from a postulated linear compartmental model. Several computer packages are available for analysing the structural identifiability of such models, but DISTING is the first to be made available for assessing indistinguishability. The computational algorithms embedded in DISTING are based on advanced versions of established geometric and algebraic properties of linear compartmental models, embedded in a user-friendly graphic model user interface. Novel computational tools greatly speed up the overall procedure. These include algorithms for Jacobian matrix reduction, submatrix rank reduction, and parallelization of candidate rank computations in symbolic matrix analysis. The application of DISTING to three postulated models with respectively two, three and four compartments is given. The 2-compartment example is used to illustrate the indistinguishability problem; the original (unidentifiable) model is found to have two structurally identifiable models that are indistinguishable from it. The 3-compartment example has three structurally identifiable indistinguishable models. It is found from DISTING that the four-compartment example has five structurally identifiable models indistinguishable from the original postulated model. This example shows that care is needed when dealing with models that have two or more compartments which are neither perturbed nor observed, because the numbering of these compartments may be arbitrary. DISTING is universally and freely available via the Internet. It is easy to use and circumvents tedious and complicated algebraic analysis previously done by hand. Copyright © 2017 Elsevier B.V. All rights reserved.
Sources of Infrasound events listed in IDC Reviewed Event Bulletin
NASA Astrophysics Data System (ADS)
Bittner, Paulina; Polich, Paul; Gore, Jane; Ali, Sherif; Medinskaya, Tatiana; Mialle, Pierrick
2017-04-01
Until 2003 two waveform technologies, i.e. seismic and hydroacoustic were used to detect and locate events included in the International Data Centre (IDC) Reviewed Event Bulletin (REB). The first atmospheric event was published in the REB in 2003, however automatic processing required significant improvements to reduce the number of false events. In the beginning of 2010 the infrasound technology was reintroduced to the IDC operations and has contributed to both automatic and reviewed IDC bulletins. The primary contribution of infrasound technology is to detect atmospheric events. These events may also be observed at seismic stations, which will significantly improve event location. Examples sources of REB events, which were detected by the International Monitoring System (IMS) infrasound network were fireballs (e.g. Bangkok fireball, 2015), volcanic eruptions (e.g. Calbuco, Chile 2015) and large surface explosions (e.g. Tjanjin, China 2015). Query blasts (e.g. Zheleznogorsk) and large earthquakes (e.g. Italy 2016) belong to events primarily recorded at seismic stations of the IMS network but often detected at the infrasound stations. In case of earthquakes analysis of infrasound signals may help to estimate the area affected by ground vibration. Infrasound associations to query blast events may help to obtain better source location. The role of IDC analysts is to verify and improve location of events detected by the automatic system and to add events which were missed in the automatic process. Open source materials may help to identify nature of some events. Well recorded examples may be added to the Reference Infrasound Event Database to help in analysis process. This presentation will provide examples of events generated by different sources which were included in the IDC bulletins.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yu, Y. Q.; Shemon, E. R.; Mahadevan, Vijay S.
SHARP, developed under the NEAMS Reactor Product Line, is an advanced modeling and simulation toolkit for the analysis of advanced nuclear reactors. SHARP is comprised of three physics modules currently including neutronics, thermal hydraulics, and structural mechanics. SHARP empowers designers to produce accurate results for modeling physical phenomena that have been identified as important for nuclear reactor analysis. SHARP can use existing physics codes and take advantage of existing infrastructure capabilities in the MOAB framework and the coupling driver/solver library, the Coupled Physics Environment (CouPE), which utilizes the widely used, scalable PETSc library. This report aims at identifying the coupled-physicsmore » simulation capability of SHARP by introducing the demonstration example called sahex in advance of the SHARP release expected by Mar 2016. sahex consists of 6 fuel pins with cladding, 1 control rod, sodium coolant and an outer duct wall that encloses all the other components. This example is carefully chosen to demonstrate the proof of concept for solving more complex demonstration examples such as EBR II assembly and ABTR full core. The workflow of preparing the input files, running the case and analyzing the results is demonstrated in this report. Moreover, an extension of the sahex model called sahex_core, which adds six homogenized neighboring assemblies to the full heterogeneous sahex model, is presented to test homogenization capabilities in both Nek5000 and PROTEUS. Some primary information on the configuration and build aspects for the SHARP toolkit, which includes capability to auto-download dependencies and configure/install with optimal flags in an architecture-aware fashion, is also covered by this report. A step-by-step instruction is provided to help users to create their cases. Details on these processes will be provided in the SHARP user manual that will accompany the first release.« less
Klein, Johannes; Leupold, Stefan; Biegler, Ilona; Biedendieck, Rebekka; Münch, Richard; Jahn, Dieter
2012-09-01
Time-lapse imaging in combination with fluorescence microscopy techniques enable the investigation of gene regulatory circuits and uncovered phenomena like culture heterogeneity. In this context, computational image processing for the analysis of single cell behaviour plays an increasing role in systems biology and mathematical modelling approaches. Consequently, we developed a software package with graphical user interface for the analysis of single bacterial cell behaviour. A new software called TLM-Tracker allows for the flexible and user-friendly interpretation for the segmentation, tracking and lineage analysis of microbial cells in time-lapse movies. The software package, including manual, tutorial video and examples, is available as Matlab code or executable binaries at http://www.tlmtracker.tu-bs.de.
Next-Generation Climate Modeling Science Challenges for Simulation, Workflow and Analysis Systems
NASA Astrophysics Data System (ADS)
Koch, D. M.; Anantharaj, V. G.; Bader, D. C.; Krishnan, H.; Leung, L. R.; Ringler, T.; Taylor, M.; Wehner, M. F.; Williams, D. N.
2016-12-01
We will present two examples of current and future high-resolution climate-modeling research that are challenging existing simulation run-time I/O, model-data movement, storage and publishing, and analysis. In each case, we will consider lessons learned as current workflow systems are broken by these large-data science challenges, as well as strategies to repair or rebuild the systems. First we consider the science and workflow challenges to be posed by the CMIP6 multi-model HighResMIP, involving around a dozen modeling groups performing quarter-degree simulations, in 3-member ensembles for 100 years, with high-frequency (1-6 hourly) diagnostics, which is expected to generate over 4PB of data. An example of science derived from these experiments will be to study how resolution affects the ability of models to capture extreme-events such as hurricanes or atmospheric rivers. Expected methods to transfer (using parallel Globus) and analyze (using parallel "TECA" software tools) HighResMIP data for such feature-tracking by the DOE CASCADE project will be presented. A second example will be from the Accelerated Climate Modeling for Energy (ACME) project, which is currently addressing challenges involving multiple century-scale coupled high resolution (quarter-degree) climate simulations on DOE Leadership Class computers. ACME is anticipating production of over 5PB of data during the next 2 years of simulations, in order to investigate the drivers of water cycle changes, sea-level-rise, and carbon cycle evolution. The ACME workflow, from simulation to data transfer, storage, analysis and publication will be presented. Current and planned methods to accelerate the workflow, including implementing run-time diagnostics, and implementing server-side analysis to avoid moving large datasets will be presented.
NASA Technical Reports Server (NTRS)
Krueger, Ronald
2012-01-01
The development of benchmark examples for quasi-static delamination propagation and cyclic delamination onset and growth prediction is presented and demonstrated for Abaqus/Standard. The example is based on a finite element model of a Double-Cantilever Beam specimen. The example is independent of the analysis software used and allows the assessment of the automated delamination propagation, onset and growth prediction capabilities in commercial finite element codes based on the virtual crack closure technique (VCCT). First, a quasi-static benchmark example was created for the specimen. Second, based on the static results, benchmark examples for cyclic delamination growth were created. Third, the load-displacement relationship from a propagation analysis and the benchmark results were compared, and good agreement could be achieved by selecting the appropriate input parameters. Fourth, starting from an initially straight front, the delamination was allowed to grow under cyclic loading. The number of cycles to delamination onset and the number of cycles during delamination growth for each growth increment were obtained from the automated analysis and compared to the benchmark examples. Again, good agreement between the results obtained from the growth analysis and the benchmark results could be achieved by selecting the appropriate input parameters. The benchmarking procedure proved valuable by highlighting the issues associated with choosing the input parameters of the particular implementation. Selecting the appropriate input parameters, however, was not straightforward and often required an iterative procedure. Overall the results are encouraging, but further assessment for mixed-mode delamination is required.
Development of Benchmark Examples for Static Delamination Propagation and Fatigue Growth Predictions
NASA Technical Reports Server (NTRS)
Kruger, Ronald
2011-01-01
The development of benchmark examples for static delamination propagation and cyclic delamination onset and growth prediction is presented and demonstrated for a commercial code. The example is based on a finite element model of an End-Notched Flexure (ENF) specimen. The example is independent of the analysis software used and allows the assessment of the automated delamination propagation, onset and growth prediction capabilities in commercial finite element codes based on the virtual crack closure technique (VCCT). First, static benchmark examples were created for the specimen. Second, based on the static results, benchmark examples for cyclic delamination growth were created. Third, the load-displacement relationship from a propagation analysis and the benchmark results were compared, and good agreement could be achieved by selecting the appropriate input parameters. Fourth, starting from an initially straight front, the delamination was allowed to grow under cyclic loading. The number of cycles to delamination onset and the number of cycles during stable delamination growth for each growth increment were obtained from the automated analysis and compared to the benchmark examples. Again, good agreement between the results obtained from the growth analysis and the benchmark results could be achieved by selecting the appropriate input parameters. The benchmarking procedure proved valuable by highlighting the issues associated with the input parameters of the particular implementation. Selecting the appropriate input parameters, however, was not straightforward and often required an iterative procedure. Overall, the results are encouraging but further assessment for mixed-mode delamination is required.
NASA Technical Reports Server (NTRS)
Krueger, Ronald
2011-01-01
The development of benchmark examples for static delamination propagation and cyclic delamination onset and growth prediction is presented and demonstrated for a commercial code. The example is based on a finite element model of an End-Notched Flexure (ENF) specimen. The example is independent of the analysis software used and allows the assessment of the automated delamination propagation, onset and growth prediction capabilities in commercial finite element codes based on the virtual crack closure technique (VCCT). First, static benchmark examples were created for the specimen. Second, based on the static results, benchmark examples for cyclic delamination growth were created. Third, the load-displacement relationship from a propagation analysis and the benchmark results were compared, and good agreement could be achieved by selecting the appropriate input parameters. Fourth, starting from an initially straight front, the delamination was allowed to grow under cyclic loading. The number of cycles to delamination onset and the number of cycles during delamination growth for each growth increment were obtained from the automated analysis and compared to the benchmark examples. Again, good agreement between the results obtained from the growth analysis and the benchmark results could be achieved by selecting the appropriate input parameters. The benchmarking procedure proved valuable by highlighting the issues associated with choosing the input parameters of the particular implementation. Selecting the appropriate input parameters, however, was not straightforward and often required an iterative procedure. Overall the results are encouraging, but further assessment for mixed-mode delamination is required.
Apel, William A.; Thompson, Vicki S; Lacey, Jeffrey A.; Gentillon, Cynthia A.
2016-08-09
A method for determining a plurality of proteins for discriminating and positively identifying an individual based from a biological sample. The method may include profiling a biological sample from a plurality of individuals against a protein array including a plurality of proteins. The protein array may include proteins attached to a support in a preselected pattern such that locations of the proteins are known. The biological sample may be contacted with the protein array such that a portion of antibodies in the biological sample reacts with and binds to the proteins forming immune complexes. A statistical analysis method, such as discriminant analysis, may be performed to determine discriminating proteins for distinguishing individuals. Proteins of interest may be used to form a protein array. Such a protein array may be used, for example, to compare a forensic sample from an unknown source with a sample from a known source.
Thompson, Vicki S; Lacey, Jeffrey A; Gentillon, Cynthia A; Apel, William A
2015-03-03
A method for determining a plurality of proteins for discriminating and positively identifying an individual based from a biological sample. The method may include profiling a biological sample from a plurality of individuals against a protein array including a plurality of proteins. The protein array may include proteins attached to a support in a preselected pattern such that locations of the proteins are known. The biological sample may be contacted with the protein array such that a portion of antibodies in the biological sample reacts with and binds to the proteins forming immune complexes. A statistical analysis method, such as discriminant analysis, may be performed to determine discriminating proteins for distinguishing individuals. Proteins of interest may be used to form a protein array. Such a protein array may be used, for example, to compare a forensic sample from an unknown source with a sample from a known source.
Probability of Failure Analysis Standards and Guidelines for Expendable Launch Vehicles
NASA Astrophysics Data System (ADS)
Wilde, Paul D.; Morse, Elisabeth L.; Rosati, Paul; Cather, Corey
2013-09-01
Recognizing the central importance of probability of failure estimates to ensuring public safety for launches, the Federal Aviation Administration (FAA), Office of Commercial Space Transportation (AST), the National Aeronautics and Space Administration (NASA), and U.S. Air Force (USAF), through the Common Standards Working Group (CSWG), developed a guide for conducting valid probability of failure (POF) analyses for expendable launch vehicles (ELV), with an emphasis on POF analysis for new ELVs. A probability of failure analysis for an ELV produces estimates of the likelihood of occurrence of potentially hazardous events, which are critical inputs to launch risk analysis of debris, toxic, or explosive hazards. This guide is intended to document a framework for POF analyses commonly accepted in the US, and should be useful to anyone who performs or evaluates launch risk analyses for new ELVs. The CSWG guidelines provide performance standards and definitions of key terms, and are being revised to address allocation to flight times and vehicle response modes. The POF performance standard allows a launch operator to employ alternative, potentially innovative methodologies so long as the results satisfy the performance standard. Current POF analysis practice at US ranges includes multiple methodologies described in the guidelines as accepted methods, but not necessarily the only methods available to demonstrate compliance with the performance standard. The guidelines include illustrative examples for each POF analysis method, which are intended to illustrate an acceptable level of fidelity for ELV POF analyses used to ensure public safety. The focus is on providing guiding principles rather than "recipe lists." Independent reviews of these guidelines were performed to assess their logic, completeness, accuracy, self- consistency, consistency with risk analysis practices, use of available information, and ease of applicability. The independent reviews confirmed the general validity of the performance standard approach and suggested potential updates to improve the accuracy each of the example methods, especially to address reliability growth.
Quantitative Medical Image Analysis for Clinical Development of Therapeutics
NASA Astrophysics Data System (ADS)
Analoui, Mostafa
There has been significant progress in development of therapeutics for prevention and management of several disease areas in recent years, leading to increased average life expectancy, as well as of quality of life, globally. However, due to complexity of addressing a number of medical needs and financial burden of development of new class of therapeutics, there is a need for better tools for decision making and validation of efficacy and safety of new compounds. Numerous biological markers (biomarkers) have been proposed either as adjunct to current clinical endpoints or as surrogates. Imaging biomarkers are among rapidly increasing biomarkers, being examined to expedite effective and rational drug development. Clinical imaging often involves a complex set of multi-modality data sets that require rapid and objective analysis, independent of reviewer's bias and training. In this chapter, an overview of imaging biomarkers for drug development is offered, along with challenges that necessitate quantitative and objective image analysis. Examples of automated and semi-automated analysis approaches are provided, along with technical review of such methods. These examples include the use of 3D MRI for osteoarthritis, ultrasound vascular imaging, and dynamic contrast enhanced MRI for oncology. Additionally, a brief overview of regulatory requirements is discussed. In conclusion, this chapter highlights key challenges and future directions in this area.
Interactive Visualization of Healthcare Data Using Tableau.
Ko, Inseok; Chang, Hyejung
2017-10-01
Big data analysis is receiving increasing attention in many industries, including healthcare. Visualization plays an important role not only in intuitively showing the results of data analysis but also in the whole process of collecting, cleaning, analyzing, and sharing data. This paper presents a procedure for the interactive visualization and analysis of healthcare data using Tableau as a business intelligence tool. Starting with installation of the Tableau Desktop Personal version 10.3, this paper describes the process of understanding and visualizing healthcare data using an example. The example data of colon cancer patients were obtained from health insurance claims in years 2012 and 2013, provided by the Health Insurance Review and Assessment Service. To explore the visualization of healthcare data using Tableau for beginners, this paper describes the creation of a simple view for the average length of stay of colon cancer patients. Since Tableau provides various visualizations and customizations, the level of analysis can be increased with small multiples, view filtering, mark cards, and Tableau charts. Tableau is a software that can help users explore and understand their data by creating interactive visualizations. The software has the advantages that it can be used in conjunction with almost any database, and it is easy to use by dragging and dropping to create an interactive visualization expressing the desired format.
NASA Technical Reports Server (NTRS)
Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.
1992-01-01
An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with engineering analysis to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in engineering analyses of failure phenomena, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which engineering analysis models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes, These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. Conventional engineering analysis models currently employed for design of failure prediction are used in this methodology. The PFA methodology is described and examples of its application are presented. Conventional approaches to failure risk evaluation for spaceflight systems are discussed, and the rationale for the approach taken in the PFA methodology is presented. The statistical methods, engineering models, and computer software used in fatigue failure mode applications are thoroughly documented.