Multidisciplinary Optimization Methods for Aircraft Preliminary Design
NASA Technical Reports Server (NTRS)
Kroo, Ilan; Altus, Steve; Braun, Robert; Gage, Peter; Sobieski, Ian
1994-01-01
This paper describes a research program aimed at improved methods for multidisciplinary design and optimization of large-scale aeronautical systems. The research involves new approaches to system decomposition, interdisciplinary communication, and methods of exploiting coarse-grained parallelism for analysis and optimization. A new architecture, that involves a tight coupling between optimization and analysis, is intended to improve efficiency while simplifying the structure of multidisciplinary, computation-intensive design problems involving many analysis disciplines and perhaps hundreds of design variables. Work in two areas is described here: system decomposition using compatibility constraints to simplify the analysis structure and take advantage of coarse-grained parallelism; and collaborative optimization, a decomposition of the optimization process to permit parallel design and to simplify interdisciplinary communication requirements.
Baxter, Susan; Muir, Delia; Brereton, Louise; Allmark, Christine; Barber, Rosemary; Harris, Lydia; Hodges, Brian; Khan, Samaira; Baird, Wendy
2016-01-01
The National Institute for Health Research (NIHR) Research Design Service (RDS) for Yorkshire and Humber has been running a public involvement funding scheme since 2008. This scheme awards researchers a small amount of money to help them get involvement from patients and/or the public. Involvement activities take place at the time when researchers are planning studies, and when they are completing application forms to request funding for a proposed research project. After the public involvement activities researchers are asked to write a report for the RDS describing what they did with the public involvement funding. This study analysed those reports using an approach which included members of a public involvement panel in the data analysis process. The aim of the work was to see what the views and experiences of researchers who received funding were, and what might be learned for the future of the scheme. Twenty five reports were analysed. Four main themes were identified, these described: the added value of public involvement; aspects to consider when planning and designing public involvement; different roles of public contributors; and aspects of valuing public member contributions. The group approach to analysis was successful in enabling involvement of a variety of individuals in the process. The findings of the study provide evidence of the value of public involvement during the development of applications for research funding. The results also indicate that researchers recognise the variety in potential roles for the public in research, and acknowledge how involvement adds value to studies. Background A regional Research Design Service, funded by the National Institute for Health Research, introduced a small grant in 2008, to support public involvement (often known as patient and public involvement [PPI]) activities during the development of applications for research funding. Successful applicants are requested to submit a report detailing how the grant money was used, including a description of the aims and outcomes of the public involvement activities. The purpose of this study was to analyse the content of these reports. We aimed to find out what researcher views and experiences of public involvement activities were, and what lessons might be learned. Methods We used an innovative method of data analysis, drawing on group participatory approaches, qualitative content analysis, and Framework Analysis to sort and label the content of the reports. We developed a framework of categories and sub-categories (or themes and sub-themes) from this process. Results Twenty five documents were analysed. Four main themes were identified in the data: the added value of public involvement; planning and designing involvement; the role of public members; and valuing public member contributions. Within these themes, sub-themes related to the timing of involvement (prior to the research study/intended during the research study), and also specific benefits of public involvement such as: validating ideas; ensuring appropriate outcomes; ensuring the acceptability of data collection methods/tools and advice regarding research processes. Other sub-themes related to: finding and approaching public members; timing of events; training/support; the format of sessions; setting up public involvement panels: use of public contributors in analysis and interpretation of data; and using public members to assist with dissemination and translation into practice. Conclusions The analysis of reports submitted by researchers following involvement events provides evidence of the value of public involvement during the development of applications for research funding, and details a method for involving members of the public in data analysis which could be of value to other researchers The findings of the analysis indicate recognition amongst researchers of the variety in potential roles for public members in research, and also an acknowledgement of how involvement adds value to studies.
Methods for Conducting Cognitive Task Analysis for a Decision Making Task.
1996-01-01
Cognitive task analysis (CTA) improves traditional task analysis procedures by analyzing the thought processes of performers while they complete a...for using these methods to conduct a CTA for domains which involve critical decision making tasks in naturalistic settings. The cognitive task analysis methods
A single-loop optimization method for reliability analysis with second order uncertainty
NASA Astrophysics Data System (ADS)
Xie, Shaojun; Pan, Baisong; Du, Xiaoping
2015-08-01
Reliability analysis may involve random variables and interval variables. In addition, some of the random variables may have interval distribution parameters owing to limited information. This kind of uncertainty is called second order uncertainty. This article develops an efficient reliability method for problems involving the three aforementioned types of uncertain input variables. The analysis produces the maximum and minimum reliability and is computationally demanding because two loops are needed: a reliability analysis loop with respect to random variables and an interval analysis loop for extreme responses with respect to interval variables. The first order reliability method and nonlinear optimization are used for the two loops, respectively. For computational efficiency, the two loops are combined into a single loop by treating the Karush-Kuhn-Tucker (KKT) optimal conditions of the interval analysis as constraints. Three examples are presented to demonstrate the proposed method.
Discovering and Analyzing Deviant Communities: Methods and Experiments
2014-10-01
analysis. Sinkholing . Sinkholing is the current method of choice for botnet analysis and defense [3]. In this approach, the analyst deceives bots into...from the bots to the botnet. There are several drawbacks to sinkholing and shutting down botnets. The biggest issue is the complexity and time...involved in conducting a sinkhol - ing campaign. Normally, sinkholing involves a coordinated effort from the analyst, ISPs, and law enforcement officials
ERIC Educational Resources Information Center
Coad, Jane; Evans, Ruth
2008-01-01
This article reflects on key methodological issues emerging from children and young people's involvement in data analysis processes. We outline a pragmatic framework illustrating different approaches to engaging children, using two case studies of children's experiences of participating in data analysis. The article highlights methods of…
Blasco, H; Błaszczyński, J; Billaut, J C; Nadal-Desbarats, L; Pradat, P F; Devos, D; Moreau, C; Andres, C R; Emond, P; Corcia, P; Słowiński, R
2015-02-01
Metabolomics is an emerging field that includes ascertaining a metabolic profile from a combination of small molecules, and which has health applications. Metabolomic methods are currently applied to discover diagnostic biomarkers and to identify pathophysiological pathways involved in pathology. However, metabolomic data are complex and are usually analyzed by statistical methods. Although the methods have been widely described, most have not been either standardized or validated. Data analysis is the foundation of a robust methodology, so new mathematical methods need to be developed to assess and complement current methods. We therefore applied, for the first time, the dominance-based rough set approach (DRSA) to metabolomics data; we also assessed the complementarity of this method with standard statistical methods. Some attributes were transformed in a way allowing us to discover global and local monotonic relationships between condition and decision attributes. We used previously published metabolomics data (18 variables) for amyotrophic lateral sclerosis (ALS) and non-ALS patients. Principal Component Analysis (PCA) and Orthogonal Partial Least Square-Discriminant Analysis (OPLS-DA) allowed satisfactory discrimination (72.7%) between ALS and non-ALS patients. Some discriminant metabolites were identified: acetate, acetone, pyruvate and glutamine. The concentrations of acetate and pyruvate were also identified by univariate analysis as significantly different between ALS and non-ALS patients. DRSA correctly classified 68.7% of the cases and established rules involving some of the metabolites highlighted by OPLS-DA (acetate and acetone). Some rules identified potential biomarkers not revealed by OPLS-DA (beta-hydroxybutyrate). We also found a large number of common discriminating metabolites after Bayesian confirmation measures, particularly acetate, pyruvate, acetone and ascorbate, consistent with the pathophysiological pathways involved in ALS. DRSA provides a complementary method for improving the predictive performance of the multivariate data analysis usually used in metabolomics. This method could help in the identification of metabolites involved in disease pathogenesis. Interestingly, these different strategies mostly identified the same metabolites as being discriminant. The selection of strong decision rules with high value of Bayesian confirmation provides useful information about relevant condition-decision relationships not otherwise revealed in metabolomics data. Copyright © 2014 Elsevier Inc. All rights reserved.
Stability analysis of flexible wind turbine blades using finite element method
NASA Technical Reports Server (NTRS)
Kamoulakos, A.
1982-01-01
Static vibration and flutter analysis of a straight elastic axis blade was performed based on a finite element method solution. The total potential energy functional was formulated according to linear beam theory. The inertia and aerodynamic loads were formulated according to the blade absolute acceleration and absolute velocity vectors. In vibration analysis, the direction of motion of the blade during the first out-of-lane and first in-plane modes was examined; numerical results involve NASA/DOE Mod-0, McCauley propeller, north wind turbine and flat plate behavior. In flutter analysis, comparison cases were examined involving several references. Vibration analysis of a nonstraight elastic axis blade based on a finite element method solution was performed in a similar manner with the straight elastic axis blade, since it was recognized that a curved blade can be approximated by an assembly of a sufficient number of straight blade elements at different inclinations with respect to common system of axes. Numerical results involve comparison between the behavior of a straight and a curved cantilever beam during the lowest two in-plane and out-of-plane modes.
Non-destructive ultrasonic measurements of case depth. [in steel
NASA Technical Reports Server (NTRS)
Flambard, C.; Lambert, A.
1978-01-01
Two ultrasonic methods for nondestructive measurements of the depth of a case-hardened layer in steel are described. One method involves analysis of ultrasonic waves diffused back from the bulk of the workpiece. The other method involves finding the speed of propagation of ultrasonic waves launched on the surface of the work. Procedures followed in the two methods for measuring case depth are described.
Diffraction as a Method of Critical Policy Analysis
ERIC Educational Resources Information Center
Ulmer, Jasmine B.
2016-01-01
Recent developments in critical policy analysis have occurred alongside the new materialisms in qualitative research. These lines of scholarship have unfolded along two separate, but related, tracks. In particular, the new materialist method of "diffraction" aligns with many elements of critical policy analysis. Both involve critical…
ERIC Educational Resources Information Center
Velastegui, Pamela J.
2013-01-01
This hypothesis-generating case study investigates the naturally emerging roles of technology brokers and technology leaders in three independent schools in New York involving 92 school educators. A multiple and mixed method design utilizing Social Network Analysis (SNA) and fuzzy set Qualitative Comparative Analysis (FSQCA) involved gathering…
Methods for assessing the stability of slopes during earthquakes-A retrospective
Jibson, R.W.
2011-01-01
During the twentieth century, several methods to assess the stability of slopes during earthquakes were developed. Pseudostatic analysis was the earliest method; it involved simply adding a permanent body force representing the earthquake shaking to a static limit-equilibrium analysis. Stress-deformation analysis, a later development, involved much more complex modeling of slopes using a mesh in which the internal stresses and strains within elements are computed based on the applied external loads, including gravity and seismic loads. Stress-deformation analysis provided the most realistic model of slope behavior, but it is very complex and requires a high density of high-quality soil-property data as well as an accurate model of soil behavior. In 1965, Newmark developed a method that effectively bridges the gap between these two types of analysis. His sliding-block model is easy to apply and provides a useful index of co-seismic slope performance. Subsequent modifications to sliding-block analysis have made it applicable to a wider range of landslide types. Sliding-block analysis provides perhaps the greatest utility of all the types of analysis. It is far easier to apply than stress-deformation analysis, and it yields much more useful information than does pseudostatic analysis. ?? 2010.
Two Strategies for Qualitative Content Analysis: An Intramethod Approach to Triangulation.
Renz, Susan M; Carrington, Jane M; Badger, Terry A
2018-04-01
The overarching aim of qualitative research is to gain an understanding of certain social phenomena. Qualitative research involves the studied use and collection of empirical materials, all to describe moments and meanings in individuals' lives. Data derived from these various materials require a form of analysis of the content, focusing on written or spoken language as communication, to provide context and understanding of the message. Qualitative research often involves the collection of data through extensive interviews, note taking, and tape recording. These methods are time- and labor-intensive. With the advances in computerized text analysis software, the practice of combining methods to analyze qualitative data can assist the researcher in making large data sets more manageable and enhance the trustworthiness of the results. This article will describe a novel process of combining two methods of qualitative data analysis, or Intramethod triangulation, as a means to provide a deeper analysis of text.
Quad-Tree Visual-Calculus Analysis of Satellite Coverage
NASA Technical Reports Server (NTRS)
Lo, Martin W.; Hockney, George; Kwan, Bruce
2003-01-01
An improved method of analysis of coverage of areas of the Earth by a constellation of radio-communication or scientific-observation satellites has been developed. This method is intended to supplant an older method in which the global-coverage-analysis problem is solved from a ground-to-satellite perspective. The present method provides for rapid and efficient analysis. This method is derived from a satellite-to-ground perspective and involves a unique combination of two techniques for multiresolution representation of map features on the surface of a sphere.
Airbreathing hypersonic vehicle design and analysis methods
NASA Technical Reports Server (NTRS)
Lockwood, Mary Kae; Petley, Dennis H.; Hunt, James L.; Martin, John G.
1996-01-01
The design, analysis, and optimization of airbreathing hypersonic vehicles requires analyses involving many highly coupled disciplines at levels of accuracy exceeding those traditionally considered in a conceptual or preliminary-level design. Discipline analysis methods including propulsion, structures, thermal management, geometry, aerodynamics, performance, synthesis, sizing, closure, and cost are discussed. Also, the on-going integration of these methods into a working environment, known as HOLIST, is described.
USDA-ARS?s Scientific Manuscript database
Traditionally, regulatory monitoring of veterinary drug residues in food animal tissues involves the use of several single-class methods to cover a wide analytical scope. Multiclass, multiresidue methods of analysis tend to provide greater overall laboratory efficiency than the use of multiple meth...
Investigating Convergence Patterns for Numerical Methods Using Data Analysis
ERIC Educational Resources Information Center
Gordon, Sheldon P.
2013-01-01
The article investigates the patterns that arise in the convergence of numerical methods, particularly those in the errors involved in successive iterations, using data analysis and curve fitting methods. In particular, the results obtained are used to convey a deeper level of understanding of the concepts of linear, quadratic, and cubic…
Low-Resolution Raman-Spectroscopy Combustion Thermometry
NASA Technical Reports Server (NTRS)
Nguyen, Quang-Viet; Kojima, Jun
2008-01-01
A method of optical thermometry, now undergoing development, involves low-resolution measurement of the spectrum of spontaneous Raman scattering (SRS) from N2 and O2 molecules. The method is especially suitable for measuring temperatures in high pressure combustion environments that contain N2, O2, or N2/O2 mixtures (including air). Methods based on SRS (in which scattered light is shifted in wavelength by amounts that depend on vibrational and rotational energy levels of laser-illuminated molecules) have been popular means of probing flames because they are almost the only methods that provide spatially and temporally resolved concentrations and temperatures of multiple molecular species in turbulent combustion. The present SRS-based method differs from prior SRS-based methods that have various drawbacks, a description of which would exceed the scope of this article. Two main differences between this and prior SRS-based methods are that it involves analysis in the frequency (equivalently, wavelength) domain, in contradistinction to analysis in the intensity domain in prior methods; and it involves low-resolution measurement of what amounts to predominantly the rotational Raman spectra of N2 and O2, in contradistinction to higher-resolution measurement of the vibrational Raman spectrum of N2 only in prior methods.
NASA Technical Reports Server (NTRS)
Tamma, Kumar K.; D'Costa, Joseph F.
1991-01-01
This paper describes the evaluation of mixed implicit-explicit finite element formulations for hyperbolic heat conduction problems involving non-Fourier effects. In particular, mixed implicit-explicit formulations employing the alpha method proposed by Hughes et al. (1987, 1990) are described for the numerical simulation of hyperbolic heat conduction models, which involves time-dependent relaxation effects. Existing analytical approaches for modeling/analysis of such models involve complex mathematical formulations for obtaining closed-form solutions, while in certain numerical formulations the difficulties include severe oscillatory solution behavior (which often disguises the true response) in the vicinity of the thermal disturbances, which propagate with finite velocities. In view of these factors, the alpha method is evaluated to assess the control of the amount of numerical dissipation for predicting the transient propagating thermal disturbances. Numerical test models are presented, and pertinent conclusions are drawn for the mixed-time integration simulation of hyperbolic heat conduction models involving non-Fourier effects.
Analysis of Social Cohesion in Health Data by Factor Analysis Method: The Ghanaian Perspective
ERIC Educational Resources Information Center
Saeed, Bashiru I. I.; Xicang, Zhao; Musah, A. A. I.; Abdul-Aziz, A. R.; Yawson, Alfred; Karim, Azumah
2013-01-01
We investigated the study of the overall social cohesion of Ghanaians. In this study, we considered the paramount interest of the involvement of Ghanaians in their communities, their views of other people and institutions, and their level of interest in both local and national politics. The factor analysis method was employed for analysis using R…
Chemical properties and methods of analysis of refractory compounds
NASA Technical Reports Server (NTRS)
Samsonov, G. V. (Editor); Frantsevich, I. N. (Editor); Yeremenko, V. N. (Editor); Nazarchuk, T. N. (Editor); Popova, O. I. (Editor)
1978-01-01
Reactions involving refractory metals and the alloys based on them are discussed. Chemical, electrochemical, photometric, spectrophotometric, and X-ray analysis are among the methods described for analyzing the results of the reactions and for determining the chemical properties of these materials.
Chen, Ning; Yu, Dejie; Xia, Baizhan; Liu, Jian; Ma, Zhengdong
2017-04-01
This paper presents a homogenization-based interval analysis method for the prediction of coupled structural-acoustic systems involving periodical composites and multi-scale uncertain-but-bounded parameters. In the structural-acoustic system, the macro plate structure is assumed to be composed of a periodically uniform microstructure. The equivalent macro material properties of the microstructure are computed using the homogenization method. By integrating the first-order Taylor expansion interval analysis method with the homogenization-based finite element method, a homogenization-based interval finite element method (HIFEM) is developed to solve a periodical composite structural-acoustic system with multi-scale uncertain-but-bounded parameters. The corresponding formulations of the HIFEM are deduced. A subinterval technique is also introduced into the HIFEM for higher accuracy. Numerical examples of a hexahedral box and an automobile passenger compartment are given to demonstrate the efficiency of the presented method for a periodical composite structural-acoustic system with multi-scale uncertain-but-bounded parameters.
ERIC Educational Resources Information Center
Schoonenboom, Judith
2016-01-01
Educational innovations often involve intact subgroups, such as school classes or university departments. In small-scale educational evaluation research, typically involving 1 to 20 subgroups, differences among these subgroups are often neglected. This article presents a mixed method from a qualitative perspective, in which differences among…
Selection of remedial alternatives for mine sites: a multicriteria decision analysis approach.
Betrie, Getnet D; Sadiq, Rehan; Morin, Kevin A; Tesfamariam, Solomon
2013-04-15
The selection of remedial alternatives for mine sites is a complex task because it involves multiple criteria and often with conflicting objectives. However, an existing framework used to select remedial alternatives lacks multicriteria decision analysis (MCDA) aids and does not consider uncertainty in the selection of alternatives. The objective of this paper is to improve the existing framework by introducing deterministic and probabilistic MCDA methods. The Preference Ranking Organization Method for Enrichment Evaluation (PROMETHEE) methods have been implemented in this study. The MCDA analysis involves processing inputs to the PROMETHEE methods that are identifying the alternatives, defining the criteria, defining the criteria weights using analytical hierarchical process (AHP), defining the probability distribution of criteria weights, and conducting Monte Carlo Simulation (MCS); running the PROMETHEE methods using these inputs; and conducting a sensitivity analysis. A case study was presented to demonstrate the improved framework at a mine site. The results showed that the improved framework provides a reliable way of selecting remedial alternatives as well as quantifying the impact of different criteria on selecting alternatives. Copyright © 2013 Elsevier Ltd. All rights reserved.
It's Deja Vu All over Again: Using Multiple-Spell Discrete-Time Survival Analysis.
ERIC Educational Resources Information Center
Willett, John B.; Singer, Judith D.
1995-01-01
The multiple-spell discrete-time survival analysis method is introduced and illustrated using longitudinal data on exit from and reentry into the teaching profession. The method is applicable to many educational problems involving the sequential occurrence of disparate events or episodes. (SLD)
ERIC Educational Resources Information Center
Kiste, Alan L.
2009-01-01
I. Analyzing and comparing student-generated inscriptions in chemistry is crucial to gaining insight into students' understanding about chemistry concepts. Thus, we developed two methods of analyzing student-generated inscriptions: features analysis and thematic analysis. We have also demonstrated how these methods are able to discern differences…
Design and Analysis of a Subcritical Airfoil for High Altitude, Long Endurance Missions.
1982-12-01
Airfoil Design and Analysis Method ......... .... 61 Appendix D: Boundary Layer Analysis Method ............. ... 81 Appendix E: Detailed Results ofr...attack. Computer codes designed by Richard Eppler were used for this study. The airfoil was anlayzed by using a viscous effects analysis program...inverse program designed by Eppler (Ref 5) was used in this study to accomplish this part. The second step involved the analysis of the airfoil under
A Comparison of Missing-Data Procedures for Arima Time-Series Analysis
ERIC Educational Resources Information Center
Velicer, Wayne F.; Colby, Suzanne M.
2005-01-01
Missing data are a common practical problem for longitudinal designs. Time-series analysis is a longitudinal method that involves a large number of observations on a single unit. Four different missing-data methods (deletion, mean substitution, mean of adjacent observations, and maximum likelihood estimation) were evaluated. Computer-generated…
On the Spectrum of Periodic Signals
ERIC Educational Resources Information Center
Al-Smadi, Adnan
2004-01-01
In theory, there are many methods for the representation of signals. In practice, however, Fourier analysis involving the resolution of signals into sinusoidal components is used widely. There are several methods for Fourier analysis available for representation of signals. If the signal is periodic, then the Fourier series is used to represent…
USDA-ARS?s Scientific Manuscript database
A high resolution GC/MS with Selected Ion Monitor (SIM) method focusing on the characterization and quantitative analysis of ginkgolic acids (GAs) in Ginkgo biloba L. plant materials, extracts and commercial products was developed and validated. The method involved sample extraction with (1:1) meth...
An advanced probabilistic structural analysis method for implicit performance functions
NASA Technical Reports Server (NTRS)
Wu, Y.-T.; Millwater, H. R.; Cruse, T. A.
1989-01-01
In probabilistic structural analysis, the performance or response functions usually are implicitly defined and must be solved by numerical analysis methods such as finite element methods. In such cases, the most commonly used probabilistic analysis tool is the mean-based, second-moment method which provides only the first two statistical moments. This paper presents a generalized advanced mean value (AMV) method which is capable of establishing the distributions to provide additional information for reliability design. The method requires slightly more computations than the second-moment method but is highly efficient relative to the other alternative methods. In particular, the examples show that the AMV method can be used to solve problems involving non-monotonic functions that result in truncated distributions.
The Determination of Metals in Welding Fume by X-RaySpectrometry
NASA Astrophysics Data System (ADS)
Kuznetsova, O. V.; Begunova, L. A.; Romanenko, S. V.; Solodsky, S. A.
2018-01-01
Analysis of the current hygienic situation in the welding production showed that the intensification of welding processes involves the deterioration of air quality, which negatively affects the welders health. Respiratory effects seen in full-time welders have included bronchitis, airway irritation, lung function changes, and a possible increase in the incidence of lung cancer. The metal concentration in the air of the working area have been determined using the photometric method of analysis, which involves the stage of decomposition of the sample material before analysis. However, losses of the analyzed elements are possible when the sample is decomposed. The X-ray fluorescence method of analysis has the advantage of being nondestructive. The investigations shown the data of photometric determination of metals in welding aerosols is 1.5÷2 times lower than the results of X-ray fluorescence analysis.
The economics of project analysis: Optimal investment criteria and methods of study
NASA Technical Reports Server (NTRS)
Scriven, M. C.
1979-01-01
Insight is provided toward the development of an optimal program for investment analysis of project proposals offering commercial potential and its components. This involves a critique of economic investment criteria viewed in relation to requirements of engineering economy analysis. An outline for a systems approach to project analysis is given Application of the Leontief input-output methodology to analysis of projects involving multiple processes and products is investigated. Effective application of elements of neoclassical economic theory to investment analysis of project components is demonstrated. Patterns of both static and dynamic activity levels are incorporated.
Methods of analyzing crude oil
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cooks, Robert Graham; Jjunju, Fred Paul Mark; Li, Anyin
The invention generally relates to methods of analyzing crude oil. In certain embodiments, methods of the invention involve obtaining a crude oil sample, and subjecting the crude oil sample to mass spectrometry analysis. In certain embodiments, the method is performed without any sample pre-purification steps.
Multicriteria decision analysis: Overview and implications for environmental decision making
Hermans, Caroline M.; Erickson, Jon D.; Erickson, Jon D.; Messner, Frank; Ring, Irene
2007-01-01
Environmental decision making involving multiple stakeholders can benefit from the use of a formal process to structure stakeholder interactions, leading to more successful outcomes than traditional discursive decision processes. There are many tools available to handle complex decision making. Here we illustrate the use of a multicriteria decision analysis (MCDA) outranking tool (PROMETHEE) to facilitate decision making at the watershed scale, involving multiple stakeholders, multiple criteria, and multiple objectives. We compare various MCDA methods and their theoretical underpinnings, examining methods that most realistically model complex decision problems in ways that are understandable and transparent to stakeholders.
Second ventilatory threshold from heart-rate variability: valid when the upper body is involved?
Mourot, Laurent; Fabre, Nicolas; Savoldelli, Aldo; Schena, Federico
2014-07-01
To determine the most accurate method based on spectral analysis of heart-rate variability (SA-HRV) during an incremental and continuous maximal test involving the upper body, the authors tested 4 different methods to obtain the heart rate (HR) at the second ventilatory threshold (VT(2)). Sixteen ski mountaineers (mean ± SD; age 25 ± 3 y, height 177 ± 8 cm, mass 69 ± 10 kg) performed a roller-ski test on a treadmill. Respiratory variables and HR were continuously recorded, and the 4 SA-HRV methods were compared with the gas-exchange method through Bland and Altman analyses. The best method was the one based on a time-varying spectral analysis with high frequency ranging from 0.15 Hz to a cutoff point relative to the individual's respiratory sinus arrhythmia. The HR values were significantly correlated (r(2) = .903), with a mean HR difference with the respiratory method of 0.1 ± 3.0 beats/min and low limits of agreements (around -6 /+6 beats/min). The 3 other methods led to larger errors and lower agreements (up to 5 beats/min and around -23/+20 beats/min). It is possible to accurately determine VT(2) with an HR monitor during an incremental test involving the upper body if the appropriate HRV method is used.
Enhanced Decision Analysis Support System.
1981-03-01
autorrares "i., the method for determining preferences when multiple and competing attributes are involved. Worth assessment is used as the model which...1967 as a method for determining preferenoe when multiple and competing attributes are involved (Rf 10). The tern worth can be - equated to other... competing objectives. After some discussion, the group decided that the problem could best be decided using the worth assessment procedure. They
Sensitivity analysis and approximation methods for general eigenvalue problems
NASA Technical Reports Server (NTRS)
Murthy, D. V.; Haftka, R. T.
1986-01-01
Optimization of dynamic systems involving complex non-hermitian matrices is often computationally expensive. Major contributors to the computational expense are the sensitivity analysis and reanalysis of a modified design. The present work seeks to alleviate this computational burden by identifying efficient sensitivity analysis and approximate reanalysis methods. For the algebraic eigenvalue problem involving non-hermitian matrices, algorithms for sensitivity analysis and approximate reanalysis are classified, compared and evaluated for efficiency and accuracy. Proper eigenvector normalization is discussed. An improved method for calculating derivatives of eigenvectors is proposed based on a more rational normalization condition and taking advantage of matrix sparsity. Important numerical aspects of this method are also discussed. To alleviate the problem of reanalysis, various approximation methods for eigenvalues are proposed and evaluated. Linear and quadratic approximations are based directly on the Taylor series. Several approximation methods are developed based on the generalized Rayleigh quotient for the eigenvalue problem. Approximation methods based on trace theorem give high accuracy without needing any derivatives. Operation counts for the computation of the approximations are given. General recommendations are made for the selection of appropriate approximation technique as a function of the matrix size, number of design variables, number of eigenvalues of interest and the number of design points at which approximation is sought.
ERIC Educational Resources Information Center
Friman, Margareta; Nyberg, Claes; Norlander, Torsten
2004-01-01
A descriptive qualitative analysis of in-depth interviews involving seven provincial Soccer Association referees was carried out in order to find out how referees experience threats and aggression directed to soccer referees. The Empirical Phenomenological Psychological method (EPP-method) was used. The analysis resulted in thirty categories which…
Johnson, R.G.; Wandless, G.A.
1984-01-01
A new method is described for determining carrier yield in the radiochemical neutron activation analysis of rare-earth elements in silicate rocks by group separation. The method involves the determination of the rare-earth elements present in the carrier by means of energy-dispersive X-ray fluorescence analysis, eliminating the need to re-irradiate samples in a nuclear reactor after the gamma ray analysis is complete. Results from the analysis of USGS standards AGV-1 and BCR-1 compare favorably with those obtained using the conventional method. ?? 1984 Akade??miai Kiado??.
Development Optimization and Uncertainty Analysis Methods for Oil and Gas Reservoirs
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ettehadtavakkol, Amin, E-mail: amin.ettehadtavakkol@ttu.edu; Jablonowski, Christopher; Lake, Larry
Uncertainty complicates the development optimization of oil and gas exploration and production projects, but methods have been devised to analyze uncertainty and its impact on optimal decision-making. This paper compares two methods for development optimization and uncertainty analysis: Monte Carlo (MC) simulation and stochastic programming. Two example problems for a gas field development and an oilfield development are solved and discussed to elaborate the advantages and disadvantages of each method. Development optimization involves decisions regarding the configuration of initial capital investment and subsequent operational decisions. Uncertainty analysis involves the quantification of the impact of uncertain parameters on the optimum designmore » concept. The gas field development problem is designed to highlight the differences in the implementation of the two methods and to show that both methods yield the exact same optimum design. The results show that both MC optimization and stochastic programming provide unique benefits, and that the choice of method depends on the goal of the analysis. While the MC method generates more useful information, along with the optimum design configuration, the stochastic programming method is more computationally efficient in determining the optimal solution. Reservoirs comprise multiple compartments and layers with multiphase flow of oil, water, and gas. We present a workflow for development optimization under uncertainty for these reservoirs, and solve an example on the design optimization of a multicompartment, multilayer oilfield development.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Edgue, E.
The point kinetics approach is a classical useful method for a reactor transient analysis. It is helpful to known, however, when a more elaborate transient analysis, involving the space-dependence change of the flux through a given transient, should be considered. In this paper, the authors present a rather elegant and quick method to check the need for a space-dependent flux analysis through a control rod transient in a given nuclear reactor. The method is applied to a series of rod ejection experiments in the TRIGA MARK-II reactor of Istanbul Technical University (ITU).
Quantum computation in the analysis of hyperspectral data
NASA Astrophysics Data System (ADS)
Gomez, Richard B.; Ghoshal, Debabrata; Jayanna, Anil
2004-08-01
Recent research on the topic of quantum computation provides us with some quantum algorithms with higher efficiency and speedup compared to their classical counterparts. In this paper, it is our intent to provide the results of our investigation of several applications of such quantum algorithms - especially the Grover's Search algorithm - in the analysis of Hyperspectral Data. We found many parallels with Grover's method in existing data processing work that make use of classical spectral matching algorithms. Our efforts also included the study of several methods dealing with hyperspectral image analysis work where classical computation methods involving large data sets could be replaced with quantum computation methods. The crux of the problem in computation involving a hyperspectral image data cube is to convert the large amount of data in high dimensional space to real information. Currently, using the classical model, different time consuming methods and steps are necessary to analyze these data including: Animation, Minimum Noise Fraction Transform, Pixel Purity Index algorithm, N-dimensional scatter plot, Identification of Endmember spectra - are such steps. If a quantum model of computation involving hyperspectral image data can be developed and formalized - it is highly likely that information retrieval from hyperspectral image data cubes would be a much easier process and the final information content would be much more meaningful and timely. In this case, dimensionality would not be a curse, but a blessing.
Protecting Privacy of Shared Epidemiologic Data without Compromising Analysis Potential
Cologne, John; Grant, Eric J.; Nakashima, Eiji; ...
2012-01-01
Objective . Ensuring privacy of research subjects when epidemiologic data are shared with outside collaborators involves masking (modifying) the data, but overmasking can compromise utility (analysis potential). Methods of statistical disclosure control for protecting privacy may be impractical for individual researchers involved in small-scale collaborations. Methods . We investigated a simple approach based on measures of disclosure risk and analytical utility that are straightforward for epidemiologic researchers to derive. The method is illustrated using data from the Japanese Atomic-bomb Survivor population. Results . Masking by modest rounding did not adequately enhance security but rounding to remove several digits of relativemore » accuracy effectively reduced the risk of identification without substantially reducing utility. Grouping or adding random noise led to noticeable bias. Conclusions . When sharing epidemiologic data, it is recommended that masking be performed using rounding. Specific treatment should be determined separately in individual situations after consideration of the disclosure risks and analysis needs.« less
Protecting Privacy of Shared Epidemiologic Data without Compromising Analysis Potential
Cologne, John; Grant, Eric J.; Nakashima, Eiji; Chen, Yun; Funamoto, Sachiyo; Katayama, Hiroaki
2012-01-01
Objective. Ensuring privacy of research subjects when epidemiologic data are shared with outside collaborators involves masking (modifying) the data, but overmasking can compromise utility (analysis potential). Methods of statistical disclosure control for protecting privacy may be impractical for individual researchers involved in small-scale collaborations. Methods. We investigated a simple approach based on measures of disclosure risk and analytical utility that are straightforward for epidemiologic researchers to derive. The method is illustrated using data from the Japanese Atomic-bomb Survivor population. Results. Masking by modest rounding did not adequately enhance security but rounding to remove several digits of relative accuracy effectively reduced the risk of identification without substantially reducing utility. Grouping or adding random noise led to noticeable bias. Conclusions. When sharing epidemiologic data, it is recommended that masking be performed using rounding. Specific treatment should be determined separately in individual situations after consideration of the disclosure risks and analysis needs. PMID:22505949
Application of a High-Fidelity Icing Analysis Method to a Model-Scale Rotor in Forward Flight
NASA Technical Reports Server (NTRS)
Narducci, Robert; Orr, Stanley; Kreeger, Richard E.
2012-01-01
An icing analysis process involving the loose coupling of OVERFLOW-RCAS for rotor performance prediction and with LEWICE3D for thermal analysis and ice accretion is applied to a model-scale rotor for validation. The process offers high-fidelity rotor analysis for the noniced and iced rotor performance evaluation that accounts for the interaction of nonlinear aerodynamics with blade elastic deformations. Ice accumulation prediction also involves loosely coupled data exchanges between OVERFLOW and LEWICE3D to produce accurate ice shapes. Validation of the process uses data collected in the 1993 icing test involving Sikorsky's Powered Force Model. Non-iced and iced rotor performance predictions are compared to experimental measurements as are predicted ice shapes.
Thokala, Praveen; Devlin, Nancy; Marsh, Kevin; Baltussen, Rob; Boysen, Meindert; Kalo, Zoltan; Longrenn, Thomas; Mussen, Filip; Peacock, Stuart; Watkins, John; Ijzerman, Maarten
2016-01-01
Health care decisions are complex and involve confronting trade-offs between multiple, often conflicting, objectives. Using structured, explicit approaches to decisions involving multiple criteria can improve the quality of decision making and a set of techniques, known under the collective heading multiple criteria decision analysis (MCDA), are useful for this purpose. MCDA methods are widely used in other sectors, and recently there has been an increase in health care applications. In 2014, ISPOR established an MCDA Emerging Good Practices Task Force. It was charged with establishing a common definition for MCDA in health care decision making and developing good practice guidelines for conducting MCDA to aid health care decision making. This initial ISPOR MCDA task force report provides an introduction to MCDA - it defines MCDA; provides examples of its use in different kinds of decision making in health care (including benefit risk analysis, health technology assessment, resource allocation, portfolio decision analysis, shared patient clinician decision making and prioritizing patients' access to services); provides an overview of the principal methods of MCDA; and describes the key steps involved. Upon reviewing this report, readers should have a solid overview of MCDA methods and their potential for supporting health care decision making. Copyright © 2016. Published by Elsevier Inc.
Brereton, Louise; Ingleton, Christine; Gardiner, Clare; Goyder, Elizabeth; Mozygemba, Kati; Lysdahl, Kristin Bakke; Tummers, Marcia; Sacchini, Dario; Leppert, Wojciech; Blaževičienė, Aurelija; van der Wilt, Gert Jan; Refolo, Pietro; De Nicola, Martina; Chilcott, James; Oortwijn, Wija
2017-02-01
Stakeholders are people with an interest in a topic. Internationally, stakeholder involvement in palliative care research and health technology assessment requires development. Stakeholder involvement adds value throughout research (from prioritising topics to disseminating findings). Philosophies and understandings about the best ways to involve stakeholders in research differ internationally. Stakeholder involvement took place in seven countries (England, Germany, Italy, Lithuania, the Netherlands, Norway and Poland). Findings informed a project that developed concepts and methods for health technology assessment and applied these to evaluate models of palliative care service delivery. To report on stakeholder involvement in the INTEGRATE-HTA project and how issues identified informed project development. Using stakeholder consultation or a qualitative research design, as appropriate locally, stakeholders in seven countries acted as 'advisors' to aid researchers' decision making. Thematic analysis was used to identify key issues across countries. A total of 132 stakeholders (82 professionals and 50 'lay' people) aged ⩾18 participated in individual face-to-face or telephone interviews, consultation meetings or focus groups. Different stakeholder involvement methods were used successfully to identify key issues in palliative care. A total of 23 issues common to three or more countries informed decisions about the intervention and comparator of interest, sub questions and specific assessments within the health technology assessment. Stakeholders, including patients and families undergoing palliative care, can inform project decision making using various involvement methods according to the local context. Researchers should consider local understandings about stakeholder involvement as views of appropriate and feasible methods vary. Methods for stakeholder involvement, especially consultation, need further development.
Improved nonlinear prediction method
NASA Astrophysics Data System (ADS)
Adenan, Nur Hamiza; Md Noorani, Mohd Salmi
2014-06-01
The analysis and prediction of time series data have been addressed by researchers. Many techniques have been developed to be applied in various areas, such as weather forecasting, financial markets and hydrological phenomena involving data that are contaminated by noise. Therefore, various techniques to improve the method have been introduced to analyze and predict time series data. In respect of the importance of analysis and the accuracy of the prediction result, a study was undertaken to test the effectiveness of the improved nonlinear prediction method for data that contain noise. The improved nonlinear prediction method involves the formation of composite serial data based on the successive differences of the time series. Then, the phase space reconstruction was performed on the composite data (one-dimensional) to reconstruct a number of space dimensions. Finally the local linear approximation method was employed to make a prediction based on the phase space. This improved method was tested with data series Logistics that contain 0%, 5%, 10%, 20% and 30% of noise. The results show that by using the improved method, the predictions were found to be in close agreement with the observed ones. The correlation coefficient was close to one when the improved method was applied on data with up to 10% noise. Thus, an improvement to analyze data with noise without involving any noise reduction method was introduced to predict the time series data.
Identification of genes and gene clusters involved in mycotoxin synthesis
USDA-ARS?s Scientific Manuscript database
Research methods to identify and characterize genes involved in mycotoxin biosynthetic pathways have evolved considerably over the years. Before whole genome sequences were available (e.g. pre-genomics), work focused primarily on chemistry, biosynthetic mutant strains and molecular analysis of sing...
Simplified half-life methods for the analysis of kinetic data
NASA Technical Reports Server (NTRS)
Eberhart, J. G.; Levin, E.
1988-01-01
The analysis of reaction rate data has as its goal the determination of the order rate constant which characterize the data. Chemical reactions with one reactant and present simplified methods for accomplishing this goal are considered. The approaches presented involve the use of half lives or other fractional lives. These methods are particularly useful for the more elementary discussions of kinetics found in general and physical chemistry courses.
Method for factor analysis of GC/MS data
Van Benthem, Mark H; Kotula, Paul G; Keenan, Michael R
2012-09-11
The method of the present invention provides a fast, robust, and automated multivariate statistical analysis of gas chromatography/mass spectroscopy (GC/MS) data sets. The method can involve systematic elimination of undesired, saturated peak masses to yield data that follow a linear, additive model. The cleaned data can then be subjected to a combination of PCA and orthogonal factor rotation followed by refinement with MCR-ALS to yield highly interpretable results.
ERIC Educational Resources Information Center
Kim, Kyung Hi
2014-01-01
This research, based on a case study of vulnerable children in Korea, used a mixed methods transformative approach to explore strategies to support and help disadvantaged children. The methodological approach includes three phases: a mixed methods contextual analysis, a qualitative dominant analysis based on Sen's capability approach and critical…
Global Study of the Simple Pendulum by the Homotopy Analysis Method
ERIC Educational Resources Information Center
Bel, A.; Reartes, W.; Torresi, A.
2012-01-01
Techniques are developed to find all periodic solutions in the simple pendulum by means of the homotopy analysis method (HAM). This involves the solution of the equations of motion in two different coordinate representations. Expressions are obtained for the cycles and periods of oscillations with a high degree of accuracy in the whole range of…
Garfield, S; Jheeta, S; Husson, F; Jacklin, A; Bischler, A; Norton, C; Franklin, B D
2016-01-01
There is a consensus that patients and the public should be involved in research in a meaningful way. However, to date, lay people have been mostly involved in developing research ideas and commenting on patient information.We previously published a paper describing our experience with lay partners conducting observations in a study of how patients in hospital are involved with their medicines. In a later part of the same study, lay partners were also involved in analysing interviews that a researcher had conducted with patients, carers and healthcare professionals about patient and carer involvement with medicines in hospital. We therefore wanted to build on our previous paper and report on our experiences with lay partners helping to conduct data analysis. We therefore interviewed the lay members and researchers involved in the analysis to find out their views.Both lay members and researchers reported that lay partners added value to the study by bringing their own perspectives and identifying further areas for the researcher to look for in the interviews. In this way researchers and lay partners were able to work together to produce a richer analysis than would have been possible from either alone. Background It is recognised that involving lay people in research in a meaningful rather than tokenistic way is both important and challenging. In this paper, we contribute to this debate by describing our experiences of lay involvement in data analysis. Methods We conducted semi-structured interviews with the lay partners and researchers involved in qualitative data analysis in a wider study of inpatient involvement in medication safety. The interviews were transcribed verbatim and coded using open thematic analysis. Results We interviewed three lay partners and the three researchers involved. These interviews demonstrated that the lay members added value to the analysis by bringing their own perspectives; these were systematically integrated into the analysis by the lead researcher to create a synergistic output. Some challenges arose, including difficulties in recruiting a diverse range of members of the public to carry out the role; however there were generally fewer challenges in data analysis than there had been with our previous experience of lay partners' involvement in data collection. Conclusions Lay members can add value to health services research by being involved in qualitative data analysis.
Advanced analysis technique for the evaluation of linear alternators and linear motors
NASA Technical Reports Server (NTRS)
Holliday, Jeffrey C.
1995-01-01
A method for the mathematical analysis of linear alternator and linear motor devices and designs is described, and an example of its use is included. The technique seeks to surpass other methods of analysis by including more rigorous treatment of phenomena normally omitted or coarsely approximated such as eddy braking, non-linear material properties, and power losses generated within structures surrounding the device. The technique is broadly applicable to linear alternators and linear motors involving iron yoke structures and moving permanent magnets. The technique involves the application of Amperian current equivalents to the modeling of the moving permanent magnet components within a finite element formulation. The resulting steady state and transient mode field solutions can simultaneously account for the moving and static field sources within and around the device.
Dynamic analysis for shuttle design verification
NASA Technical Reports Server (NTRS)
Fralich, R. W.; Green, C. E.; Rheinfurth, M. H.
1972-01-01
Two approaches that are used for determining the modes and frequencies of space shuttle structures are discussed. The first method, direct numerical analysis, involves finite element mathematical modeling of the space shuttle structure in order to use computer programs for dynamic structural analysis. The second method utilizes modal-coupling techniques of experimental verification made by vibrating only spacecraft components and by deducing modes and frequencies of the complete vehicle from results obtained in the component tests.
Pradana Pérez, Juan A; Durand Alegría, Jesús S; Hernando, Pilar Fernández; Sierra, Adolfo Narros
2012-01-01
A rapid, economic and sensitive chemiluminescent method involving flow-injection analysis was developed for the determination of dipyrone in pharmaceutical preparations. The method is based on the chemiluminescent reaction between quinolinic hydrazide and hydrogen peroxide in a strongly alkaline medium, in which vanadium(IV) acts as a catalyst. Principal chemical and physical variables involved in the flow-injection system were optimized using a modified simplex method. The variations in the quantum yield observed when dipyrone was present in the reaction medium were used to determine the concentration of this compound. The proposed method requires no preconcentration steps and reliably quantifies dipyrone over the linear range 1-50 µg/mL. In addition, a sample throughput of 85 samples/h is possible. Copyright © 2011 John Wiley & Sons, Ltd.
Eigenvalue and eigenvector sensitivity and approximate analysis for repeated eigenvalue problems
NASA Technical Reports Server (NTRS)
Hou, Gene J. W.; Kenny, Sean P.
1991-01-01
A set of computationally efficient equations for eigenvalue and eigenvector sensitivity analysis are derived, and a method for eigenvalue and eigenvector approximate analysis in the presence of repeated eigenvalues is presented. The method developed for approximate analysis involves a reparamaterization of the multivariable structural eigenvalue problem in terms of a single positive-valued parameter. The resulting equations yield first-order approximations of changes in both the eigenvalues and eigenvectors associated with the repeated eigenvalue problem. Examples are given to demonstrate the application of such equations for sensitivity and approximate analysis.
System of Systems Analytic Workbench - 2017
2017-08-31
and transitional activities with key collaborators. The tools include: System Operational Dependency Analysis/System Developmental Dependency Analysis...in the methods of the SoS-AWB involve the following: 1. System Operability Dependency Analysis (SODA)/System Development Dependency Analysis...available f. Development of standard dependencies with combinations of low-medium-high parameters Report No. SERC-2017-TR-111
EPIBLASTER-fast exhaustive two-locus epistasis detection strategy using graphical processing units
Kam-Thong, Tony; Czamara, Darina; Tsuda, Koji; Borgwardt, Karsten; Lewis, Cathryn M; Erhardt-Lehmann, Angelika; Hemmer, Bernhard; Rieckmann, Peter; Daake, Markus; Weber, Frank; Wolf, Christiane; Ziegler, Andreas; Pütz, Benno; Holsboer, Florian; Schölkopf, Bernhard; Müller-Myhsok, Bertram
2011-01-01
Detection of epistatic interaction between loci has been postulated to provide a more in-depth understanding of the complex biological and biochemical pathways underlying human diseases. Studying the interaction between two loci is the natural progression following traditional and well-established single locus analysis. However, the added costs and time duration required for the computation involved have thus far deterred researchers from pursuing a genome-wide analysis of epistasis. In this paper, we propose a method allowing such analysis to be conducted very rapidly. The method, dubbed EPIBLASTER, is applicable to case–control studies and consists of a two-step process in which the difference in Pearson's correlation coefficients is computed between controls and cases across all possible SNP pairs as an indication of significant interaction warranting further analysis. For the subset of interactions deemed potentially significant, a second-stage analysis is performed using the likelihood ratio test from the logistic regression to obtain the P-value for the estimated coefficients of the individual effects and the interaction term. The algorithm is implemented using the parallel computational capability of commercially available graphical processing units to greatly reduce the computation time involved. In the current setup and example data sets (211 cases, 222 controls, 299468 SNPs; and 601 cases, 825 controls, 291095 SNPs), this coefficient evaluation stage can be completed in roughly 1 day. Our method allows for exhaustive and rapid detection of significant SNP pair interactions without imposing significant marginal effects of the single loci involved in the pair. PMID:21150885
A two-stage linear discriminant analysis via QR-decomposition.
Ye, Jieping; Li, Qi
2005-06-01
Linear Discriminant Analysis (LDA) is a well-known method for feature extraction and dimension reduction. It has been used widely in many applications involving high-dimensional data, such as image and text classification. An intrinsic limitation of classical LDA is the so-called singularity problems; that is, it fails when all scatter matrices are singular. Many LDA extensions were proposed in the past to overcome the singularity problems. Among these extensions, PCA+LDA, a two-stage method, received relatively more attention. In PCA+LDA, the LDA stage is preceded by an intermediate dimension reduction stage using Principal Component Analysis (PCA). Most previous LDA extensions are computationally expensive, and not scalable, due to the use of Singular Value Decomposition or Generalized Singular Value Decomposition. In this paper, we propose a two-stage LDA method, namely LDA/QR, which aims to overcome the singularity problems of classical LDA, while achieving efficiency and scalability simultaneously. The key difference between LDA/QR and PCA+LDA lies in the first stage, where LDA/QR applies QR decomposition to a small matrix involving the class centroids, while PCA+LDA applies PCA to the total scatter matrix involving all training data points. We further justify the proposed algorithm by showing the relationship among LDA/QR and previous LDA methods. Extensive experiments on face images and text documents are presented to show the effectiveness of the proposed algorithm.
Multilevel Analysis Methods for Partially Nested Cluster Randomized Trials
ERIC Educational Resources Information Center
Sanders, Elizabeth A.
2011-01-01
This paper explores multilevel modeling approaches for 2-group randomized experiments in which a treatment condition involving clusters of individuals is compared to a control condition involving only ungrouped individuals, otherwise known as partially nested cluster randomized designs (PNCRTs). Strategies for comparing groups from a PNCRT in the…
Śliwińska, Anna; Burchart-Korol, Dorota; Smoliński, Adam
2017-01-01
This paper presents a life cycle assessment (LCA) of greenhouse gas emissions generated through methanol and electricity co-production system based on coal gasification technology. The analysis focuses on polygeneration technologies from which two products are produced, and thus, issues related to an allocation procedure for LCA are addressed in this paper. In the LCA, two methods were used: a 'system expansion' method based on two approaches, the 'avoided burdens approach' and 'direct system enlargement' methods and an 'allocation' method involving proportional partitioning based on physical relationships in a technological process. Cause-effect relationships in the analysed production process were identified, allowing for the identification of allocation factors. The 'system expansion' method involved expanding the analysis to include five additional variants of electricity production technologies in Poland (alternative technologies). This method revealed environmental consequences of implementation for the analysed technologies. It was found that the LCA of polygeneration technologies based on the 'system expansion' method generated a more complete source of information on environmental consequences than the 'allocation' method. The analysis shows that alternative technologies chosen for generating LCA results are crucial. Life cycle assessment was performed for the analysed, reference and variant alternative technologies. Comparative analysis was performed between the analysed technologies of methanol and electricity co-production from coal gasification as well as a reference technology of methanol production from the natural gas reforming process. Copyright © 2016 Elsevier B.V. All rights reserved.
Patton, Sandra; Hutton, Eve
2016-08-01
The active involvement of parents and children in goal setting and intervention is integral to contemporary occupational therapy process models. However, parental perspectives on collaborative handwriting intervention are limited. This paper presents parental perspectives on a three-way collaboration involving teachers, parents and an occupational therapist in the application of Handwriting Without Tears(®) (HWT(®) ) with children with Down syndrome. Within a larger mixed methods study, 44 parents completed purpose-designed questionnaires and six parents participated in a focus group, post 8 months of programme implementation. Both methods gathered parent's perspectives on the usefulness and limitations of applying HWT(®) . The focus group explored collaboration in depth. Analysis involved triangulation of data from descriptive analysis of numerical data with content analysis of open-ended questions and focus group data. Enablers of parent-child engagement in HWT(®) were identified as; the parent-child-friendly aspects of HWT(®) , the teacher involvement ensuring continuity which eased demands on parents, the ongoing support/guidance of the occupational therapist and the child's involvement in HWT(®) group intervention. The occupational therapists' involvement was reported as essential to encouraging teacher/parent involvement. Barriers to child-parent engagement included fluctuations in child health, mood, attention span and time limitations including the child's involvement in other therapy programmes. Parents perceived the HWT(®) and the three-way collaborative approach as enabling active parent-child engagement in handwriting intervention. This approach warrants further investigation. Findings have the potential to inform practice guidelines and pre- and post-graduation education related to collaborative handwriting intervention with children with Down syndrome and their families. © 2016 Occupational Therapy Australia.
The Problem With the Placement Study.
ERIC Educational Resources Information Center
Miner, Norris
This study compared the effectiveness and efficiency of two alternative methods for determining the status of graduates of Seminole Community College. The first method involved the identification of graduates, design and mailing of a questionnaire, and analysis of response data, as mandated by the state. The second method compared computer data…
A multiclass multiresidue LC-MS/MS method for analysis of veterinary drugs in bovine kidney
USDA-ARS?s Scientific Manuscript database
The increased efficiency permitted by multiclass, multiresidue methods has made such approaches very attractive to laboratories involved in monitoring veterinary drug residues in animal tissues. In this current work, evaluation of a multiclass multiresidue LC-MS/MS method in bovine kidney is describ...
ERIC Educational Resources Information Center
Grimm, Kevin J.
2007-01-01
Recent advances in methods and computer software for longitudinal data analysis have pushed researchers to more critically examine developmental theories. In turn, researchers have also begun to push longitudinal methods by asking more complex developmental questions. One such question involves the relationships between two developmental…
Supporting Space Systems Design via Systems Dependency Analysis Methodology
NASA Astrophysics Data System (ADS)
Guariniello, Cesare
The increasing size and complexity of space systems and space missions pose severe challenges to space systems engineers. When complex systems and Systems-of-Systems are involved, the behavior of the whole entity is not only due to that of the individual systems involved but also to the interactions and dependencies between the systems. Dependencies can be varied and complex, and designers usually do not perform analysis of the impact of dependencies at the level of complex systems, or this analysis involves excessive computational cost, or occurs at a later stage of the design process, after designers have already set detailed requirements, following a bottom-up approach. While classical systems engineering attempts to integrate the perspectives involved across the variety of engineering disciplines and the objectives of multiple stakeholders, there is still a need for more effective tools and methods capable to identify, analyze and quantify properties of the complex system as a whole and to model explicitly the effect of some of the features that characterize complex systems. This research describes the development and usage of Systems Operational Dependency Analysis and Systems Developmental Dependency Analysis, two methods based on parametric models of the behavior of complex systems, one in the operational domain and one in the developmental domain. The parameters of the developed models have intuitive meaning, are usable with subjective and quantitative data alike, and give direct insight into the causes of observed, and possibly emergent, behavior. The approach proposed in this dissertation combines models of one-to-one dependencies among systems and between systems and capabilities, to analyze and evaluate the impact of failures or delays on the outcome of the whole complex system. The analysis accounts for cascading effects, partial operational failures, multiple failures or delays, and partial developmental dependencies. The user of these methods can assess the behavior of each system based on its internal status and on the topology of its dependencies on systems connected to it. Designers and decision makers can therefore quickly analyze and explore the behavior of complex systems and evaluate different architectures under various working conditions. The methods support educated decision making both in the design and in the update process of systems architecture, reducing the need to execute extensive simulations. In particular, in the phase of concept generation and selection, the information given by the methods can be used to identify promising architectures to be further tested and improved, while discarding architectures that do not show the required level of global features. The methods, when used in conjunction with appropriate metrics, also allow for improved reliability and risk analysis, as well as for automatic scheduling and re-scheduling based on the features of the dependencies and on the accepted level of risk. This dissertation illustrates the use of the two methods in sample aerospace applications, both in the operational and in the developmental domain. The applications show how to use the developed methodology to evaluate the impact of failures, assess the criticality of systems, quantify metrics of interest, quantify the impact of delays, support informed decision making when scheduling the development of systems and evaluate the achievement of partial capabilities. A larger, well-framed case study illustrates how the Systems Operational Dependency Analysis method and the Systems Developmental Dependency Analysis method can support analysis and decision making, at the mid and high level, in the design process of architectures for the exploration of Mars. The case study also shows how the methods do not replace the classical systems engineering methodologies, but support and improve them.
Kasote, Deepak M; Ghosh, Ritesh; Chung, Jun Young; Kim, Jonggeun; Bae, Inhwan; Bae, Hanhong
2016-01-01
Plant hormones are the key regulators of adaptive stress response. Abiotic stresses such as drought and salt are known to affect the growth and productivity of plants. It is well known that the levels of plant hormones such as zeatin (ZA), abscisic acid (ABA), salicylic acid (SA), jasmonic acid (JA), and brassinolide (BR) fluctuate upon abiotic stress exposure. At present, there is not any single suitable liquid chromatography-mass spectrometry (LC-MS) method for simultaneous analysis of BR and other plant hormones involved in abiotic stresses. In the present study, we developed a simple, sensitive, and rapid method for simultaneous analysis of five major plant hormones, ZA, ABA, JA, SA, and BR, which are directly or indirectly involved in drought and salt stresses. The optimized extraction procedure was simple and easy to use for simultaneous measurement of these plant hormones in Arabidopsis thaliana. The developed method is highly reproducible and can be adapted for simultaneous measurement of changes in plant hormones (ZA, ABA, JA, SA, and BR) in response to abiotic stresses in plants like A. thaliana and tomato.
A collocation-shooting method for solving fractional boundary value problems
NASA Astrophysics Data System (ADS)
Al-Mdallal, Qasem M.; Syam, Muhammed I.; Anwar, M. N.
2010-12-01
In this paper, we discuss the numerical solution of special class of fractional boundary value problems of order 2. The method of solution is based on a conjugating collocation and spline analysis combined with shooting method. A theoretical analysis about the existence and uniqueness of exact solution for the present class is proven. Two examples involving Bagley-Torvik equation subject to boundary conditions are also presented; numerical results illustrate the accuracy of the present scheme.
Belaz, Kátia Roberta A; Pereira-Filho, Edenir Rodrigues; Oliveira, Regina V
2013-08-01
In this work, the development of two multidimensional liquid chromatography methods coupled to a fluorescence detector is described for direct analysis of microsomal fractions obtained from rat livers. The chiral multidimensional method was then applied for the optimization of the in vitro metabolism of albendazole by experimental design. Albendazole was selected as a model drug because of its anthelmintics properties and recent potential for cancer treatment. The development of two fully automated achiral-chiral and chiral-chiral high performance liquid chromatography (HPLC) methods for the determination of albendazole (ABZ) and its metabolites albendazole sulphoxide (ABZ-SO), albendazole sulphone (ABZ-SO2) and albendazole 2-aminosulphone (ABZ-SO2NH2) in microsomal fractions are described. These methods involve the use of a phenyl (RAM-phenyl-BSA) or octyl (RAM-C8-BSA) restricted access media bovine serum albumin column for the sample clean-up, followed by an achiral phenyl column (15.0×0.46cmI.D.) or a chiral amylose tris(3,5-dimethylphenylcarbamate) column (15.0×0.46cmI.D.). The chiral 2D HPLC method was applied to the development of a compromise condition for the in vitro metabolism of ABZ by means of experimental design involving multivariate analysis. Copyright © 2013 Elsevier B.V. All rights reserved.
An Isotopic Dilution Experiment Using Liquid Scintillation: A Simple Two-System, Two-Phase Analysis.
ERIC Educational Resources Information Center
Moehs, Peter J.; Levine, Samuel
1982-01-01
A simple isotonic, dilution analysis whose principles apply to methods of more complex radioanalyses is described. Suitable for clinical and instrumental analysis chemistry students, experimental manipulations are kept to a minimum involving only aqueous extraction before counting. Background information, procedures, and results are discussed.…
Counseling Workers over 40: GULHEMP, a New Approach.
ERIC Educational Resources Information Center
Meredith, Jack
This series of presentations describe a method of job counseling and placement for the middle-aged which combines pre-employment physical worker analysis with job analysis for effective matching of job requirements with worker capacities. The matching process involves these steps: (1) job analysis by an industrial engineer; (2) worker examination…
ERIC Educational Resources Information Center
Hwang, Heungsun; Montreal, Hec; Dillon, William R.; Takane, Yoshio
2006-01-01
An extension of multiple correspondence analysis is proposed that takes into account cluster-level heterogeneity in respondents' preferences/choices. The method involves combining multiple correspondence analysis and k-means in a unified framework. The former is used for uncovering a low-dimensional space of multivariate categorical variables…
Experimental and Computational Analysis of Modes in a Partially Constrained Plate
2004-03-01
way to quantify a structure. One technique utilizing an energy method is the Statistical Energy Analysis (SEA). The SEA process involves regarding...B.R. Mace. “ Statistical Energy Analysis of Two Edge- Coupled Rectangular Plates: Ensemble Averages,” Journal of Sound and Vibration, 193(4): 793-822
Georgsson, Mattias; Kushniruk, Andre
2016-01-01
The cognitive walkthrough (CW) is a task-based, expert inspection usability evaluation method involving benefits such as cost effectiveness and efficiency. A drawback of the method is that it doesn't involve the user perspective from real users but instead is based on experts' predictions about the usability of the system and how users interact. In this paper, we propose a way of involving the user in an expert evaluation method by modifying the CW with patient groups as mediators. This along with other modifications include a dual domain session facilitator, specific patient groups and three different phases: 1) a preparation phase where suitable tasks are developed by a panel of experts and patients, validated through the content validity index 2) a patient user evaluation phase including an individual and collaborative process part 3) an analysis and coding phase where all data is digitalized and synthesized making use of Qualitative Data Analysis Software (QDAS) to determine usability deficiencies. We predict that this way of evaluating will utilize the benefits of the expert methods, also providing a way of including the patient user of these self-management systems. Results from this prospective study should provide evidence of the usefulness of this method modification.
Transient loads analysis for space flight applications
NASA Technical Reports Server (NTRS)
Thampi, S. K.; Vidyasagar, N. S.; Ganesan, N.
1992-01-01
A significant part of the flight readiness verification process involves transient analysis of the coupled Shuttle-payload system to determine the low frequency transient loads. This paper describes a methodology for transient loads analysis and its implementation for the Spacelab Life Sciences Mission. The analysis is carried out using two major software tools - NASTRAN and an external FORTRAN code called EZTRAN. This approach is adopted to overcome some of the limitations of NASTRAN's standard transient analysis capabilities. The method uses Data Recovery Matrices (DRM) to improve computational efficiency. The mode acceleration method is fully implemented in the DRM formulation to recover accurate displacements, stresses, and forces. The advantages of the method are demonstrated through a numerical example.
Wentzel, Jobke; Sanderman, Robbert; van Gemert-Pijnen, Lisette
2015-01-01
Background It is acknowledged that the success and uptake of eHealth improve with the involvement of users and stakeholders to make technology reflect their needs. Involving stakeholders in implementation research is thus a crucial element in developing eHealth technology. Business modeling is an approach to guide implementation research for eHealth. Stakeholders are involved in business modeling by identifying relevant stakeholders, conducting value co-creation dialogs, and co-creating a business model. Because implementation activities are often underestimated as a crucial step while developing eHealth, comprehensive and applicable approaches geared toward business modeling in eHealth are scarce. Objective This paper demonstrates the potential of several stakeholder-oriented analysis methods and their practical application was demonstrated using Infectionmanager as an example case. In this paper, we aim to demonstrate how business modeling, with the focus on stakeholder involvement, is used to co-create an eHealth implementation. Methods We divided business modeling in 4 main research steps. As part of stakeholder identification, we performed literature scans, expert recommendations, and snowball sampling (Step 1). For stakeholder analyzes, we performed “basic stakeholder analysis,” stakeholder salience, and ranking/analytic hierarchy process (Step 2). For value co-creation dialogs, we performed a process analysis and stakeholder interviews based on the business model canvas (Step 3). Finally, for business model generation, we combined all findings into the business model canvas (Step 4). Results Based on the applied methods, we synthesized a step-by-step guide for business modeling with stakeholder-oriented analysis methods that we consider suitable for implementing eHealth. Conclusions The step-by-step guide for business modeling with stakeholder involvement enables eHealth researchers to apply a systematic and multidisciplinary, co-creative approach for implementing eHealth. Business modeling becomes an active part in the entire development process of eHealth and starts an early focus on implementation, in which stakeholders help to co-create the basis necessary for a satisfying success and uptake of the eHealth technology. PMID:26272510
Traditional and Cognitive Job Analyses as Tools for Understanding the Skills Gap.
ERIC Educational Resources Information Center
Hanser, Lawrence M.
Traditional methods of job and task analysis may be categorized as worker-oriented methods focusing on general human behaviors performed by workers in jobs or as job-oriented methods focusing on the technologies involved in jobs. The ability of both types of traditional methods to identify, understand, and communicate the skills needed in high…
Ito, Jun; Herter, Thomas; Baidoo, Edward E K; Lao, Jeemeng; Vega-Sánchez, Miguel E; Michelle Smith-Moritz, A; Adams, Paul D; Keasling, Jay D; Usadel, Björn; Petzold, Christopher J; Heazlewood, Joshua L
2014-03-01
Understanding the intricate metabolic processes involved in plant cell wall biosynthesis is limited by difficulties in performing sensitive quantification of many involved compounds. Hydrophilic interaction liquid chromatography is a useful technique for the analysis of hydrophilic metabolites from complex biological extracts and forms the basis of this method to quantify plant cell wall precursors. A zwitterionic silica-based stationary phase has been used to separate hydrophilic nucleotide sugars involved in cell wall biosynthesis from milligram amounts of leaf tissue. A tandem mass spectrometry operating in selected reaction monitoring mode was used to quantify nucleotide sugars. This method was highly repeatable and quantified 12 nucleotide sugars at low femtomole quantities, with linear responses up to four orders of magnitude to several 100pmol. The method was also successfully applied to the analysis of purified leaf extracts from two model plant species with variations in their cell wall sugar compositions and indicated significant differences in the levels of 6 out of 12 nucleotide sugars. The plant nucleotide sugar extraction procedure was demonstrated to have good recovery rates with minimal matrix effects. The approach results in a significant improvement in sensitivity when applied to plant samples over currently employed techniques. Copyright © 2013 Elsevier Inc. All rights reserved.
Torey, Angeline; Sasidharan, Sreenivasan; Yeng, Chen; Latha, Lachimanan Yoga
2010-05-10
Quality control standardizations of the various medicinal plants used in traditional medicine is becoming more important today in view of the commercialization of formulations based on these plants. An attempt at standardization of Cassia spectabilis leaf has been carried out with respect to authenticity, assay and chemical constituent analysis. The authentication involved many parameters, including gross morphology, microscopy of the leaves and functional group analysis by Fourier Transform Infrared (FTIR) spectroscopy. The assay part of standardization involved determination of the minimum inhibitory concentration (MIC) of the extract which could help assess the chemical effects and establish curative values. The MIC of the C. spectabilis leaf extracts was investigated using the Broth Dilution Method. The extracts showed a MIC value of 6.25 mg/mL, independent of the extraction time. The chemical constituent aspect of standardization involves quantification of the main chemical components in C. spectabilis. The GCMS method used for quantification of 2,4-(1H,3H)-pyrimidinedione in the extract was rapid, accurate, precise, linear (R(2) = 0.8685), rugged and robust. Hence this method was suitable for quantification of this component in C. spectabilis. The standardization of C. spectabilis is needed to facilitate marketing of medicinal plants, with a view to promoting the export of valuable Malaysian Traditional Medicinal plants such as C. spectabilis.
Enrollment Projection within a Decision-Making Framework.
ERIC Educational Resources Information Center
Armstrong, David F.; Nunley, Charlene Wenckowski
1981-01-01
Two methods used to predict enrollment at Montgomery College in Maryland are compared and evaluated, and the administrative context in which they are used is considered. The two methods involve time series analysis (curve fitting) and indicator techniques (yield from components). (MSE)
Developments in Sampling and Analysis Instrumentation for Stationary Sources
ERIC Educational Resources Information Center
Nader, John S.
1973-01-01
Instrumentation for the measurement of pollutant emissions is considered including sample-site selection, sample transport, sample treatment, sample analysis, and data reduction, display, and interpretation. Measurement approaches discussed involve sample extraction from within the stack and electro-optical methods. (BL)
ERIC Educational Resources Information Center
Sarkis, Vahak D.
1974-01-01
Describes a method (involving a Hach Colorimeter and simplified procedures) that can be used for the analysis of up to 56 different chemical constituents of water. Presents the results of student analysis of waters of Fulton and Montgomery counties in New York. (GS)
van Limburg, Maarten; Wentzel, Jobke; Sanderman, Robbert; van Gemert-Pijnen, Lisette
2015-08-13
It is acknowledged that the success and uptake of eHealth improve with the involvement of users and stakeholders to make technology reflect their needs. Involving stakeholders in implementation research is thus a crucial element in developing eHealth technology. Business modeling is an approach to guide implementation research for eHealth. Stakeholders are involved in business modeling by identifying relevant stakeholders, conducting value co-creation dialogs, and co-creating a business model. Because implementation activities are often underestimated as a crucial step while developing eHealth, comprehensive and applicable approaches geared toward business modeling in eHealth are scarce. This paper demonstrates the potential of several stakeholder-oriented analysis methods and their practical application was demonstrated using Infectionmanager as an example case. In this paper, we aim to demonstrate how business modeling, with the focus on stakeholder involvement, is used to co-create an eHealth implementation. We divided business modeling in 4 main research steps. As part of stakeholder identification, we performed literature scans, expert recommendations, and snowball sampling (Step 1). For stakeholder analyzes, we performed "basic stakeholder analysis," stakeholder salience, and ranking/analytic hierarchy process (Step 2). For value co-creation dialogs, we performed a process analysis and stakeholder interviews based on the business model canvas (Step 3). Finally, for business model generation, we combined all findings into the business model canvas (Step 4). Based on the applied methods, we synthesized a step-by-step guide for business modeling with stakeholder-oriented analysis methods that we consider suitable for implementing eHealth. The step-by-step guide for business modeling with stakeholder involvement enables eHealth researchers to apply a systematic and multidisciplinary, co-creative approach for implementing eHealth. Business modeling becomes an active part in the entire development process of eHealth and starts an early focus on implementation, in which stakeholders help to co-create the basis necessary for a satisfying success and uptake of the eHealth technology.
Lizier, Joseph T; Heinzle, Jakob; Horstmann, Annette; Haynes, John-Dylan; Prokopenko, Mikhail
2011-02-01
The human brain undertakes highly sophisticated information processing facilitated by the interaction between its sub-regions. We present a novel method for interregional connectivity analysis, using multivariate extensions to the mutual information and transfer entropy. The method allows us to identify the underlying directed information structure between brain regions, and how that structure changes according to behavioral conditions. This method is distinguished in using asymmetric, multivariate, information-theoretical analysis, which captures not only directional and non-linear relationships, but also collective interactions. Importantly, the method is able to estimate multivariate information measures with only relatively little data. We demonstrate the method to analyze functional magnetic resonance imaging time series to establish the directed information structure between brain regions involved in a visuo-motor tracking task. Importantly, this results in a tiered structure, with known movement planning regions driving visual and motor control regions. Also, we examine the changes in this structure as the difficulty of the tracking task is increased. We find that task difficulty modulates the coupling strength between regions of a cortical network involved in movement planning and between motor cortex and the cerebellum which is involved in the fine-tuning of motor control. It is likely these methods will find utility in identifying interregional structure (and experimentally induced changes in this structure) in other cognitive tasks and data modalities.
Marraccini, Marisa E; Brick, Leslie Ann D; Weyandt, Lisa L
2018-03-22
Although bullying is traditionally considered within the context of primary and secondary school, recent evidence suggests that bullying continues into college and workplace settings. Participants/Method: Latent class analysis (LCA) was employed to classify college bullying involvement typologies among 325 college students attending a northeastern university. Four classes concerning bullying involvement were revealed: Non-involved (36%); Instructor victim (30%); Peer bully-victim (22%); and Peer bully-victim/ Instructor victim (12%). Findings from this study, which classified college bullying experiences by incorporating both peer and instructor (teacher and professor) bullying, add substantially to the literature by providing insight into patterns of relatively unexplored bullying behaviors.
Proteomic Analysis of Cytoskeleton Proteins in Fish.
Gotesman, Michael; Menanteau-Ledouble, Simon; El-Matbouli, Mansour
2016-01-01
In this chapter, we describe laboratory protocols for rearing fish and a simple and efficient method of extracting and identifying pathogen and host proteins that may be involved in entry and replication of commercially important fish viruses. We have used the common carp (Cyprinus carpio L.) and goldfish (Cyprinus auratus) as a model system for studies of proteins involved in viral entry and replication. The chapter describes detailed protocols for maintenance of carp, cell culture, antibody purification of proteins, and use of electrospray-ionization mass spectrometry analysis to screen and identify cytoskeleton and other proteins that may be involved in viral infection and propagation in fish.
Guidance for Organisational Strategy on Knowledge to Action from Conceptual Frameworks and Practice
ERIC Educational Resources Information Center
Willis, Cameron; Riley, Barbara; Lewis, Mary; Stockton, Lisa; Yessis, Jennifer
2017-01-01
This paper aims to provide public health organisations involved in chronic disease prevention with conceptual and practical guidance for developing contextually sensitive knowledge-to-action (KTA) strategies. Methods involve an analysis of 13 relevant conceptual KTA frameworks, and a review of three case examples of organisations with active KTA…
Family Involvement in Creative Teaching Practices for All in Small Rural Schools
ERIC Educational Resources Information Center
Vigo Arrazola, Begoña; Soriano Bozalongo, Juana
2015-01-01
Parental involvement is interpreted as a key form of support that can contribute to the establishment of inclusive practices in schools, but this can be difficult in sparsely populated areas. Using ethnographic methods of participant observation, informal conversations and document analysis, this article therefore focuses on family involvement…
Cognitive Process Modeling of Spatial Ability: The Assembling Objects Task
ERIC Educational Resources Information Center
Ivie, Jennifer L.; Embretson, Susan E.
2010-01-01
Spatial ability tasks appear on many intelligence and aptitude tests. Although the construct validity of spatial ability tests has often been studied through traditional correlational methods, such as factor analysis, less is known about the cognitive processes involved in solving test items. This study examines the cognitive processes involved in…
Microbial ecology laboratory procedures manual NASA/MSFC
NASA Technical Reports Server (NTRS)
Huff, Timothy L.
1990-01-01
An essential part of the efficient operation of any microbiology laboratory involved in sample analysis is a standard procedures manual. The purpose of this manual is to provide concise and well defined instructions on routine technical procedures involving sample analysis and methods for monitoring and maintaining quality control within the laboratory. Of equal importance is the safe operation of the laboratory. This manual outlines detailed procedures to be followed in the microbial ecology laboratory to assure safety, analytical control, and validity of results.
Keller, T; Schneider, A; Regenscheit, P; Dirnhofer, R; Rücker, T; Jaspers, J; Kisser, W
1999-01-11
A new method has been developed for the rapid analysis of psilocybin and/or psilocin in fungus material using ion mobility spectrometry. Quantitative analysis was performed by gas chromatography-mass spectrometry after a simple one-step extraction involving homogenization of the dried fruit bodies of fungi in chloroform and derivatization with MSTFA. The proposed methods resulted in rapid procedures useful in analyzing psychotropic fungi for psilocybin and psilocin.
ERIC Educational Resources Information Center
McLaren, Ingrid Ann Marie
2012-01-01
This paper describes a study which uses quantitative and qualitative methods in determining the relationship between academic, institutional and psychological variables and degree performance for a sample of Jamaican undergraduate students. Quantitative methods, traditionally associated with the positivist paradigm, and involving the counting and…
Conformal mapping for multiple terminals
Wang, Weimin; Ma, Wenying; Wang, Qiang; Ren, Hao
2016-01-01
Conformal mapping is an important mathematical tool that can be used to solve various physical and engineering problems in many fields, including electrostatics, fluid mechanics, classical mechanics, and transformation optics. It is an accurate and convenient way to solve problems involving two terminals. However, when faced with problems involving three or more terminals, which are more common in practical applications, existing conformal mapping methods apply assumptions or approximations. A general exact method does not exist for a structure with an arbitrary number of terminals. This study presents a conformal mapping method for multiple terminals. Through an accurate analysis of boundary conditions, additional terminals or boundaries are folded into the inner part of a mapped region. The method is applied to several typical situations, and the calculation process is described for two examples of an electrostatic actuator with three electrodes and of a light beam splitter with three ports. Compared with previously reported results, the solutions for the two examples based on our method are more precise and general. The proposed method is helpful in promoting the application of conformal mapping in analysis of practical problems. PMID:27830746
The Impact of Intervention Methods on Emotional Intelligence
ERIC Educational Resources Information Center
Davis, Christopher M.
2013-01-01
This experimental study continued the exploration surrounding emotional intelligence (EI). Emotional intelligence was examined through past and present literature, instrumentation, didactic teaching methods employing EI concepts, and data analysis. The experiment involved participants from two sections of an undergraduate economics class at a…
Q-Type Factor Analysis of Healthy Aged Men.
ERIC Educational Resources Information Center
Kleban, Morton H.
Q-type factor analysis was used to re-analyze baseline data collected in 1957, on 47 men aged 65-91. Q-type analysis is the use of factor methods to study persons rather than tests. Although 550 variables were originally studied involving psychiatry, medicine, cerebral metabolism and chemistry, personality, audiometry, dichotic and diotic memory,…
The multiple imputation method: a case study involving secondary data analysis.
Walani, Salimah R; Cleland, Charles M
2015-05-01
To illustrate with the example of a secondary data analysis study the use of the multiple imputation method to replace missing data. Most large public datasets have missing data, which need to be handled by researchers conducting secondary data analysis studies. Multiple imputation is a technique widely used to replace missing values while preserving the sample size and sampling variability of the data. The 2004 National Sample Survey of Registered Nurses. The authors created a model to impute missing values using the chained equation method. They used imputation diagnostics procedures and conducted regression analysis of imputed data to determine the differences between the log hourly wages of internationally educated and US-educated registered nurses. The authors used multiple imputation procedures to replace missing values in a large dataset with 29,059 observations. Five multiple imputed datasets were created. Imputation diagnostics using time series and density plots showed that imputation was successful. The authors also present an example of the use of multiple imputed datasets to conduct regression analysis to answer a substantive research question. Multiple imputation is a powerful technique for imputing missing values in large datasets while preserving the sample size and variance of the data. Even though the chained equation method involves complex statistical computations, recent innovations in software and computation have made it possible for researchers to conduct this technique on large datasets. The authors recommend nurse researchers use multiple imputation methods for handling missing data to improve the statistical power and external validity of their studies.
Translations on Eastern Europe, Scientific Affairs, No. 562
1977-10-28
remodeling and mod- ernization of the institute’s facilities resulted in an increase in the reactor’s neutron flux and power output capacity and...research technique involving the use of the experimental reactor is neutron activation analysis. Using this method it is possible to produce...artificial radioactivity through the bombardment of non-active substances with neutrons . This is one of the most sensitive methods of chemical analysis
Yi, Zhou; Manil-Ségalen, Marion; Sago, Laila; Glatigny, Annie; Redeker, Virginie; Legouis, Renaud; Mucchielli-Giorgi, Marie-Hélène
2016-05-06
Affinity purifications followed by mass spectrometric analysis are used to identify protein-protein interactions. Because quantitative proteomic data are noisy, it is necessary to develop statistical methods to eliminate false-positives and identify true partners. We present here a novel approach for filtering false interactors, named "SAFER" for mass Spectrometry data Analysis by Filtering of Experimental Replicates, which is based on the reproducibility of the replicates and the fold-change of the protein intensities between bait and control. To identify regulators or targets of autophagy, we characterized the interactors of LGG1, a ubiquitin-like protein involved in autophagosome formation in C. elegans. LGG-1 partners were purified by affinity, analyzed by nanoLC-MS/MS mass spectrometry, and quantified by a label-free proteomic approach based on the mass spectrometric signal intensity of peptide precursor ions. Because the selection of confident interactions depends on the method used for statistical analysis, we compared SAFER with several statistical tests and different scoring algorithms on this set of data. We show that SAFER recovers high-confidence interactors that have been ignored by the other methods and identified new candidates involved in the autophagy process. We further validated our method on a public data set and conclude that SAFER notably improves the identification of protein interactors.
Development of an SPE/CE method for analyzing HAAs
Zhang, L.; Capel, P.D.; Hozalski, R.M.
2007-01-01
The haloacetic acid (HAA) analysis methods approved by the US Environmental Protection Agency involve extraction and derivatization of HAAs (typically to their methyl ester form) and analysis by gas chromatography (GC) with electron capture detection (ECD). Concerns associated with these methods include the time and effort of the derivatization process, use of potentially hazardous chemicals or conditions during methylation, poor recoveries because of low extraction efficiencies for some HAAs or matrix effects from sulfate, and loss of tribromoacetic acid because of decarboxylation. The HAA analysis method introduced here uses solid-phase extraction (SPE) followed by capillary electrophoresis (CE) analysis. The method is accurate, reproducible, sensitive, relatively safe, and easy to perform, and avoids the use of large amounts of solvent for liquid-liquid extraction and the potential hazards and hassles of derivatization. The cost of analyzing HAAs using this method should be lower than the currently approved methods, and utilities with a GC/ECD can perform the analysis in-house.
A framework for performing workplace hazard and risk analysis: a participative ergonomics approach.
Morag, Ido; Luria, Gil
2013-01-01
Despite the unanimity among researchers about the centrality of workplace analysis based on participatory ergonomics (PE) as a basis for preventive interventions, there is still little agreement about the necessary of a theoretical framework for providing practical guidance. In an effort to develop a conceptual PE framework, the authors, focusing on 20 studies, found five primary dimensions for characterising an analytical structure: (1) extent of workforce involvement; (2) analysis duration; (3) diversity of reporter role types; (4) scope of analysis and (5) supportive information system for analysis management. An ergonomics analysis carried out in a chemical manufacturing plant serves as a case study for evaluating the proposed framework. The study simultaneously demonstrates the five dimensions and evaluates their feasibility. The study showed that managerial leadership was fundamental to the successful implementation of the analysis; that all job holders should participate in analysing their own workplace and simplified reporting methods contributed to a desirable outcome. This paper seeks to clarify the scope of workplace ergonomics analysis by offering a theoretical and structured framework for providing practical advice and guidance. Essential to successfully implementing the analytical framework are managerial involvement, participation of all job holders and simplified reporting methods.
Peckner, Ryan; Myers, Samuel A; Jacome, Alvaro Sebastian Vaca; Egertson, Jarrett D; Abelin, Jennifer G; MacCoss, Michael J; Carr, Steven A; Jaffe, Jacob D
2018-05-01
Mass spectrometry with data-independent acquisition (DIA) is a promising method to improve the comprehensiveness and reproducibility of targeted and discovery proteomics, in theory by systematically measuring all peptide precursors in a biological sample. However, the analytical challenges involved in discriminating between peptides with similar sequences in convoluted spectra have limited its applicability in important cases, such as the detection of single-nucleotide polymorphisms (SNPs) and alternative site localizations in phosphoproteomics data. We report Specter (https://github.com/rpeckner-broad/Specter), an open-source software tool that uses linear algebra to deconvolute DIA mixture spectra directly through comparison to a spectral library, thus circumventing the problems associated with typical fragment-correlation-based approaches. We validate the sensitivity of Specter and its performance relative to that of other methods, and show that Specter is able to successfully analyze cases involving highly similar peptides that are typically challenging for DIA analysis methods.
Effects of surface preparation on quality of aluminum alloy weldments
NASA Technical Reports Server (NTRS)
Kizer, D.; Saperstein, Z.
1968-01-01
Study of surface preparations and surface contamination effects on the welding of 2014 aluminum involves several methods of surface analysis to identify surface properties conducive to weld defects. These methods are radioactive evaporation, spectral reflectance mass spectroscopy, gas chromatography and spark emission spectroscopy.
Steam Hydrocarbon Cracking and Reforming
ERIC Educational Resources Information Center
Golombok, Michael
2004-01-01
The interactive methods of steam hydrocarbon reforming and cracking of the oil and chemical industries are scrutinized, with special focus on their resemblance and variations. The two methods are illustrations of equilibrium-controlled and kinetically-controlled processes, the analysis of which involves theories, which overlap and balance each…
Emerging and recurrent issues in drug development.
Anello, C
This paper reviews several emerging and recurrent issues relating to the drug development process. These emerging issues include changes to the FDA regulatory environment, internationalization of drug development, advances in computer technology and visualization tools, and efforts to incorporate meta-analysis methodology. Recurrent issues include: renewed interest in statistical methods for handling subgroups in the design and analysis of clinical trials; renewed interest in alternatives to the 'intention-to-treat' analysis in the presence of non-compliance in randomized clinical trials; renewed interest in methodology to address the multiplicities resulting from a variety of sources inherent in the drug development process, and renewed interest in methods to assure data integrity. These emerging and recurrent issues provide a continuing challenge to the international community of statisticians involved in drug development. Moreover, the involvement of statisticians with different perspectives continues to enrich the field and contributes to improvement in the public health.
Protecting privacy of shared epidemiologic data without compromising analysis potential.
Cologne, John; Grant, Eric J; Nakashima, Eiji; Chen, Yun; Funamoto, Sachiyo; Katayama, Hiroaki
2012-01-01
Ensuring privacy of research subjects when epidemiologic data are shared with outside collaborators involves masking (modifying) the data, but overmasking can compromise utility (analysis potential). Methods of statistical disclosure control for protecting privacy may be impractical for individual researchers involved in small-scale collaborations. We investigated a simple approach based on measures of disclosure risk and analytical utility that are straightforward for epidemiologic researchers to derive. The method is illustrated using data from the Japanese Atomic-bomb Survivor population. Masking by modest rounding did not adequately enhance security but rounding to remove several digits of relative accuracy effectively reduced the risk of identification without substantially reducing utility. Grouping or adding random noise led to noticeable bias. When sharing epidemiologic data, it is recommended that masking be performed using rounding. Specific treatment should be determined separately in individual situations after consideration of the disclosure risks and analysis needs.
Automatic Error Analysis Using Intervals
ERIC Educational Resources Information Center
Rothwell, E. J.; Cloud, M. J.
2012-01-01
A technique for automatic error analysis using interval mathematics is introduced. A comparison to standard error propagation methods shows that in cases involving complicated formulas, the interval approach gives comparable error estimates with much less effort. Several examples are considered, and numerical errors are computed using the INTLAB…
Flow Analysis Tool White Paper
NASA Technical Reports Server (NTRS)
Boscia, Nichole K.
2012-01-01
Faster networks are continually being built to accommodate larger data transfers. While it is intuitive to think that implementing faster networks will result in higher throughput rates, this is often not the case. There are many elements involved in data transfer, many of which are beyond the scope of the network itself. Although networks may get bigger and support faster technologies, the presence of other legacy components, such as older application software or kernel parameters, can often cause bottlenecks. Engineers must be able to identify when data flows are reaching a bottleneck that is not imposed by the network and then troubleshoot it using the tools available to them. The current best practice is to collect as much information as possible on the network traffic flows so that analysis is quick and easy. Unfortunately, no single method of collecting this information can sufficiently capture the whole endto- end picture. This becomes even more of a hurdle when large, multi-user systems are involved. In order to capture all the necessary information, multiple data sources are required. This paper presents a method for developing a flow analysis tool to effectively collect network flow data from multiple sources and provide that information to engineers in a clear, concise way for analysis. The purpose of this method is to collect enough information to quickly (and automatically) identify poorly performing flows along with the cause of the problem. The method involves the development of a set of database tables that can be populated with flow data from multiple sources, along with an easyto- use, web-based front-end interface to help network engineers access, organize, analyze, and manage all the information.
ERIC Educational Resources Information Center
Eckes, Suzanne E.
2017-01-01
This article examines an education policy matter that involves homophobic speech in public schools. Using legal research methods, two federal circuit court opinions that have examined the tension surrounding anti-LGBTQ student expression are analyzed. This legal analysis provides non-lawyers some insight into the current realities of student…
Multicriteria analysis of ontologically represented information
NASA Astrophysics Data System (ADS)
Wasielewska, K.; Ganzha, M.; Paprzycki, M.; Bǎdicǎ, C.; Ivanovic, M.; Lirkov, I.
2014-11-01
Our current work concerns the development of a decision support system for the software selection problem. The main idea is to utilize expert knowledge to help the user in selecting the best software / method / computational resource to solve a computational problem. Obviously, this involves multicriterial decision making and the key open question is: which method to choose. The context of the work is provided by the Agents in Grid (AiG) project, where the software selection (and thus multicriterial analysis) is to be realized when all information concerning the problem, the hardware and the software is ontologically represented. Initially, we have considered the Analytical Hierarchy Process (AHP), which is well suited for the hierarchical data structures (e.g., such that have been formulated in terms of ontologies). However, due to its well-known shortcomings, we have decided to extend our search for the multicriterial analysis method best suited for the problem in question. In this paper we report results of our search, which involved: (i) TOPSIS (Technique for Order Preference by Similarity to Ideal Solution), (ii) PROMETHEE, and (iii) GRIP (Generalized Regression with Intensities of Preference). We also briefly argue why other methods have not been considered as valuable candidates.
Brandfass, Christoph; Karlovsky, Petr
2006-01-23
Fusarium head blight (FHB) is a disease of cereal crops, which has a severe impact on wheat and barley production worldwide. Apart from reducing the yield and impairing grain quality, FHB leads to contamination of grain with toxic secondary metabolites (mycotoxins), which pose a health risk to humans and livestock. The Fusarium species primarily involved in FHB are F. graminearum and F. culmorum. A key prerequisite for a reduction in the incidence of FHB is an understanding of its epidemiology. We describe a duplex-PCR-based method for the simultaneous detection of F. culmorum and F. graminearum in plant material. Species-specific PCR products are identified by melting curve analysis performed in a real-time thermocycler in the presence of the fluorescent dye SYBR Green I. In contrast to multiplex real-time PCR assays, the method does not use doubly labeled hybridization probes. PCR with product differentiation by melting curve analysis offers a cost-effective means of qualitative analysis for the presence of F. culmorum and F. graminearum in plant material. This method is particularly suitable for epidemiological studies involving a large number of samples.
Zhang, Jiang; Liu, Qi; Chen, Huafu; Yuan, Zhen; Huang, Jin; Deng, Lihua; Lu, Fengmei; Zhang, Junpeng; Wang, Yuqing; Wang, Mingwen; Chen, Liangyin
2015-01-01
Clustering analysis methods have been widely applied to identifying the functional brain networks of a multitask paradigm. However, the previously used clustering analysis techniques are computationally expensive and thus impractical for clinical applications. In this study a novel method, called SOM-SAPC that combines self-organizing mapping (SOM) and supervised affinity propagation clustering (SAPC), is proposed and implemented to identify the motor execution (ME) and motor imagery (MI) networks. In SOM-SAPC, SOM was first performed to process fMRI data and SAPC is further utilized for clustering the patterns of functional networks. As a result, SOM-SAPC is able to significantly reduce the computational cost for brain network analysis. Simulation and clinical tests involving ME and MI were conducted based on SOM-SAPC, and the analysis results indicated that functional brain networks were clearly identified with different response patterns and reduced computational cost. In particular, three activation clusters were clearly revealed, which include parts of the visual, ME and MI functional networks. These findings validated that SOM-SAPC is an effective and robust method to analyze the fMRI data with multitasks.
Fourier Descriptor Analysis and Unification of Voice Range Profile Contours: Method and Applications
ERIC Educational Resources Information Center
Pabon, Peter; Ternstrom, Sten; Lamarche, Anick
2011-01-01
Purpose: To describe a method for unified description, statistical modeling, and comparison of voice range profile (VRP) contours, even from diverse sources. Method: A morphologic modeling technique, which is based on Fourier descriptors (FDs), is applied to the VRP contour. The technique, which essentially involves resampling of the curve of the…
Feng, Jingwen; Lin, Jie; Zhang, Pengquan; Yang, Songnan; Sa, Yu; Feng, Yuanming
2017-08-29
High-content screening is commonly used in studies of the DNA damage response. The double-strand break (DSB) is one of the most harmful types of DNA damage lesions. The conventional method used to quantify DSBs is γH2AX foci counting, which requires manual adjustment and preset parameters and is usually regarded as imprecise, time-consuming, poorly reproducible, and inaccurate. Therefore, a robust automatic alternative method is highly desired. In this manuscript, we present a new method for quantifying DSBs which involves automatic image cropping, automatic foci-segmentation and fluorescent intensity measurement. Furthermore, an additional function was added for standardizing the measurement of DSB response inhibition based on co-localization analysis. We tested the method with a well-known inhibitor of DSB response. The new method requires only one preset parameter, which effectively minimizes operator-dependent variations. Compared with conventional methods, the new method detected a higher percentage difference of foci formation between different cells, which can improve measurement accuracy. The effects of the inhibitor on DSB response were successfully quantified with the new method (p = 0.000). The advantages of this method in terms of reliability, automation and simplicity show its potential in quantitative fluorescence imaging studies and high-content screening for compounds and factors involved in DSB response.
Gauvin, Francois-Pierre; Abelson, Julia; Giacomini, Mita; Eyles, John; Lavis, John N
2010-05-01
There have been calls in recent years for greater public involvement in health technology assessment (HTA). Yet the concept of public involvement is poorly articulated and little attention has been paid to the context of HTA agencies. This article investigates how public involvement is conceptualized in the HTA agency environment. Using qualitative concept analysis methods, we reviewed the HTA literature and the websites of HTA agencies and conducted semi-structured interviews with informants in Canada, Denmark, and the United Kingdom. Our analysis reveals that HTA agencies' role as bridges or boundary organizations situated at the frontier of research and policymaking causes the agencies to struggle with the idea of public involvement. The HTA community is concerned with conceptualizing public involvement in such a way as to meet scientific and methodological standards without neglecting its responsibilities to healthcare policymakers. We offer a conceptual tool for analyzing the nature of public involvement across agencies, characterizing different domains, levels of involvement, and types of publics. Copyright (c) 2010 Elsevier Ltd. All rights reserved.
Risk-benefit analysis and public policy: a bibliography
DOE Office of Scientific and Technical Information (OSTI.GOV)
Clark, E.M.; Van Horn, A.J.
1976-11-01
Risk-benefit analysis has been implicitly practiced whenever decision-makers are confronted with decisions involving risks to life, health, or to the environment. Various methodologies have been developed to evaluate relevant criteria and to aid in assessing the impacts of alternative projects. Among these have been cost-benefit analysis, which has been widely used for project evaluation. However, in many cases it has been difficult to assign dollar costs to those criteria involving risks and benefits which are not now assigned explicit monetary values in our economic system. Hence, risk-benefit analysis has evolved to become more than merely an extension of cost-benefit analysis,more » and many methods have been applied to examine the trade-offs between risks and benefits. In addition, new scientific and statistical techniques have been developed for assessing current and future risks. The 950 references included in this bibliography are meant to suggest the breadth of those methodologies which have been applied to decisions involving risk.« less
Family Early Literacy Practices Questionnaire: A Validation Study for a Spanish-Speaking Population
ERIC Educational Resources Information Center
Lewis, Kandia
2012-01-01
The purpose of the current study was to evaluate the psychometric validity of a Spanish translated version of a family involvement questionnaire (the FELP) using a mixed-methods design. Thus, statistical analyses (i.e., factor analysis, reliability analysis, and item analysis) and qualitative analyses (i.e., focus group data) were assessed.…
The purpose of this SOP is to describe the methods used for detection and quantification by gas chromatography/mass spectrometry (GC/MS) of pesticides in a variety of matrices, including air, house dust, soil, and handwipes. This analysis involves automated gas GC/MS analysis us...
Combining Heterogeneous Correlation Matrices: Simulation Analysis of Fixed-Effects Methods
ERIC Educational Resources Information Center
Hafdahl, Adam R.
2008-01-01
Monte Carlo studies of several fixed-effects methods for combining and comparing correlation matrices have shown that two refinements improve estimation and inference substantially. With rare exception, however, these simulations have involved homogeneous data analyzed using conditional meta-analytic procedures. The present study builds on…
Mixed Methods Research Designs in Counseling Psychology
ERIC Educational Resources Information Center
Hanson, William E.; Creswell, John W.; Clark, Vicki L. Plano; Petska, Kelly S.; Creswell, David J.
2005-01-01
With the increased popularity of qualitative research, researchers in counseling psychology are expanding their methodologies to include mixed methods designs. These designs involve the collection, analysis, and integration of quantitative and qualitative data in a single or multiphase study. This article presents an overview of mixed methods…
Measurement of Air Pollutants in the Troposphere
ERIC Educational Resources Information Center
Clemitshaw, Kevin C.
2011-01-01
This article describes the principles, applications and performances of methods to measure gas-phase air pollutants that either utilise passive or active sampling with subsequent laboratory analysis or involve automated "in situ" sampling and analysis. It focuses on air pollutants that have adverse impacts on human health (nitrogen…
Mean Comparison: Manifest Variable versus Latent Variable
ERIC Educational Resources Information Center
Yuan, Ke-Hai; Bentler, Peter M.
2006-01-01
An extension of multiple correspondence analysis is proposed that takes into account cluster-level heterogeneity in respondents' preferences/choices. The method involves combining multiple correspondence analysis and k-means in a unified framework. The former is used for uncovering a low-dimensional space of multivariate categorical variables…
Patient Involvement in Safe Delivery: A Qualitative Study.
Olfati, Forozun; Asefzadeh, Saeid; Changizi, Nasrin; Keramat, Afsaneh; Yunesian, Masud
2015-09-28
Patient involvement in safe delivery planning is considered important yet not widely practiced. The present study aimed at identifythe factors that affect patient involvementin safe delivery, as recommended by parturient women. This study was part of a qualitative research conducted by content analysis method and purposive sampling in 2013.The data were collected through 63 semi-structured interviews in4 hospitalsand analyzed using thematic content analysis. The participants in this research were women before discharge and after delivery. Findings were analyzed using Colaizzi's method. Four categories of factors that could affect patient involvement in safe delivery emerged from our analysis: patient-related (true and false beliefs, literacy, privacy, respect for patient), illness-related (pain, type of delivery, patient safety incidents), health care professional-relatedand task-related factors (behavior, monitoring &training), health care setting-related (financial aspects, facilities). More research is needed to explore the factors affecting the participation of mothers. It is therefore, recommended to: 1) take notice of mother education, their husbands, midwives and specialists; 2) provide pregnant women with insurance coverage from the outset of pregnancy, especially during prenatal period; 3) form a labor pain committee consisting of midwives, obstetricians, and anesthesiologists in order to identify the preferred painless labor methods based on the existing facilities and conditions, 4) carry out research on observing patients' privacy and dignity; 5) pay more attention on the factors affecting cesarean.
Analysis of beryllium and depleted uranium: An overview of detection methods in aerosols and soils
DOE Office of Scientific and Technical Information (OSTI.GOV)
Camins, I.; Shinn, J.H.
We conducted a survey of commercially available methods for analysis of beryllium and depleted uranium in aerosols and soils to find a reliable, cost-effective, and sufficiently precise method for researchers involved in environmental testing at the Yuma Proving Ground, Yuma, Arizona. Criteria used for evaluation include cost, method of analysis, specificity, sensitivity, reproducibility, applicability, and commercial availability. We found that atomic absorption spectrometry with graphite furnace meets these criteria for testing samples for beryllium. We found that this method can also be used to test samples for depleted uranium. However, atomic absorption with graphite furnace is not as sensitive amore » measurement method for depleted uranium as it is for beryllium, so we recommend that quality control of depleted uranium analysis be maintained by testing 10 of every 1000 samples by neutron activation analysis. We also evaluated 45 companies and institutions that provide analyses of beryllium and depleted uranium. 5 refs., 1 tab.« less
TUBEs-Mass Spectrometry for Identification and Analysis of the Ubiquitin-Proteome.
Azkargorta, Mikel; Escobes, Iraide; Elortza, Felix; Matthiesen, Rune; Rodríguez, Manuel S
2016-01-01
Mass spectrometry (MS) has become the method of choice for the large-scale analysis of protein ubiquitylation. There exist a number of proposed methods for mapping ubiquitin sites, each with different pros and cons. We present here a protocol for the MS analysis of the ubiquitin-proteome captured by TUBEs and subsequent data analysis. Using dedicated software and algorithms, specific information on the presence of ubiquitylated peptides can be obtained from the MS search results. In addition, a quantitative and functional analysis of the ubiquitylated proteins and their interacting partners helps to unravel the biological and molecular processes they are involved in.
NASA Technical Reports Server (NTRS)
Adams, Gaynor J; DUGAN DUANE W
1952-01-01
A method of analysis based on slender-wing theory is developed to investigate the characteristics in roll of slender cruciform wings and wing-body combinations. The method makes use of the conformal mapping processes of classical hydrodynamics which transform the region outside a circle and the region outside an arbitrary arrangement of line segments intersecting at the origin. The method of analysis may be utilized to solve other slender cruciform wing-body problems involving arbitrarily assigned boundary conditions. (author)
NASA Technical Reports Server (NTRS)
Bratanow, T.; Ecer, A.
1973-01-01
A general computational method for analyzing unsteady flow around pitching and plunging airfoils was developed. The finite element method was applied in developing an efficient numerical procedure for the solution of equations describing the flow around airfoils. The numerical results were employed in conjunction with computer graphics techniques to produce visualization of the flow. The investigation involved mathematical model studies of flow in two phases: (1) analysis of a potential flow formulation and (2) analysis of an incompressible, unsteady, viscous flow from Navier-Stokes equations.
Estimating Transmissivity from the Water Level Fluctuations of a Sinusoidally Forced Well
Mehnert, E.; Valocchi, A.J.; Heidari, M.; Kapoor, S.G.; Kumar, P.
1999-01-01
The water levels in wells are known to fluctuate in response to earth tides and changes in atmospheric pressure. These water level fluctuations can be analyzed to estimate transmissivity (T). A new method to estimate transmissivity, which assumes that the atmospheric pressure varies in a sinusoidal fashion, is presented. Data analysis for this simplified method involves using a set of type curves and estimating the ratio of the amplitudes of the well response over the atmospheric pressure. Type curves for this new method were generated based on a model for ground water flow between the well and aquifer developed by Cooper et al. (1965). Data analysis with this method confirmed these published results: (1) the amplitude ratio is a function of transmissivity, the well radius, and the frequency of the sinusoidal oscillation; and (2) the amplitude ratio is a weak function of storativity. Compared to other methods, the developed method involves simpler, more intuitive data analysis and allows shorter data sets to be analyzed. The effect of noise on estimating the amplitude ratio was evaluated and found to be more significant at lower T. For aquifers with low T, noise was shown to mask the water level fluctuations induced by atmospheric pressure changes. In addition, reducing the length of the data series did not affect the estimate of T, but the variance of the estimate was higher for the shorter series of noisy data.
USDA-ARS?s Scientific Manuscript database
A high-throughput qualitative screening and identification method for 9 aminoglycosides of regulatory interest has been developed, validated, and implemented for bovine kidney, liver, and muscle tissues. The method involves extraction at previously validated conditions, cleanup using disposable pip...
Scientist Honored by DOE for Outstanding Research Accomplishments,
passive design tools. The American Society of Heating, Refrigeration and Air Conditioning Engineer's mixed systems. This accomplishment gave the solar energy design community a direct, verifiable method of design manual, Passive Solar Heating Analysis, is an outgrowth of this method. Dr. Balcomb's involvement
The Tracer Method of Curriculum Analysis in Cancer Education
ERIC Educational Resources Information Center
Mahan, J. Maurice; And Others
1976-01-01
To assist faculty involved in cancer education in various courses in the curriculum, rather than instituting a new course in oncology, a method was developed for identifying and assessing cancer-related content (a clinical clerk attended lectures, interviewed instructors, reviewed syllibi etc.) and a comprehensive description was produced and…
An Analysis of the Algebraic Method for Balancing Chemical Reactions.
ERIC Educational Resources Information Center
Olson, John A.
1997-01-01
Analyzes the algebraic method for balancing chemical reactions. Introduces a third general condition that involves a balance between the total amount of oxidation and reduction. Requires the specification of oxidation states for all elements throughout the reaction. Describes the general conditions, the mathematical treatment, redox reactions, and…
Analysis of Five Instructional Methods for Teaching Sketchpad to Junior High Students
ERIC Educational Resources Information Center
Wright, Geoffrey; Shumway, Steve; Terry, Ronald; Bartholomew, Scott
2012-01-01
This manuscript addresses a problem teachers of computer software applications face today: What is an effective method for teaching new computer software? Technology and engineering teachers, specifically those with communications and other related courses that involve computer software applications, face this problem when teaching computer…
Jantzi, Sarah C; Almirall, José R
2014-01-01
Elemental analysis of soil is a useful application of both laser ablation inductively coupled plasma mass spectrometry (LA-ICP-MS) and laser-induced breakdown spectroscopy (LIBS) in geological, agricultural, environmental, archeological, planetary, and forensic sciences. In forensic science, the question to be answered is often whether soil specimens found on objects (e.g., shoes, tires, or tools) originated from the crime scene or other location of interest. Elemental analysis of the soil from the object and the locations of interest results in a characteristic elemental profile of each specimen, consisting of the amount of each element present. Because multiple elements are measured, multivariate statistics can be used to compare the elemental profiles in order to determine whether the specimen from the object is similar to one of the locations of interest. Previous work involved milling and pressing 0.5 g of soil into pellets before analysis using LA-ICP-MS and LIBS. However, forensic examiners prefer techniques that require smaller samples, are less time consuming, and are less destructive, allowing for future analysis by other techniques. An alternative sample introduction method was developed to meet these needs while still providing quantitative results suitable for multivariate comparisons. The tape-mounting method involved deposition of a thin layer of soil onto double-sided adhesive tape. A comparison of tape-mounting and pellet method performance is reported for both LA-ICP-MS and LIBS. Calibration standards and reference materials, prepared using the tape method, were analyzed by LA-ICP-MS and LIBS. As with the pellet method, linear calibration curves were achieved with the tape method, as well as good precision and low bias. Soil specimens from Miami-Dade County were prepared by both the pellet and tape methods and analyzed by LA-ICP-MS and LIBS. Principal components analysis and linear discriminant analysis were applied to the multivariate data. Results from both the tape method and the pellet method were nearly identical, with clear groupings and correct classification rates of >94%.
An Interpretative Phenomenological Analysis of Sense-Making by Department of Defense Employees
ERIC Educational Resources Information Center
Harrison, John L., Sr.
2011-01-01
The purpose of this qualitative, phenomenological study was to explore the perceptions and lived experiences of Department of Defense (DOD) civilian employees to identify how their personal sense-making affects their coaching of adult students. The author used an interpretative phenomenological analysis (IPA) method involving personal interviews…
75 FR 53968 - Reverb Communications, Inc.; Analysis of Proposed Consent Order To Aid Public Comment
Federal Register 2010, 2011, 2012, 2013, 2014
2010-09-02
... final the agreement's proposed order. This matter involves the public relations, marketing, and sales... Consent Order To Aid Public Comment AGENCY: Federal Trade Commission. ACTION: Proposed Consent Agreement... or deceptive acts or practices or unfair methods of competition. The attached Analysis to Aid Public...
Variation and Commonality in Phenomenographic Research Methods
ERIC Educational Resources Information Center
Akerlind, Gerlese S.
2012-01-01
This paper focuses on the data analysis stage of phenomenographic research, elucidating what is involved in terms of both commonality and variation in accepted practice. The analysis stage of phenomenographic research is often not well understood. This paper helps to clarify the process, initially by collecting together in one location the more…
Lift-Shape Construction, An EFL Project Report.
ERIC Educational Resources Information Center
Evans, Ben H.
Research development of a construction system is detailed in terms of--(1) design and analysis, (2) construction methods, (3) testing, (4) cost analysis, and (5) architectural potentials. The system described permits construction of usual shapes without the use of conventional concrete formwork. The concrete involves development of a structural…
A Guide to Job Analysis for the Preparation of Job Training Programmes.
ERIC Educational Resources Information Center
Ceramics, Glass, and Mineral Products Industry Training Board, Harrow (England).
The paper deals with job analysis for the preparation of job training programs. The analytical approach involves five steps: enlisting support, examining the job, describing the job, analyzing training requirements, and planning the programs. Appendixes include methods of producing training schemes--the simple job breakdown, straightforward…
Applied Behavior Analysis Is a Science And, Therefore, Progressive
ERIC Educational Resources Information Center
Leaf, Justin B.; Leaf, Ronald; McEachin, John; Taubman, Mitchell; Ala'i-Rosales, Shahla; Ross, Robert K.; Smith, Tristram; Weiss, Mary Jane
2016-01-01
Applied behavior analysis (ABA) is a science and, therefore, involves progressive approaches and outcomes. In this commentary we argue that the spirit and the method of science should be maintained in order to avoid reductionist procedures, stifled innovation, and rote, unresponsive protocols that become increasingly removed from meaningful…
Robustness of Type I Error and Power in Set Correlation Analysis of Contingency Tables.
ERIC Educational Resources Information Center
Cohen, Jacob; Nee, John C. M.
1990-01-01
The analysis of contingency tables via set correlation allows the assessment of subhypotheses involving contrast functions of the categories of the nominal scales. The robustness of such methods with regard to Type I error and statistical power was studied via a Monte Carlo experiment. (TJH)
Durability reliability analysis for corroding concrete structures under uncertainty
NASA Astrophysics Data System (ADS)
Zhang, Hao
2018-02-01
This paper presents a durability reliability analysis of reinforced concrete structures subject to the action of marine chloride. The focus is to provide insight into the role of epistemic uncertainties on durability reliability. The corrosion model involves a number of variables whose probabilistic characteristics cannot be fully determined due to the limited availability of supporting data. All sources of uncertainty, both aleatory and epistemic, should be included in the reliability analysis. Two methods are available to formulate the epistemic uncertainty: the imprecise probability-based method and the purely probabilistic method in which the epistemic uncertainties are modeled as random variables. The paper illustrates how the epistemic uncertainties are modeled and propagated in the two methods, and shows how epistemic uncertainties govern the durability reliability.
Detecting disease-predisposing variants: the haplotype method.
Valdes, A M; Thomson, G
1997-01-01
For many HLA-associated diseases, multiple alleles-- and, in some cases, multiple loci--have been suggested as the causative agents. The haplotype method for identifying disease-predisposing amino acids in a genetic region is a stratification analysis. We show that, for each haplotype combination containing all the amino acid sites involved in the disease process, the relative frequencies of amino acid variants at sites not involved in disease but in linkage disequilibrium with the disease-predisposing sites are expected to be the same in patients and controls. The haplotype method is robust to mode of inheritance and penetrance of the disease and can be used to determine unequivocally whether all amino acid sites involved in the disease have not been identified. Using a resampling technique, we developed a statistical test that takes account of the nonindependence of the sites sampled. Further, when multiple sites in the genetic region are involved in disease, the test statistic gives a closer fit to the null expectation when some--compared with none--of the true predisposing factors are included in the haplotype analysis. Although the haplotype method cannot distinguish between very highly correlated sites in one population, ethnic comparisons may help identify the true predisposing factors. The haplotype method was applied to insulin-dependent diabetes mellitus (IDDM) HLA class II DQA1-DQB1 data from Caucasian, African, and Japanese populations. Our results indicate that the combination DQA1#52 (Arg predisposing) DQB1#57 (Asp protective), which has been proposed as an important IDDM agent, does not include all the predisposing elements. With rheumatoid arthritis HLA class II DRB1 data, the results were consistent with the shared-epitope hypothesis. PMID:9042931
ERIC Educational Resources Information Center
Kilcommons, Aoiffe M.; Withers, Paul; Moreno-Lopez, Agueda
2012-01-01
Background: Involving ID service users in risk decision making necessitates consideration of an individual's ability to assess the implications and associated risks and thus make an informed choice. This calls for research on service users' awareness and understanding of risk management (RM). Method: Thirteen people in a residential ID service who…
Velayutham, Selva Ganapathy; Chandra, Sadanandavalli Retnaswami; Bharath, Srikala; Shankar, Ravi Girikamatha
2017-01-01
Introduction: Alzhiemers disease and Frontotemporal dementia are common neurodegenerative dementias with a wide prevalence. Falls are a common cause of morbidity in these patients. Identifying subclinical involvement of these parameters might serve as a tool in differential analysis of these distinct parameters involved in these conditions and also help in planning preventive strategies to prevent falls. Patients and Methods: Eight patients in age and gender matched patients in each group were compared with normal controls. Standardizes methods of gait and balance aseesment were done in all persons. Results: Results revealed subclinical involvement of gait and balancesin all groups specially during divided attention. The parameters were significantly more affected in patients. Patients with AD and FTD had involement of over all ambulation index balance more affected in AD patients FTD patients showed step cycle, stride length abnormalities. Discussion: There is balance and gait involvement in normal ageing as well as patients with AD and FTD. The pattern of involvement in AD correlates with WHERE pathway involvement and FTD with frontal subcortical circuits involvement. Conclusion: Identification the differential patterns of involvement in subclinical stage might help to differentiate normal ageing and the different types of cortical dementias. This could serve as an additional biomarker and also assist in initiating appropriate training methods to prevent future falls. PMID:28515555
ERIC Educational Resources Information Center
Anderson, Richard C.; Freebody, Peter
The "yes/no" method of vocabulary assessment requires students to indicate words they know from among a list of words and nonwords. Preliminary evidence gained from a study involving fifth grade students indicates that the method is superior in many ways to the multiple choice method of assessment. Analysis of "false alarms," cases in which…
Neutron activation analysis of certified samples by the absolute method
NASA Astrophysics Data System (ADS)
Kadem, F.; Belouadah, N.; Idiri, Z.
2015-07-01
The nuclear reactions analysis technique is mainly based on the relative method or the use of activation cross sections. In order to validate nuclear data for the calculated cross section evaluated from systematic studies, we used the neutron activation analysis technique (NAA) to determine the various constituent concentrations of certified samples for animal blood, milk and hay. In this analysis, the absolute method is used. The neutron activation technique involves irradiating the sample and subsequently performing a measurement of the activity of the sample. The fundamental equation of the activation connects several physical parameters including the cross section that is essential for the quantitative determination of the different elements composing the sample without resorting to the use of standard sample. Called the absolute method, it allows a measurement as accurate as the relative method. The results obtained by the absolute method showed that the values are as precise as the relative method requiring the use of standard sample for each element to be quantified.
Shah, Kumar A; Peoples, Michael C; Halquist, Matthew S; Rutan, Sarah C; Karnes, H Thomas
2011-01-25
The work described in this paper involves development of a high-throughput on-line microfluidic sample extraction method using capillary micro-columns packed with MIP beads coupled with tandem mass spectrometry for the analysis of urinary NNAL. The method was optimized and matrix effects were evaluated and resolved. The method enabled low sample volume (200 μL) and rapid analysis of urinary NNAL by direct injection onto the microfluidic column packed with molecularly imprinted beads engineered to NNAL. The method was validated according to the FDA bioanalytical method validation guidance. The dynamic range extended from 20.0 to 2500.0 pg/mL with a percent relative error of ±5.9% and a run time of 7.00 min. The lower limit of quantitation was 20.0 pg/mL. The method was used for the analysis of NNAL and NNAL-Gluc concentrations in smokers' urine. Copyright © 2010 Elsevier B.V. All rights reserved.
Karim, Syed Mustafa; Zekri, Jamal; Abdelghany, Ehab; Dada, Reyad; Munsoor, Husna; Ahmad, Imran
2015-01-01
Background: A substantial number of cancer patients receive chemotherapy until the end of life (EoL). Various factors have been shown to be associated with receipt of chemotherapy until near death. In this study, we determine our average time from last chemotherapy to death (TLCD) and explore different factors that may be associated with decreased TLCD. Materials and Methods: A retrospective review of medical records of adult cancer patients who received chemotherapy during their illness and died in our hospital between January 2010 and January 2012 was conducted. Chi-square test and t-test were used to examine the correlation between selected factors and use of chemotherapy within 60 days of death. Multivariate analysis was used to test independent significance of factors testing positive in univariate analysis. Kaplan-Meier method was used to perform survival analysis. Results: Of the 115 cancer patients who died in the hospital, 41 (35.6%) had TLCD of 60 days or less. Patients with better performance status and those dying under medical oncology service were more likely to be in this group of patients. Univariate analysis showed that these patients were less likely to have palliative care involvement, were more likely to die of treatment related causes, and more likely to have died in the Intensive Care Unit. Multivariate analysis confirmed lack of palliative care involvement and better performance status as independent factors for TLCD less than 60 days. Survival analyses showed that patients with palliative care involvement and those dying under palliative care service were likely to have significantly longer TLCD. Conclusions: Cancer patients who have no involvement of palliative care team in their management tend to receive chemotherapy near the EoL, have more aggressive EoL care, and have higher risk of dying die from treatment related complications. Palliative care should be involved early in the care of cancer patients. PMID:25810576
NASA Astrophysics Data System (ADS)
Bialas, A.
2004-02-01
It is shown that the method of eliminating the statistical fluctuations from event-by-event analysis proposed recently by Fu and Liu can be rewritten in a compact form involving the generalized factorial moments.
The number of studies involving the analysis of perfluorooctanoic acid (PFOA) has 33 increased recently because PFOA is routinely detected in human blood samples from around the world. Recent studies with mice have shown that dosing pregnant dams with PFOA during gestation gives ...
Metagram Software - A New Perspective on the Art of Computation.
1981-10-01
numober) Computer Programming Information and Analysis Metagramming Philosophy Intelligence Information Systefs Abstraction & Metasystems Metagranmming...control would also serve well in the analysis of military and political intelligence, and in other areas where highly abstract methods of thought serve...needed in intelligence because several levels of abstraction are involved in a political or military system, because analysis entails a complex interplay
Strategic Analysis and Plan for Implementing Telemedicine at Fort Greely
2003-03-01
Analysis The Situational Analysis tool assessed the environmental, market , and organizational factors involved in a Fort Greely telemedicine... Factors ): Medicaid reimbursement is now approved for Alaska regardless of method of healthcare delivery. Market Factors (Customers): The influx of...arrive are Active National Guardsmen and Fort Greely Telemedicine 50 their families. Market Factors (Services): Fairbanks Memorial Hospital (FMH) can
Efficient genotype compression and analysis of large genetic variation datasets
Layer, Ryan M.; Kindlon, Neil; Karczewski, Konrad J.; Quinlan, Aaron R.
2015-01-01
Genotype Query Tools (GQT) is a new indexing strategy that expedites analyses of genome variation datasets in VCF format based on sample genotypes, phenotypes and relationships. GQT’s compressed genotype index minimizes decompression for analysis, and performance relative to existing methods improves with cohort size. We show substantial (up to 443 fold) performance gains over existing methods and demonstrate GQT’s utility for exploring massive datasets involving thousands to millions of genomes. PMID:26550772
Pimentel, Lígia; Fontes, Ana Luiza; Salsinha, Sofia; Machado, Manuela; Correia, Inês; Gomes, Ana Maria; Pintado, Manuela; Rodríguez-Alcalá, Luís Miguel
2018-03-08
Lipids are gaining relevance over the last 20 years, as our knowledge about their role has changed from merely energy/structural molecules to compounds also involved in several biological processes. This led to the creation in 2003 of a new emerging research field: lipidomics. In particular the phospholipids have pharmacological/food applications, participate in cell signalling/homeostatic pathways while their analysis faces some challenges. Their fractionation/purification is, in fact, especially difficult, as they are amphiphilic compounds. Moreover, it usually involves SPE or TLC procedures requiring specific materials hampering their suitableness for routine analysis. Finally, they can interfere with the ionization of other molecules during mass spectrometry analysis. Thus, simple high-throughput reliable methods to selectively isolate these compounds based on the difference between chemical characteristics of lipids would represent valuable tools for their study besides that of other compounds. The current review work aims to describe the state-of-the-art related to the extraction of phospholipids using liquid-liquid methods for their targeted isolation. The technological and biological importance of these compounds and ion suppression phenomena are also reviewed. Methods by precipitation with acetone or isolation using methanol seem to be suitable for selective isolation of phospholipids in both biological and food samples. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Sampling and analysis of hexavalent chromium during exposure to chromic acid mist and welding fumes.
Blomquist, G; Nilsson, C A; Nygren, O
1983-12-01
Sampling and analysis of hexavalent chromium during exposure to chromic acid mist and welding fumes. Scand j work environ & health 9 (1983) 489-495. In view of the serious health effects of hexavalent chromium, the problems involved in its sampling and analysis in workroom air have been the subject of much concern. In this paper, the stability problems arising from the reduction of hexavalent to trivalent chromium during sampling, sample storage, and analysis are discussed. Replacement of sulfuric acid by a sodium acetate buffer (pH 4) as a leaching solution prior to analysis with the diphenylcarbazide (DPC) method is suggested and is demonstrated to be necessary in order to avoid reduction. Field samples were taken from two different industrial processes-manual metal arc welding on stainless steel without shield gas and chromium plating. A comparison was made of the DPC method, acidic dissolution with atomic absorption spectrophotometric (AAS) analysis, and the carbonate method. For chromic acid mist, the DPC method and AAS analysis were shown to give the same results. In the analysis of welding fumes, the modified DPC method gave the same results as the laborious and less sensitive carbonate method.
Ding, Dewu; Sun, Xiao
2018-01-16
Shewanella oneidensis MR-1 can transfer electrons from the intracellular environment to the extracellular space of the cells to reduce the extracellular insoluble electron acceptors (Extracellular Electron Transfer, EET). Benefiting from this EET capability, Shewanella has been widely used in different areas, such as energy production, wastewater treatment, and bioremediation. Genome-wide proteomics data was used to determine the active proteins involved in activating the EET process. We identified 1012 proteins with decreased expression and 811 proteins with increased expression when the EET process changed from inactivation to activation. We then networked these proteins to construct the active protein networks, and identified the top 20 key active proteins by network centralization analysis, including metabolism- and energy-related proteins, signal and transcriptional regulatory proteins, translation-related proteins, and the EET-related proteins. We also constructed the integrated protein interaction and transcriptional regulatory networks for the active proteins, then found three exclusive active network motifs involved in activating the EET process-Bi-feedforward Loop, Regulatory Cascade with a Feedback, and Feedback with a Protein-Protein Interaction (PPI)-and identified the active proteins involved in these motifs. Both enrichment analysis and comparative analysis to the whole-genome data implicated the multiheme c -type cytochromes and multiple signal processing proteins involved in the process. Furthermore, the interactions of these motif-guided active proteins and the involved functional modules were discussed. Collectively, by using network-based methods, this work reported a proteome-wide search for the key active proteins that potentially activate the EET process.
Dual ant colony operational modal analysis parameter estimation method
NASA Astrophysics Data System (ADS)
Sitarz, Piotr; Powałka, Bartosz
2018-01-01
Operational Modal Analysis (OMA) is a common technique used to examine the dynamic properties of a system. Contrary to experimental modal analysis, the input signal is generated in object ambient environment. Operational modal analysis mainly aims at determining the number of pole pairs and at estimating modal parameters. Many methods are used for parameter identification. Some methods operate in time while others in frequency domain. The former use correlation functions, the latter - spectral density functions. However, while some methods require the user to select poles from a stabilisation diagram, others try to automate the selection process. Dual ant colony operational modal analysis parameter estimation method (DAC-OMA) presents a new approach to the problem, avoiding issues involved in the stabilisation diagram. The presented algorithm is fully automated. It uses deterministic methods to define the interval of estimated parameters, thus reducing the problem to optimisation task which is conducted with dedicated software based on ant colony optimisation algorithm. The combination of deterministic methods restricting parameter intervals and artificial intelligence yields very good results, also for closely spaced modes and significantly varied mode shapes within one measurement point.
Berget, Ellen; Helgeland, Lars; Liseth, Knut; Løkeland, Turid; Molven, Anders; Vintermyr, Olav Karsten
2014-01-01
Aims We aimed to evaluate the prognostic value of routine use of PCR amplification of immunoglobulin gene rearrangements in bone marrow (BM) staging in patients with follicular lymphoma (FL). Methods Clonal rearrangements were assessed by immunoglobulin heavy and light-chain gene rearrangement analysis in BM aspirates from 96 patients diagnosed with FL and related to morphological detection of BM involvement in biopsies. In 71 patients, results were also compared with concurrent flow cytometry analysis. Results BM involvement was detected by PCR in 34.4% (33/96) of patients. The presence of clonal rearrangements by PCR was associated with advanced clinical stage (I–III vs IV; p<0.001), high FL International Prognostic Index (FLIPI) score (0–1, 2 vs ≥3; p=0.003), and detection of BM involvement by morphology and flow cytometry analysis (p<0.001 for both). PCR-positive patients had a significantly poorer survival than PCR-negative patients (p=0.001, log-rank test). Thirteen patients positive by PCR but without morphologically detectable BM involvement, had significantly poorer survival than patients with negative morphology and negative PCR result (p=0.002). The poor survival associated with BM involvement by PCR was independent of the FLIPI score (p=0.007, Cox regression). BM involvement by morphology or flow cytometry did not show a significant impact on survival. Conclusions Our results showed that routine use of PCR-based clonality analysis significantly improved the prognostic impact of BM staging in patients with FL. BM involvement by PCR was also an independent adverse prognostic factor. PMID:25233852
NASA Astrophysics Data System (ADS)
Viswanathan, V. K.
1980-11-01
The optical design and analysis of the LASL carbon dioxide laser fusion systems required the use of techniques that are quite different from the currently used method in conventional optical design problems. The necessity for this is explored and the method that has been successfully used at Los Alamos to understand these systems is discussed with examples. This method involves characterization of the various optical components in their mounts by a Zernike polynomial set and using fast Fourier transform techniques to propagate the beam, taking diffraction and other nonlinear effects that occur in these types of systems into account. The various programs used for analysis are briefly discussed.
Statistical methods for astronomical data with upper limits. I - Univariate distributions
NASA Technical Reports Server (NTRS)
Feigelson, E. D.; Nelson, P. I.
1985-01-01
The statistical treatment of univariate censored data is discussed. A heuristic derivation of the Kaplan-Meier maximum-likelihood estimator from first principles is presented which results in an expression amenable to analytic error analysis. Methods for comparing two or more censored samples are given along with simple computational examples, stressing the fact that most astronomical problems involve upper limits while the standard mathematical methods require lower limits. The application of univariate survival analysis to six data sets in the recent astrophysical literature is described, and various aspects of the use of survival analysis in astronomy, such as the limitations of various two-sample tests and the role of parametric modelling, are discussed.
STATISTICAL ANALYSIS OF SNAP 10A THERMOELECTRIC CONVERTER ELEMENT PROCESS DEVELOPMENT VARIABLES
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fitch, S.H.; Morris, J.W.
1962-12-15
Statistical analysis, primarily analysis of variance, was applied to evaluate several factors involved in the development of suitable fabrication and processing techniques for the production of lead telluride thermoelectric elements for the SNAP 10A energy conversion system. The analysis methods are described as to their application for determining the effects of various processing steps, estabIishing the value of individual operations, and evaluating the significance of test results. The elimination of unnecessary or detrimental processing steps was accomplished and the number of required tests was substantially reduced by application of these statistical methods to the SNAP 10A production development effort. (auth)
On Manpower Forecasting. Methods for Manpower Analysis, No.2.
ERIC Educational Resources Information Center
Morton, J.E.
Some of the problems and techniques involved in manpower forecasting are discussed. This non-technical introduction to the field aims at reducing fears of data manipulation methods and at increasing respect for conceptual, logical, and analytical issues. The major approaches to manpower forecasting are explicated and evaluated under the headings:…
An Educational Model for Disruption of Bacteria for Protein Studies.
ERIC Educational Resources Information Center
Bhaduri, Saumya; Demchick, Paul H.
1984-01-01
A simple, rapid, and safe method has been developed for disrupting bacterial cells for protein studies. The method involved stepwise treatment of cells with acetone and with sodium dodecyl sulfate solution to allow extraction of cellular proteins for analysis by polyacrylamide gel electrophoresis. Applications for instructional purposes are noted.…
1980-10-01
reported using the method of Gentzkow (1942), which involves conversion of urea to ammonia with urease and measurement of the ammonia by...Nesslerization. Methods employing urease are not well suited for automated analysis since an incubation time of about 20 minutes is required for the conversion of
The present study investigates primary and secondary sources of organic carbon for Bakersfield, CA, USA as part of the 2010 CalNex study. The method used here involves integrated sampling that is designed to allow for detailed and specific chemical analysis of particulate matter ...
Monogamy on the Street: A Mixed Methods Study of Homeless Men
ERIC Educational Resources Information Center
Brown, Ryan A.; Kennedy, David P.; Tucker, Joan S.; Golinelli, Daniela; Wenzel, Suzanne L.
2013-01-01
In this study, we used a mixed methods approach to explore the determinants of relationship patterns and risky sex among homeless men living in downtown Los Angeles. This involved analysis of qualitative interviews focused on gender ideology and sexual events ("n" = 30) as well as structured interviews ("n" = 305) focused on…
ERIC Educational Resources Information Center
Morozov, Andrew; Kilgore, Deborah; Atman, Cynthia
2007-01-01
In this study, the authors used two methods for analyzing expert data: verbal protocol analysis (VPA) and narrative analysis. VPA has been effectively used to describe the design processes employed by engineering students, expert designers, and expert-novice comparative research. VPA involves asking participants to "think aloud" while…
Deriving a Typology of Web 2.0 Learning Technologies
ERIC Educational Resources Information Center
Bower, Matt
2016-01-01
This paper presents the methods and outcomes of a typological analysis of Web 2.0 technologies. A comprehensive review incorporating over 2000 links led to identification of over 200 Web 2.0 technologies that were suitable for learning and teaching purposes. The typological analysis involved development of relevant Web 2.0 dimensions, grouping…
Computer analysis of arteriograms
NASA Technical Reports Server (NTRS)
Selzer, R. H.; Armstrong, J. H.; Beckenbach, E. B.; Blankenhorn, D. H.; Crawford, D. W.; Brooks, S. H.; Sanmarco, M. E.
1977-01-01
A computer system has been developed to quantify the degree of atherosclerosis in the human femoral artery. The analysis involves first scanning and digitizing angiographic film, then tracking the outline of the arterial image and finally computing the relative amount of roughness or irregularity in the vessel wall. The image processing system and method are described.
A Cost-Savings Analysis of a Statewide Parenting Education Program in Child Welfare
ERIC Educational Resources Information Center
Maher, Erin J.; Corwin, Tyler W.; Hodnett, Rhenda; Faulk, Karen
2012-01-01
Objectives: This article presents a cost-savings analysis of the statewide implementation of an evidence-informed parenting education program. Methods: Between the years 2005 and 2008, the state of Louisiana used the Nurturing Parenting Program (NPP) to impart parenting skills to child welfare-involved families. Following these families' outcomes…
Analysis of environmental regulatory proposals: Its your chance to influence policy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Veil, J.A.
1994-03-02
As part of the regulatory development process, the US Envirorunental Protection Agency (EPA) collects data, makes various assumptions about the data, and analyzes the data. Although EPA acts in good faith, the agency cannot always be aware of all relevant data, make only appropriate assumptions, and use applicable analytical methods. Regulated industries must carefully must carefully review every component of the regulatory decision-making process to identify misunderstandings and errors and to supply additional data that is relevant to the regulatory action. This paper examines three examples of how EPA`s data, assumptions, and analytical methods have been critiqued. The first twomore » examples involve EPA`s cost-effectiveness (CE) analyses prepared for the offshore oil and gas effluent limitations guidelines and as part of EPA Region 6`s general permit for coastal waters of Texas and Louisiana. A CE analysis regulations to the incremental amount of pollutants that would be removed by the recommended treatment processes. The third example, although not involving a CE analysis, demonstrates how the use of non-representative data can influence the outcome of an analysis.« less
Analysis of the Space Shuttle main engine simulation
NASA Technical Reports Server (NTRS)
Deabreu-Garcia, J. Alex; Welch, John T.
1993-01-01
This is a final report on an analysis of the Space Shuttle Main Engine Program, a digital simulator code written in Fortran. The research was undertaken in ultimate support of future design studies of a shuttle life-extending Intelligent Control System (ICS). These studies are to be conducted by NASA Lewis Space Research Center. The primary purpose of the analysis was to define the means to achieve a faster running simulation, and to determine if additional hardware would be necessary for speeding up simulations for the ICS project. In particular, the analysis was to consider the use of custom integrators based on the Matrix Stability Region Placement (MSRP) method. In addition to speed of execution, other qualities of the software were to be examined. Among these are the accuracy of computations, the useability of the simulation system, and the maintainability of the program and data files. Accuracy involves control of truncation error of the methods, and roundoff error induced by floating point operations. It also involves the requirement that the user be fully aware of the model that the simulator is implementing.
Rivera, José; Carrillo, Mariano; Chacón, Mario; Herrera, Gilberto; Bojorquez, Gilberto
2007-01-01
The development of smart sensors involves the design of reconfigurable systems capable of working with different input sensors. Reconfigurable systems ideally should spend the least possible amount of time in their calibration. An autocalibration algorithm for intelligent sensors should be able to fix major problems such as offset, variation of gain and lack of linearity, as accurately as possible. This paper describes a new autocalibration methodology for nonlinear intelligent sensors based on artificial neural networks, ANN. The methodology involves analysis of several network topologies and training algorithms. The proposed method was compared against the piecewise and polynomial linearization methods. Method comparison was achieved using different number of calibration points, and several nonlinear levels of the input signal. This paper also shows that the proposed method turned out to have a better overall accuracy than the other two methods. Besides, experimentation results and analysis of the complete study, the paper describes the implementation of the ANN in a microcontroller unit, MCU. In order to illustrate the method capability to build autocalibration and reconfigurable systems, a temperature measurement system was designed and tested. The proposed method is an improvement over the classic autocalibration methodologies, because it impacts on the design process of intelligent sensors, autocalibration methodologies and their associated factors, like time and cost.
NASA Astrophysics Data System (ADS)
Deco, Gustavo; Martí, Daniel
2007-03-01
The analysis of transitions in stochastic neurodynamical systems is essential to understand the computational principles that underlie those perceptual and cognitive processes involving multistable phenomena, like decision making and bistable perception. To investigate the role of noise in a multistable neurodynamical system described by coupled differential equations, one usually considers numerical simulations, which are time consuming because of the need for sufficiently many trials to capture the statistics of the influence of the fluctuations on that system. An alternative analytical approach involves the derivation of deterministic differential equations for the moments of the distribution of the activity of the neuronal populations. However, the application of the method of moments is restricted by the assumption that the distribution of the state variables of the system takes on a unimodal Gaussian shape. We extend in this paper the classical moments method to the case of bimodal distribution of the state variables, such that a reduced system of deterministic coupled differential equations can be derived for the desired regime of multistability.
Marcon, Tamara Davidson; Girz, Laura; Stillar, Amanda; Tessier, Carole; Lafrance, Adele
2017-01-01
Objectives Best practice guidelines encourage the involvement of parents in the assessment and treatment of child/adolescent eating disorders (ED). This study investigated medical residents’ perspectives regarding parental involvement as well as their expectations for future practice in the assessment and treatment of ED. Method Five hundred and eighty-four medical residents from 17 Canadian residency programs specializing in family medicine, pediatrics, and psychiatry completed a web-based survey. Questions pertained to assessment and treatment practices for child/adolescent ED. Analyses included ANOVAs, paired t-tests, and, for residents who endorsed family involvement (N = 444), qualitative content analysis. Results Overall, residents reported that they “mostly” agreed with the involvement of family in the assessment and treatment of ED. Residents’ endorsement of family involvement in both domains increased according to the extent of ED training received. Four major themes emerged from the content analysis of family involvement and included recommendations in line with evidence-based models and unspecified, passive involvement in the assessment and recovery process. Conclusions Many residents endorse family involvement in both assessment and treatment; however, understanding of the nature of such involvement is often vague. Training in evidence-based protocols is necessary for residents planning to engage in multi-disciplinary assessment, referral, and/or treatment in their future practice. PMID:28747930
Preliminary Tests For Development Of A Non-Pertechnetate Analysis Method
DOE Office of Scientific and Technical Information (OSTI.GOV)
Diprete, D.; McCabe, D.
2016-09-28
The objective of this task was to develop a non-pertechnetate analysis method that 222-S lab could easily implement. The initial scope involved working with 222-S laboratory personnel to adapt the existing Tc analytical method to fractionate the non-pertechnetate and pertechnetate. SRNL then developed and tested a method using commercial sorbents containing Aliquat ® 336 to extract the pertechnetate (thereby separating it from non-pertechnetate), followed by oxidation, extraction, and stripping steps, and finally analysis by beta counting and Mass Spectroscopy. Several additional items were partially investigated, including impacts of a 137Cs removal step. The method was initially tested on SRS tankmore » waste samples to determine its viability. Although SRS tank waste does not contain non-pertechnetate, testing with it was useful to investigate the compatibility, separation efficiency, interference removal efficacy, and method sensitivity.« less
Laboratory analytical methods for the determination of the hydrocarbon status of soils (a review)
NASA Astrophysics Data System (ADS)
Pikovskii, Yu. I.; Korotkov, L. A.; Smirnova, M. A.; Kovach, R. G.
2017-10-01
Laboratory analytical methods suitable for the determination of the hydrocarbon status of soils (a specific soil characteristic involving information on the total content and qualitative features of soluble (bitumoid) carbonaceous substances and individual hydrocarbons (polycyclic aromatic hydrocarbons, alkanes, etc.) in bitumoid, as well as the composition and content of hydrocarbon gases) have been considered. Among different physicochemical methods of study, attention is focused on the methods suitable for the wide use. Luminescence-bituminological analysis, low-temperature spectrofluorimetry (Shpolskii spectroscopy), infrared (IR) spectroscopy, gas chromatography, chromatography-mass spectrometry, and some other methods have been characterized, as well as sample preparation features. Advantages and limitations of each of these methods are described; their efficiency, instrumental complexity, analysis duration, and accuracy are assessed.
NASA Astrophysics Data System (ADS)
Zhang, X. C.; Zhang, X. Z.; Li, W. H.; Liu, B.; Gong, X. L.; Zhang, P. Q.
The aim of this article is to investigate the use of a Dynamic Vibration Absorber to control vibration of engine by using simulation. Traditional means of vibration control have involved the use of passive and more recently, active methods. This study is different in that it involves an adaptive component in the design of vibration absorber using magnetorheological elastomers (MREs) as the adaptive spring. MREs are kind of novel smart material whose shear modulus can be controlled by applied magnetic field. In this paper, the vibration mode of a simple model of automobile engine is simulated by Finite Element Method (FEM) analysis. Based on the analysis, the MREs Adaptive Tuned Dynamic Vibration Absorber (ATDVA) is presented to reduce the vibration of the engine. Simulation result indicate that the control frequency of ATDVA can be changed by modifing the shear modulus of MREs and the vibraion reduction efficiency of ATDVA are also evaluated by FEM analysis.
Multi-scale modelling of elastic moduli of trabecular bone
Hamed, Elham; Jasiuk, Iwona; Yoo, Andrew; Lee, YikHan; Liszka, Tadeusz
2012-01-01
We model trabecular bone as a nanocomposite material with hierarchical structure and predict its elastic properties at different structural scales. The analysis involves a bottom-up multi-scale approach, starting with nanoscale (mineralized collagen fibril) and moving up the scales to sub-microscale (single lamella), microscale (single trabecula) and mesoscale (trabecular bone) levels. Continuum micromechanics methods, composite materials laminate theory and finite-element methods are used in the analysis. Good agreement is found between theoretical and experimental results. PMID:22279160
1992-10-01
Manual CI APPENDIX D: Drawing Navigator Field Test D1 DISTRIBUTION Accesion For NTIS CRA&I OTIC TAB Unannouncea JustiteCdtOn By Distribution I "".i•I...methods replace manual methods, the automation will handle the data for the designer, thus reducing error and increasing throughput. However, the two...actively move data from one automation tool (CADD) to the other (the analysis program). This intervention involves a manual rekeying of data already in
Lores, E M; Bradway, D E; Moseman, R F
1978-01-01
The analyses of four organophosphorus pesticide poisoning cases, three of which resulted in death, are reported. The case histories of the subjects, along with the analysis of tissues, urine, and blood for the levels of pesticides and metabolites are given. The pesticides involved include dicrotophos, chlorpyrifos, malathion, and parathion. The methods of analysis were adapted from previously published methods that provide a very rapid means of identification of organophosphorus pesticides in the tissues or in the blood of poisoned patients.
Analyzing Multiple Outcomes in Clinical Research Using Multivariate Multilevel Models
Baldwin, Scott A.; Imel, Zac E.; Braithwaite, Scott R.; Atkins, David C.
2014-01-01
Objective Multilevel models have become a standard data analysis approach in intervention research. Although the vast majority of intervention studies involve multiple outcome measures, few studies use multivariate analysis methods. The authors discuss multivariate extensions to the multilevel model that can be used by psychotherapy researchers. Method and Results Using simulated longitudinal treatment data, the authors show how multivariate models extend common univariate growth models and how the multivariate model can be used to examine multivariate hypotheses involving fixed effects (e.g., does the size of the treatment effect differ across outcomes?) and random effects (e.g., is change in one outcome related to change in the other?). An online supplemental appendix provides annotated computer code and simulated example data for implementing a multivariate model. Conclusions Multivariate multilevel models are flexible, powerful models that can enhance clinical research. PMID:24491071
Diers, Anne R.; Keszler, Agnes; Hogg, Neil
2015-01-01
BACKGROUND S-Nitrosothiols have been recognized as biologically-relevant products of nitric oxide that are involved in many of the diverse activities of this free radical. SCOPE OF REVIEW This review serves to discuss current methods for the detection and analysis of protein S-nitrosothiols. The major methods of S-nitrosothiol detection include chemiluminescence-based methods and switch-based methods, each of which comes in various flavors with advantages and caveats. MAJOR CONCLUSIONS The detection of S-nitrosothiols is challenging and prone to many artifacts. Accurate measurements require an understanding of the underlying chemistry of the methods involved and the use of appropriate controls. GENERAL SIGNIFICANCE Nothing is more important to a field of research than robust methodology that is generally trusted. The field of S-Nitrosation has developed such methods but, as S-nitrosothiols are easy to introduce as artifacts, it is vital that current users learn from the lessons of the past. PMID:23988402
Technology Overview for Advanced Aircraft Armament System Program.
1981-05-01
availability of methods or systems for improving stores and armament safety. Of particular importance are aspects of safety involving hazards analysis ...flutter virtually insensitive to inertia and center-of- gravity location of store - Simplifies and reduces analysis and testing required to flutter- clear...status. Nearly every existing reliability analysis and discipline that prom- ised a positive return on reliability performance was drawn out, dusted
Ice Growth Measurements from Image Data to Support Ice Crystal and Mixed-Phase Accretion Testing
NASA Technical Reports Server (NTRS)
Struk, Peter M.; Lynch, Christopher J.
2012-01-01
This paper describes the imaging techniques as well as the analysis methods used to measure the ice thickness and growth rate in support of ice-crystal icing tests performed at the National Research Council of Canada (NRC) Research Altitude Test Facility (RATFac). A detailed description of the camera setup, which involves both still and video cameras, as well as the analysis methods using the NASA Spotlight software, are presented. Two cases, one from two different test entries, showing significant ice growth are analyzed in detail describing the ice thickness and growth rate which is generally linear. Estimates of the bias uncertainty are presented for all measurements. Finally some of the challenges related to the imaging and analysis methods are discussed as well as methods used to overcome them.
Ice Growth Measurements from Image Data to Support Ice-Crystal and Mixed-Phase Accretion Testing
NASA Technical Reports Server (NTRS)
Struk, Peter, M; Lynch, Christopher, J.
2012-01-01
This paper describes the imaging techniques as well as the analysis methods used to measure the ice thickness and growth rate in support of ice-crystal icing tests performed at the National Research Council of Canada (NRC) Research Altitude Test Facility (RATFac). A detailed description of the camera setup, which involves both still and video cameras, as well as the analysis methods using the NASA Spotlight software, are presented. Two cases, one from two different test entries, showing significant ice growth are analyzed in detail describing the ice thickness and growth rate which is generally linear. Estimates of the bias uncertainty are presented for all measurements. Finally some of the challenges related to the imaging and analysis methods are discussed as well as methods used to overcome them.
Chen, Wei-Qiang; Obermayr, Philipp; Černigoj, Urh; Vidič, Jana; Panić-Janković, Tanta; Mitulović, Goran
2017-11-01
Classical proteomics approaches involve enzymatic hydrolysis of proteins (either separated by polyacrylamide gels or in solution) followed by peptide identification using LC-MS/MS analysis. This method requires normally more than 16 h to complete. In the case of clinical analysis, it is of the utmost importance to provide fast and reproducible analysis with minimal manual sample handling. Herein we report the method development for online protein digestion on immobilized monolithic enzymatic reactors (IMER) to accelerate protein digestion, reduce manual sample handling, and provide reproducibility to the digestion process in clinical laboratory. An integrated online digestion and separation method using monolithic immobilized enzymatic reactor was developed and applied to digestion and separation of in-vitro-fertilization media. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Closed-loop transfer recovery with observer-based controllers. I - Analysis. II - Design
NASA Technical Reports Server (NTRS)
Chen, Ben M.; Saberi, Ali; Ly, Uy-Loi
1992-01-01
A detailed study is presented of three fundamental issues related to the problem of closed-loop transfer (CLT) recovery. The first issues concerns what can and cannot be achieved for a given system and for an arbitrary target CLT function (TCLTF). The second issue involves developing necessary and/or sufficient conditions for a TCLTF to be recoverable either exactly or approximately. The third issue involves the necessary and/or sufficient conditions on a given system such that it has at least one recoverable TCLTF. The results of the analysis identify some fundamental limitations of the given system as a consequence of its structural properties which enables designers to appreciate at the outset different design limitations incurred in the synthesis of output-feedback controllers. Then, the actual design of full-order or reduced-order observer-based controllers is addressed which will achieve as close as possibly the desired TCLTF. Three design methods are considered: (1) the ATEA method, (2) a method that minimizes the H2-norm of a recovery matrix, and (3) a method that minimizes the respective H(infinity) norm. The relative merits of the methods are discussed.
Elucidation of reaction mechanism involved in the formation of LaNiO3 from XRD and TG analysis
NASA Astrophysics Data System (ADS)
Dharmadhikari, Dipti V.; Athawale, Anjali A.
2013-06-01
The present work is focused on the synthesis and elucidation of reaction mechanism involved in the formation of LaNiO3 with the help of X-ray diffraction (XRD) and thermogravimetric (TG) analysis. LaNiO3 was synthesized by hydrothermal method by heating at 160°C under autogenous pressure for 6h. Pure phase product was obtained after calcining the hydrothermally activated product for 6h at 700°C. The various phases of the product obtained after hydrothermal treatment and calcination followed by the formation of pure phase nanocrystalline lanthanum nickel oxide could be determined from XRD analysis of the samples. The reaction mechanism and phase formation temperature has been interpreted by thermogravimetric analysis of the hydrothermally synthesized product and XRD analysis.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Van Benthem, Mark Hilary; Mowry, Curtis Dale; Kotula, Paul Gabriel
Thermal decomposition of poly dimethyl siloxane compounds, Sylgard{reg_sign} 184 and 186, were examined using thermal desorption coupled gas chromatography-mass spectrometry (TD/GC-MS) and multivariate analysis. This work describes a method of producing multiway data using a stepped thermal desorption. The technique involves sequentially heating a sample of the material of interest with subsequent analysis in a commercial GC/MS system. The decomposition chromatograms were analyzed using multivariate analysis tools including principal component analysis (PCA), factor rotation employing the varimax criterion, and multivariate curve resolution. The results of the analysis show seven components related to offgassing of various fractions of siloxanes that varymore » as a function of temperature. Thermal desorption coupled with gas chromatography-mass spectrometry (TD/GC-MS) is a powerful analytical technique for analyzing chemical mixtures. It has great potential in numerous analytic areas including materials analysis, sports medicine, in the detection of designer drugs; and biological research for metabolomics. Data analysis is complicated, far from automated and can result in high false positive or false negative rates. We have demonstrated a step-wise TD/GC-MS technique that removes more volatile compounds from a sample before extracting the less volatile compounds. This creates an additional dimension of separation before the GC column, while simultaneously generating three-way data. Sandia's proven multivariate analysis methods, when applied to these data, have several advantages over current commercial options. It also has demonstrated potential for success in finding and enabling identification of trace compounds. Several challenges remain, however, including understanding the sources of noise in the data, outlier detection, improving the data pretreatment and analysis methods, developing a software tool for ease of use by the chemist, and demonstrating our belief that this multivariate analysis will enable superior differentiation capabilities. In addition, noise and system artifacts challenge the analysis of GC-MS data collected on lower cost equipment, ubiquitous in commercial laboratories. This research has the potential to affect many areas of analytical chemistry including materials analysis, medical testing, and environmental surveillance. It could also provide a method to measure adsorption parameters for chemical interactions on various surfaces by measuring desorption as a function of temperature for mixtures. We have presented results of a novel method for examining offgas products of a common PDMS material. Our method involves utilizing a stepped TD/GC-MS data acquisition scheme that may be almost totally automated, coupled with multivariate analysis schemes. This method of data generation and analysis can be applied to a number of materials aging and thermal degradation studies.« less
NASA Technical Reports Server (NTRS)
1995-01-01
This guidebook, the second of a two-volume series, is intended to facilitate the transfer of formal methods to the avionics and aerospace community. The 1st volume concentrates on administrative and planning issues [NASA-95a], and the second volume focuses on the technical issues involved in applying formal methods to avionics and aerospace software systems. Hereafter, the term "guidebook" refers exclusively to the second volume of the series. The title of this second volume, A Practitioner's Companion, conveys its intent. The guidebook is written primarily for the nonexpert and requires little or no prior experience with formal methods techniques and tools. However, it does attempt to distill some of the more subtle ingredients in the productive application of formal methods. To the extent that it succeeds, those conversant with formal methods will also nd the guidebook useful. The discussion is illustrated through the development of a realistic example, relevant fragments of which appear in each chapter. The guidebook focuses primarily on the use of formal methods for analysis of requirements and high-level design, the stages at which formal methods have been most productively applied. Although much of the discussion applies to low-level design and implementation, the guidebook does not discuss issues involved in the later life cycle application of formal methods.
Best, Paul; Badham, Jennifer; Corepal, Rekesh; O'Neill, Roisin F; Tully, Mark A; Kee, Frank; Hunter, Ruth F
2017-11-23
While Patient and Public Involvement (PPI) is encouraged throughout the research process, engagement is typically limited to intervention design and post-analysis stages. There are few approaches to participatory data analyses within complex health interventions. Using qualitative data from a feasibility randomised controlled trial (RCT), this proof-of-concept study tests the value of a new approach to participatory data analysis called Participatory Theme Elicitation (PTE). Forty excerpts were given to eight members of a youth advisory PPI panel to sort into piles based on their perception of related thematic content. Using algorithms to detect communities in networks, excerpts were then assigned to a thematic cluster that combined the panel members' perspectives. Network analysis techniques were also used to identify key excerpts in each grouping that were then further explored qualitatively. While PTE analysis was, for the most part, consistent with the researcher-led analysis, young people also identified new emerging thematic content. PTE appears promising for encouraging user led identification of themes arising from qualitative data collected during complex interventions. Further work is required to validate and extend this method. ClinicalTrials.gov, ID: NCT02455986 . Retrospectively Registered on 21 May 2015.
Proteome Profile of Starch Granules Purified from Rice (Oryza sativa) Endosperm.
Xing, Shihai; Meng, Xiaoxi; Zhou, Lihui; Mujahid, Hana; Zhao, Chunfang; Zhang, Yadong; Wang, Cailin; Peng, Zhaohua
2016-01-01
Starch is the most important food energy source in cereals. Many of the known enzymes involved in starch biosynthesis are partially or entirely granule-associated in the endosperm. Studying the proteome of rice starch granules is critical for us to further understand the mechanisms underlying starch biosynthesis and packaging of starch granules in rice amyloplasts, consequently for the improvement of rice grain quality. In this article, we developed a protocol to purify starch granules from mature rice endosperm and verified the quality of purified starch granules by microscopy observations, I2 staining, and Western blot analyses. In addition, we found the phenol extraction method was superior to Tris-HCl buffer extraction method with respect to the efficiency in recovery of starch granule associated proteins. LC-MS/MS analysis showed identification of already known starch granule associated proteins with high confidence. Several proteins reported to be involved in starch synthesis in prior genetic studies in plants were also shown to be enriched with starch granules, either directly or indirectly, in our studies. In addition, our results suggested that a few additional candidate proteins may also be involved in starch synthesis. Furthermore, our results indicated that some starch synthesis pathway proteins are subject to protein acetylation modification. GO analysis and KEGG pathway enrichment analysis showed that the identified proteins were mainly located in plastids and involved in carbohydrate metabolism. This study substantially advances the understanding of the starch granule associated proteome in rice and post translational regulation of some starch granule associated proteins.
NASA Astrophysics Data System (ADS)
Marpaung, B. O. Y.; Waginah
2018-03-01
Every existence of community settlements that formed has related to social, culture, and economy that exists in that society. Participation is a process that involving human interaction towards each other, of these interactions creates activities that potentially form a new space (Hendriksen, et al., 2012). Problems in this research are related to community involvement in building residential, determining land used, building roads, and utilities in Kampung Nelayan Belawan Medan residential. The aim of this research is to find the community involvement of building residential, determining land used, building roads, and utilities in Kampung Nelayan Belawan Medan residential. In the process of collecting data, researchers conducted field observation and interviews. Then the researchers connect the theory and interpretation of data in determining the method of data analysis. Then the researchers connect the theory and interpretation of data in determining the method of data analysis. The discovery of this research is that the formation of settlement spaces in the fishing village is inseparable from the participation in Kampung Nelayan Belawan Medan residential.
van der Ham, Alida J; van Erp, Nicole; Broerse, Jacqueline E W
2016-04-01
The aim of this study was to gain better insight into the quality of patient participation in the development of clinical practice guidelines and to contribute to approaches for the monitoring and evaluation of such initiatives. In addition, we explore the potential of a dialogue-based approach for reconciliation of preferences of patients and professionals in the guideline development processes. The development of the Multidisciplinary Guideline for Employment and Severe Mental Illness in the Netherlands served as a case study. Methods for patient involvement in guideline development included the following: four patient representatives in the development group and advisory committee, two focus group discussions with patients, a dialogue session and eight case studies. To evaluate the quality of patient involvement, we developed a monitoring and evaluation framework including both process and outcome criteria. Data collection included observations, document analysis and semi-structured interviews (n = 26). The quality of patient involvement was enhanced using different methods, reflection of patient input in the guideline text, a supportive attitude among professionals and attention to patient involvement throughout the process. The quality was lower with respect to representing the diversity of the target group, articulation of the patient perspective in the GDG, and clarity and transparency concerning methods of involvement. The monitoring and evaluation framework was useful in providing detailed insights into patient involvement in guideline development. Patient involvement was evaluated as being of good quality. The dialogue-based approach appears to be a promising method for obtaining integrated stakeholder input in a multidisciplinary setting. © 2015 John Wiley & Sons Ltd.
Multiple testing and power calculations in genetic association studies.
So, Hon-Cheong; Sham, Pak C
2011-01-01
Modern genetic association studies typically involve multiple single-nucleotide polymorphisms (SNPs) and/or multiple genes. With the development of high-throughput genotyping technologies and the reduction in genotyping cost, investigators can now assay up to a million SNPs for direct or indirect association with disease phenotypes. In addition, some studies involve multiple disease or related phenotypes and use multiple methods of statistical analysis. The combination of multiple genetic loci, multiple phenotypes, and multiple methods of evaluating associations between genotype and phenotype means that modern genetic studies often involve the testing of an enormous number of hypotheses. When multiple hypothesis tests are performed in a study, there is a risk of inflation of the type I error rate (i.e., the chance of falsely claiming an association when there is none). Several methods for multiple-testing correction are in popular use, and they all have strengths and weaknesses. Because no single method is universally adopted or always appropriate, it is important to understand the principles, strengths, and weaknesses of the methods so that they can be applied appropriately in practice. In this article, we review the three principle methods for multiple-testing correction and provide guidance for calculating statistical power.
Magnusson, R; Nordlander, T; Östin, A
2016-01-15
Sampling teams performing work at sea in areas where chemical munitions may have been dumped require rapid and reliable analytical methods for verifying sulfur mustard leakage from suspected objects. Here we present such an on-site analysis method based on dynamic headspace GC-MS for analysis of five cyclic sulfur mustard degradation products that have previously been detected in sediments from chemical weapon dumping sites: 1,4-oxathiane, 1,3-dithiolane, 1,4-dithiane, 1,4,5-oxadithiephane, and 1,2,5-trithiephane. An experimental design involving authentic Baltic Sea sediments spiked with the target analytes was used to develop an optimized protocol for sample preparation, headspace extraction and analysis that afforded recoveries of up to 60-90%. The optimized method needs no organic solvents, uses only two grams of sediment on a dry weight basis and involves a unique sample presentation whereby sediment is spread uniformly as a thin layer inside the walls of a glass headspace vial. The method showed good linearity for analyte concentrations of 5-200 ng/g dw, good repeatability, and acceptable carry-over. The method's limits of detection for spiked sediment samples ranged from 2.5 to 11 μg/kg dw, with matrix interference being the main limiting factor. The instrumental detection limits were one to two orders of magnitude lower. Full-scan GC-MS analysis enabled the use of automated mass spectral deconvolution for rapid identification of target analytes. Using this approach, analytes could be identified in spiked sediment samples at concentrations down to 13-65 μg/kg dw. On-site validation experiments conducted aboard the research vessel R/V Oceania demonstrated the method's practical applicability, enabling the successful identification of four cyclic sulfur mustard degradation products at concentrations of 15-308μg/kg in sediments immediately after being collected near a wreck at the Bornholm Deep dumpsite in the Baltic Sea. Copyright © 2015 Elsevier B.V. All rights reserved.
2010-01-01
Background Cluster analysis, and in particular hierarchical clustering, is widely used to extract information from gene expression data. The aim is to discover new classes, or sub-classes, of either individuals or genes. Performing a cluster analysis commonly involve decisions on how to; handle missing values, standardize the data and select genes. In addition, pre-processing, involving various types of filtration and normalization procedures, can have an effect on the ability to discover biologically relevant classes. Here we consider cluster analysis in a broad sense and perform a comprehensive evaluation that covers several aspects of cluster analyses, including normalization. Result We evaluated 2780 cluster analysis methods on seven publicly available 2-channel microarray data sets with common reference designs. Each cluster analysis method differed in data normalization (5 normalizations were considered), missing value imputation (2), standardization of data (2), gene selection (19) or clustering method (11). The cluster analyses are evaluated using known classes, such as cancer types, and the adjusted Rand index. The performances of the different analyses vary between the data sets and it is difficult to give general recommendations. However, normalization, gene selection and clustering method are all variables that have a significant impact on the performance. In particular, gene selection is important and it is generally necessary to include a relatively large number of genes in order to get good performance. Selecting genes with high standard deviation or using principal component analysis are shown to be the preferred gene selection methods. Hierarchical clustering using Ward's method, k-means clustering and Mclust are the clustering methods considered in this paper that achieves the highest adjusted Rand. Normalization can have a significant positive impact on the ability to cluster individuals, and there are indications that background correction is preferable, in particular if the gene selection is successful. However, this is an area that needs to be studied further in order to draw any general conclusions. Conclusions The choice of cluster analysis, and in particular gene selection, has a large impact on the ability to cluster individuals correctly based on expression profiles. Normalization has a positive effect, but the relative performance of different normalizations is an area that needs more research. In summary, although clustering, gene selection and normalization are considered standard methods in bioinformatics, our comprehensive analysis shows that selecting the right methods, and the right combinations of methods, is far from trivial and that much is still unexplored in what is considered to be the most basic analysis of genomic data. PMID:20937082
High resolution melting (HRM) analysis of DNA--its role and potential in food analysis.
Druml, Barbara; Cichna-Markl, Margit
2014-09-01
DNA based methods play an increasing role in food safety control and food adulteration detection. Recent papers show that high resolution melting (HRM) analysis is an interesting approach. It involves amplification of the target of interest in the presence of a saturation dye by the polymerase chain reaction (PCR) and subsequent melting of the amplicons by gradually increasing the temperature. Since the melting profile depends on the GC content, length, sequence and strand complementarity of the product, HRM analysis is highly suitable for the detection of single-base variants and small insertions or deletions. The review gives an introduction into HRM analysis, covers important aspects in the development of an HRM analysis method and describes how HRM data are analysed and interpreted. Then we discuss the potential of HRM analysis based methods in food analysis, i.e. for the identification of closely related species and cultivars and the identification of pathogenic microorganisms. Copyright © 2014 Elsevier Ltd. All rights reserved.
Power flow as a complement to statistical energy analysis and finite element analysis
NASA Technical Reports Server (NTRS)
Cuschieri, J. M.
1987-01-01
Present methods of analysis of the structural response and the structure-borne transmission of vibrational energy use either finite element (FE) techniques or statistical energy analysis (SEA) methods. The FE methods are a very useful tool at low frequencies where the number of resonances involved in the analysis is rather small. On the other hand SEA methods can predict with acceptable accuracy the response and energy transmission between coupled structures at relatively high frequencies where the structural modal density is high and a statistical approach is the appropriate solution. In the mid-frequency range, a relatively large number of resonances exist which make finite element method too costly. On the other hand SEA methods can only predict an average level form. In this mid-frequency range a possible alternative is to use power flow techniques, where the input and flow of vibrational energy to excited and coupled structural components can be expressed in terms of input and transfer mobilities. This power flow technique can be extended from low to high frequencies and this can be integrated with established FE models at low frequencies and SEA models at high frequencies to form a verification of the method. This method of structural analysis using power flo and mobility methods, and its integration with SEA and FE analysis is applied to the case of two thin beams joined together at right angles.
Application of the Covalent Bond Classification Method for the Teaching of Inorganic Chemistry
ERIC Educational Resources Information Center
Green, Malcolm L. H.; Parkin, Gerard
2014-01-01
The Covalent Bond Classification (CBC) method provides a means to classify covalent molecules according to the number and types of bonds that surround an atom of interest. This approach is based on an elementary molecular orbital analysis of the bonding involving the central atom (M), with the various interactions being classified according to the…
Farash, Katherine; Hanson, Erin K; Ballantyne, Jack
2015-03-09
DNA profiles can be obtained from 'touch DNA' evidence, which comprises microscopic traces of human biological material. Current methods for the recovery of trace DNA employ cotton swabs or adhesive tape to sample an area of interest. However, such a 'blind-swabbing' approach will co-sample cellular material from the different individuals, even if the individuals' cells are located in geographically distinct locations on the item. Thus, some of the DNA mixtures encountered in touch DNA samples are artificially created by the swabbing itself. In some instances, a victim's DNA may be found in significant excess thus masking any potential perpetrator's DNA. In order to circumvent the challenges with standard recovery and analysis methods, we have developed a lower cost, 'smart analysis' method that results in enhanced genetic analysis of touch DNA evidence. We describe an optimized and efficient micromanipulation recovery strategy for the collection of bio-particles present in touch DNA samples, as well as an enhanced amplification strategy involving a one-step 5 µl microvolume lysis/STR amplification to permit the recovery of STR profiles from the bio-particle donor(s). The use of individual or few (i.e., "clumps") bioparticles results in the ability to obtain single source profiles. These procedures represent alternative enhanced techniques for the isolation and analysis of single bioparticles from forensic touch DNA evidence. While not necessary in every forensic investigation, the method could be highly beneficial for the recovery of a single source perpetrator DNA profile in cases involving physical assault (e.g., strangulation) that may not be possible using standard analysis techniques. Additionally, the strategies developed here offer an opportunity to obtain genetic information at the single cell level from a variety of other non-forensic trace biological material.
Ma, Jing; Hou, Xiaofang; Zhang, Bing; Wang, Yunan; He, Langchong
2014-03-01
In this study, a new"heart-cutting" two-dimensional liquid chromatography method for the simultaneous determination of carbohydrate contents in milk powder was presented. In this two dimensional liquid chromatography system, a Venusil XBP-C4 analysis column was used in the first dimension ((1)D) as a pre-separation column, a ZORBAX carbohydrates analysis column was used in the second dimension ((2)D) as a final-analysis column. The whole process was completed in less than 35min without a particular sample preparation procedure. The capability of the new two dimensional HPLC method was demonstrated in the determination of carbohydrates in various brands of milk powder samples. A conventional one dimensional chromatography method was also proposed. The two proposed methods were both validated in terms of linearity, limits of detection, accuracy and precision. The comparison between the results obtained with the two methods showed that the new and completely automated two dimensional liquid chromatography method is more suitable for milk powder sample because of its online cleanup effect involved. Crown Copyright © 2013. Published by Elsevier B.V. All rights reserved.
Kenyan Nurses Involvement in National Policy Development Processes
Juma, Pamela Atieno
2014-01-01
The aim of this study was to critically examine how nurses have been involved in national policy processes in the Kenyan health sector. The paper reports qualitative results from a larger mixed method study. National nonnursing decision-makers and nurse leaders, and provincial managers as well as frontline nurse managers from two Kenyan districts were purposefully selected for interviews. Interviews dealt with nurses' involvement in national policy processes, factors hindering nurses' engagement in policy processes, and ways to enhance nurses' involvement in policy processes. Critical theory and feminist perspectives guided the study process. Content analysis of data was conducted. Findings revealed that nurses' involvement in policy processes in Kenya was limited. Only a few nurse leaders were involved in national policy committees as a result of their positions in the sector. Critical analysis of the findings revealed that hierarchies and structural factors as well as nursing professional issues were the primary barriers constraining nurses' involvement in policy processes. Thus, there is need to address these factors both by nurses themselves and by nonnursing decision makers, in order to enhance nurses engagement in policy making and further the contribution to quality of services to the communities. PMID:25349731
Methods for determination of inorganic substances in water and fluvial sediments
Fishman, Marvin J.; Friedman, Linda C.
1985-01-01
Chapter Al of the laboratory manual contains methods used by the Geological Survey to analyze samples of water, suspended sediments, and bottom material for their content of inorganic constituents. Included are methods for determining the concentration of dissolved constituents in water, total recoverable and total of constituents in water-suspended sediment samples, and recoverable and total concentrations of constituents in samples of bottom material. Essential definitions are included in the introduction to the manual, along with a brief discussion of the use of significant figures in calculating and reporting analytical results. Quality control in the water-analysis laboratory is discussed, including accuracy and precision of analyses, the use of standard reference water samples, and the operation of an effective quality assurance program. Methods for sample preparation and pretreatment are given also.A brief discussion of the principles of the analytical techniques involved and their particular application to water and sediment analysis is presented. The analytical methods involving these techniques are arranged alphabetically according to constituent. For each method given, the general topics covered are application, principle of the method, interferences, apparatus and reagents required, a detailed description of the analytical procedure, reporting results, units and significant figures, and analytical precision data, when available. More than 125 methods are given for the determination of 70 different inorganic constituents and physical properties of water, suspended sediment, and bottom material.
Gale, Nicola K; Heath, Gemma; Cameron, Elaine; Rashid, Sabina; Redwood, Sabi
2013-09-18
The Framework Method is becoming an increasingly popular approach to the management and analysis of qualitative data in health research. However, there is confusion about its potential application and limitations. The article discusses when it is appropriate to adopt the Framework Method and explains the procedure for using it in multi-disciplinary health research teams, or those that involve clinicians, patients and lay people. The stages of the method are illustrated using examples from a published study. Used effectively, with the leadership of an experienced qualitative researcher, the Framework Method is a systematic and flexible approach to analysing qualitative data and is appropriate for use in research teams even where not all members have previous experience of conducting qualitative research.
Acoustics based assessment of respiratory diseases using GMM classification.
Mayorga, P; Druzgalski, C; Morelos, R L; Gonzalez, O H; Vidales, J
2010-01-01
The focus of this paper is to present a method utilizing lung sounds for a quantitative assessment of patient health as it relates to respiratory disorders. In order to accomplish this, applicable traditional techniques within the speech processing domain were utilized to evaluate lung sounds obtained with a digital stethoscope. Traditional methods utilized in the evaluation of asthma involve auscultation and spirometry, but utilization of more sensitive electronic stethoscopes, which are currently available, and application of quantitative signal analysis methods offer opportunities of improved diagnosis. In particular we propose an acoustic evaluation methodology based on the Gaussian Mixed Models (GMM) which should assist in broader analysis, identification, and diagnosis of asthma based on the frequency domain analysis of wheezing and crackles.
The Limits of Functional Analysis in the Study of Mass Communication.
ERIC Educational Resources Information Center
Anderson, James A.; Meyer, Timothy P.
The fundamental limits of the functional approach to the study of mass communication are embodied in two of its criticisms. The first weakness is in its logical structure and the second involves the limits that are set by known methods. Functional analysis has difficulties as a meaningful research perspective because the process of mass…
Sampling coarse woody debris along spoked transects
Paul C. Van Deusen; Jeffery H. Gove
2011-01-01
Line transects are commonly used for sampling coarse woody debris (CWD). The USDA Forest Service Forest Inventory and Analysis programme uses a variant of this method that involves sampling for CWD along transects that radiate from the centre of a circular plot-like spokes on a wheel. A new approach for analysis of data collected with spoked transects is developed....
ERIC Educational Resources Information Center
Kirby, Nicola; Dempster, Edith
2015-01-01
Quantitative methods of data analysis usually involve inferential statistics, and are not well known for their ability to reflect the intricacies of a diverse student population. The South African tertiary education sector is characterised by extreme inequality and diversity. Foundation programmes address issues of inequality of access by…
ERIC Educational Resources Information Center
Chen, Zhe; Honomichl, Ryan; Kennedy, Diane; Tan, Enda
2016-01-01
The present study examines 5- to 8-year-old children's relation reasoning in solving matrix completion tasks. This study incorporates a componential analysis, an eye-tracking method, and a microgenetic approach, which together allow an investigation of the cognitive processing strategies involved in the development and learning of children's…
Foreign Object Damage to Tires Operating in a Wartime Environment
1991-11-01
barriers were successfully overcome and the method of testing employed can now be confidently used for future test needs of this type. Data Analysis ...combined variable effects. Analysis consideration involved cut types, cut depths, number of cuts, cut/hit probabilities, tire failures, and aircraft...November 1988 with data reduction and analysis continuing into October 1989. All of the cutting tests reported in this report were conducted at the
A simplified analysis of the multigrid V-cycle as a fast elliptic solver
NASA Technical Reports Server (NTRS)
Decker, Naomi H.; Taasan, Shlomo
1988-01-01
For special model problems, Fourier analysis gives exact convergence rates for the two-grid multigrid cycle and, for more general problems, provides estimates of the two-grid convergence rates via local mode analysis. A method is presented for obtaining mutigrid convergence rate estimates for cycles involving more than two grids (using essentially the same analysis as for the two-grid cycle). For the simple cast of the V-cycle used as a fast Laplace solver on the unit square, the k-grid convergence rate bounds obtained by this method are sharper than the bounds predicted by the variational theory. Both theoretical justification and experimental evidence are presented.
SEU System Analysis: Not Just the Sum of All Parts
NASA Technical Reports Server (NTRS)
Berg, Melanie D.; Label, Kenneth
2014-01-01
Single event upset (SEU) analysis of complex systems is challenging. Currently, system SEU analysis is performed by component level partitioning and then either: the most dominant SEU cross-sections (SEUs) are used in system error rate calculations; or the partition SEUs are summed to eventually obtain a system error rate. In many cases, system error rates are overestimated because these methods generally overlook system level derating factors. The problem with overestimating is that it can cause overdesign and consequently negatively affect the following: cost, schedule, functionality, and validation/verification. The scope of this presentation is to discuss the risks involved with our current scheme of SEU analysis for complex systems; and to provide alternative methods for improvement.
Revealing representational content with pattern-information fMRI--an introductory guide.
Mur, Marieke; Bandettini, Peter A; Kriegeskorte, Nikolaus
2009-03-01
Conventional statistical analysis methods for functional magnetic resonance imaging (fMRI) data are very successful at detecting brain regions that are activated as a whole during specific mental activities. The overall activation of a region is usually taken to indicate involvement of the region in the task. However, such activation analysis does not consider the multivoxel patterns of activity within a brain region. These patterns of activity, which are thought to reflect neuronal population codes, can be investigated by pattern-information analysis. In this framework, a region's multivariate pattern information is taken to indicate representational content. This tutorial introduction motivates pattern-information analysis, explains its underlying assumptions, introduces the most widespread methods in an intuitive way, and outlines the basic sequence of analysis steps.
Analysis of a spacecraft instrument ball bearing assembly lubricated by a perfluoroalkylether
NASA Technical Reports Server (NTRS)
Morales, W.; Jones, W. R., Jr.; Buckley, D. H.
1986-01-01
An analysis of a spacecraft instrument ball bearing assembly, subjected to a scanning life test, was performed to determine the possible case of rotational problems involving these units aboard several satellites. The analysis indicated an ineffective transfer of a fluorinated liquid lubricant from a phenolic retainer to the bearing balls. Part of the analysis led to a novel HPLC separation method employing a fluorinated mobile phase in conjunction with silica based size exclusion columns.
Sunada, Keijiro; Yamamoto, Hironori; Kita, Hiroto; Yano, Tomonori; Sato, Hiroyuki; Hayashi, Yoshikazu; Miyata, Tomohiko; Sekine, Yutaka; Kuno, Akiko; Iwamoto, Michiko; Ohnishi, Hirohide; Ido, Kenichi; Sugano, Kentaro
2005-01-01
AIM: To evaluate the clinical outcome of enteroscopy, using the double-balloon method, focusing on the involvement of neoplasms in strictures of the small intestine. METHODS: Enteroscopy, using the double-balloon method, was performed between December 1999 and December 2002 at Jichi Medical School Hospital, Japan and strictures of the small intestine were found in 17 out of 62 patients. These 17 consecutive patients were subjected to analysis. RESULTS: The double-balloon enteroscopy contributed to the diagnosis of small intestinal neoplasms found in 3 out of 17 patients by direct observation of the strictures as well as biopsy sampling. Surgical procedures were chosen for these three patients, while balloon dilation was chosen for the strictures in four patients diagnosed with inflammation without involvement of neoplasm. CONCLUSION: Double-balloon enteroscopy is a useful method for the diagnosis and treatment of strictures in the small bowel. PMID:15742422
Trends in highway construction costs in Louisiana.
DOT National Transportation Integrated Search
1999-09-01
The objective of this research was to identify and quantify the factors that influence the price of highway construction in Louisiana. The method of investigation involved a literature review and an analysis of construction price records in Louisiana...
A Qualitative Study on Organizational Factors Affecting Occupational Accidents
ESKANDARI, Davood; JAFARI, Mohammad Javad; MEHRABI, Yadollah; KIAN, Mostafa Pouya; CHARKHAND, Hossein; MIRGHOTBI, Mostafa
2017-01-01
Background: Technical, human, operational and organizational factors have been influencing the sequence of occupational accidents. Among them, organizational factors play a major role in causing occupational accidents. The aim of this research was to understand the Iranian safety experts’ experiences and perception of organizational factors. Methods: This qualitative study was conducted in 2015 by using the content analysis technique. Data were collected through semi-structured interviews with 17 safety experts working in Iranian universities and industries and analyzed with a conventional qualitative content analysis method using the MAXQDA software. Results: Eleven organizational factors’ sub-themes were identified: management commitment, management participation, employee involvement, communication, blame culture, education and training, job satisfaction, interpersonal relationship, supervision, continuous improvement, and reward system. The participants considered these factors as effective on occupational accidents. Conclusion: The mentioned 11 organizational factors are probably involved in occupational accidents in Iran. Naturally, improving organizational factors can increase the safety performance and reduce occupational accidents. PMID:28435824
Phung, Viet‐Hai; Essam, Nadya; Asghar, Zahid; Spaight, Anne
2015-01-01
Abstract Rationale, aims and objectives Clinical leadership and organizational culture are important contextual factors for quality improvement (QI) but the relationship between these and with organizational change is complex and poorly understood. We aimed to explore the relationship between clinical leadership, culture of innovation and clinical engagement in QI within a national ambulance QI Collaborative (QIC). Methods We used a self‐administered online questionnaire survey sent to front‐line clinicians in all 12 English ambulance services. We conducted a cross‐sectional analysis of quantitative data and qualitative analysis of free‐text responses. Results There were 2743 (12% of 22 117) responses from 11 of the 12 participating ambulance services. In the 3% of responders that were directly involved with the QIC, leadership behaviour was significantly higher than for those not directly involved. QIC involvement made no significant difference to responders' perceptions of the culture of innovation in their organization, which was generally considered poor. Although uptake of QI methods was low overall, QIC members were significantly more likely to use QI methods, which were also significantly associated with leadership behaviour. Conclusions Despite a limited organizational culture of innovation, clinical leadership and use of QI methods in ambulance services generally, the QIC achieved its aims to significantly improve pre‐hospital care for acute myocardial infarction and stroke. We postulate that this was mediated through an improvement subculture, linked to the QIC, which facilitated large‐scale improvement by stimulating leadership and QI methods. Further research is needed to understand success factors for QI in complex health care environments. PMID:26303398
Velayutham, Selva Ganapathy; Chandra, Sadanandavalli Retnaswami; Bharath, Srikala; Shankar, Ravi Girikamatha
2017-01-01
Alzhiemers disease and Frontotemporal dementia are common neurodegenerative dementias with a wide prevalence. Falls are a common cause of morbidity in these patients. Identifying subclinical involvement of these parameters might serve as a tool in differential analysis of these distinct parameters involved in these conditions and also help in planning preventive strategies to prevent falls. Eight patients in age and gender matched patients in each group were compared with normal controls. Standardizes methods of gait and balance aseesment were done in all persons. Results revealed subclinical involvement of gait and balancesin all groups specially during divided attention. The parameters were significantly more affected in patients. Patients with AD and FTD had involement of over all ambulation index balance more affected in AD patients FTD patients showed step cycle, stride length abnormalities. There is balance and gait involvement in normal ageing as well as patients with AD and FTD. The pattern of involvement in AD correlates with WHERE pathway involvement and FTD with frontal subcortical circuits involvement. Identification the differential patterns of involvement in subclinical stage might help to differentiate normal ageing and the different types of cortical dementias. This could serve as an additional biomarker and also assist in initiating appropriate training methods to prevent future falls.
Structural landscape of base pairs containing post-transcriptional modifications in RNA
Seelam, Preethi P.; Sharma, Purshotam
2017-01-01
Base pairs involving post-transcriptionally modified nucleobases are believed to play important roles in a wide variety of functional RNAs. Here we present our attempts toward understanding the structural and functional role of naturally occurring modified base pairs using a combination of X-ray crystal structure database analysis, sequence analysis, and advanced quantum chemical methods. Our bioinformatics analysis reveals that despite their presence in all major secondary structural elements, modified base pairs are most prevalent in tRNA crystal structures and most commonly involve guanine or uridine modifications. Further, analysis of tRNA sequences reveals additional examples of modified base pairs at structurally conserved tRNA regions and highlights the conservation patterns of these base pairs in three domains of life. Comparison of structures and binding energies of modified base pairs with their unmodified counterparts, using quantum chemical methods, allowed us to classify the base modifications in terms of the nature of their electronic structure effects on base-pairing. Analysis of specific structural contexts of modified base pairs in RNA crystal structures revealed several interesting scenarios, including those at the tRNA:rRNA interface, antibiotic-binding sites on the ribosome, and the three-way junctions within tRNA. These scenarios, when analyzed in the context of available experimental data, allowed us to correlate the occurrence and strength of modified base pairs with their specific functional roles. Overall, our study highlights the structural importance of modified base pairs in RNA and points toward the need for greater appreciation of the role of modified bases and their interactions, in the context of many biological processes involving RNA. PMID:28341704
Schwarz, Betje; Specht, Timo; Bethge, Matthias
2017-12-01
Purpose To explore the patient's perspective on the involvement of employers into rehabilitation. Methods 8 participants of a work-related medical rehabilitation were interviewed by telephone 4 weeks after discharge. Qualitative content analysis was used to analyze generated data. Results Beside a poor employer-involvement, the interviews revealed that the process of returning to work was characterized and hampered by unused measures of supporting vocational reintegration during rehabilitation, intersection problems in the health care and social security system, and a strategy of waiting by all involved actors. Conclusion Beside an improved employer-involvement, systematic intersection management and full usage of existing measures are demanded to support vocational reintegration. © Georg Thieme Verlag KG Stuttgart · New York.
Methods of space radiation dose analysis with applications to manned space systems
NASA Technical Reports Server (NTRS)
Langley, R. W.; Billings, M. P.
1972-01-01
The full potential of state-of-the-art space radiation dose analysis for manned missions has not been exploited. Point doses have been overemphasized, and the critical dose to the bone marrow has been only crudely approximated, despite the existence of detailed man models and computer codes for dose integration in complex geometries. The method presented makes it practical to account for the geometrical detail of the astronaut as well as the vehicle. Discussed are the major assumptions involved and the concept of applying the results of detailed proton dose analysis to the real-time interpretation of on-board dosimetric measurements.
Spectral Analysis of B Stars: An Application of Bayesian Statistics
NASA Astrophysics Data System (ADS)
Mugnes, J.-M.; Robert, C.
2012-12-01
To better understand the processes involved in stellar physics, it is necessary to obtain accurate stellar parameters (effective temperature, surface gravity, abundances…). Spectral analysis is a powerful tool for investigating stars, but it is also vital to reduce uncertainties at a decent computational cost. Here we present a spectral analysis method based on a combination of Bayesian statistics and grids of synthetic spectra obtained with TLUSTY. This method simultaneously constrains the stellar parameters by using all the lines accessible in observed spectra and thus greatly reduces uncertainties and improves the overall spectrum fitting. Preliminary results are shown using spectra from the Observatoire du Mont-Mégantic.
COLD-SAT feasibility study safety analysis
NASA Technical Reports Server (NTRS)
Mchenry, Steven T.; Yost, James M.
1991-01-01
The Cryogenic On-orbit Liquid Depot-Storage, Acquisition, and Transfer (COLD-SAT) satellite presents some unique safety issues. The feasibility study conducted at NASA-Lewis desired a systems safety program that would be involved from the initial design in order to eliminate and/or control the inherent hazards. Because of this, a hazards analysis method was needed that: (1) identified issues that needed to be addressed for a feasibility assessment; and (2) identified all potential hazards that would need to be controlled and/or eliminated during the detailed design phases. The developed analysis method is presented as well as the results generated for the COLD-SAT system.
NASA Technical Reports Server (NTRS)
Junkin, B. G. (Principal Investigator)
1979-01-01
A method is presented for the processing and analysis of digital topography data that can subsequently be entered in an interactive data base in the form of slope, slope length, elevation, and aspect angle. A discussion of the data source and specific descriptions of the data processing software programs are included. In addition, the mathematical considerations involved in the registration of raw digitized coordinate points to the UTM coordinate system are presented. Scale factor considerations are also included. Results of the processing and analysis are illustrated using the Shiprock and Gallup Quadrangle test data.
Texture analysis of pulmonary parenchyma in normal and emphysematous lung
NASA Astrophysics Data System (ADS)
Uppaluri, Renuka; Mitsa, Theophano; Hoffman, Eric A.; McLennan, Geoffrey; Sonka, Milan
1996-04-01
Tissue characterization using texture analysis is gaining increasing importance in medical imaging. We present a completely automated method for discriminating between normal and emphysematous regions from CT images. This method involves extracting seventeen features which are based on statistical, hybrid and fractal texture models. The best subset of features is derived from the training set using the divergence technique. A minimum distance classifier is used to classify the samples into one of the two classes--normal and emphysema. Sensitivity and specificity and accuracy values achieved were 80% or greater in most cases proving that texture analysis holds great promise in identifying emphysema.
NASA Astrophysics Data System (ADS)
Kistenev, Yury V.; Karapuzikov, Alexander I.; Kostyukova, Nadezhda Yu.; Starikova, Marina K.; Boyko, Andrey A.; Bukreeva, Ekaterina B.; Bulanova, Anna A.; Kolker, Dmitry B.; Kuzmin, Dmitry A.; Zenov, Konstantin G.; Karapuzikov, Alexey A.
2015-06-01
A human exhaled air analysis by means of infrared (IR) laser photoacoustic spectroscopy is presented. Eleven healthy nonsmoking volunteers (control group) and seven patients with chronic obstructive pulmonary disease (COPD, target group) were involved in the study. The principal component analysis method was used to select the most informative ranges of the absorption spectra of patients' exhaled air in terms of the separation of the studied groups. It is shown that the data of the profiles of exhaled air absorption spectrum in the informative ranges allow identifying COPD patients in comparison to the control group.
Anastassiades, M; Schwack, W
1998-10-30
Simple methods for the analysis of carbendazim, benomyl and thiophanate methyl in fruits and vegetables and of 2,4-D in citrus fruits are presented. Sample preparation involves supercritical fluid extraction with carbon dioxide and further analysis is performed without any additional clean-up by GC-MS after derivatisation or directly by HPLC-diode array detection. The SFE methods presented are clearly faster and more cost effective than traditional solvent based approaches. The recoveries, detection limits and repeatabilities achieved, meet the needs of tolerance level monitoring of these compounds in fruits and vegetables.
A Multi-Methods Approach to HRA and Human Performance Modeling: A Field Assessment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jacques Hugo; David I Gertman
2012-06-01
The Advanced Test Reactor (ATR) is a research reactor at the Idaho National Laboratory is primarily designed and used to test materials to be used in other, larger-scale and prototype reactors. The reactor offers various specialized systems and allows certain experiments to be run at their own temperature and pressure. The ATR Canal temporarily stores completed experiments and used fuel. It also has facilities to conduct underwater operations such as experiment examination or removal. In reviewing the ATR safety basis, a number of concerns were identified involving the ATR canal. A brief study identified ergonomic issues involving the manual handlingmore » of fuel elements in the canal that may increase the probability of human error and possible unwanted acute physical outcomes to the operator. In response to this concern, that refined the previous HRA scoping analysis by determining the probability of the inadvertent exposure of a fuel element to the air during fuel movement and inspection was conducted. The HRA analysis employed the SPAR-H method and was supplemented by information gained from a detailed analysis of the fuel inspection and transfer tasks. This latter analysis included ergonomics, work cycles, task duration, and workload imposed by tool and workplace characteristics, personal protective clothing, and operational practices that have the potential to increase physical and mental workload. Part of this analysis consisted of NASA-TLX analyses, combined with operational sequence analysis, computational human performance analysis (CHPA), and 3D graphical modeling to determine task failures and precursors to such failures that have safety implications. Experience in applying multiple analysis techniques in support of HRA methods is discussed.« less
A Bayesian approach to meta-analysis of plant pathology studies.
Mila, A L; Ngugi, H K
2011-01-01
Bayesian statistical methods are used for meta-analysis in many disciplines, including medicine, molecular biology, and engineering, but have not yet been applied for quantitative synthesis of plant pathology studies. In this paper, we illustrate the key concepts of Bayesian statistics and outline the differences between Bayesian and classical (frequentist) methods in the way parameters describing population attributes are considered. We then describe a Bayesian approach to meta-analysis and present a plant pathological example based on studies evaluating the efficacy of plant protection products that induce systemic acquired resistance for the management of fire blight of apple. In a simple random-effects model assuming a normal distribution of effect sizes and no prior information (i.e., a noninformative prior), the results of the Bayesian meta-analysis are similar to those obtained with classical methods. Implementing the same model with a Student's t distribution and a noninformative prior for the effect sizes, instead of a normal distribution, yields similar results for all but acibenzolar-S-methyl (Actigard) which was evaluated only in seven studies in this example. Whereas both the classical (P = 0.28) and the Bayesian analysis with a noninformative prior (95% credibility interval [CRI] for the log response ratio: -0.63 to 0.08) indicate a nonsignificant effect for Actigard, specifying a t distribution resulted in a significant, albeit variable, effect for this product (CRI: -0.73 to -0.10). These results confirm the sensitivity of the analytical outcome (i.e., the posterior distribution) to the choice of prior in Bayesian meta-analyses involving a limited number of studies. We review some pertinent literature on more advanced topics, including modeling of among-study heterogeneity, publication bias, analyses involving a limited number of studies, and methods for dealing with missing data, and show how these issues can be approached in a Bayesian framework. Bayesian meta-analysis can readily include information not easily incorporated in classical methods, and allow for a full evaluation of competing models. Given the power and flexibility of Bayesian methods, we expect them to become widely adopted for meta-analysis of plant pathology studies.
Probing Cosmic Infrared Sources: A Computer Modeling Approach
1992-06-01
developed to study various physical phenomena involving dust grains, e.g., molecule formation on grains, grain formation in expanding circumstellar...EVALUATION OF METHODS OF ANALYSIS IN INFRARED ASTR9?NOMY 16 4.0 THEORETICAL STUDIES INVOLVING DUST GRAINS., 16 4.1 Theory of Molecule Formation on Dust Grains...17 4.2 Modeling Grain Formation in Stellar Outflows 7 18 4.3 Infrared Emission from Fractal Grains * 19 4.4 Photochemistry in Circumstellar Envelopes
Investigation of High-Angle-of-Attack Maneuver-Limiting Factors. Part 1. Analysis and Simulation
1980-12-01
useful, are not so satisfying or in- structive as the more positive identification of causal factors offered by the methods developed in Reference 5...same methods be applied to additional high-performance fighter aircraft having widely differing high AOA handling characteristics to see if further...predictions and the nonlinear model results were resolved. The second task involved development of methods , criteria, and an associated pilot rating scale, for
Biotinyl endothelin-1 binding to endothelin receptor and its applications.
Saravanan, K; Paramasivam, M; Dey, S; Singh, T P; Srinivasan, A
2004-09-01
The endothelin (ET) system consists of two membrane receptor types A and B and three 21-mer isopeptides endothelin-1, endothelin-2, and endothelin-3 as ligands. This system is involved in many physiological processes such as vasomodulation, neurotransmission, embryonic development, renal function, and regulation of cell proliferation. In many pathophysiological conditions involving endothelin system, the endothelin antagonism could be a possible clinical treatment. Designing of an antagonist involves the characterization of the binding of the test compounds to the endothelin receptors. This is being carried out using radioactive ligand. A simpler and quicker method will be of great advantage. This study reports a non-radioactive method for establishing the IC50 concentrations of the ligand. This method uses biotinylated-endothelin-1 and streptavidin conjugated with horseradish peroxidase. Hydroxyl apatite gel is used for separating the bound and unbound biotin-tagged endothelin-1. This method is applicable to detergent solubilized receptors and purified recombinant receptors. The endothelin receptor type A expressed in Pichia pastoris system has been used in this study. We show that this method is applicable in Western blot analysis of endothelin-1 and its receptor complex. This can be used to localize the receptor molecules as well.
NASA Astrophysics Data System (ADS)
Wang, Qianren; Chen, Xing; Yin, Yuehong; Lu, Jian
2017-08-01
With the increasing complexity of mechatronic products, traditional empirical or step-by-step design methods are facing great challenges with various factors and different stages having become inevitably coupled during the design process. Management of massive information or big data, as well as the efficient operation of information flow, is deeply involved in the process of coupled design. Designers have to address increased sophisticated situations when coupled optimisation is also engaged. Aiming at overcoming these difficulties involved in conducting the design of the spindle box system of ultra-precision optical grinding machine, this paper proposed a coupled optimisation design method based on state-space analysis, with the design knowledge represented by ontologies and their semantic networks. An electromechanical coupled model integrating mechanical structure, control system and driving system of the motor is established, mainly concerning the stiffness matrix of hydrostatic bearings, ball screw nut and rolling guide sliders. The effectiveness and precision of the method are validated by the simulation results of the natural frequency and deformation of the spindle box when applying an impact force to the grinding wheel.
DOT National Transportation Integrated Search
2014-07-01
Pavement Condition surveys are carried out periodically to gather information on pavement distresses that will guide decision-making for maintenance and preservation. Traditional methods involve manual pavement inspections which are time-consuming : ...
ERIC Educational Resources Information Center
Sandefur, James T.
1991-01-01
Discussed is the process of translating situations involving changing quantities into mathematical relationships. This process, called dynamical modeling, allows students to learn new mathematics while sharpening their algebraic skills. A description of dynamical systems, problem-solving methods, a graphical analysis, and available classroom…
Using Spider-Web Patterns To Determine Toxicity
NASA Technical Reports Server (NTRS)
Noever, David A.; Cronise, Raymond J.; Relwani, Rachna A.
1995-01-01
Method of determining toxicities of chemicals involves recording and analysis of spider-web patterns. Based on observation spiders exposed to various chemicals spin webs that differ, in various ways, from normal webs. Potential alternative to toxicity testing on higher animals.
ERIC Educational Resources Information Center
Blake, Anthony; Francis, David
1973-01-01
Approaches to developing management ability include systematic techniques, mental enlargement, self-analysis, and job-related counseling. A method is proposed to integrate them into a responsive program involving depth understanding, vision of the future, specialization commitment to change, and self-monitoring control. (MS)
The promises of big data and small data for travel behavior (aka human mobility) analysis.
Chen, Cynthia; Ma, Jingtao; Susilo, Yusak; Liu, Yu; Wang, Menglin
2016-07-01
The last decade has witnessed very active development in two broad, but separate fields, both involving understanding and modeling of how individuals move in time and space (hereafter called "travel behavior analysis" or "human mobility analysis"). One field comprises transportation researchers who have been working in the field for decades and the other involves new comers from a wide range of disciplines, but primarily computer scientists and physicists. Researchers in these two fields work with different datasets, apply different methodologies, and answer different but overlapping questions. It is our view that there is much, hidden synergy between the two fields that needs to be brought out. It is thus the purpose of this paper to introduce datasets, concepts, knowledge and methods used in these two fields, and most importantly raise cross-discipline ideas for conversations and collaborations between the two. It is our hope that this paper will stimulate many future cross-cutting studies that involve researchers from both fields.
Ostrow, Laysha; Penney, Darby; Stuart, Elizabeth; Leaf, Phillip J
2017-01-01
The 2012 National Survey of Peer-Run Organizations is one of the first to survey a nationally representative sample of mental health peer-run organizations, nonprofit venues for support and advocacy which are defined by people with psychiatric histories being in positions of authority and control. This paper describes data collection methods and demonstrates how participatory strategies to involve people with psychiatric histories intersected with Internet research to achieve study aims. People with psychiatric histories were involved in designing and implementing a web-based survey to collect data on peer-run organizations' operations and views on national policy. Participatory approaches were used throughout design, data collection analysis, and dissemination. The extensive involvement of people with psychiatric histories in project design and implementation were important strategies that contributed to this study's success.
NASA Astrophysics Data System (ADS)
Britvin, Sergey N.; Rumyantsev, Andrey M.; Zobnina, Anastasia E.; Padkina, Marina V.
2017-02-01
Molecular structure of 1,4-diazabicyclo[3.2.1]octane, a parent ring of TAN1251 family of alkaloids, is herein characterized for the first time in comparison with the structure of nortropane (8-azabicyclo[3.2.1]octane), the parent framework of tropane ring system. The methods of study involve X-ray structural analysis, DFT geometry optimizations with infrared frequency calculations followed by natural bond orbital (NBO) analysis, and vibrational analysis of infrared spectrum.
ERIC Educational Resources Information Center
Perrotta, Carlo; Williamson, Ben
2018-01-01
This paper argues that methods used for the classification and measurement of online education are not neutral and objective, but involved in the creation of the educational realities they claim to measure. In particular, the paper draws on material semiotics to examine cluster analysis as a 'performative device' that, to a significant extent,…
Landscape maps as an aid to management of scenic mountain areas
Roland Baumgartner
1979-01-01
Before any question about wise management decisions concerning the visual resource of our environment can be answered, it is necessary to conduct a detailed analysis to determine the integral visual inventory of landscape, as it impresses any involved person. With this method of landscape analysis researchers and planners can specify the potential of any region with an...
Exponential approximations in optimal design
NASA Technical Reports Server (NTRS)
Belegundu, A. D.; Rajan, S. D.; Rajgopal, J.
1990-01-01
One-point and two-point exponential functions have been developed and proved to be very effective approximations of structural response. The exponential has been compared to the linear, reciprocal and quadratic fit methods. Four test problems in structural analysis have been selected. The use of such approximations is attractive in structural optimization to reduce the numbers of exact analyses which involve computationally expensive finite element analysis.
ERIC Educational Resources Information Center
Firdausiah Mansur, Andi Besse; Yusof, Norazah
2013-01-01
Clustering on Social Learning Network still not explored widely, especially when the network focuses on e-learning system. Any conventional methods are not really suitable for the e-learning data. SNA requires content analysis, which involves human intervention and need to be carried out manually. Some of the previous clustering techniques need…
ERIC Educational Resources Information Center
Yalçinkaya, Begüm
2015-01-01
The aim of this study is to determine which values are included in education songs in elementary school textbooks and the level of these values. This study, conducted using document analysis method, involved primary education music class textbooks. Education songs in textbooks were analyzed within the frame of 29 values determined based on…
Index in Alexandre Dumas' Novel the Man in the Iron Mask: A Semiotic Analysis
ERIC Educational Resources Information Center
Syarifuddin, Salmia; Yahya, Andi Rukayah Alim; Jusoff, Kamaruzaman; Makhsud, Abdul
2013-01-01
Novel as a literary work can be analyzed by using semiotic analysis. This article aims to analyze the meaning of index found in characterizations in the novel "The Man in the Iron Mask" by Alexandre Dumas. This article involved the descriptive qualitative method. The results revealed that there are many causal relations between the index…
USDA-ARS?s Scientific Manuscript database
Cytochrome P450s (CYPs) encode one of the most diverse enzyme superfamily in nature. They catalyze oxidative reactions of endogenous molecules and exogenous chemicals. Methods: We identifiedCYPs genes through in silico analysis using EST, RNA-Seq and genome databases of channel catfish.Phylogenetic ...
Jeffrey D. Kline; Alissa Moses; Theresa Burcsu
2010-01-01
Forest policymakers, public lands managers, and scientists in the Pacific Northwest (USA) seek ways to evaluate the landscape-level effects of policies and management through the multidisciplinary development and application of spatially explicit methods and models. The Interagency Mapping and Analysis Project (IMAP) is an ongoing effort to generate landscape-wide...
Bullying and Suicidal Ideation and Behaviors: A Meta-Analysis
Holt, Melissa K.; Vivolo-Kantor, Alana M.; Polanin, Joshua R.; Holland, Kristin M.; DeGue, Sarah; Matjasko, Jennifer L.; Wolfe, Misty; Reid, Gerald
2015-01-01
BACKGROUND AND OBJECTIVES Over the last decade there has been increased attention to the association between bullying involvement (as a victim, perpetrator, or bully-victim) and suicidal ideation/behaviors. We conducted a meta-analysis to estimate the association between bullying involvement and suicidal ideation and behaviors. METHODS We searched multiple online databases and reviewed reference sections of articles derived from searches to identify cross-sectional studies published through July 2013. Using search terms associated with bullying, suicide, and youth, 47 studies (38.3% from the United States, 61.7% in non-US samples) met inclusion criteria. Seven observers independently coded studies and met in pairs to reach consensus. RESULTS Six different meta-analyses were conducted by using 3 predictors (bullying victimization, bullying perpetration, and bully/victim status) and 2 outcomes (suicidal ideation and suicidal behaviors). A total of 280 effect sizes were extracted and multilevel, random effects meta-analyses were performed. Results indicated that each of the predictors were associated with risk for suicidal ideation and behavior (range, 2.12 [95% confidence interval (CI), 1.67–2.69] to 4.02 [95% CI, 2.39–6.76]). Significant heterogeneity remained across each analysis. The bullying perpetration and suicidal behavior effect sizes were moderated by the study’s country of origin; the bully/victim status and suicidal ideation results were moderated by bullying assessment method. CONCLUSIONS Findings demonstrated that involvement in bullying in any capacity is associated with suicidal ideation and behavior. Future research should address mental health implications of bullying involvement to prevent suicidal ideation/behavior. PMID:25560447
Proteome Profile of Starch Granules Purified from Rice (Oryza sativa) Endosperm
Xing, Shihai; Meng, Xiaoxi; Zhou, Lihui; Mujahid, Hana; Zhao, Chunfang; Zhang, Yadong; Wang, Cailin; Peng, Zhaohua
2016-01-01
Starch is the most important food energy source in cereals. Many of the known enzymes involved in starch biosynthesis are partially or entirely granule-associated in the endosperm. Studying the proteome of rice starch granules is critical for us to further understand the mechanisms underlying starch biosynthesis and packaging of starch granules in rice amyloplasts, consequently for the improvement of rice grain quality. In this article, we developed a protocol to purify starch granules from mature rice endosperm and verified the quality of purified starch granules by microscopy observations, I2 staining, and Western blot analyses. In addition, we found the phenol extraction method was superior to Tris-HCl buffer extraction method with respect to the efficiency in recovery of starch granule associated proteins. LC-MS/MS analysis showed identification of already known starch granule associated proteins with high confidence. Several proteins reported to be involved in starch synthesis in prior genetic studies in plants were also shown to be enriched with starch granules, either directly or indirectly, in our studies. In addition, our results suggested that a few additional candidate proteins may also be involved in starch synthesis. Furthermore, our results indicated that some starch synthesis pathway proteins are subject to protein acetylation modification. GO analysis and KEGG pathway enrichment analysis showed that the identified proteins were mainly located in plastids and involved in carbohydrate metabolism. This study substantially advances the understanding of the starch granule associated proteome in rice and post translational regulation of some starch granule associated proteins. PMID:27992503
A wavelet-based technique to predict treatment outcome for Major Depressive Disorder.
Mumtaz, Wajid; Xia, Likun; Mohd Yasin, Mohd Azhar; Azhar Ali, Syed Saad; Malik, Aamir Saeed
2017-01-01
Treatment management for Major Depressive Disorder (MDD) has been challenging. However, electroencephalogram (EEG)-based predictions of antidepressant's treatment outcome may help during antidepressant's selection and ultimately improve the quality of life for MDD patients. In this study, a machine learning (ML) method involving pretreatment EEG data was proposed to perform such predictions for Selective Serotonin Reuptake Inhibitor (SSRIs). For this purpose, the acquisition of experimental data involved 34 MDD patients and 30 healthy controls. Consequently, a feature matrix was constructed involving time-frequency decomposition of EEG data based on wavelet transform (WT) analysis, termed as EEG data matrix. However, the resultant EEG data matrix had high dimensionality. Therefore, dimension reduction was performed based on a rank-based feature selection method according to a criterion, i.e., receiver operating characteristic (ROC). As a result, the most significant features were identified and further be utilized during the training and testing of a classification model, i.e., the logistic regression (LR) classifier. Finally, the LR model was validated with 100 iterations of 10-fold cross-validation (10-CV). The classification results were compared with short-time Fourier transform (STFT) analysis, and empirical mode decompositions (EMD). The wavelet features extracted from frontal and temporal EEG data were found statistically significant. In comparison with other time-frequency approaches such as the STFT and EMD, the WT analysis has shown highest classification accuracy, i.e., accuracy = 87.5%, sensitivity = 95%, and specificity = 80%. In conclusion, significant wavelet coefficients extracted from frontal and temporal pre-treatment EEG data involving delta and theta frequency bands may predict antidepressant's treatment outcome for the MDD patients.
Perturbation solutions of combustion instability problems
NASA Technical Reports Server (NTRS)
Googerdy, A.; Peddieson, J., Jr.; Ventrice, M.
1979-01-01
A method involving approximate modal analysis using the Galerkin method followed by an approximate solution of the resulting modal-amplitude equations by the two-variable perturbation method (method of multiple scales) is applied to two problems of pressure-sensitive nonlinear combustion instability in liquid-fuel rocket motors. One problem exhibits self-coupled instability while the other exhibits mode-coupled instability. In both cases it is possible to carry out the entire linear stability analysis and significant portions of the nonlinear stability analysis in closed form. In the problem of self-coupled instability the nonlinear stability boundary and approximate forms of the limit-cycle amplitudes and growth and decay rates are determined in closed form while the exact limit-cycle amplitudes and growth and decay rates are found numerically. In the problem of mode-coupled instability the limit-cycle amplitudes are found in closed form while the growth and decay rates are found numerically. The behavior of the solutions found by the perturbation method are in agreement with solutions obtained using complex numerical methods.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lloyd, S. A. M.; Ansbacher, W.; Department of Physics and Astronomy, University of Victoria, Victoria, British Columbia V8W 3P6
2013-01-15
Purpose: Acuros external beam (Acuros XB) is a novel dose calculation algorithm implemented through the ECLIPSE treatment planning system. The algorithm finds a deterministic solution to the linear Boltzmann transport equation, the same equation commonly solved stochastically by Monte Carlo methods. This work is an evaluation of Acuros XB, by comparison with Monte Carlo, for dose calculation applications involving high-density materials. Existing non-Monte Carlo clinical dose calculation algorithms, such as the analytic anisotropic algorithm (AAA), do not accurately model dose perturbations due to increased electron scatter within high-density volumes. Methods: Acuros XB, AAA, and EGSnrc based Monte Carlo are usedmore » to calculate dose distributions from 18 MV and 6 MV photon beams delivered to a cubic water phantom containing a rectangular high density (4.0-8.0 g/cm{sup 3}) volume at its center. The algorithms are also used to recalculate a clinical prostate treatment plan involving a unilateral hip prosthesis, originally evaluated using AAA. These results are compared graphically and numerically using gamma-index analysis. Radio-chromic film measurements are presented to augment Monte Carlo and Acuros XB dose perturbation data. Results: Using a 2% and 1 mm gamma-analysis, between 91.3% and 96.8% of Acuros XB dose voxels containing greater than 50% the normalized dose were in agreement with Monte Carlo data for virtual phantoms involving 18 MV and 6 MV photons, stainless steel and titanium alloy implants and for on-axis and oblique field delivery. A similar gamma-analysis of AAA against Monte Carlo data showed between 80.8% and 87.3% agreement. Comparing Acuros XB and AAA evaluations of a clinical prostate patient plan involving a unilateral hip prosthesis, Acuros XB showed good overall agreement with Monte Carlo while AAA underestimated dose on the upstream medial surface of the prosthesis due to electron scatter from the high-density material. Film measurements support the dose perturbations demonstrated by Monte Carlo and Acuros XB data. Conclusions: Acuros XB is shown to perform as well as Monte Carlo methods and better than existing clinical algorithms for dose calculations involving high-density volumes.« less
A Comparative Study of Involvement and Motivation among Casino Gamblers
Lee, Choong-Ki; Lee, BongKoo; Bernhard, Bo Jason
2009-01-01
Objective The purpose of this paper is to investigate three different types of gamblers (which we label "non-problem", "some problem", and "probable pathological gamblers") to determine differences in involvement and motivation, as well as differences in demographic and behavioral variables. Methods The analysis takes advantage of a unique opportunity to sample on-site at a major casino in South Korea, and the resulting purposive sample yielded 180 completed questionnaires in each of the three groups, for a total number of 540. Factor analysis, analysis of variance (ANOVA) and Duncan tests, and Chi-square tests are employed to analyze the data collected from the survey. Results Findings from ANOVA tests indicate that involvement factors of importance/self-expression, pleasure/interest, and centrality derived from the factor analysis were significantly different among these three types of gamblers. The "probable pathological" and "some problem" gamblers were found to have similar degrees of involvement, and higher degrees of involvement than the non-problem gamblers. The tests also reveal that motivational factors of escape, socialization, winning, and exploring scenery were significantly different among these three types of gamblers. When looking at motivations to visit the casino, "probable pathological" gamblers were more likely to seek winning, the "some problem" group appeared to be more likely to seek escape, and the "non-problem" gamblers indicate that their motivations to visit centered around explorations of scenery and culture in the surrounding casino area. Conclusion The tools for exploring motivations and involvements of gambling provide valuable and discerning information about the entire spectrum of gamblers. PMID:20046388
Inversion of the strain-life and strain-stress relationships for use in metal fatigue analysis
NASA Technical Reports Server (NTRS)
Manson, S. S.
1979-01-01
The paper presents closed-form solutions (collocation method and spline-function method) for the constants of the cyclic fatigue life equation so that they can be easily incorporated into cumulative damage analysis. The collocation method involves conformity with the experimental curve at specific life values. The spline-function method is such that the basic life relation is expressed as a two-part function, one applicable at strains above the transition strain (strain at intersection of elastic and plastic lines), the other below. An illustrative example is treated by both methods. It is shown that while the collocation representation has the advantage of simplicity of form, the spline-function representation can be made more accurate over a wider life range, and is simpler to use.
Wagner, Rebecca; Wetzel, Stephanie J; Kern, John; Kingston, H M Skip
2012-02-01
The employment of chemical weapons by rogue states and/or terrorist organizations is an ongoing concern in the United States. The quantitative analysis of nerve agents must be rapid and reliable for use in the private and public sectors. Current methods describe a tedious and time-consuming derivatization for gas chromatography-mass spectrometry and liquid chromatography in tandem with mass spectrometry. Two solid-phase extraction (SPE) techniques for the analysis of glyphosate and methylphosphonic acid are described with the utilization of isotopically enriched analytes for quantitation via atmospheric pressure chemical ionization-quadrupole time-of-flight mass spectrometry (APCI-Q-TOF-MS) that does not require derivatization. Solid-phase extraction-isotope dilution mass spectrometry (SPE-IDMS) involves pre-equilibration of a naturally occurring sample with an isotopically enriched standard. The second extraction method, i-Spike, involves loading an isotopically enriched standard onto the SPE column before the naturally occurring sample. The sample and the spike are then co-eluted from the column enabling precise and accurate quantitation via IDMS. The SPE methods in conjunction with IDMS eliminate concerns of incomplete elution, matrix and sorbent effects, and MS drift. For accurate quantitation with IDMS, the isotopic contribution of all atoms in the target molecule must be statistically taken into account. This paper describes two newly developed sample preparation techniques for the analysis of nerve agent surrogates in drinking water as well as statistical probability analysis for proper molecular IDMS. The methods described in this paper demonstrate accurate molecular IDMS using APCI-Q-TOF-MS with limits of quantitation as low as 0.400 mg/kg for glyphosate and 0.031 mg/kg for methylphosphonic acid. Copyright © 2012 John Wiley & Sons, Ltd.
More About the Phase-Synchronized Enhancement Method
NASA Technical Reports Server (NTRS)
Jong, Jen-Yi
2004-01-01
A report presents further details regarding the subject matter of "Phase-Synchronized Enhancement Method for Engine Diagnostics" (MFS-26435), NASA Tech Briefs, Vol. 22, No. 1 (January 1998), page 54. To recapitulate: The phase-synchronized enhancement method (PSEM) involves the digital resampling of a quasi-periodic signal in synchronism with the instantaneous phase of one of its spectral components. This resampling transforms the quasi-periodic signal into a periodic one more amenable to analysis. It is particularly useful for diagnosis of a rotating machine through analysis of vibration spectra that include components at the fundamental and harmonics of a slightly fluctuating rotation frequency. The report discusses the machinery-signal-analysis problem, outlines the PSEM algorithms, presents the mathematical basis of the PSEM, and presents examples of application of the PSEM in some computational simulations.
Nelson, Sarah E.; LaBrie, Richard A.; Shaffer, Howard J.
2011-01-01
Background: The purpose of this study was to examine the relationships between types of gambling and disordered gambling, with and without controlling for gambling involvement (i.e. the number of types of games with which respondents were involved during the past 12 months). Methods: We completed a secondary data analysis of the 2007 British Gambling Prevalence Survey (BGPS), which collected data in England, Scotland and Wales between September 2006 and March 2007. The sample included 9003 residents, aged 16 or older, recruited from 10 144 randomly selected addresses. 5832 households contributed at least one participant. Post-facto weighting to produce a nationally representative sample yielded 8968 observations. The BGPS included four primary types of measures: participation in gambling (during the past 12 months and during the past 7 days), disordered gambling assessments, attitudes toward gambling and descriptive information. Results: Statistically controlling for gambling involvement substantially reduced or eliminated all statistically significant relationships between types of gambling and disordered gambling. Conclusions: Gambling involvement is an important predictor of disordered gambling status. Our analysis indicates that greater gambling involvement better characterizes disordered gambling than does any specific type of gambling. PMID:19892851
Design and analysis of group-randomized trials in cancer: A review of current practices.
Murray, David M; Pals, Sherri L; George, Stephanie M; Kuzmichev, Andrey; Lai, Gabriel Y; Lee, Jocelyn A; Myles, Ranell L; Nelson, Shakira M
2018-06-01
The purpose of this paper is to summarize current practices for the design and analysis of group-randomized trials involving cancer-related risk factors or outcomes and to offer recommendations to improve future trials. We searched for group-randomized trials involving cancer-related risk factors or outcomes that were published or online in peer-reviewed journals in 2011-15. During 2016-17, in Bethesda MD, we reviewed 123 articles from 76 journals to characterize their design and their methods for sample size estimation and data analysis. Only 66 (53.7%) of the articles reported appropriate methods for sample size estimation. Only 63 (51.2%) reported exclusively appropriate methods for analysis. These findings suggest that many investigators do not adequately attend to the methodological challenges inherent in group-randomized trials. These practices can lead to underpowered studies, to an inflated type 1 error rate, and to inferences that mislead readers. Investigators should work with biostatisticians or other methodologists familiar with these issues. Funders and editors should ensure careful methodological review of applications and manuscripts. Reviewers should ensure that studies are properly planned and analyzed. These steps are needed to improve the rigor and reproducibility of group-randomized trials. The Office of Disease Prevention (ODP) at the National Institutes of Health (NIH) has taken several steps to address these issues. ODP offers an online course on the design and analysis of group-randomized trials. ODP is working to increase the number of methodologists who serve on grant review panels. ODP has developed standard language for the Application Guide and the Review Criteria to draw investigators' attention to these issues. Finally, ODP has created a new Research Methods Resources website to help investigators, reviewers, and NIH staff better understand these issues. Published by Elsevier Inc.
Xiang, Zheng; Sun, Hao; Cai, Xiaojun; Chen, Dahui
2016-04-01
Transmission of biological information is a biochemical process of multistep cascade from genes/proteins to metabolites. However, because most metabolites reflect the terminal information of the biochemical process, it is difficult to describe the transmission process of disease information in terms of the metabolomics strategy. In this paper, by incorporating network and metabolomics methods, an integrated approach was proposed to systematically investigate and explain the molecular mechanism of renal interstitial fibrosis. Through analysis of the network, the cascade transmission process of disease information starting from genes/proteins to metabolites was putatively identified and uncovered. The results indicated that renal fibrosis was involved in metabolic pathways of glycerophospholipid metabolism, biosynthesis of unsaturated fatty acids and arachidonic acid metabolism, riboflavin metabolism, tyrosine metabolism, and sphingolipid metabolism. These pathways involve kidney disease genes such as TGF-β1 and P2RX7. Our results showed that combining metabolomics and network analysis can provide new strategies and ideas for the interpretation of pathogenesis of disease with full consideration of "gene-protein-metabolite."
Marini, Federico; de Beer, Dalene; Walters, Nico A; de Villiers, André; Joubert, Elizabeth; Walczak, Beata
2017-03-17
An ultimate goal of investigations of rooibos plant material subjected to different stages of fermentation is to identify the chemical changes taking place in the phenolic composition, using an untargeted approach and chromatographic fingerprints. Realization of this goal requires, among others, identification of the main components of the plant material involved in chemical reactions during the fermentation process. Quantitative chromatographic data for the compounds for extracts of green, semi-fermented and fermented rooibos form the basis of preliminary study following a targeted approach. The aim is to estimate whether treatment has a significant effect based on all quantified compounds and to identify the compounds, which contribute significantly to it. Analysis of variance is performed using modern multivariate methods such as ANOVA-Simultaneous Component Analysis, ANOVA - Target Projection and regularized MANOVA. This study is the first one in which all three approaches are compared and evaluated. For the data studied, all tree methods reveal the same significance of the fermentation effect on the extract compositions, but they lead to its different interpretation. Copyright © 2017 Elsevier B.V. All rights reserved.
Castor Oil: Properties, Uses, and Optimization of Processing Parameters in Commercial Production
Patel, Vinay R.; Dumancas, Gerard G.; Kasi Viswanath, Lakshmi C.; Maples, Randall; Subong, Bryan John J.
2016-01-01
Castor oil, produced from castor beans, has long been considered to be of important commercial value primarily for the manufacturing of soaps, lubricants, and coatings, among others. Global castor oil production is concentrated primarily in a small geographic region of Gujarat in Western India. This region is favorable due to its labor-intensive cultivation method and subtropical climate conditions. Entrepreneurs and castor processors in the United States and South America also cultivate castor beans but are faced with the challenge of achieving high castor oil production efficiency, as well as obtaining the desired oil quality. In this manuscript, we provide a detailed analysis of novel processing methods involved in castor oil production. We discuss novel processing methods by explaining specific processing parameters involved in castor oil production. PMID:27656091
Castor Oil: Properties, Uses, and Optimization of Processing Parameters in Commercial Production.
Patel, Vinay R; Dumancas, Gerard G; Kasi Viswanath, Lakshmi C; Maples, Randall; Subong, Bryan John J
2016-01-01
Castor oil, produced from castor beans, has long been considered to be of important commercial value primarily for the manufacturing of soaps, lubricants, and coatings, among others. Global castor oil production is concentrated primarily in a small geographic region of Gujarat in Western India. This region is favorable due to its labor-intensive cultivation method and subtropical climate conditions. Entrepreneurs and castor processors in the United States and South America also cultivate castor beans but are faced with the challenge of achieving high castor oil production efficiency, as well as obtaining the desired oil quality. In this manuscript, we provide a detailed analysis of novel processing methods involved in castor oil production. We discuss novel processing methods by explaining specific processing parameters involved in castor oil production.
Wannet, W J; Hermans, J H; van Der Drift, C; Op Den Camp, H J
2000-02-01
A convenient and sensitive method was developed to separate and detect various types of carbohydrates (polyols, mono- and disaccharides, and phosphorylated sugars) simultaneously using high-performance liquid chromatography (HPLC). The method consists of a chromatographic separation on a CarboPac PA1 anion-exchange analytical column followed by pulsed amperometric detection. In a single run (43 min) 13 carbohydrates were readily resolved. Calibration plots were linear over the ranges of 5-25 microM to 1. 0-1.5 mM. The reliable and fast analysis technique, avoiding derivatization steps and long run times, was used to determine the levels of carbohydrates involved in mannitol and trehalose metabolism in the edible mushroom Agaricus bisporus. Moreover, the method was used to study the trehalose phosphorylase reaction.
Preprocessing film-copied MRI for studying morphological brain changes.
Pham, Tuan D; Eisenblätter, Uwe; Baune, Bernhard T; Berger, Klaus
2009-06-15
The magnetic resonance imaging (MRI) of the brain is one of the important data items for studying memory and morbidity in elderly as these images can provide useful information through the quantitative measures of various regions of interest of the brain. As an effort to fully automate the biomedical analysis of the brain that can be combined with the genetic data of the same human population and where the records of the original MRI data are missing, this paper presents two effective methods for addressing this imaging problem. The first method handles the restoration of the film-copied MRI. The second method involves the segmentation of the image data. Experimental results and comparisons with other methods suggest the usefulness of the proposed image analysis methodology.
Size and shape measurement in contemporary cephalometrics.
McIntyre, Grant T; Mossey, Peter A
2003-06-01
The traditional method of analysing cephalograms--conventional cephalometric analysis (CCA)--involves the calculation of linear distance measurements, angular measurements, area measurements, and ratios. Because shape information cannot be determined from these 'size-based' measurements, an increasing number of studies employ geometric morphometric tools in the cephalometric analysis of craniofacial morphology. Most of the discussions surrounding the appropriateness of CCA, Procrustes superimposition, Euclidean distance matrix analysis (EDMA), thin-plate spline analysis (TPS), finite element morphometry (FEM), elliptical Fourier functions (EFF), and medial axis analysis (MAA) have centred upon mathematical and statistical arguments. Surprisingly, little information is available to assist the orthodontist in the clinical relevance of each technique. This article evaluates the advantages and limitations of the above methods currently used to analyse the craniofacial morphology on cephalograms and investigates their clinical relevance and possible applications.
NASA Technical Reports Server (NTRS)
Kenny, Sean P.; Hou, Gene J. W.
1994-01-01
A method for eigenvalue and eigenvector approximate analysis for the case of repeated eigenvalues with distinct first derivatives is presented. The approximate analysis method developed involves a reparameterization of the multivariable structural eigenvalue problem in terms of a single positive-valued parameter. The resulting equations yield first-order approximations to changes in the eigenvalues and the eigenvectors associated with the repeated eigenvalue problem. This work also presents a numerical technique that facilitates the definition of an eigenvector derivative for the case of repeated eigenvalues with repeated eigenvalue derivatives (of all orders). Examples are given which demonstrate the application of such equations for sensitivity and approximate analysis. Emphasis is placed on the application of sensitivity analysis to large-scale structural and controls-structures optimization problems.
A rapid method for estimation of Pu-isotopes in urine samples using high volume centrifuge.
Kumar, Ranjeet; Rao, D D; Dubla, Rupali; Yadav, J R
2017-07-01
The conventional radio-analytical technique used for estimation of Pu-isotopes in urine samples involves anion exchange/TEVA column separation followed by alpha spectrometry. This sequence of analysis consumes nearly 3-4 days for completion. Many a times excreta analysis results are required urgently, particularly under repeat and incidental/emergency situations. Therefore, there is need to reduce the analysis time for the estimation of Pu-isotopes in bioassay samples. This paper gives the details of standardization of a rapid method for estimation of Pu-isotopes in urine samples using multi-purpose centrifuge, TEVA resin followed by alpha spectrometry. The rapid method involves oxidation of urine samples, co-precipitation of plutonium along with calcium phosphate followed by sample preparation using high volume centrifuge and separation of Pu using TEVA resin. Pu-fraction was electrodeposited and activity estimated using 236 Pu tracer recovery by alpha spectrometry. Ten routine urine samples of radiation workers were analyzed and consistent radiochemical tracer recovery was obtained in the range 47-88% with a mean and standard deviation of 64.4% and 11.3% respectively. With this newly standardized technique, the whole analytical procedure is completed within 9h (one working day hour). Copyright © 2017 Elsevier Ltd. All rights reserved.
Analysis of low levels of rare earths by radiochemical neutron activation analysis
Wandless, G.A.; Morgan, J.W.
1985-01-01
A procedure for the radiochemical neutron-activation analysis for the rare earth elements (REE) involves the separation of the REE as a group by rapid ion-exchange methods and determination of yields by reactivation or by energy dispersive X-ray fluorescence (EDXRF) spectrometry. The U. S. Geological Survey (USGS) standard rocks, BCR-1 and AGV-1, were analyzed to determine the precision and accuracy of the method. We found that the precision was ??5-10% on the basis of replicate analysis and that, in general the accuracy was within ??5% of accepted values for most REE. Data for USGS standard rocks BIR-1 (Icelandic basalt) and DNC-1 (North Carolina diabase) are also presented. ?? 1985 Akade??miai Kiado??.
2014-01-01
Background Meta-regression is becoming increasingly used to model study level covariate effects. However this type of statistical analysis presents many difficulties and challenges. Here two methods for calculating confidence intervals for the magnitude of the residual between-study variance in random effects meta-regression models are developed. A further suggestion for calculating credible intervals using informative prior distributions for the residual between-study variance is presented. Methods Two recently proposed and, under the assumptions of the random effects model, exact methods for constructing confidence intervals for the between-study variance in random effects meta-analyses are extended to the meta-regression setting. The use of Generalised Cochran heterogeneity statistics is extended to the meta-regression setting and a Newton-Raphson procedure is developed to implement the Q profile method for meta-analysis and meta-regression. WinBUGS is used to implement informative priors for the residual between-study variance in the context of Bayesian meta-regressions. Results Results are obtained for two contrasting examples, where the first example involves a binary covariate and the second involves a continuous covariate. Intervals for the residual between-study variance are wide for both examples. Conclusions Statistical methods, and R computer software, are available to compute exact confidence intervals for the residual between-study variance under the random effects model for meta-regression. These frequentist methods are almost as easily implemented as their established counterparts for meta-analysis. Bayesian meta-regressions are also easily performed by analysts who are comfortable using WinBUGS. Estimates of the residual between-study variance in random effects meta-regressions should be routinely reported and accompanied by some measure of their uncertainty. Confidence and/or credible intervals are well-suited to this purpose. PMID:25196829
Daveson, Barbara A; de Wolf-Linder, Susanne; Witt, Jana; Newson, Kirstie; Morris, Carolyn; Higginson, Irene J; Evans, Catherine J
2015-12-01
Support and evidence for patient, unpaid caregiver and public involvement in research (user involvement) are growing. Consensus on how best to involve users in palliative care research is lacking. To determine an optimal user-involvement model for palliative care research. We hosted a consultation workshop using expert presentations, discussion and nominal group technique to generate recommendations and consensus on agreement of importance. A total of 35 users and 32 researchers were approached to attend the workshop, which included break-out groups and a ranking exercise. Descriptive statistical analysis to establish consensus and highlight divergence was applied. Qualitative analysis of discussions was completed to aid interpretation of findings. Participants involved in palliative care research were invited to a global research institute, UK. A total of 12 users and 5 researchers participated. Users wanted their involvement to be more visible, including during dissemination, with a greater emphasis on the difference their involvement makes. Researchers wanted to improve productivity, relevance and quality through involvement. Users and researchers agreed that an optimal model should consist of (a) early involvement to ensure meaningful involvement and impact and (b) diverse virtual and face-to-face involvement methods to ensure flexibility. For involvement in palliative care research to succeed, early and flexible involvement is required. Researchers should advertise opportunities for involvement and promote impact of involvement via dissemination plans. Users should prioritise adding value to research through enhancing productivity, quality and relevance. More research is needed not only to inform implementation and ensure effectiveness but also to investigate the cost-effectiveness of involvement in palliative care research. © The Author(s) 2015.
Daveson, Barbara A; de Wolf-Linder, Susanne; Witt, Jana; Newson, Kirstie; Morris, Carolyn; Higginson, Irene J; Evans, Catherine J
2015-01-01
Background: Support and evidence for patient, unpaid caregiver and public involvement in research (user involvement) are growing. Consensus on how best to involve users in palliative care research is lacking. Aim: To determine an optimal user-involvement model for palliative care research. Design: We hosted a consultation workshop using expert presentations, discussion and nominal group technique to generate recommendations and consensus on agreement of importance. A total of 35 users and 32 researchers were approached to attend the workshop, which included break-out groups and a ranking exercise. Descriptive statistical analysis to establish consensus and highlight divergence was applied. Qualitative analysis of discussions was completed to aid interpretation of findings. Setting/participants: Participants involved in palliative care research were invited to a global research institute, UK. Results: A total of 12 users and 5 researchers participated. Users wanted their involvement to be more visible, including during dissemination, with a greater emphasis on the difference their involvement makes. Researchers wanted to improve productivity, relevance and quality through involvement. Users and researchers agreed that an optimal model should consist of (a) early involvement to ensure meaningful involvement and impact and (b) diverse virtual and face-to-face involvement methods to ensure flexibility. Conclusion: For involvement in palliative care research to succeed, early and flexible involvement is required. Researchers should advertise opportunities for involvement and promote impact of involvement via dissemination plans. Users should prioritise adding value to research through enhancing productivity, quality and relevance. More research is needed not only to inform implementation and ensure effectiveness but also to investigate the cost-effectiveness of involvement in palliative care research. PMID:25931336
Measuring Road Network Vulnerability with Sensitivity Analysis
Jun-qiang, Leng; Long-hai, Yang; Liu, Wei-yi; Zhao, Lin
2017-01-01
This paper focuses on the development of a method for road network vulnerability analysis, from the perspective of capacity degradation, which seeks to identify the critical infrastructures in the road network and the operational performance of the whole traffic system. This research involves defining the traffic utility index and modeling vulnerability of road segment, route, OD (Origin Destination) pair and road network. Meanwhile, sensitivity analysis method is utilized to calculate the change of traffic utility index due to capacity degradation. This method, compared to traditional traffic assignment, can improve calculation efficiency and make the application of vulnerability analysis to large actual road network possible. Finally, all the above models and calculation method is applied to actual road network evaluation to verify its efficiency and utility. This approach can be used as a decision-supporting tool for evaluating the performance of road network and identifying critical infrastructures in transportation planning and management, especially in the resource allocation for mitigation and recovery. PMID:28125706
NASA Astrophysics Data System (ADS)
Zhukotsky, Alexander V.; Kogan, Emmanuil M.; Kopylov, Victor F.; Marchenko, Oleg V.; Lomakin, O. A.
1994-07-01
A new method for morphodensitometric analysis of blood cells was applied for medically screening some ecological influence and infection pathologies. A complex algorithm of computational image processing was created for supra molecular restructurings of interphase chromatin of lymphocytes research. It includes specific methods of staining and unifies different quantitative analysis methods. Our experience with the use of a television image analyzer in cytological and immunological studies made it possible to carry out some research in morphometric analysis of chromatin structure in interphase lymphocyte nuclei in genetic and virus pathologies. In our study to characterize lymphocytes as an image-forming system by a rigorous mathematical description we used an approach involving contaminant evaluation of the topography of chromatin network intact and victims' lymphocytes. It is also possible to digitize data, which revealed significant distinctions between control and experiment. The method allows us to observe the minute structural changes in chromatin, especially eu- and hetero-chromatin that were previously studied by genetics only in chromosomes.
The fuel tax compliance unit : an evaluation and analysis of results.
DOT National Transportation Integrated Search
2004-01-01
Kentucky utilized TEA-21 federal funds to create an innovative pilot program to identify the best practices and methods for auditing taxpayers of transportation related taxes. This program involved a four-year experimental program called the Fuel Tax...
FINITE-ELEMENT ANALYSIS OF MULTIPHASE IMMISCIBLE FLOW THROUGH SOILS
A finite-element model is developed for multiphase flow through soil involving three immiscible fluids: namely, air, water, and a nonaqueous phase liquid (NAPL). A variational method is employed for the finite-element formulation corresponding to the coupled differential equation...
Monitoring changes in stream bottom sediments and benthic invertebrates.
DOT National Transportation Integrated Search
1981-01-01
The study was conducted to determine whether the analysis of stream bottom sediments could be used to assess sediment pollution generated by highway construction. Most of the work completed to date has involved testing and refining methods for the co...
USDA-ARS?s Scientific Manuscript database
Western blotting is a technique that involves the separation of proteins by gel electrophoresis, their blotting or transfer to a membrane, and selective immunodetection of an immobilized antigen. This is an important and routine method for protein analysis that depends on the specificity of antibod...
Injection Locking Techniques for Spectrum Analysis
NASA Astrophysics Data System (ADS)
Gathma, Timothy D.; Buckwalter, James F.
2011-04-01
Wideband spectrum analysis supports future communication systems that reconfigure and adapt to the capacity of the spectral environment. While test equipment manufacturers offer wideband spectrum analyzers with excellent sensitivity and resolution, these spectrum analyzers typically cannot offer acceptable size, weight, and power (SWAP). CMOS integrated circuits offer the potential to fully integrate spectrum analysis capability with analog front-end circuitry and digital signal processing on a single chip. Unfortunately, CMOS lacks high-Q passives and wideband resonator tunability that is necessary for heterodyne implementations of spectrum analyzers. As an alternative to the heterodyne receiver architectures, two nonlinear methods for performing wideband, low-power spectrum analysis are presented. The first method involves injecting the spectrum of interest into an array of injection-locked oscillators. The second method employs the closed loop dynamics of both injection locking and phase locking to independently estimate the injected frequency and power.
A mixed-mode crack analysis of isotropic solids using conservation laws of elasticity
NASA Technical Reports Server (NTRS)
Yau, J. F.; Wang, S. S.; Corten, H. T.
1980-01-01
A simple and convenient method of analysis for studying two-dimensional mixed-mode crack problems is presented. The analysis is formulated on the basis of conservation laws of elasticity and of fundamental relationships in fracture mechanics. The problem is reduced to the determination of mixed-mode stress-intensity factor solutions in terms of conservation integrals involving known auxiliary solutions. One of the salient features of the present analysis is that the stress-intensity solutions can be determined directly by using information extracted in the far field. Several examples with solutions available in the literature are solved to examine the accuracy and other characteristics of the current approach. This method is demonstrated to be superior in its numerical simplicity and computational efficiency to other approaches. Solutions of more complicated and practical engineering fracture problems dealing with the crack emanating from a circular hole are presented also to illustrate the capacity of this method
Nookaew, Intawat; Papini, Marta; Pornputtapong, Natapol; Scalcinati, Gionata; Fagerberg, Linn; Uhlén, Matthias; Nielsen, Jens
2012-01-01
RNA-seq, has recently become an attractive method of choice in the studies of transcriptomes, promising several advantages compared with microarrays. In this study, we sought to assess the contribution of the different analytical steps involved in the analysis of RNA-seq data generated with the Illumina platform, and to perform a cross-platform comparison based on the results obtained through Affymetrix microarray. As a case study for our work we, used the Saccharomyces cerevisiae strain CEN.PK 113-7D, grown under two different conditions (batch and chemostat). Here, we asses the influence of genetic variation on the estimation of gene expression level using three different aligners for read-mapping (Gsnap, Stampy and TopHat) on S288c genome, the capabilities of five different statistical methods to detect differential gene expression (baySeq, Cuffdiff, DESeq, edgeR and NOISeq) and we explored the consistency between RNA-seq analysis using reference genome and de novo assembly approach. High reproducibility among biological replicates (correlation ≥0.99) and high consistency between the two platforms for analysis of gene expression levels (correlation ≥0.91) are reported. The results from differential gene expression identification derived from the different statistical methods, as well as their integrated analysis results based on gene ontology annotation are in good agreement. Overall, our study provides a useful and comprehensive comparison between the two platforms (RNA-seq and microrrays) for gene expression analysis and addresses the contribution of the different steps involved in the analysis of RNA-seq data. PMID:22965124
Funari, Mariana F A; Jorge, Alexander A L; Pinto, Emilia M; Arnhold, Ivo J P; Mendonca, Berenice B; Nishi, Mirian Y
2008-11-01
LWD is associated to SHOX haploinsufficiency, in most cases, due to gene deletion. Generally FISH and microsatellite analysis are used to identify SHOX deletion. MLPA is a new method of detecting gene copy variation, allowing simultaneous analysis of several regions. Here we describe the presence of a SHOX intragenic deletion in a family with LWD, analyzed through different methodologies. Genomic DNA of 11 subjects from one family were studied by microsatellite analysis, direct sequencing and MLPA. FISH was performed in two affected individuals. Microsatellite analysis showed that all affected members shared the same haplotype suggesting the involvement of SHOX. MLPA detected an intragenic deletion involving exons IV-VIa, which was not detected by FISH and microsatellite analysis. In conclusion, the MLPA technique was proved to be the best solution on detecting this small deletion, it has the advantage of being less laborious also allowing the analysis of several regions simultaneously.
NASA Technical Reports Server (NTRS)
1995-01-01
This report summarizes past corrosion issues experienced by the NASA space shuttle orbiter fleet. Design considerations for corrosion prevention and inspection methods are reviewed. Significant corrosion issues involving structures and subsystems are analyzed, including corrective actions taken. Notable successes and failures of corrosion mitigation systems and procedures are discussed. The projected operating environment used for design is contrasted with current conditions in flight and conditions during ground processing.
A novel model for DNA sequence similarity analysis based on graph theory.
Qi, Xingqin; Wu, Qin; Zhang, Yusen; Fuller, Eddie; Zhang, Cun-Quan
2011-01-01
Determination of sequence similarity is one of the major steps in computational phylogenetic studies. As we know, during evolutionary history, not only DNA mutations for individual nucleotide but also subsequent rearrangements occurred. It has been one of major tasks of computational biologists to develop novel mathematical descriptors for similarity analysis such that various mutation phenomena information would be involved simultaneously. In this paper, different from traditional methods (eg, nucleotide frequency, geometric representations) as bases for construction of mathematical descriptors, we construct novel mathematical descriptors based on graph theory. In particular, for each DNA sequence, we will set up a weighted directed graph. The adjacency matrix of the directed graph will be used to induce a representative vector for DNA sequence. This new approach measures similarity based on both ordering and frequency of nucleotides so that much more information is involved. As an application, the method is tested on a set of 0.9-kb mtDNA sequences of twelve different primate species. All output phylogenetic trees with various distance estimations have the same topology, and are generally consistent with the reported results from early studies, which proves the new method's efficiency; we also test the new method on a simulated data set, which shows our new method performs better than traditional global alignment method when subsequent rearrangements happen frequently during evolutionary history.
Liu, Chao; Abu-Jamous, Basel; Brattico, Elvira; Nandi, Asoke K
2017-03-01
In the past decades, neuroimaging of humans has gained a position of status within neuroscience, and data-driven approaches and functional connectivity analyses of functional magnetic resonance imaging (fMRI) data are increasingly favored to depict the complex architecture of human brains. However, the reliability of these findings is jeopardized by too many analysis methods and sometimes too few samples used, which leads to discord among researchers. We propose a tunable consensus clustering paradigm that aims at overcoming the clustering methods selection problem as well as reliability issues in neuroimaging by means of first applying several analysis methods (three in this study) on multiple datasets and then integrating the clustering results. To validate the method, we applied it to a complex fMRI experiment involving affective processing of hundreds of music clips. We found that brain structures related to visual, reward, and auditory processing have intrinsic spatial patterns of coherent neuroactivity during affective processing. The comparisons between the results obtained from our method and those from each individual clustering algorithm demonstrate that our paradigm has notable advantages over traditional single clustering algorithms in being able to evidence robust connectivity patterns even with complex neuroimaging data involving a variety of stimuli and affective evaluations of them. The consensus clustering method is implemented in the R package "UNCLES" available on http://cran.r-project.org/web/packages/UNCLES/index.html .
Teaching concept analysis to graduate nursing students.
Schiller, Catharine J
2018-04-01
To provide guidance to educators who use the Wilson (1963) concept analysis method, as modified by Walker and Avant (2011), in their graduate nursing curriculum BACKGROUND: While graduate nursing curricula often include a concept analysis assignment, there is a paucity of literature to assist educators in guiding students through this challenging process. This article details one way for educators to assist graduate nursing students in learning how to undertake each step of the Wilson (1963) concept analysis method, as modified by Walker and Avant (2011). Wilson (1963) concept analysis method, as modified by Walker and Avant (2011). Using examples, this article walks the reader through the Walker and Avant (2011) concept analysis process and addresses those issues commonly encountered by educators during this process. This article presented one way of walking students through a Walker and Avant (2011) concept analysis. Having clear information about the steps involved in developing a concept analysis will make it easier for educators to incorporate it into their graduate nursing curriculum and to effectively guide students on their journey through this process. © 2018 Wiley Periodicals, Inc.
NASA Astrophysics Data System (ADS)
Solimun, Fernandes, Adji Achmad Rinaldo; Arisoesilaningsih, Endang
2017-12-01
Research in various fields generally investigates systems and involves latent variables. One method to analyze the model representing the system is path analysis. The data of latent variables measured using questionnaires by applying attitude scale model yields data in the form of score, before analyzed should be transformation so that it becomes data of scale. Path coefficient, is parameter estimator, calculated from scale data using method of successive interval (MSI) and summated rating scale (SRS). In this research will be identifying which data transformation method is better. Path coefficients have smaller varieties are said to be more efficient. The transformation method that produces scaled data and used in path analysis capable of producing path coefficients (parameter estimators) with smaller varieties is said to be better. The result of analysis using real data shows that on the influence of Attitude variable to Intention Entrepreneurship, has relative efficiency (ER) = 1, where it shows that the result of analysis using data transformation of MSI and SRS as efficient. On the other hand, for simulation data, at high correlation between items (0.7-0.9), MSI method is more efficient 1.3 times better than SRS method.
Application of the Probabilistic Dynamic Synthesis Method to the Analysis of a Realistic Structure
NASA Technical Reports Server (NTRS)
Brown, Andrew M.; Ferri, Aldo A.
1998-01-01
The Probabilistic Dynamic Synthesis method is a new technique for obtaining the statistics of a desired response engineering quantity for a structure with non-deterministic parameters. The method uses measured data from modal testing of the structure as the input random variables, rather than more "primitive" quantities like geometry or material variation. This modal information is much more comprehensive and easily measured than the "primitive" information. The probabilistic analysis is carried out using either response surface reliability methods or Monte Carlo simulation. A previous work verified the feasibility of the PDS method on a simple seven degree-of-freedom spring-mass system. In this paper, extensive issues involved with applying the method to a realistic three-substructure system are examined, and free and forced response analyses are performed. The results from using the method are promising, especially when the lack of alternatives for obtaining quantitative output for probabilistic structures is considered.
Contact Stress Analysis of Spiral Bevel Gears Using Finite Element Analysis
NASA Technical Reports Server (NTRS)
Bibel, G. D.; Kumar, A; Reddy, S.; Handschuh, R.
1995-01-01
A procedure is presented for performing three-dimensional stress analysis of spiral bevel gears in mesh using the finite element method. The procedure involves generating a finite element model by solving equations that identify tooth surface coordinates. Coordinate transformations are used to orientate the gear and pinion for gear meshing. Contact boundary conditions are simulated with gap elements. A solution technique for correct orientation of the gap elements is given. Example models and results are presented.
Pathway and network analysis of cancer genomes.
Creixell, Pau; Reimand, Jüri; Haider, Syed; Wu, Guanming; Shibata, Tatsuhiro; Vazquez, Miguel; Mustonen, Ville; Gonzalez-Perez, Abel; Pearson, John; Sander, Chris; Raphael, Benjamin J; Marks, Debora S; Ouellette, B F Francis; Valencia, Alfonso; Bader, Gary D; Boutros, Paul C; Stuart, Joshua M; Linding, Rune; Lopez-Bigas, Nuria; Stein, Lincoln D
2015-07-01
Genomic information on tumors from 50 cancer types cataloged by the International Cancer Genome Consortium (ICGC) shows that only a few well-studied driver genes are frequently mutated, in contrast to many infrequently mutated genes that may also contribute to tumor biology. Hence there has been large interest in developing pathway and network analysis methods that group genes and illuminate the processes involved. We provide an overview of these analysis techniques and show where they guide mechanistic and translational investigations.
A Constrained-Clustering Approach to the Analysis of Remote Sensing Data.
1983-01-01
One old and two new clustering methods were applied to the constrained-clustering problem of separating different agricultural fields based on multispectral remote sensing satellite data. (Constrained-clustering involves double clustering in multispectral measurement similarity and geographical location.) The results of applying the three methods are provided along with a discussion of their relative strengths and weaknesses and a detailed description of their algorithms.
Dolan, James G
2010-01-01
Current models of healthcare quality recommend that patient management decisions be evidence-based and patient-centered. Evidence-based decisions require a thorough understanding of current information regarding the natural history of disease and the anticipated outcomes of different management options. Patient-centered decisions incorporate patient preferences, values, and unique personal circumstances into the decision making process and actively involve both patients along with health care providers as much as possible. Fundamentally, therefore, evidence-based, patient-centered decisions are multi-dimensional and typically involve multiple decision makers.Advances in the decision sciences have led to the development of a number of multiple criteria decision making methods. These multi-criteria methods are designed to help people make better choices when faced with complex decisions involving several dimensions. They are especially helpful when there is a need to combine "hard data" with subjective preferences, to make trade-offs between desired outcomes, and to involve multiple decision makers. Evidence-based, patient-centered clinical decision making has all of these characteristics. This close match suggests that clinical decision support systems based on multi-criteria decision making techniques have the potential to enable patients and providers to carry out the tasks required to implement evidence-based, patient-centered care effectively and efficiently in clinical settings.The goal of this paper is to give readers a general introduction to the range of multi-criteria methods available and show how they could be used to support clinical decision-making. Methods discussed include the balance sheet, the even swap method, ordinal ranking methods, direct weighting methods, multi-attribute decision analysis, and the analytic hierarchy process (AHP).
Dolan, James G.
2010-01-01
Current models of healthcare quality recommend that patient management decisions be evidence-based and patient-centered. Evidence-based decisions require a thorough understanding of current information regarding the natural history of disease and the anticipated outcomes of different management options. Patient-centered decisions incorporate patient preferences, values, and unique personal circumstances into the decision making process and actively involve both patients along with health care providers as much as possible. Fundamentally, therefore, evidence-based, patient-centered decisions are multi-dimensional and typically involve multiple decision makers. Advances in the decision sciences have led to the development of a number of multiple criteria decision making methods. These multi-criteria methods are designed to help people make better choices when faced with complex decisions involving several dimensions. They are especially helpful when there is a need to combine “hard data” with subjective preferences, to make trade-offs between desired outcomes, and to involve multiple decision makers. Evidence-based, patient-centered clinical decision making has all of these characteristics. This close match suggests that clinical decision support systems based on multi-criteria decision making techniques have the potential to enable patients and providers to carry out the tasks required to implement evidence-based, patient-centered care effectively and efficiently in clinical settings. The goal of this paper is to give readers a general introduction to the range of multi-criteria methods available and show how they could be used to support clinical decision-making. Methods discussed include the balance sheet, the even swap method, ordinal ranking methods, direct weighting methods, multi-attribute decision analysis, and the analytic hierarchy process (AHP) PMID:21394218
Dynamic test/analysis correlation using reduced analytical models
NASA Technical Reports Server (NTRS)
Mcgowan, Paul E.; Angelucci, A. Filippo; Javeed, Mehzad
1992-01-01
Test/analysis correlation is an important aspect of the verification of analysis models which are used to predict on-orbit response characteristics of large space structures. This paper presents results of a study using reduced analysis models for performing dynamic test/analysis correlation. The reduced test-analysis model (TAM) has the same number and orientation of DOF as the test measurements. Two reduction methods, static (Guyan) reduction and the Improved Reduced System (IRS) reduction, are applied to the test/analysis correlation of a laboratory truss structure. Simulated test results and modal test data are used to examine the performance of each method. It is shown that selection of DOF to be retained in the TAM is critical when large structural masses are involved. In addition, the use of modal test results may provide difficulties in TAM accuracy even if a large number of DOF are retained in the TAM.
Is probabilistic bias analysis approximately Bayesian?
MacLehose, Richard F.; Gustafson, Paul
2011-01-01
Case-control studies are particularly susceptible to differential exposure misclassification when exposure status is determined following incident case status. Probabilistic bias analysis methods have been developed as ways to adjust standard effect estimates based on the sensitivity and specificity of exposure misclassification. The iterative sampling method advocated in probabilistic bias analysis bears a distinct resemblance to a Bayesian adjustment; however, it is not identical. Furthermore, without a formal theoretical framework (Bayesian or frequentist), the results of a probabilistic bias analysis remain somewhat difficult to interpret. We describe, both theoretically and empirically, the extent to which probabilistic bias analysis can be viewed as approximately Bayesian. While the differences between probabilistic bias analysis and Bayesian approaches to misclassification can be substantial, these situations often involve unrealistic prior specifications and are relatively easy to detect. Outside of these special cases, probabilistic bias analysis and Bayesian approaches to exposure misclassification in case-control studies appear to perform equally well. PMID:22157311
Kagel, John H.; Winkler, Robin C.
1972-01-01
The current research methods of behavioral economics are characterized by inadequate empirical foundations. Psychologists involved in the experimental analysis of behavior with their research strategies and their experimental technology, particularly that of the Token Economy, can assist in providing empirical foundations for behavioral economics. Cooperative research between economists and psychologists to this end should be immediately fruitful and mutually beneficial. PMID:16795356
Cost and Time Analysis of Monograph Cataloging in Hospital Libraries: A Preliminary Study.
ERIC Educational Resources Information Center
Angold, Linda
The purpose of this paper is: (1) to propose models to be used in evaluating relative time and cost factors involved in monograph cataloging within a hospital library, and (2) to test the models by performing a cost and time analysis of each cataloging method studied. To establish as complete a list of cataloging work units as possible, several…
Cross-National Analysis of Islamic Fundamentalism
2016-01-20
attitudes, and was fully involved in activities concerning questionnaire design including a new experimental design in the survey, pilot testing, and...possible collaboration with the research design of the panel survey in Tunisia. • Data analysis: Analyses of religious fundamentalism, women’s dress, trust...the Event History Calendar and the best methods to ask about knowledge and experience of past events. The group designed a series of cognitive
Analysis of Defense Industry Consolidation Effects on Program Acquisition Costs
2007-12-01
overhead costs. Also in 1993, Norman R. Augustine, then CEO of Lockheed Martin, headed an effort involving other major defense industry executives...name programs, Lockheed Chairman Norman Augustine could only name one (Pearlstein, 14 July 1997). A GAO study looked into one method that...latest technology could, essential, resort to monopolistic practices of market and cost control. Kovacic and Smallwood , in an analysis of defense
Validated flow-injection method for rapid aluminium determination in anti-perspirants.
López-Gonzálvez, A; Ruiz, M A; Barbas, C
2008-09-29
A flow-injection (FI) method for the rapid determination of aluminium in anti-perspirants has been developed. The method is based on the spectrophotometric detection at 535nm of the complex formed between Al ions and the chromogenic reagent eriochrome cyanine R. Both the batch and FI methods were validated by checking the parameters included in the ISO-3543-1 regulation. Variables involved in the FI method were optimized by using appropriate statistical tools. The method does not exhibit interference from other substances present in anti-perspirants and it shows a high precision with a R.S.D. value (n=6) of 0.9%. Moreover, the accuracy of the method was evaluated by comparison with a back complexometric titration method, which is currently used for routine analysis in pharmaceutical laboratories. The Student's t-test showed that the results obtained by both methods were not significantly different for a significance level of 95%. A response time of 12s and a sample analysis time, by performing triplicate injections, of 60s were achieved. The analytical figures of merit make the method highly appropriate to substitute the time-consuming complexometric method for this kind of analysis.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Townsend, D.W.; Linnhoff, B.
In Part I, criteria for heat engine and heat pump placement in chemical process networks were derived, based on the ''temperature interval'' (T.I) analysis of the heat exchanger network problem. Using these criteria, this paper gives a method for identifying the best outline design for any combined system of chemical process, heat engines, and heat pumps. The method eliminates inferior alternatives early, and positively leads on to the most appropriate solution. A graphical procedure based on the T.I. analysis forms the heart of the approach, and the calculations involved are simple enough to be carried out on, say, a programmablemore » calculator. Application to a case study is demonstrated. Optimization methods based on this procedure are currently under research.« less
Development of a security vulnerability assessment process for the RAMCAP chemical sector.
Moore, David A; Fuller, Brad; Hazzan, Michael; Jones, J William
2007-04-11
The Department of Homeland Security (DHS), Directorate of Information Analysis & Infrastructure Protection (IAIP), Protective Services Division (PSD), contracted the American Society of Mechanical Engineers Innovative Technologies Institute, LLC (ASME ITI, LLC) to develop guidance on Risk Analysis and Management for Critical Asset Protection (RAMCAP). AcuTech Consulting Group (AcuTech) has been contracted by ASME ITI, LLC, to provide assistance by facilitating the development of sector-specific guidance on vulnerability analysis and management for critical asset protection for the chemical manufacturing, petroleum refining, and liquefied natural gas (LNG) sectors. This activity involves two key tasks for these three sectors: Development of a screening to supplement DHS understanding of the assets that are important to protect against terrorist attack and to prioritize the activities. Development of a standard security vulnerability analysis (SVA) framework for the analysis of consequences, vulnerabilities, and threats. This project involves the cooperative effort of numerous leading industrial companies, industry trade associations, professional societies, and security and safety consultants representative of those sectors. Since RAMCAP is a voluntary program for ongoing risk management for homeland security, sector coordinating councils are being asked to assist in communicating the goals of the program and in encouraging participation. The RAMCAP project will have a profound and positive impact on all sectors as it is fully developed, rolled-out and implemented. It will help define the facilities and operations of national and regional interest for the threat of terrorism, define standardized methods for analyzing consequences, vulnerabilities, and threats, and describe best security practices of the industry. This paper will describe the results of the security vulnerability analysis process that was developed and field tested for the chemical manufacturing sector. This method was developed through the cooperation of the many organizations and the individuals involved from the chemical sector RAMCAP development activities. The RAMCAP SVA method is intended to provide a common basis for making vulnerability assessments and risk-based decisions for homeland security. Mr. Moore serves as the coordinator for the chemical manufacturing, petroleum refining, and LNG sectors for the RAMCAP project and Dr. Jones is the chief technology officer for ASME-ITI, LLC for RAMCAP.
Ito, Shinya; Tsukada, Katsuo
2002-01-11
An evaluation of the feasibility of liquid chromatography-mass spectrometry (LC-MS) with atmospheric pressure ionization was made for quantitation of four diarrhetic shellfish poisoning toxins, okadaic acid, dinophysistoxin-1, pectenotoxin-6 and yessotoxin in scallops. When LC-MS was applied to the analysis of scallop extracts, large signal suppressions were observed due to coeluting substances from the column. To compensate for these matrix signal suppressions, the standard addition method was applied. First, the sample was analyzed and then the sample involving the addition of calibration standards is analyzed. Although this method requires two LC-MS runs per analysis, effective correction of quantitative errors was found.
Chaney, Rufus L; Green, Carrie E; Lehotay, Steven J
2018-05-04
With the establishment by CODEX of a 200 ng/g limit of inorganic arsenic (iAs) in polished rice grain, more analyses of iAs will be necessary to ensure compliance in regulatory and trade applications, to assess quality control in commercial rice production, and to conduct research involving iAs in rice crops. Although analytical methods using high-performance liquid chromatography-inductively coupled plasma-mass spectrometry (HPLC-ICP-MS) have been demonstrated for full speciation of As, this expensive and time-consuming approach is excessive when regulations are based only on iAs. We report a streamlined sample preparation and analysis of iAs in powdered rice based on heated extraction with 0.28 M HNO 3 followed by hydride generation (HG) under control of acidity and other simple conditions. Analysis of iAs is then conducted using flow-injection HG and inexpensive ICP-atomic emission spectroscopy (AES) or other detection means. A key innovation compared with previous methods was to increase the acidity of the reagent solution with 4 M HCl (prior to reduction of As 5+ to As 3+ ), which minimized interferences from dimethylarsinic acid. An inter-laboratory method validation was conducted among 12 laboratories worldwide in the analysis of six shared blind duplicates and a NIST Standard Reference Material involving different types of rice and iAs levels. Also, four laboratories used the standard HPLC-ICP-MS method to analyze the samples. The results between the methods were not significantly different, and the Horwitz ratio averaged 0.52 for the new method, which meets official method validation criteria. Thus, the simpler, more versatile, and less expensive method may be used by laboratories for several purposes to accurately determine iAs in rice grain. Graphical abstract Comparison of iAs results from new and FDA methods.
A Review: Principles of Design and Analysis of Learning Systems
ERIC Educational Resources Information Center
Durney, Carl H.
1973-01-01
Analyzes a traditional and an innovative course in terms of learning principles, involving Erickson's management methods and Gagne's learning activities. Suggests that learning systems should be designed by applying all principles rather than emphasizing one or two of them. (CC)
Used Solvent Testing and Reclamation. Volume 1. Cold-Cleaning Solvents
1988-12-01
spectrometer, and specific gravity meter involve buying routine cleaning supplies , and should not exceed $50. Consequently, these methods were...in addition to routine cleaning supplies . The K13V measurement requires periodic supplies of Kauri-butanol solution. TLC analysis requires glass
ESR Analysis of Polymer Photo-Oxidation
NASA Technical Reports Server (NTRS)
Kim, Soon Sam; Liang, Ranty Hing; Tsay, Fun-Dow; Gupta, Amitave
1987-01-01
Electron-spin resonance identifies polymer-degradation reactions and their kinetics. New technique enables derivation of kinetic model of specific chemical reactions involved in degradation of particular polymer. Detailed information provided by new method enables prediction of aging characteristics long before manifestation of macroscopic mechanical properties.
Journal Writing in Health Education.
ERIC Educational Resources Information Center
Gillis, Angela J.
2001-01-01
Notes the growing use of journals in nursing education and health professions continuing education. Describes a three-step method involving critical analysis of clinical practice, peer group discussion, and self-evaluation. Presents practical guidelines for journal writing and ways to use journals to develop competence. (SK)
Applied statistics in agricultural, biological, and environmental sciences.
USDA-ARS?s Scientific Manuscript database
Agronomic research often involves measurement and collection of multiple response variables in an effort to understand the more complex nature of the system being studied. Multivariate statistical methods encompass the simultaneous analysis of all random variables measured on each experimental or s...
Report to DHS on Summer Internship 2006
DOE Office of Scientific and Technical Information (OSTI.GOV)
Beckwith, R H
2006-07-26
This summer I worked at Lawrence Livermore National Laboratory in a bioforensics collection and extraction research group under David Camp. The group is involved with researching efficiencies of various methods for collecting bioforensic evidence from crime scenes. The different methods under examination are a wipe, swab, HVAC filter and a vacuum. The vacuum is something that has particularly gone uncharacterized. My time was spent mostly on modeling and calculations work, but at the end of the summer I completed my internship with a few experiments to supplement my calculations. I had two major projects this summer. My first major projectmore » this summer involved fluid mechanics modeling of collection and extraction situations. This work examines different fluid dynamic models for the case of a micron spore attached to a fiber. The second project I was involved with was a statistical analysis of the different sampling techniques.« less
Reducing physical size limits for low-frequency horn loudspeaker systems
NASA Astrophysics Data System (ADS)
Honeycutt, Richard Allison
From 1881 until the present day, many excellent scholars have studied acoustic horns. This dissertation begins by discussing over eighty results of such study. Next, the methods of modeling horn behavior are examined with an emphasis on the prediction of throat impedance. Because of the time constraints in a product-design environment, in which the results of this study may be used, boundary-element and cascaded-section types of analysis were not considered due to their time intensiveness. Of the methods studied, an analytical process based upon Olson's adaptation of Webster's analysis is selected as the most accurate of the rapid methods, although other good methods exist. Reasons and extent of inaccuracy are discussed. The concept of interleaved horn loading is introduced: it involves using two horns of different parameters, fed by a single driver, with a view toward interleaving and thus smoothing the impedance peaks of the separate horns to produce a smoother response. The validity of the technique is demonstrated both theoretically and practically. Then the reactance annulling technique is explained and tested experimentally. It is found to work well, but the exact parameter values involved are not found to be critical. Finally, the considerations involved in building a practical working system are discussed, and a preliminary working model reviewed. Future work could be directed toward finding the optimum parameter values for the two "parallel horns" whose impedances are to be interleaved, as well as the system parameters that determine these optimum values. Also, further experimental investigation or ported loading of the back air chamber would be useful.
Leakage detection in galvanized iron pipelines using ensemble empirical mode decomposition analysis
NASA Astrophysics Data System (ADS)
Amin, Makeen; Ghazali, M. Fairusham
2015-05-01
There are many numbers of possible approaches to detect leaks. Some leaks are simply noticeable when the liquids or water appears on the surface. However many leaks do not find their way to the surface and the existence has to be check by analysis of fluid flow in the pipeline. The first step is to determine the approximate position of leak. This can be done by isolate the sections of the mains in turn and noting which section causes a drop in the flow. Next approach is by using sensor to locate leaks. This approach are involves strain gauge pressure transducers and piezoelectric sensor. the occurrence of leaks and know its exact location in the pipeline by using specific method which are Acoustic leak detection method and transient method. The objective is to utilize the signal processing technique in order to analyse leaking in the pipeline. With this, an EEMD method will be applied as the analysis method to collect and analyse the data.
Sensitivity of control-augmented structure obtained by a system decomposition method
NASA Technical Reports Server (NTRS)
Sobieszczanskisobieski, Jaroslaw; Bloebaum, Christina L.; Hajela, Prabhat
1988-01-01
The verification of a method for computing sensitivity derivatives of a coupled system is presented. The method deals with a system whose analysis can be partitioned into subsets that correspond to disciplines and/or physical subsystems that exchange input-output data with each other. The method uses the partial sensitivity derivatives of the output with respect to input obtained for each subset separately to assemble a set of linear, simultaneous, algebraic equations that are solved for the derivatives of the coupled system response. This sensitivity analysis is verified using an example of a cantilever beam augmented with an active control system to limit the beam's dynamic displacements under an excitation force. The verification shows good agreement of the method with reference data obtained by a finite difference technique involving entire system analysis. The usefulness of a system sensitivity method in optimization applications by employing a piecewise-linear approach to the same numerical example is demonstrated. The method's principal merits are its intrinsically superior accuracy in comparison with the finite difference technique, and its compatibility with the traditional division of work in complex engineering tasks among specialty groups.
Comparative Numerical Analysis of Different Strengthening Systems of Historical Brick Arches
NASA Astrophysics Data System (ADS)
Zielińska, M.
2017-05-01
The article presents a comparative numerical analysis of various ways to strengthen historical brick arches. Five ways of strengthening brick arches with steel tie-rods have been proposed. Two of these involve the use of braces wrapped around pillars supporting the arch connected with a tie-rod; the other two ways involve the use of the tie-rods with welded metal sheets of different sizes; the latter involves the use of a tie-rod glued with the use of an epoxy adhesive. The collected data were compared with the reference model of the arch left without any interference. The results make it possible to evaluate the effectiveness of the methods by comparing displacements in the vertical and horizontal direction and stresses. The article indicates the direction of proper planning and design of the arch strengthening in brick structures in historical buildings.
Materialistic Desires or Childhood Adversities as Explanations for Girls' Trading Sex for Benefits.
Song, Juyoung; Morash, Merry
2016-01-01
This study investigates whether high school and younger South Korean girls trade sex with middle-aged men for benefits due to cultural emphasis on materialism/consumerism, childhood adversities, or both. This form of prostitution, referred to as "compensated dating," is common in economically developed East Asian Countries, where there is debate about its causes. Purposeful sampling was used to select a diverse group of 25 girls who described involvement in compensated dating, and a life calendar method was used to guide the interview. The rich data were subjected to thematic analysis to show the nature of prostitution involvement, precursors, and motivations. Data analysis revealed that sole reliance on materialistic desire as an explanation of prostitution obscures the influence of peer pressure and family dysfunction. Findings suggest the need for social services rather than punitive responses to girls involved in compensated dating. © The Author(s) 2014.
Analysis of longitudinal data from animals where some data are missing in SPSS
Duricki, DA; Soleman, S; Moon, LDF
2017-01-01
Testing of therapies for disease or injury often involves analysis of longitudinal data from animals. Modern analytical methods have advantages over conventional methods (particularly where some data are missing) yet are not used widely by pre-clinical researchers. We provide here an easy to use protocol for analysing longitudinal data from animals and present a click-by-click guide for performing suitable analyses using the statistical package SPSS. We guide readers through analysis of a real-life data set obtained when testing a therapy for brain injury (stroke) in elderly rats. We show that repeated measures analysis of covariance failed to detect a treatment effect when a few data points were missing (due to animal drop-out) whereas analysis using an alternative method detected a beneficial effect of treatment; specifically, we demonstrate the superiority of linear models (with various covariance structures) analysed using Restricted Maximum Likelihood estimation (to include all available data). This protocol takes two hours to follow. PMID:27196723
NASA Astrophysics Data System (ADS)
Randle, K.; Al-Jundi, J.; Mamas, C. J. V.; Sokhi, R. S.; Earwaker, L. G.
1993-06-01
Our work on heavy metals in the estuarine environment has involved the use of two multielement techniques: neutron activation analysis (NAA) and proton-induced X-ray emission (PIXE) analysis. As PIXE is essentially a surface analytical technique problems may arise due to sample inhomogeneity and surface roughness. In order to assess the contribution of these effects we have compared the results from PIXE analysis with those from a technique which analyzes a larger bulk sample rather than just the surface. An obvious method was NAA. A series of sediment samples containing particles of variable diameter were compared. Pellets containing a few mg of sediment were prepared from each sample and analyzed by the PIXE technique using both an absolute and a comparitive method. For INAA the rest of the sample was then irradiated with thermal neutrons and element concentrations determined from analyses of the subsequent gamma-ray spectrum. Results from the two methods are discussed.
Dusseldorp, Elise; Doove, Lisa; Mechelen, Iven van
2016-06-01
In the analysis of randomized controlled trials (RCTs), treatment effect heterogeneity often occurs, implying differences across (subgroups of) clients in treatment efficacy. This phenomenon is typically referred to as treatment-subgroup interactions. The identification of subgroups of clients, defined in terms of pretreatment characteristics that are involved in a treatment-subgroup interaction, is a methodologically challenging task, especially when many characteristics are available that may interact with treatment and when no comprehensive a priori hypotheses on relevant subgroups are available. A special type of treatment-subgroup interaction occurs if the ranking of treatment alternatives in terms of efficacy differs across subgroups of clients (e.g., for one subgroup treatment A is better than B and for another subgroup treatment B is better than A). These are called qualitative treatment-subgroup interactions and are most important for optimal treatment assignment. The method QUINT (Qualitative INteraction Trees) was recently proposed to induce subgroups involved in such interactions from RCT data. The result of an analysis with QUINT is a binary tree from which treatment assignment criteria can be derived. The implementation of this method, the R package quint, is the topic of this paper. The analysis process is described step-by-step using data from the Breast Cancer Recovery Project, showing the reader all functions included in the package. The output is explained and given a substantive interpretation. Furthermore, an overview is given of the tuning parameters involved in the analysis, along with possible motivational concerns associated with choice alternatives that are available to the user.
van Smeden, Jeroen; Boiten, Walter A; Hankemeier, Thomas; Rissmann, Robert; Bouwstra, Joke A; Vreeken, Rob J
2014-01-01
Ceramides (CERs), cholesterol, and free fatty acids (FFAs) are the main lipid classes in human stratum corneum (SC, outermost skin layer), but no studies report on the detailed analysis of these classes in a single platform. The primary aims of this study were to 1) develop an LC/MS method for (semi-)quantitative analysis of all main lipid classes present in human SC; and 2) use this method to study in detail the lipid profiles of human skin substitutes and compare them to human SC lipids. By applying two injections of 10μl, the developed method detects all major SC lipids using RPLC and negative ion mode APCI-MS for detection of FFAs, and NPLC using positive ion mode APCI-MS to analyze CERs and cholesterol. Validation showed this lipid platform to be robust, reproducible, sensitive, and fast. The method was successfully applied on ex vivo human SC, human SC obtained from tape strips and human skin substitutes (porcine SC and human skin equivalents). In conjunction with FFA profiles, clear differences in CER profiles were observed between these different SC sources. Human skin equivalents more closely mimic the lipid composition of human stratum corneum than porcine skin does, although noticeable differences are still present. These differences gave biologically relevant information on some of the enzymes that are probably involved in SC lipid processing. For future research, this provides an excellent method for (semi-)quantitative, 'high-throughput' profiling of SC lipids and can be used to advance the understanding of skin lipids and the biological processes involved. © 2013.
Parametrically excited non-linear multidegree-of-freedom systems with repeated natural frequencies
NASA Astrophysics Data System (ADS)
Tezak, E. G.; Nayfeh, A. H.; Mook, D. T.
1982-12-01
A method for analyzing multidegree-of-freedom systems having a repeated natural frequency subjected to a parametric excitation is presented. Attention is given to the ordering of the various terms (linear and non-linear) in the governing equations. The analysis is based on the method of multiple scales. As a numerical example involving a parametric resonance, panel flutter is discussed in detail in order to illustrate the type of results one can expect to obtain with this analysis. Some of the analytical results are verified by a numerical integration of the governing equations.
Development of Scatterometer-Derived Surface Pressures
NASA Astrophysics Data System (ADS)
Hilburn, K. A.; Bourassa, M. A.; O'Brien, J. J.
2001-12-01
SeaWinds scatterometer-derived wind fields can be used to estimate surface pressure fields. The method to be used has been developed and tested with Seasat-A and NSCAT wind measurements. The method involves blending two dynamically consistent values of vorticity. Geostrophic relative vorticity is calculated from an initial guess surface pressure field (AVN analysis in this case). Relative vorticity is calculated from SeaWinds winds, adjusted to a geostrophic value, and then blended with the initial guess. An objective method applied minimizes the differences between the initial guess field and scatterometer field, subject to regularization. The long-term goal of this project is to derive research-quality pressure fields from the SeaWinds winds for the Southern Ocean from the Antarctic ice sheet to 30 deg S. The intermediate goal of this report involves generation of pressure fields over the northern hemisphere for testing purposes. Specifically, two issues need to be addressed. First, the most appropriate initial guess field will be determined: the pure AVN analysis or the previously assimilated pressure field. The independent comparison data to be used in answering this question will involve data near land, ship data, and ice data that were not included in the AVN analysis. Second, the smallest number of pressure observations required to anchor the assimilated field will be determined. This study will use Neumann (derivative) boundary conditions on the region of interest. Such boundary conditions only determine the solution to within a constant that must be determined by a number of anchoring points. The smallness of the number of anchoring points will demonstrate the viability of the general use of the scatterometer as a barometer over the oceans.
Analysis of longitudinal data from animals with missing values using SPSS.
Duricki, Denise A; Soleman, Sara; Moon, Lawrence D F
2016-06-01
Testing of therapies for disease or injury often involves the analysis of longitudinal data from animals. Modern analytical methods have advantages over conventional methods (particularly when some data are missing), yet they are not used widely by preclinical researchers. Here we provide an easy-to-use protocol for the analysis of longitudinal data from animals, and we present a click-by-click guide for performing suitable analyses using the statistical package IBM SPSS Statistics software (SPSS). We guide readers through the analysis of a real-life data set obtained when testing a therapy for brain injury (stroke) in elderly rats. If a few data points are missing, as in this example data set (for example, because of animal dropout), repeated-measures analysis of covariance may fail to detect a treatment effect. An alternative analysis method, such as the use of linear models (with various covariance structures), and analysis using restricted maximum likelihood estimation (to include all available data) can be used to better detect treatment effects. This protocol takes 2 h to carry out.
Mach Reflection, Mach Disc, and the Associated Nozzle Free Jet Flows. Ph.D. Thesis
NASA Technical Reports Server (NTRS)
Chang, I.
1973-01-01
The numerical method involving both the method of integral relations and the method of characteristics have been applied to investigate the steady flow phenomena associated with the accurrence of Mach reflection and Mach disc from nozzle flows. The solutions of triple-shock intersection are presented. The regime where Mach configuration appears is defines for the inviscid analysis. The method of integral relations developed for the blunt body problem is modified and extended to the attached shock wave and to internal nozzle flow problems.
Determination of benzylpenicillin in pharmaceuticals by capillary zone electrophoresis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hoyt, A.M. Jr.; Sepaniak, M.J.
A rapid and direct method is described for the determination of benzylpenicillin (penicillin G) in pharmaceutical preparations. The method involves very little sample preparation and total analysis time for duplicate results is less 30 minutes per sample. The method takes advantage of the speed and separating power of capillary zone electrophoresis (CZE). Detection of penicillin is by absorption at 228 nm. An internal standard is employed to reduce sample injection error. The method was applied successfully to both tablets and injectable preparations. 14 refs., 5 figs., 3 tabs.
Statistical deprojection of galaxy pairs
NASA Astrophysics Data System (ADS)
Nottale, Laurent; Chamaraux, Pierre
2018-06-01
Aims: The purpose of the present paper is to provide methods of statistical analysis of the physical properties of galaxy pairs. We perform this study to apply it later to catalogs of isolated pairs of galaxies, especially two new catalogs we recently constructed that contain ≈1000 and ≈13 000 pairs, respectively. We are particularly interested by the dynamics of those pairs, including the determination of their masses. Methods: We could not compute the dynamical parameters directly since the necessary data are incomplete. Indeed, we only have at our disposal one component of the intervelocity between the members, namely along the line of sight, and two components of their interdistance, i.e., the projection on the sky-plane. Moreover, we know only one point of each galaxy orbit. Hence we need statistical methods to find the probability distribution of 3D interdistances and 3D intervelocities from their projections; we designed those methods under the term deprojection. Results: We proceed in two steps to determine and use the deprojection methods. First we derive the probability distributions expected for the various relevant projected quantities, namely intervelocity vz, interdistance rp, their ratio, and the product rp v_z^2, which is involved in mass determination. In a second step, we propose various methods of deprojection of those parameters based on the previous analysis. We start from a histogram of the projected data and we apply inversion formulae to obtain the deprojected distributions; lastly, we test the methods by numerical simulations, which also allow us to determine the uncertainties involved.
Irving, Benjamin J; Goussard, Pierre; Andronikou, Savvas; Gie, Robert; Douglas, Tania S; Todd-Pokropek, Andrew; Taylor, Paul
2014-10-01
Airway deformation and stenosis can be key signs of pathology such as lymphadenopathy. This study presents a local airway point distribution model (LA-PDM) to automatically analyse regions of the airway tree in CT scans and identify abnormal airway deformation. In our method, the airway tree is segmented and the centreline identified from each chest CT scan. Thin-plate splines, along with a local mesh alignment method for tubular meshes, are used to register the airways and develop point distribution models (PDM). Each PDM is then used to analyse and classify local regions of the airway. This LA-PDM method was developed using 89 training cases and evaluated on a 90 CT test set, where each set includes paediatric tuberculosis (TB) cases (with airway involvement) and non-TB cases (without airway involvement). The LA-PDM was able to accurately distinguish cases with airway involvement with an AUC of the ROC classification (and 95% confidence interval) of 0.87 (0.77-0.94) for the Trachea-LMB-RMB region and 0.81 (0.68-0.90) for the RMB-RUL-BI region - outperforming a comparison method based on airway cross-sectional features. This has the potential to assist and improve airway analysis from CT scans by detecting involved airways and visualising affected airway regions. Copyright © 2014 Elsevier B.V. All rights reserved.
Wang, Jie; Zeng, Hao-Long; Du, Hongying; Liu, Zeyuan; Cheng, Ji; Liu, Taotao; Hu, Ting; Kamal, Ghulam Mustafa; Li, Xihai; Liu, Huili; Xu, Fuqiang
2018-03-01
Metabolomics generate a profile of small molecules from cellular/tissue metabolism, which could directly reflect the mechanisms of complex networks of biochemical reactions. Traditional metabolomics methods, such as OPLS-DA, PLS-DA are mainly used for binary class discrimination. Multiple groups are always involved in the biological system, especially for brain research. Multiple brain regions are involved in the neuronal study of brain metabolic dysfunctions such as alcoholism, Alzheimer's disease, etc. In the current study, 10 different brain regions were utilized for comparative studies between alcohol preferring and non-preferring rats, male and female rats respectively. As many classes are involved (ten different regions and four types of animals), traditional metabolomics methods are no longer efficient for showing differentiation. Here, a novel strategy based on the decision tree algorithm was employed for successfully constructing different classification models to screen out the major characteristics of ten brain regions at the same time. Subsequently, this method was also utilized to select the major effective brain regions related to alcohol preference and gender difference. Compared with the traditional multivariate statistical methods, the decision tree could construct acceptable and understandable classification models for multi-class data analysis. Therefore, the current technology could also be applied to other general metabolomics studies involving multi class data. Copyright © 2017 Elsevier B.V. All rights reserved.
Verification and Validation in a Rapid Software Development Process
NASA Technical Reports Server (NTRS)
Callahan, John R.; Easterbrook, Steve M.
1997-01-01
The high cost of software production is driving development organizations to adopt more automated design and analysis methods such as rapid prototyping, computer-aided software engineering (CASE) tools, and high-level code generators. Even developers of safety-critical software system have adopted many of these new methods while striving to achieve high levels Of quality and reliability. While these new methods may enhance productivity and quality in many cases, we examine some of the risks involved in the use of new methods in safety-critical contexts. We examine a case study involving the use of a CASE tool that automatically generates code from high-level system designs. We show that while high-level testing on the system structure is highly desirable, significant risks exist in the automatically generated code and in re-validating releases of the generated code after subsequent design changes. We identify these risks and suggest process improvements that retain the advantages of rapid, automated development methods within the quality and reliability contexts of safety-critical projects.
Woldegebriel, Michael; Vivó-Truyols, Gabriel
2016-10-04
A novel method for compound identification in liquid chromatography-high resolution mass spectrometry (LC-HRMS) is proposed. The method, based on Bayesian statistics, accommodates all possible uncertainties involved, from instrumentation up to data analysis into a single model yielding the probability of the compound of interest being present/absent in the sample. This approach differs from the classical methods in two ways. First, it is probabilistic (instead of deterministic); hence, it computes the probability that the compound is (or is not) present in a sample. Second, it answers the hypothesis "the compound is present", opposed to answering the question "the compound feature is present". This second difference implies a shift in the way data analysis is tackled, since the probability of interfering compounds (i.e., isomers and isobaric compounds) is also taken into account.
Minty, B; Ramsey, E D; Davies, I
2000-12-01
A direct aqueous supercritical fluid extraction (SFE) system was developed which can be directly interfaced to an infrared spectrometer for the determination of oil in water. The technique is designed to provide an environmentally clean, automated alternative to established IR methods for oil in water analysis which require the use of restricted organic solvents. The SFE-FTIR method involves minimum sample handling stages, with on-line analysis of a 500 ml water sample being complete within 15 min. Method accuracy for determining water samples spiked with gasoline, white spirit, kerosene, diesel or engine oil was 81-100% with precision (RSD) ranging from 3 to 17%. An independent evaluation determined a 2 ppm limit of quantification for diesel in industrial effluents. The results of a comparative study involving an established IR method and the SFE-FTIR method indicate that oil levels calculated using an accepted equation which includes coefficients derived from reference hydrocarbon standards may result in significant errors. A new approach permitted the derivation of quantification coefficients for the SFE-FTIR analyses which provided improved results. In situations where the identity of the oil to be analysed is known, a rapid off-line SFE-FTIR system calibration procedure was developed and successfully applied to various oils. An optional in-line silica gel clean-up procedure incorporated within the SFE-FTIR system enables the same water sample to be analysed for total oil content including vegetable oils and selectively for petroleum oil content within a total of 20 min. At the end of an analysis the SFE system is cleaned using an in situ 3 min clean cycle.
What does 'race' have to do with medical education research?
Muzzin, Linda; Mickleborough, Tim
2013-08-01
We live in a world of ethnoracial conflict. This is confirmed every day by opening and reading the newspaper. This everyday world seems far away in the pages of a medical education journal, but is it? The goal of this paper is to suggest that one need not look very far in medical education to encounter ethnoracial issues, and further, that research methods that are not ethnoracially biased must be employed to study these topics. We will draw attention to the relevance of employing an ethical conceptual approach to research involving 'race' by demonstrating how one author researching internationally educated health professionals has put 'race' front and centre in his analysis. He does this by using a postcolonial method of analysis termed a 'doubled-research' technique that sets up categories such as 'race' but then decolonizes them to avoid essentialism or stereotyping. We compare this method to another mainstream method employed for the same topic of inquiry which has sidelined 'race' in the analysis, potentially hiding findings about ethnoracial relations involving health professionals in our 'multicultural' society. This demonstration leads to the important question of whether research methods can be epistemologically racist-a question that has been raised about conventional research on education in general. Our argument is not meant to be the last word on this topic, but the first in this journal. We conclude that there is an internal ethics or axiology within research perspectives and methodologies that needs to be examined where ethnoracial issues are prominent. The use of mainstream approaches to undertake research can unintentionally 'leave unsaid' central aspects of what is researched while antiracist methods such as the one described in this article can open up the data to allow for a richer and deeper understanding of the problem. © 2013 John Wiley & Sons Ltd.
Rodriguez-Sabate, Clara; Morales, Ingrid; Sanchez, Alberto; Rodriguez, Manuel
2017-01-01
The complexity of basal ganglia (BG) interactions is often condensed into simple models mainly based on animal data and that present BG in closed-loop cortico-subcortical circuits of excitatory/inhibitory pathways which analyze the incoming cortical data and return the processed information to the cortex. This study was aimed at identifying functional relationships in the BG motor-loop of 24 healthy-subjects who provided written, informed consent and whose BOLD-activity was recorded by MRI methods. The analysis of the functional interaction between these centers by correlation techniques and multiple linear regression showed non-linear relationships which cannot be suitably addressed with these methods. The multiple correspondence analysis (MCA), an unsupervised multivariable procedure which can identify non-linear interactions, was used to study the functional connectivity of BG when subjects were at rest. Linear methods showed different functional interactions expected according to current BG models. MCA showed additional functional interactions which were not evident when using lineal methods. Seven functional configurations of BG were identified with MCA, two involving the primary motor and somatosensory cortex, one involving the deepest BG (external-internal globus pallidum, subthalamic nucleus and substantia nigral), one with the input-output BG centers (putamen and motor thalamus), two linking the input-output centers with other BG (external pallidum and subthalamic nucleus), and one linking the external pallidum and the substantia nigral. The results provide evidence that the non-linear MCA and linear methods are complementary and should be best used in conjunction to more fully understand the nature of functional connectivity of brain centers.
Ortiz-Boyer, F; Tena, M T; Luque de Castro, M D; Valcárcel, M
1995-10-01
Methods are reported for the determination of tyrothricin and benzocaine by HPLC and menthol by GC in the analysis of throat lozenges (tablets) containing all three compounds. After optimization of the variables involved in both HPLC and GC the methods have been characterized and validated according to the guidelines of the Spanish Pharmacopoeia, and applied to both the monitoring of the manufacturing process and the quality control of the final product.
Reducing maintenance costs in agreement with CNC machine tools reliability
NASA Astrophysics Data System (ADS)
Ungureanu, A. L.; Stan, G.; Butunoi, P. A.
2016-08-01
Aligning maintenance strategy with reliability is a challenge due to the need to find an optimal balance between them. Because the various methods described in the relevant literature involve laborious calculations or use of software that can be costly, this paper proposes a method that is easier to implement on CNC machine tools. The new method, called the Consequence of Failure Analysis (CFA) is based on technical and economic optimization, aimed at obtaining a level of required performance with minimum investment and maintenance costs.
Computational fluid dynamics combustion analysis evaluation
NASA Technical Reports Server (NTRS)
Kim, Y. M.; Shang, H. M.; Chen, C. P.; Ziebarth, J. P.
1992-01-01
This study involves the development of numerical modelling in spray combustion. These modelling efforts are mainly motivated to improve the computational efficiency in the stochastic particle tracking method as well as to incorporate the physical submodels of turbulence, combustion, vaporization, and dense spray effects. The present mathematical formulation and numerical methodologies can be casted in any time-marching pressure correction methodologies (PCM) such as FDNS code and MAST code. A sequence of validation cases involving steady burning sprays and transient evaporating sprays will be included.
The processing and transmission of EEG data
NASA Technical Reports Server (NTRS)
Schulze, A. E.
1974-01-01
Interest in sleep research was stimulated by the discovery of a number of physiological changes that occur during sleep and by the observed effects of sleep on physical and mental performance and status. The use of the relatively new methods of EEG measurement, transmission, and automatic scoring makes sleep analysis and categorization feasible. Sleep research involving the use of the EEG as a fundamental input has the potential of answering many unanswered questions involving physical and mental behavior, drug effects, circadian rhythm, and anesthesia.
An Analysis of Motivation Factors for Students’ Pursuit of Leadership Positions
McLaughlin, Milena M.; Gettig, Jacob P.; Fajiculay, Jay R.; Advincula, M. Renee
2015-01-01
Objective. To identify factors that influence student involvement and leadership within organizations and to assess the impact of involvement in organizations on professional skill development. Methods. A printed survey was administered to fourth-year pharmacy students at one college of pharmacy (N=202). Results. Most students (82%) indicated they were involved in at least one organization during pharmacy school and 58% reported holding a leadership position at some point. Factors with the largest impact on involvement in organizations were desire to present a well-rounded image to employers, ability to network, and interest in the activities sponsored by the organization. Involvement in professional organizations had a strong influence on their leadership, teamwork, confidence, and time-management skills. Conclusion. That presenting a well-rounded image to employers and having the ability to network with mentors and peers drove student involvement in professional organizations may be reflective of increasing competition for residencies and jobs. PMID:25741024
Cleanliness evaluation of rough surfaces with diffuse IR reflectance
NASA Technical Reports Server (NTRS)
Pearson, L. H.
1995-01-01
Contamination on bonding surfaces has been determined to be a primary cause for degraded bond strength in certain solid rocket motor bondlines. Hydrocarbon and silicone based organic contaminants that are airborne or directly introduced to a surface are a significant source of contamination. Diffuse infrared (IR) reflectance has historically been used as an effective technique for detection of organic contaminants, however, common laboratory methods involving the use of a Fourier transform IR spectrometer (FTIR) are impractical for inspecting the large bonding surface areas found on solid rocket motors. Optical methods involving the use of acousto-optic tunable filters and fixed bandpass optical filters are recommended for increased data acquisition speed. Testing and signal analysis methods are presented which provide for simultaneous measurement of contamination concentration and roughness level on rough metal surfaces contaminated with hydrocarbons.
Zhang, Shangjian; Wang, Heng; Zou, Xinhai; Zhang, Yali; Lu, Rongguo; Liu, Yong
2015-06-15
An extinction-ratio-independent electrical method is proposed for measuring chirp parameters of Mach-Zehnder electric-optic intensity modulators based on frequency-shifted optical heterodyne. The method utilizes the electrical spectrum analysis of the heterodyne products between the intensity modulated optical signal and the frequency-shifted optical carrier, and achieves the intrinsic chirp parameters measurement at microwave region with high-frequency resolution and wide-frequency range for the Mach-Zehnder modulator with a finite extinction ratio. Moreover, the proposed method avoids calibrating the responsivity fluctuation of the photodiode in spite of the involved photodetection. Chirp parameters as a function of modulation frequency are experimentally measured and compared to those with the conventional optical spectrum analysis method. Our method enables an extinction-ratio-independent and calibration-free electrical measurement of Mach-Zehnder intensity modulators by using the high-resolution frequency-shifted heterodyne technique.
Van Cutsem, Emmanuel; Simonart, Géraldine; Degand, Hervé; Faber, Anne-Marie; Morsomme, Pierre; Boutry, Marc
2011-02-01
Nicotiana tabacum leaves are covered by trichomes involved in the secretion of large amounts of secondary metabolites, some of which play a major role in plant defense. However, little is known about the metabolic pathways that operate in these structures. We undertook a proteomic analysis of N. tabacum trichomes in order to identify their protein complement. Efficient trichome isolation was obtained by abrading frozen leaves. After homogenization, soluble proteins and a microsomal fraction were prepared by centrifugation. Gel-based and gel-free proteomic analyses were then performed. 2-DE analysis of soluble proteins led to the identification of 1373 protein spots, which were digested and analyzed by MS/MS, leading to 680 unique identifications. Both soluble proteins and microsomal fraction were analyzed by LC MALDI-MS/MS after trypsin digestion, leading to 858 identifications, many of which had not been identified after 2-DE, indicating that the two methods complement each other. Many enzymes putatively involved in secondary metabolism were identified, including enzymes involved in the synthesis of terpenoid precursors and in acyl sugar production. Several transporters were also identified, some of which might be involved in secondary metabolite transport. Various (a)biotic stress response proteins were also detected, supporting the role of trichomes in plant defense. Copyright © 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
A wavelet-based technique to predict treatment outcome for Major Depressive Disorder
Xia, Likun; Mohd Yasin, Mohd Azhar; Azhar Ali, Syed Saad
2017-01-01
Treatment management for Major Depressive Disorder (MDD) has been challenging. However, electroencephalogram (EEG)-based predictions of antidepressant’s treatment outcome may help during antidepressant’s selection and ultimately improve the quality of life for MDD patients. In this study, a machine learning (ML) method involving pretreatment EEG data was proposed to perform such predictions for Selective Serotonin Reuptake Inhibitor (SSRIs). For this purpose, the acquisition of experimental data involved 34 MDD patients and 30 healthy controls. Consequently, a feature matrix was constructed involving time-frequency decomposition of EEG data based on wavelet transform (WT) analysis, termed as EEG data matrix. However, the resultant EEG data matrix had high dimensionality. Therefore, dimension reduction was performed based on a rank-based feature selection method according to a criterion, i.e., receiver operating characteristic (ROC). As a result, the most significant features were identified and further be utilized during the training and testing of a classification model, i.e., the logistic regression (LR) classifier. Finally, the LR model was validated with 100 iterations of 10-fold cross-validation (10-CV). The classification results were compared with short-time Fourier transform (STFT) analysis, and empirical mode decompositions (EMD). The wavelet features extracted from frontal and temporal EEG data were found statistically significant. In comparison with other time-frequency approaches such as the STFT and EMD, the WT analysis has shown highest classification accuracy, i.e., accuracy = 87.5%, sensitivity = 95%, and specificity = 80%. In conclusion, significant wavelet coefficients extracted from frontal and temporal pre-treatment EEG data involving delta and theta frequency bands may predict antidepressant’s treatment outcome for the MDD patients. PMID:28152063
On the use of haplotype phylogeny to detect disease susceptibility loci
Bardel, Claire; Danjean, Vincent; Hugot, Jean-Pierre; Darlu, Pierre; Génin, Emmanuelle
2005-01-01
Background The cladistic approach proposed by Templeton has been presented as promising for the study of the genetic factors involved in common diseases. This approach allows the joint study of multiple markers within a gene by considering haplotypes and grouping them in nested clades. The idea is to search for clades with an excess of cases as compared to the whole sample and to identify the mutations defining these clades as potential candidate disease susceptibility sites. However, the performance of this approach for the study of the genetic factors involved in complex diseases has never been studied. Results In this paper, we propose a new method to perform such a cladistic analysis and we estimate its power through simulations. We show that under models where the susceptibility to the disease is caused by a single genetic variant, the cladistic test is neither really more powerful to detect an association nor really more efficient to localize the susceptibility site than an individual SNP testing. However, when two interacting sites are responsible for the disease, the cladistic analysis greatly improves the probability to find the two susceptibility sites. The impact of the linkage disequilibrium and of the tree characteristics on the efficiency of the cladistic analysis are also discussed. An application on a real data set concerning the CARD15 gene and Crohn disease shows that the method can successfully identify the three variant sites that are involved in the disease susceptibility. Conclusion The use of phylogenies to group haplotypes is especially interesting to pinpoint the sites that are likely to be involved in disease susceptibility among the different markers identified within a gene. PMID:15904492
NASA Astrophysics Data System (ADS)
Yarmohammadi, M.; Javadi, S.; Babolian, E.
2018-04-01
In this study a new spectral iterative method (SIM) based on fractional interpolation is presented for solving nonlinear fractional differential equations (FDEs) involving Caputo derivative. This method is equipped with a pre-algorithm to find the singularity index of solution of the problem. This pre-algorithm gives us a real parameter as the index of the fractional interpolation basis, for which the SIM achieves the highest order of convergence. In comparison with some recent results about the error estimates for fractional approximations, a more accurate convergence rate has been attained. We have also proposed the order of convergence for fractional interpolation error under the L2-norm. Finally, general error analysis of SIM has been considered. The numerical results clearly demonstrate the capability of the proposed method.
Structural determination of intact proteins using mass spectrometry
Kruppa, Gary [San Francisco, CA; Schoeniger, Joseph S [Oakland, CA; Young, Malin M [Livermore, CA
2008-05-06
The present invention relates to novel methods of determining the sequence and structure of proteins. Specifically, the present invention allows for the analysis of intact proteins within a mass spectrometer. Therefore, preparatory separations need not be performed prior to introducing a protein sample into the mass spectrometer. Also disclosed herein are new instrumental developments for enhancing the signal from the desired modified proteins, methods for producing controlled protein fragments in the mass spectrometer, eliminating complex microseparations, and protein preparatory chemical steps necessary for cross-linking based protein structure determination.Additionally, the preferred method of the present invention involves the determination of protein structures utilizing a top-down analysis of protein structures to search for covalent modifications. In the preferred method, intact proteins are ionized and fragmented within the mass spectrometer.
A Meta-analysis of Cerebellar Contributions to Higher Cognition from PET and fMRI studies
Keren-Happuch, E; Chen, Shen-Hsing Annabel; Ho, Moon-Ho Ringo; Desmond, John E.
2013-01-01
A growing interest in cerebellar function and its involvement in higher cognition have prompted much research in recent years. Cerebellar presence in a wide range of cognitive functions examined within an increasing body of neuroimaging literature has been observed. We applied a meta-analytic approach, which employed the activation likelihood estimate method, to consolidate results of cerebellar involvement accumulated in different cognitive tasks of interest and systematically identified similarities among the studies. The current analysis included 88 neuroimaging studies demonstrating cerebellar activations in higher cognitive domains involving emotion, executive function, language, music, timing and working memory. While largely consistent with a prior meta-analysis by Stoodley and Schmahmann (2009), our results extended their findings to include music and timing domains to provide further insights into cerebellar involvement and elucidate its role in higher cognition. In addition, we conducted inter- and intra-domain comparisons for the cognitive domains of emotion, language and working memory. We also considered task differences within the domain of verbal working memory by conducting a comparison of the Sternberg with the n-back task, as well as an analysis of the differential components within the Sternberg task. Results showed a consistent cerebellar presence in the timing domain, providing evidence for a role in time keeping. Unique clusters identified within the domain further refine the topographic organization of the cerebellum. PMID:23125108
Contact stress analysis of spiral bevel gears using nonlinear finite element static analysis
NASA Technical Reports Server (NTRS)
Bibel, G. D.; Kumar, A.; Reddy, S.; Handschuh, R.
1993-01-01
A procedure is presented for performing three-dimensional stress analysis of spiral bevel gears in mesh using the finite element method. The procedure involves generating a finite element model by solving equations that identify tooth surface coordinates. Coordinate transformations are used to orientate the gear and pinion for gear meshing. Contact boundary conditions are simulated with gap elements. A solution technique for correct orientation of the gap elements is given. Example models and results are presented.
Conceptual designs for in situ analysis of Mars soil
NASA Technical Reports Server (NTRS)
Mckay, C. P.; Zent, A. P.; Hartman, H.
1991-01-01
A goal of this research is to develop conceptual designs for instrumentation to perform in situ measurements of the Martian soil in order to determine the existence and nature of any reactive chemicals. Our approach involves assessment and critical review of the Viking biology results which indicated the presence of a soil oxidant, an investigation of the possible application of standard soil science techniques to the analysis of Martian soil, and a preliminary consideration of non-standard methods that may be necessary for use in the highly oxidizing Martian soil. Based on our preliminary analysis, we have developed strawman concepts for standard soil analysis on Mars, including pH, suitable for use on a Mars rover mission. In addition, we have devised a method for the determination of the possible strong oxidants on Mars.
Statistical Discourse Analysis: A Method for Modelling Online Discussion Processes
ERIC Educational Resources Information Center
Chiu, Ming Ming; Fujita, Nobuko
2014-01-01
Online forums (synchronous and asynchronous) offer exciting data opportunities to analyze how people influence one another through their interactions. However, researchers must address several analytic difficulties involving the data (missing values, nested structure [messages within topics], non-sequential messages), outcome variables (discrete…
Remote Sensing as a Demonstration of Applied Physics.
ERIC Educational Resources Information Center
Colwell, Robert N.
1980-01-01
Provides information about the field of remote sensing, including discussions of geo-synchronous and sun-synchronous remote-sensing platforms, the actual physical processes and equipment involved in sensing, the analysis of images by humans and machines, and inexpensive, small scale methods, including aerial photography. (CS)
Characterizing Preservice Teachers' Mathematical Understanding of Algebraic Relationships
ERIC Educational Resources Information Center
Nillas, Leah A.
2010-01-01
Qualitative research methods were employed to investigate characterization of preservice teachers' mathematical understanding. Responses on test items involving algebraic relationships were analyzed using with-in case analysis (Miles and Huberman, 1994) and Pirie and Kieren's (1994) model of growth of mathematical understanding. Five elementary…
GPA, GMAT, and Scale: A Method of Quantification of Admissions Criteria.
ERIC Educational Resources Information Center
Sobol, Marion G.
1984-01-01
Multiple regression analysis was used to establish a scale, measuring college student involvement in campus activities, work experience, technical background, references, and goals. This scale was tested to see whether it improved the prediction of success in graduate school. (Author/MLW)
Bioinformatics and the Undergraduate Curriculum
ERIC Educational Resources Information Center
Maloney, Mark; Parker, Jeffrey; LeBlanc, Mark; Woodard, Craig T.; Glackin, Mary; Hanrahan, Michael
2010-01-01
Recent advances involving high-throughput techniques for data generation and analysis have made familiarity with basic bioinformatics concepts and programs a necessity in the biological sciences. Undergraduate students increasingly need training in methods related to finding and retrieving information stored in vast databases. The rapid rise of…
Kishimoto, Toru; Wanikawa, Akira; Kagami, Noboru; Kawatsura, Katsuyuki
2005-06-15
Hop aroma components, which mainly comprise terpenoids, contribute to the character of beers. However, pretreatments are necessary before analyzing these components because of their trace levels and complicated matrixes. Here, the stir bar-sorptive extraction (SBSE) method was used to detect and quantify many terpenoids simultaneously from small samples. This simple technique showed low coefficients of variation, high accuracy, and low detection limits. An investigation of the behavior of terpenoids identified two distinct patterns of decreasing concentration during wort boiling. The first, which was seen in myrcene and linalool, involved a rapid decrease that was best fitted by a quadratic curve. The second, which was observed in beta-eudesmol, humulene, humulene epoxide I, beta-farnesene, caryophyllene, and geraniol, involved a gentle linear decrease. Conversely, the concentration of beta-damascenone increased after boiling. As the aroma composition depended on the hop variety, we also examined the relationship between terpenoid content and sensory analysis in beer.
Measuring and Estimating Normalized Contrast in Infrared Flash Thermography
NASA Technical Reports Server (NTRS)
Koshti, Ajay M.
2013-01-01
Infrared flash thermography (IRFT) is used to detect void-like flaws in a test object. The IRFT technique involves heating up the part surface using a flash of flash lamps. The post-flash evolution of the part surface temperature is sensed by an IR camera in terms of pixel intensity of image pixels. The IR technique involves recording of the IR video image data and analysis of the data using the normalized pixel intensity and temperature contrast analysis method for characterization of void-like flaws for depth and width. This work introduces a new definition of the normalized IR pixel intensity contrast and normalized surface temperature contrast. A procedure is provided to compute the pixel intensity contrast from the camera pixel intensity evolution data. The pixel intensity contrast and the corresponding surface temperature contrast differ but are related. This work provides a method to estimate the temperature evolution and the normalized temperature contrast from the measured pixel intensity evolution data and some additional measurements during data acquisition.
ERIC Educational Resources Information Center
Benítez, Isabel; Padilla, José-Luis
2014-01-01
Differential item functioning (DIF) can undermine the validity of cross-lingual comparisons. While a lot of efficient statistics for detecting DIF are available, few general findings have been found to explain DIF results. The objective of the article was to study DIF sources by using a mixed method design. The design involves a quantitative phase…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nose, Y.
Methods were developed for generating an integrated, statistical model of the anatomical structures within the human thorax relevant to radioisotope powered artificial heart implantation. These methods involve measurement and analysis of anatomy in four areas: chest wall, pericardium, vascular connections, and great vessels. A model for the prediction of thorax outline from radiograms was finalized. These models were combined with 100 radiograms to arrive at a size distribution representing the adult male and female populations. (CH)
Olokundun, Maxwell; Moses, Chinonye Love; Iyiola, Oluwole; Ibidunni, Stephen; Ogbari, Mercy; Peter, Fred; Borishade, Taiye
2018-08-01
Traditional methods of teaching entrepreneurship in universities involves more theoretical approaches which are less effective in motivating considerations for an entrepreneurship career. This owes to the fact that such techniques essentially make students develop a dormant attitude rather than active participation. Expert views suggest that experiential entrepreneurship teaching methods in universities which involve practical activities and active participation can be considered salient to students' development of entrepreneurial interest an business startup potentials. This present study presents data on the extent to which experiential teaching methods in entrepreneurship adopted by Nigerian universities stimulate students' entrepreneurial interest and business startups. Data have been gathered following a descriptive cross-sectional quantitative survey conducted among university students ( N = 600) of four selected institutions in Nigeria offering a degree programme in entrepreneurship. Hierarchical Multiple Regression Analysis was used in confirming the hypothesis proposed in the study using the Statistical Package for Social Sciences (SPSS) version 22.The findings from the analysis showed that the adoption of experiential practical activities considered as best practices in entrepreneurship teaching in Nigerian universities can stimulate students' interest and drive for engaging in business start-up activities even as undergraduates. The field data set is made extensively available to allow for critical investigation.
1980-01-01
standard procedure for Analysis of all types of civil engineering struc- tures. Early in its development, it became apparent that this method had...unique potentialities in the evaluation of stress in dams, and many of its earliest civil engineering applications concerned special problems associated...with such structures [3,4]. The earliest dynamic finite element analyses of civil engineering structures involved the earthquake response analysis of
The Effect of Multispectral Image Fusion Enhancement on Human Efficiency
2017-03-20
human visual system by applying a technique commonly used in visual percep- tion research : ideal observer analysis. Using this approach, we establish...applications, analytic tech- niques, and procedural methods used across studies. This paper uses ideal observer analysis to establish a frame- work that allows...augmented similarly to incorpo- rate research involving more complex stimulus content. Additionally, the ideal observer can be adapted for a number of
Rhebergen, Martijn D F; Visser, Maaike J; Verberk, Maarten M; Lenderink, Annet F; van Dijk, Frank J H; Kezic, Sanja; Hulshof, Carel T J
2012-10-01
We compared three common user involvement methods in revealing barriers and facilitators from intended users that might influence their use of a new genetic test. The study was part of the development of a new genetic test on the susceptibility to hand eczema for nurses. Eighty student nurses participated in five focus groups (n = 33), 15 interviews (n = 15) or questionnaires (n = 32). For each method, data were collected until saturation. We compared the mean number of items and relevant remarks that could influence the use of the genetic test obtained per method, divided by the number of participants in that method. Thematic content analysis was performed using MAXQDA software. The focus groups revealed 30 unique items compared to 29 in the interviews and 21 in the questionnaires. The interviews produced more items and relevant remarks per participant (1.9 and 8.4 pp) than focus groups (0.9 and 4.8 pp) or questionnaires (0.7 and 2.3 pp). All three involvement methods revealed relevant barriers and facilitators to use a new genetic test. Focus groups and interviews revealed substantially more items than questionnaires. Furthermore, this study suggests a preference for the use of interviews because the number of items per participant was higher than for focus groups and questionnaires. This conclusion may be valid for other genetic tests as well.
Groene, Oliver; Klazinga, Niek; Wagner, Cordula; Arah, Onyebuchi A; Thompson, Andrew; Bruneau, Charles; Suñol, Rosa
2010-09-24
Hospitals in European countries apply a wide range of quality improvement strategies. Knowledge of the effectiveness of these strategies, implemented as part of an overall hospital quality improvement system, is limited. We propose to study the relationships among organisational quality improvement systems, patient empowerment, organisational culture, professionals' involvement with the quality of hospital care, including clinical effectiveness, patient safety and patient involvement. We will employ a cross-sectional, multi-level study design in which patient-level measurements are nested in hospital departments, which are in turn nested in hospitals in different EU countries. Mixed methods will be used for data collection, measurement and analysis. Hospital/care pathway level constructs that will be assessed include external pressure, hospital governance, quality improvement system, patient empowerment in quality improvement, organisational culture and professional involvement. These constructs will be assessed using questionnaires. Patient-level constructs include clinical effectiveness, patient safety and patient involvement, and will be assessed using audit of patient records, routine data and patient surveys. For the assessment of hospital and pathway level constructs we will collect data from randomly selected hospitals in eight countries. For a sample of hospitals in each country we will carry out additional data collection at patient-level related to four conditions (stroke, acute myocardial infarction, hip fracture and delivery). In addition, structural components of quality improvement systems will be assessed using visits by experienced external assessors. Data analysis will include descriptive statistics and graphical representations and methods for data reduction, classification techniques and psychometric analysis, before moving to bi-variate and multivariate analysis. The latter will be conducted at hospital and multilevel. In addition, we will apply sophisticated methodological elements such as the use of causal diagrams, outcome modelling, double robust estimation and detailed sensitivity analysis or multiple bias analyses to assess the impact of the various sources of bias. Products of the project will include a catalogue of instruments and tools that can be used to build departmental or hospital quality and safety programme and an appraisal scheme to assess the maturity of the quality improvement system for use by hospitals and by purchasers to contract hospitals.
Santos, Eliane Macedo Sobrinho; Santos, Hércules Otacílio; Dos Santos Dias, Ivoneth; Santos, Sérgio Henrique; Batista de Paula, Alfredo Maurício; Feltenberger, John David; Sena Guimarães, André Luiz; Farias, Lucyana Conceição
2016-01-01
Pathogenesis of odontogenic tumors is not well known. It is important to identify genetic deregulations and molecular alterations. This study aimed to investigate, through bioinformatic analysis, the possible genes involved in the pathogenesis of ameloblastoma (AM) and keratocystic odontogenic tumor (KCOT). Genes involved in the pathogenesis of AM and KCOT were identified in GeneCards. Gene list was expanded, and the gene interactions network was mapped using the STRING software. "Weighted number of links" (WNL) was calculated to identify "leader genes" (highest WNL). Genes were ranked by K-means method and Kruskal-Wallis test was used (P<0.001). Total interactions score (TIS) was also calculated using all interaction data generated by the STRING database, in order to achieve global connectivity for each gene. The topological and ontological analyses were performed using Cytoscape software and BinGO plugin. Literature review data was used to corroborate the bioinformatics data. CDK1 was identified as leader gene for AM. In KCOT group, results show PCNA and TP53 . Both tumors exhibit a power law behavior. Our topological analysis suggested leader genes possibly important in the pathogenesis of AM and KCOT, by clustering coefficient calculated for both odontogenic tumors (0.028 for AM, zero for KCOT). The results obtained in the scatter diagram suggest an important relationship of these genes with the molecular processes involved in AM and KCOT. Ontological analysis for both AM and KCOT demonstrated different mechanisms. Bioinformatics analyzes were confirmed through literature review. These results may suggest the involvement of promising genes for a better understanding of the pathogenesis of AM and KCOT.
Goulart Coelho, Lineker M; Lange, Liséte C; Coelho, Hosmanny Mg
2017-01-01
Solid waste management is a complex domain involving the interaction of several dimensions; thus, its analysis and control impose continuous challenges for decision makers. In this context, multi-criteria decision-making models have become important and convenient supporting tools for solid waste management because they can handle problems involving multiple dimensions and conflicting criteria. However, the selection of the multi-criteria decision-making method is a hard task since there are several multi-criteria decision-making approaches, each one with a large number of variants whose applicability depends on information availability and the aim of the study. Therefore, to support researchers and decision makers, the objectives of this article are to present a literature review of multi-criteria decision-making applications used in solid waste management, offer a critical assessment of the current practices, and provide suggestions for future works. A brief review of fundamental concepts on this topic is first provided, followed by the analysis of 260 articles related to the application of multi-criteria decision making in solid waste management. These studies were investigated in terms of the methodology, including specific steps such as normalisation, weighting, and sensitivity analysis. In addition, information related to waste type, the study objective, and aspects considered was recorded. From the articles analysed it is noted that studies using multi-criteria decision making in solid waste management are predominantly addressed to problems related to municipal solid waste involving facility location or management strategy.
NASA Astrophysics Data System (ADS)
Szafranko, E.
2017-08-01
When planning a building structure, dilemmas arise as to what construction and material solutions are feasible. The decisions are not always obvious. A procedure for selecting the variant that will best satisfy the expectations of the investor and future users of a structure must be founded on mathematical methods. The following deserve special attention: the MCE methods, Hierarchical Analysis Methods and Weighting Methods. Another interesting solution, particularly useful when dealing with evaluations which take into account negative values, is the Indicator Method. MCE methods are relatively popular owing to the simplicity of the calculations and ease of the interpretation of the results. Having prepared the input data properly, they enable the user to compare them on the same level. In a situation where an analysis involves a large number of data, it is more convenient to divide them into groups according to main criteria and subcriteria. This option is provided by hierarchical analysis methods. They are based on ordered sets of criteria, which are evaluated in groups. In some cases, this approach yields the results that are superior and easier to read. If an analysis encompasses direct and indirect effects, an Indicator Method seems to be a justified choice for selecting the right solution. The Indicator Method is different in character and relies on weights and assessments of effects. It allows the user to evaluate effectively the analyzed variants. This article explains the methodology of conducting a multi-criteria analysis, showing its advantages and disadvantages. An example of calculations contained in the article shows what problems can be encountered when making an assessment of various solutions regarding building materials and structures. For comparison, an analysis based on graphical methods developed by the author was presented.
Can the impact of public involvement on research be evaluated? A mixed methods study.
Barber, Rosemary; Boote, Jonathan D; Parry, Glenys D; Cooper, Cindy L; Yeeles, Philippa; Cook, Sarah
2012-09-01
Public involvement is central to health and social research policies, yet few systematic evaluations of its impact have been carried out, raising questions about the feasibility of evaluating the impact of public involvement. To investigate whether it is feasible to evaluate the impact of public involvement on health and social research. Mixed methods including a two-round Delphi study with pre-specified 80% consensus criterion, with follow-up interviews. UK and international panellists came from different settings, including universities, health and social care institutions and charitable organizations. They comprised researchers, members of the public, research managers, commissioners and policy makers, self-selected as having knowledge and/or experience of public involvement in health and/or social research; 124 completed both rounds of the Delphi process. A purposive sample of 14 panellists was interviewed. Consensus was reached that it is feasible to evaluate the impact of public involvement on 5 of 16 impact issues: identifying and prioritizing research topics, disseminating research findings and on key stakeholders. Qualitative analysis revealed the complexities of evaluating a process that is subjective and socially constructed. While many panellists believed that it is morally right to involve the public in research, they also considered that it is appropriate to evaluate the impact of public involvement. This study found consensus among panellists that it is feasible to evaluate the impact of public involvement on some research processes, outcomes and on key stakeholders. The value of public involvement and the importance of evaluating its impact were endorsed. © 2011 Blackwell Publishing Ltd.
Can the impact of public involvement on research be evaluated? A mixed methods study
Barber, Rosemary; Boote, Jonathan D; Parry, Glenys D; Cooper, Cindy L; Yeeles, Philippa; Cook, Sarah
2011-01-01
Abstract Background Public involvement is central to health and social research policies, yet few systematic evaluations of its impact have been carried out, raising questions about the feasibility of evaluating the impact of public involvement. Objective To investigate whether it is feasible to evaluate the impact of public involvement on health and social research. Methods Mixed methods including a two‐round Delphi study with pre‐specified 80% consensus criterion, with follow‐up interviews. UK and international panellists came from different settings, including universities, health and social care institutions and charitable organizations. They comprised researchers, members of the public, research managers, commissioners and policy makers, self‐selected as having knowledge and/or experience of public involvement in health and/or social research; 124 completed both rounds of the Delphi process. A purposive sample of 14 panellists was interviewed. Results Consensus was reached that it is feasible to evaluate the impact of public involvement on 5 of 16 impact issues: identifying and prioritizing research topics, disseminating research findings and on key stakeholders. Qualitative analysis revealed the complexities of evaluating a process that is subjective and socially constructed. While many panellists believed that it is morally right to involve the public in research, they also considered that it is appropriate to evaluate the impact of public involvement. Conclusions This study found consensus among panellists that it is feasible to evaluate the impact of public involvement on some research processes, outcomes and on key stakeholders. The value of public involvement and the importance of evaluating its impact were endorsed. PMID:21324054
Detrended fluctuation analysis for major depressive disorder.
Mumtaz, Wajid; Malik, Aamir Saeed; Ali, Syed Saad Azhar; Yasin, Mohd Azhar Mohd; Amin, Hafeezullah
2015-01-01
Clinical utility of Electroencephalography (EEG) based diagnostic studies is less clear for major depressive disorder (MDD). In this paper, a novel machine learning (ML) scheme was presented to discriminate the MDD patients and healthy controls. The proposed method inherently involved feature extraction, selection, classification and validation. The EEG data acquisition involved eyes closed (EC) and eyes open (EO) conditions. At feature extraction stage, the de-trended fluctuation analysis (DFA) was performed, based on the EEG data, to achieve scaling exponents. The DFA was performed to analyzes the presence or absence of long-range temporal correlations (LRTC) in the recorded EEG data. The scaling exponents were used as input features to our proposed system. At feature selection stage, 3 different techniques were used for comparison purposes. Logistic regression (LR) classifier was employed. The method was validated by a 10-fold cross-validation. As results, we have observed that the effect of 3 different reference montages on the computed features. The proposed method employed 3 different types of feature selection techniques for comparison purposes as well. The results show that the DFA analysis performed better in LE data compared with the IR and AR data. In addition, during Wilcoxon ranking, the AR performed better than LE and IR. Based on the results, it was concluded that the DFA provided useful information to discriminate the MDD patients and with further validation can be employed in clinics for diagnosis of MDD.
NASA Astrophysics Data System (ADS)
Nasution, A. H.; Rachmawan, Y. A.
2018-04-01
Fashion trend in the world changed extremely fast. Fashion has become the one of people’s lifestyle in the world. Fashion week events in several areas can be a measurement of fahion trend nowadays. There was a fashion week event in Indonesia called Jakarta Fashion Week (JFW) aims to show fashion trend to people who want to improve their fashion style. People will join some events if the event has involvement to them, hence they will come to that event again and again. Annually and continuously event is really important to create loyalty among people who are involved in it, in order to increase positive development towards the organizer in organizing the next event. Saving a huge amount from the marketing budget, and creating a higher quality event. This study aims to know the effect of 5 brand personality dimension to event involvement and loyalty in Jakarta Fashion Week (JFW). This study use quantitative confirmative method with Structural Equation Model (SEM) analysis technique. The sample of this study is 150 respondents who became a participant of Jakarta Fashion Week 2017. Result show that there was significant effect of 5 brand personality dimension to 3 dimension of event involvement and loyalty. Meanwhile, there was one dimension of event involvement called personal self-expression that has not effect to loyalty.
DOT National Transportation Integrated Search
2011-02-01
An understanding of traffic flow in time and space is fundamental to the development of : strategies for the efficient use of the existing transportation infrastructure in large : metropolitan areas. Thus, this project involved developing the methods...
Institute for Defense Analysis. Annual Report 1995.
1995-01-01
staff have been involved in the community-wide development of MPI as well as in its application to specific NSA problems. 35 Parallel Groebner ...Basis Code — Symbolic Computing on Parallel Machines The Groebner basis method is a set of algorithms for reformulating very complex algebraic expres
Using Case Studies: An International Approach
ERIC Educational Resources Information Center
McClam, Tricia; Woodside, Marianne
2005-01-01
Case studies as an instructional strategy have been used in many disciplines, including law, teacher education, science, medicine, and business. Among the benefits of this method of instruction are involving students in learning, developing their critical thinking skills, promoting communication, and engaging in critical analysis. Case studies are…
Experiential Education: Enhancing the Liberal Arts Curriculum
ERIC Educational Resources Information Center
Graff, Elissa R.
2013-01-01
This mixed-methods study combined a survey instrument, the Learning Style Inventory (LSI), with a selected group of follow-up interviews for the purpose of determining how experiential practices affected student engagement and learning. Quantitative data analysis established students' preferences for more active involvement in learning practices…
A New Heterogeneous Multidimensional Unfolding Procedure
ERIC Educational Resources Information Center
Park, Joonwook; Rajagopal, Priyali; DeSarbo, Wayne S.
2012-01-01
A variety of joint space multidimensional scaling (MDS) methods have been utilized for the spatial analysis of two- or three-way dominance data involving subjects' preferences, choices, considerations, intentions, etc. so as to provide a parsimonious spatial depiction of the underlying relevant dimensions, attributes, stimuli, and/or subjects'…
Optical Fourier diffractometry applied to degraded bone structure recognition
NASA Astrophysics Data System (ADS)
Galas, Jacek; Godwod, Krzysztof; Szawdyn, Jacek; Sawicki, Andrzej
1993-09-01
Image processing and recognition methods are useful in many fields. This paper presents the hybrid optical and digital method applied to recognition of pathological changes in bones involved by metabolic bone diseases. The trabecular bone structure, registered by x ray on the photographic film, is analyzed in the new type of computer controlled diffractometer. The set of image parameters, extracted from diffractogram, is evaluated by statistical analysis. The synthetic image descriptors in discriminant space, constructed on the base of 3 training groups of images (control, osteoporosis, and osteomalacia groups) by discriminant analysis, allow us to recognize bone samples with degraded bone structure and to recognize the disease. About 89% of the images were classified correctly. This method after optimization process will be verified in medical investigations.
Cerasa, Antonio; Castiglioni, Isabella; Salvatore, Christian; Funaro, Angela; Martino, Iolanda; Alfano, Stefania; Donzuso, Giulia; Perrotta, Paolo; Gioia, Maria Cecilia; Gilardi, Maria Carla; Quattrone, Aldo
2015-01-01
Presently, there are no valid biomarkers to identify individuals with eating disorders (ED). The aim of this work was to assess the feasibility of a machine learning method for extracting reliable neuroimaging features allowing individual categorization of patients with ED. Support Vector Machine (SVM) technique, combined with a pattern recognition method, was employed utilizing structural magnetic resonance images. Seventeen females with ED (six with diagnosis of anorexia nervosa and 11 with bulimia nervosa) were compared against 17 body mass index-matched healthy controls (HC). Machine learning allowed individual diagnosis of ED versus HC with an Accuracy ≥ 0.80. Voxel-based pattern recognition analysis demonstrated that voxels influencing the classification Accuracy involved the occipital cortex, the posterior cerebellar lobule, precuneus, sensorimotor/premotor cortices, and the medial prefrontal cortex, all critical regions known to be strongly involved in the pathophysiological mechanisms of ED. Although these findings should be considered preliminary given the small size investigated, SVM analysis highlights the role of well-known brain regions as possible biomarkers to distinguish ED from HC at an individual level, thus encouraging the translational implementation of this new multivariate approach in the clinical practice. PMID:26648660
Leung, Doris G
2017-07-01
A growing body of the literature supports the use of magnetic resonance imaging as a potential biomarker for disease severity in the hereditary myopathies. We performed a systematic review of the medical literature to evaluate patterns of fat infiltration observed in magnetic resonance imaging studies of muscular dystrophy and congenital myopathy. Searches were performed using MEDLINE, EMBASE, and grey literature databases. Studies that described fat infiltration of muscles in patients with muscular dystrophy or congenital myopathy were selected for full-length review. Data on preferentially involved or spared muscles were extracted for analysis. A total of 2172 titles and abstracts were screened, and 70 publications met our criteria for inclusion in the systematic review. There were 23 distinct genetic disorders represented in this analysis. In most studies, preferential involvement and sparing of specific muscles were reported. We conclude that magnetic resonance imaging studies can be used to identify distinct patterns of muscle involvement in the hereditary myopathies. However, larger studies and standardized methods of reporting are needed to develop imaging as a diagnostic tool in these diseases.
Waveguides for performing enzymatic reactions
Levene; Michael J. , Korlach; Jonas , Turner; Stephen W. , Craighead; Harold G. , Webb; Watt W.
2007-11-06
The present invention is directed to a method and an apparatus for analysis of an analyte. The method involves providing a zero-mode waveguide which includes a cladding surrounding a core where the cladding is configured to preclude propagation of electromagnetic energy of a frequency less than a cutoff frequency longitudinally through the core of the zero-mode waveguide. The analyte is positioned in the core of the zero-mode waveguide and is then subjected, in the core of the zero-mode wave guide, to activating electromagnetic radiation of a frequency less than the cut-off frequency under conditions effective to permit analysis of the analyte in an effective observation volume which is more compact than if the analysis were carried out in the absence of the zero-mode waveguide.
NASA Technical Reports Server (NTRS)
Iliff, K. W.; Maine, R. E.
1976-01-01
A maximum likelihood estimation method was applied to flight data and procedures to facilitate the routine analysis of a large amount of flight data were described. Techniques that can be used to obtain stability and control derivatives from aircraft maneuvers that are less than ideal for this purpose are described. The techniques involve detecting and correcting the effects of dependent or nearly dependent variables, structural vibration, data drift, inadequate instrumentation, and difficulties with the data acquisition system and the mathematical model. The use of uncertainty levels and multiple maneuver analysis also proved to be useful in improving the quality of the estimated coefficients. The procedures used for editing the data and for overall analysis are also discussed.
Levene, Michael J.; Korlach, Jonas; Turner, Stephen W.; Craighead, Harold G.; Webb, Watt W.
2007-02-20
The present invention is directed to a method and an apparatus for analysis of an analyte. The method involves providing a zero-mode waveguide which includes a cladding surrounding a core where the cladding is configured to preclude propagation of electromagnetic energy of a frequency less than a cutoff frequency longitudinally through the core of the zero-mode waveguide. The analyte is positioned in the core of the zero-mode waveguide and is then subjected, in the core of the zero-mode waveguide, to activating electromagnetic radiation of a frequency less than the cut-off frequency under conditions effective to permit analysis of the analyte in an effective observation volume which is more compact than if the analysis were carried out in the absence of the zero-mode waveguide.
NASA Astrophysics Data System (ADS)
Pizzini, Edward L.; Treagust, David F.; Cody, John
The purpose of this study was to determine whether or not formative evaluation could facilitate goal attainment in a biochemistry course and produce desired learning outcomes consistently by altering course materials and/or instruction. Formative evaluation procedures included the administration of the Inorganic-Organic-Biological Chemistry Test Form 1974 and the Methods and Procedures of Science test to course participants over three consecutive years. A one group pretest-post-test design was used. The statistical analysis involved the use of the Wilcoxon matched-pairs signed-ranks test. The study involved 64 participants. The findings indicate that the use of formative evaluation can be effective in producing desired learning outcomes to facilitate goal attainment.
Comparison of landmark-based and automatic methods for cortical surface registration
Pantazis, Dimitrios; Joshi, Anand; Jiang, Jintao; Shattuck, David; Bernstein, Lynne E.; Damasio, Hanna; Leahy, Richard M.
2009-01-01
Group analysis of structure or function in cerebral cortex typically involves as a first step the alignment of the cortices. A surface based approach to this problem treats the cortex as a convoluted surface and coregisters across subjects so that cortical landmarks or features are aligned. This registration can be performed using curves representing sulcal fundi and gyral crowns to constrain the mapping. Alternatively, registration can be based on the alignment of curvature metrics computed over the entire cortical surface. The former approach typically involves some degree of user interaction in defining the sulcal and gyral landmarks while the latter methods can be completely automated. Here we introduce a cortical delineation protocol consisting of 26 consistent landmarks spanning the entire cortical surface. We then compare the performance of a landmark-based registration method that uses this protocol with that of two automatic methods implemented in the software packages FreeSurfer and BrainVoyager. We compare performance in terms of discrepancy maps between the different methods, the accuracy with which regions of interest are aligned, and the ability of the automated methods to correctly align standard cortical landmarks. Our results show similar performance for ROIs in the perisylvian region for the landmark based method and FreeSurfer. However, the discrepancy maps showed larger variability between methods in occipital and frontal cortex and also that automated methods often produce misalignment of standard cortical landmarks. Consequently, selection of the registration approach should consider the importance of accurate sulcal alignment for the specific task for which coregistration is being performed. When automatic methods are used, the users should ensure that sulci in regions of interest in their studies are adequately aligned before proceeding with subsequent analysis. PMID:19796696
Monitoring and evaluation of strategic change programme implementation-Lessons from a case analysis.
Neumann, Jan; Robson, Andrew; Sloan, Diane
2018-02-01
This study considered the monitoring and evaluation of a large-scale and domestic and global strategic change programme implementation. It considers the necessary prerequisites to overcome challenges and barriers that prevent systematic and effective monitoring and evaluation to take place alongside its operationalisation. The work involves a case study based on a major industrial company from the energy sector. The change programme makes particular reference to changes in business models, business processes, organisation structures as well as Enterprise Resource Planning infrastructure. The case study focussed on the summative evaluation of the programme post-implementation. This assessment involved 25 semi-structured interviews with employees across a range of managerial strata capturing more than 65 roles within the change programme at both local and global levels. Data relating to their perception of evaluation effectiveness and shortcomings were analysed by means of template analysis. The study identifies responsibilities for executing an evaluation alongside various methods and tools that are appropriate, thereby focussing on the "Who" (roles, responsibility for particular activities) and "How" (methods and tools) rather than "What" to monitor and evaluate. The findings are presented generically so they offer new insights and transferability for practitioners involved in managing strategic change and its associated evaluation. Copyright © 2017 Elsevier Ltd. All rights reserved.
Random-effects meta-analysis: the number of studies matters.
Guolo, Annamaria; Varin, Cristiano
2017-06-01
This paper investigates the impact of the number of studies on meta-analysis and meta-regression within the random-effects model framework. It is frequently neglected that inference in random-effects models requires a substantial number of studies included in meta-analysis to guarantee reliable conclusions. Several authors warn about the risk of inaccurate results of the traditional DerSimonian and Laird approach especially in the common case of meta-analysis involving a limited number of studies. This paper presents a selection of likelihood and non-likelihood methods for inference in meta-analysis proposed to overcome the limitations of the DerSimonian and Laird procedure, with a focus on the effect of the number of studies. The applicability and the performance of the methods are investigated in terms of Type I error rates and empirical power to detect effects, according to scenarios of practical interest. Simulation studies and applications to real meta-analyses highlight that it is not possible to identify an approach uniformly superior to alternatives. The overall recommendation is to avoid the DerSimonian and Laird method when the number of meta-analysis studies is modest and prefer a more comprehensive procedure that compares alternative inferential approaches. R code for meta-analysis according to all of the inferential methods examined in the paper is provided.
Fast Image Texture Classification Using Decision Trees
NASA Technical Reports Server (NTRS)
Thompson, David R.
2011-01-01
Texture analysis would permit improved autonomous, onboard science data interpretation for adaptive navigation, sampling, and downlink decisions. These analyses would assist with terrain analysis and instrument placement in both macroscopic and microscopic image data products. Unfortunately, most state-of-the-art texture analysis demands computationally expensive convolutions of filters involving many floating-point operations. This makes them infeasible for radiation- hardened computers and spaceflight hardware. A new method approximates traditional texture classification of each image pixel with a fast decision-tree classifier. The classifier uses image features derived from simple filtering operations involving integer arithmetic. The texture analysis method is therefore amenable to implementation on FPGA (field-programmable gate array) hardware. Image features based on the "integral image" transform produce descriptive and efficient texture descriptors. Training the decision tree on a set of training data yields a classification scheme that produces reasonable approximations of optimal "texton" analysis at a fraction of the computational cost. A decision-tree learning algorithm employing the traditional k-means criterion of inter-cluster variance is used to learn tree structure from training data. The result is an efficient and accurate summary of surface morphology in images. This work is an evolutionary advance that unites several previous algorithms (k-means clustering, integral images, decision trees) and applies them to a new problem domain (morphology analysis for autonomous science during remote exploration). Advantages include order-of-magnitude improvements in runtime, feasibility for FPGA hardware, and significant improvements in texture classification accuracy.
Brown, Ryan M; Meah, Christopher J; Heath, Victoria L; Styles, Iain B; Bicknell, Roy
2016-01-01
Angiogenesis involves the generation of new blood vessels from the existing vasculature and is dependent on many growth factors and signaling events. In vivo angiogenesis is dynamic and complex, meaning assays are commonly utilized to explore specific targets for research into this area. Tube-forming assays offer an excellent overview of the molecular processes in angiogenesis. The Matrigel tube forming assay is a simple-to-implement but powerful tool for identifying biomolecules involved in angiogenesis. A detailed experimental protocol on the implementation of the assay is described in conjunction with an in-depth review of methods that can be applied to the analysis of the tube formation. In addition, an ImageJ plug-in is presented which allows automatic quantification of tube images reducing analysis times while removing user bias and subjectivity.
Selective Catalytic Combustion Sensors for Reactive Organic Analysis
NASA Technical Reports Server (NTRS)
Innes, W. B.
1971-01-01
Sensors involving a vanadia-alumina catalyst bed-thermocouple assembly satisfy requirements for simple, reproducible and rapid continuous analysis or reactive organics. Responses generally increase with temperature to 400 C and increase to a maximum with flow rate/catalyst volume. Selectivity decreases with temperature. Response time decreases with flow rate and increases with catalyst volume. At chosen optimum conditions calculated response which is additive and linear agrees better with photochemical reactivity than other methods for various automotive sources, and response to vehicle exhaust is insensitive to flow rate. Application to measurement of total reactive organics in vehicle exhaust as well as for gas chromatography detection illustrate utility. The approach appears generally applicable to high thermal effect reactions involving first order kinetics.
Verstraeten, B.; Sermeus, J.; Salenbien, R.; Fivez, J.; Shkerdin, G.; Glorieux, C.
2015-01-01
The underlying working principle of detecting impulsive stimulated scattering signals in a differential configuration of heterodyne diffraction detection is unraveled by involving optical scattering theory. The feasibility of the method for the thermoelastic characterization of coating-substrate systems is demonstrated on the basis of simulated data containing typical levels of noise. Besides the classical analysis of the photoacoustic part of the signals, which involves fitting surface acoustic wave dispersion curves, the photothermal part of the signals is analyzed by introducing thermal wave dispersion curves to represent and interpret their grating wavelength dependence. The intrinsic possibilities and limitations of both inverse problems are quantified by making use of least and most squares analysis. PMID:26236643
A survey of automated methods for sensemaking support
NASA Astrophysics Data System (ADS)
Llinas, James
2014-05-01
Complex, dynamic problems in general present a challenge for the design of analysis support systems and tools largely because there is limited reliable a priori procedural knowledge descriptive of the dynamic processes in the environment. Problem domains that are non-cooperative or adversarial impute added difficulties involving suboptimal observational data and/or data containing the effects of deception or covertness. The fundamental nature of analysis in these environments is based on composite approaches involving mining or foraging over the evidence, discovery and learning processes, and the synthesis of fragmented hypotheses; together, these can be labeled as sensemaking procedures. This paper reviews and analyzes the features, benefits, and limitations of a variety of automated techniques that offer possible support to sensemaking processes in these problem domains.
Ren, Jimin; Sherry, A. Dean; Malloy, Craig R.
2015-01-01
Inversion transfer (IT) is a well-established technique with multiple attractive features for analysis of kinetics. However, its application in measurement of ATP synthesis rate in vivo has lagged behind the more common ST techniques. One well-recognized issue with IT is the complexity of data analysis in comparison to much simpler analysis by ST. This complexity arises, in part, because the γ-ATP spin is involved in multiple chemical reactions and magnetization exchanges, whereas Pi is involved in a single reaction, Pi → γ-ATP. By considering the reactions involving γ-ATP only as a lumped constant, the rate constant for the reaction of physiological interest, kPi→γATP, can be determined. Here, we present a new IT data analysis method to evaluate kPi→γATP using data collected from resting human skeletal muscle at 7T. The method is based on the basic Bloch-McConnell equation, which relates kPi→γATP with ṁPi, the rate of Pi magnetization change. The kPi→γATP value is accessed from ṁPi data by more familiar linear correlation approaches. For a group of human subjects (n = 15), the kPi→γATP value derived for resting calf muscle was 0.066 ± 0.017 s−1, in agreement with literature reported values. In this study we also explored possible time-saving strategies to speed up data acquisition for kPi→γATP evaluation using simulations. The analysis indicates that it is feasible to carry out a 31P inversion transfer experiment in ~10 minutes or shorter at 7T with reasonable outcome in kPi→γATP variance for measurement of ATP synthesis in resting human skeletal muscle. We believe that this new IT data analysis approach will facilitate the wide acceptance of IT to evaluate ATP synthesis rate in vivo. PMID:25943328
NASA Astrophysics Data System (ADS)
Szafranko, Elżbieta
2017-10-01
Assessment of variant solutions developed for a building investment project needs to be made at the stage of planning. While considering alternative solutions, the investor defines various criteria, but a direct evaluation of the degree of their fulfilment by developed variant solutions can be very difficult. In practice, there are different methods which enable the user to include a large number of parameters into an analysis, but their implementation can be challenging. Some methods require advanced mathematical computations, preceded by complicating input data processing, and the generated results may not lend themselves easily to interpretation. Hence, during her research, the author has developed a systemic approach, which involves several methods and whose goal is to compare their outcome. The final stage of the proposed method consists of graphic interpretation of results. The method has been tested on a variety of building and development projects.
Newmark-Beta-FDTD method for super-resolution analysis of time reversal waves
NASA Astrophysics Data System (ADS)
Shi, Sheng-Bing; Shao, Wei; Ma, Jing; Jin, Congjun; Wang, Xiao-Hua
2017-09-01
In this work, a new unconditionally stable finite-difference time-domain (FDTD) method with the split-field perfectly matched layer (PML) is proposed for the analysis of time reversal (TR) waves. The proposed method is very suitable for multiscale problems involving microstructures. The spatial and temporal derivatives in this method are discretized by the central difference technique and Newmark-Beta algorithm, respectively, and the derivation results in the calculation of a banded-sparse matrix equation. Since the coefficient matrix keeps unchanged during the whole simulation process, the lower-upper (LU) decomposition of the matrix needs to be performed only once at the beginning of the calculation. Moreover, the reverse Cuthill-Mckee (RCM) technique, an effective preprocessing technique in bandwidth compression of sparse matrices, is used to improve computational efficiency. The super-resolution focusing of TR wave propagation in two- and three-dimensional spaces is included to validate the accuracy and efficiency of the proposed method.
Developing techniques for cause-responsibility analysis of occupational accidents.
Jabbari, Mousa; Ghorbani, Roghayeh
2016-11-01
The aim of this study was to specify the causes of occupational accidents, determine social responsibility and the role of groups involved in work-related accidents. This study develops occupational accidents causes tree, occupational accidents responsibility tree, and occupational accidents component-responsibility analysis worksheet; based on these methods, it develops cause-responsibility analysis (CRA) techniques, and for testing them, analyzes 100 fatal/disabling occupational accidents in the construction setting that were randomly selected from all the work-related accidents in Tehran, Iran, over a 5-year period (2010-2014). The main result of this study involves two techniques for CRA: occupational accidents tree analysis (OATA) and occupational accidents components analysis (OACA), used in parallel for determination of responsible groups and responsibilities rate. From the results, we find that the management group of construction projects has 74.65% responsibility of work-related accidents. The developed techniques are purposeful for occupational accidents investigation/analysis, especially for the determination of detailed list of tasks, responsibilities, and their rates. Therefore, it is useful for preventing work-related accidents by focusing on the responsible group's duties. Copyright © 2016 Elsevier Ltd. All rights reserved.
On the convergence of nanotechnology and Big Data analysis for computer-aided diagnosis.
Rodrigues, Jose F; Paulovich, Fernando V; de Oliveira, Maria Cf; de Oliveira, Osvaldo N
2016-04-01
An overview is provided of the challenges involved in building computer-aided diagnosis systems capable of precise medical diagnostics based on integration and interpretation of data from different sources and formats. The availability of massive amounts of data and computational methods associated with the Big Data paradigm has brought hope that such systems may soon be available in routine clinical practices, which is not the case today. We focus on visual and machine learning analysis of medical data acquired with varied nanotech-based techniques and on methods for Big Data infrastructure. Because diagnosis is essentially a classification task, we address the machine learning techniques with supervised and unsupervised classification, making a critical assessment of the progress already made in the medical field and the prospects for the near future. We also advocate that successful computer-aided diagnosis requires a merge of methods and concepts from nanotechnology and Big Data analysis.
Comparative analysis of methods for real-time analytical control of chemotherapies preparations.
Bazin, Christophe; Cassard, Bruno; Caudron, Eric; Prognon, Patrice; Havard, Laurent
2015-10-15
Control of chemotherapies preparations are now an obligation in France, though analytical control is compulsory. Several methods are available and none of them is presumed as ideal. We wanted to compare them so as to determine which one could be the best choice. We compared non analytical (visual and video-assisted, gravimetric) and analytical (HPLC/FIA, UV/FT-IR, UV/Raman, Raman) methods thanks to our experience and a SWOT analysis. The results of the analysis show great differences between the techniques, but as expected none us them is without defects. However they can probably be used in synergy. Overall for the pharmacist willing to get involved, the implementation of the control for chemotherapies preparations must be widely anticipated, with the listing of every parameter, and remains according to us an analyst's job. Copyright © 2015 Elsevier B.V. All rights reserved.
Applications of non-parametric statistics and analysis of variance on sample variances
NASA Technical Reports Server (NTRS)
Myers, R. H.
1981-01-01
Nonparametric methods that are available for NASA-type applications are discussed. An attempt will be made here to survey what can be used, to attempt recommendations as to when each would be applicable, and to compare the methods, when possible, with the usual normal-theory procedures that are avavilable for the Gaussion analog. It is important here to point out the hypotheses that are being tested, the assumptions that are being made, and limitations of the nonparametric procedures. The appropriateness of doing analysis of variance on sample variances are also discussed and studied. This procedure is followed in several NASA simulation projects. On the surface this would appear to be reasonably sound procedure. However, difficulties involved center around the normality problem and the basic homogeneous variance assumption that is mase in usual analysis of variance problems. These difficulties discussed and guidelines given for using the methods.
Validation of quantitative method for azoxystrobin residues in green beans and peas.
Abdelraheem, Ehab M H; Hassan, Sayed M; Arief, Mohamed M H; Mohammad, Somaia G
2015-09-01
This study presents a method validation for extraction and quantitative analysis of azoxystrobin residues in green beans and peas using HPLC-UV and the results confirmed by GC-MS. The employed method involved initial extraction with acetonitrile after the addition of salts (magnesium sulfate and sodium chloride), followed by a cleanup step by activated neutral carbon. Validation parameters; linearity, matrix effect, LOQ, specificity, trueness and repeatability precision were attained. The spiking levels for the trueness and the precision experiments were (0.1, 0.5, 3 mg/kg). For HPLC-UV analysis, mean recoveries ranged between 83.69% to 91.58% and 81.99% to 107.85% for green beans and peas, respectively. For GC-MS analysis, mean recoveries ranged from 76.29% to 94.56% and 80.77% to 100.91% for green beans and peas, respectively. According to these results, the method has been proven to be efficient for extraction and determination of azoxystrobin residues in green beans and peas. Copyright © 2015 Elsevier Ltd. All rights reserved.
Spectral analysis method and sample generation for real time visualization of speech
NASA Astrophysics Data System (ADS)
Hobohm, Klaus
A method for translating speech signals into optical models, characterized by high sound discrimination and learnability and designed to provide to deaf persons a feedback towards control of their way of speaking, is presented. Important properties of speech production and perception processes and organs involved in these mechanisms are recalled in order to define requirements for speech visualization. It is established that the spectral representation of time, frequency and amplitude resolution of hearing must be fair and continuous variations of acoustic parameters of speech signal must be depicted by a continuous variation of images. A color table was developed for dynamic illustration and sonograms were generated with five spectral analysis methods such as Fourier transformations and linear prediction coding. For evaluating sonogram quality, test persons had to recognize consonant/vocal/consonant words and an optimized analysis method was achieved with a fast Fourier transformation and a postprocessor. A hardware concept of a real time speech visualization system, based on multiprocessor technology in a personal computer, is presented.
Fusing modeling techniques to support domain analysis for reuse opportunities identification
NASA Technical Reports Server (NTRS)
Hall, Susan Main; Mcguire, Eileen
1993-01-01
Functional modeling techniques or object-oriented graphical representations, which are more useful to someone trying to understand the general design or high level requirements of a system? For a recent domain analysis effort, the answer was a fusion of popular modeling techniques of both types. By using both functional and object-oriented techniques, the analysts involved were able to lean on their experience in function oriented software development, while taking advantage of the descriptive power available in object oriented models. In addition, a base of familiar modeling methods permitted the group of mostly new domain analysts to learn the details of the domain analysis process while producing a quality product. This paper describes the background of this project and then provides a high level definition of domain analysis. The majority of this paper focuses on the modeling method developed and utilized during this analysis effort.
NASA Astrophysics Data System (ADS)
García-Moreno, Angel-Iván; González-Barbosa, José-Joel; Ramírez-Pedraza, Alfonso; Hurtado-Ramos, Juan B.; Ornelas-Rodriguez, Francisco-Javier
2016-04-01
Computer-based reconstruction models can be used to approximate urban environments. These models are usually based on several mathematical approximations and the usage of different sensors, which implies dependency on many variables. The sensitivity analysis presented in this paper is used to weigh the relative importance of each uncertainty contributor into the calibration of a panoramic camera-LiDAR system. Both sensors are used for three-dimensional urban reconstruction. Simulated and experimental tests were conducted. For the simulated tests we analyze and compare the calibration parameters using the Monte Carlo and Latin hypercube sampling techniques. Sensitivity analysis for each variable involved into the calibration was computed by the Sobol method, which is based on the analysis of the variance breakdown, and the Fourier amplitude sensitivity test method, which is based on Fourier's analysis. Sensitivity analysis is an essential tool in simulation modeling and for performing error propagation assessments.
Non-Gradient Blue Native Polyacrylamide Gel Electrophoresis.
Luo, Xiaoting; Wu, Jinzi; Jin, Zhen; Yan, Liang-Jun
2017-02-02
Gradient blue native polyacrylamide gel electrophoresis (BN-PAGE) is a well established and widely used technique for activity analysis of high-molecular-weight proteins, protein complexes, and protein-protein interactions. Since its inception in the early 1990s, a variety of minor modifications have been made to this gradient gel analytical method. Here we provide a major modification of the method, which we call non-gradient BN-PAGE. The procedure, similar to that of non-gradient SDS-PAGE, is simple because there is no expensive gradient maker involved. The non-gradient BN-PAGE protocols presented herein provide guidelines on the analysis of mitochondrial protein complexes, in particular, dihydrolipoamide dehydrogenase (DLDH) and those in the electron transport chain. Protocols for the analysis of blood esterases or mitochondrial esterases are also presented. The non-gradient BN-PAGE method may be tailored for analysis of specific proteins according to their molecular weight regardless of whether the target proteins are hydrophobic or hydrophilic. © 2017 by John Wiley & Sons, Inc. Copyright © 2017 John Wiley & Sons, Inc.
Detection of incipient defects in cables by partial discharge signal analysis
NASA Astrophysics Data System (ADS)
Martzloff, F. D.; Simmon, E.; Steiner, J. P.; Vanbrunt, R. J.
1992-07-01
As one of the objectives of a program aimed at assessing test methods for in-situ detection of incipient defects in cables due to aging, a laboratory test system was implemented to demonstrate that the partial discharge analysis method can be successfully applied to low-voltage cables. Previous investigations generally involved cables rated 5 kV or higher, while the objective of the program focused on the lower voltages associated with the safety systems of nuclear power plants. The defect detection system implemented for the project was based on commercially available signal analysis hardware and software packages, customized for the specific purposes of the project. The test specimens included several cables of the type found in nuclear power plants, including artificial defects introduced at various points of the cable. The results indicate that indeed, partial discharge analysis is capable of detecting incipient defects in low-voltage cables. There are, however, some limitations of technical and non-technical nature that need further exploration before this method can be accepted in the industry.
Research priorities in occupational safety and health: a review.
Iavicoli, Sergio; Rondinone, Bruna; Marinaccio, Alessandro; Fingerhut, Marilyn
2006-01-01
Changes in the world of work in the last few decades have markedly affected questions regarding occupational safety and health (OSH). Jobs in our economy continue to shift from manufacturing to services. Longer hours, shift work, reduced job security, temporary work are realities in the modern workplace, new chemicals, materials, processes are developed at an ever accelerating pace. The workforce is also changing. It will become older and more racially diverse and women are increasing. These changes present new challenges to protect worker safety and health and it was been indispensable to redefine priorities, by consulting all those involved in OSH. The present study therefore made a critical comparative analysis of the main published projects to identify research priorities in the OSH field, comparing methods, approaches and results. Comparison of the priority areas established in each of these studies is inherently difficult due to differences in socio-cultural backgrounds, in the methods employed to identify priority topics, and the many factors involved. However, it is clear that the Delphi technique is widely used as a reliable method, in that it covers a broad range of qualified witnesses, from a variety of backgrounds--such as trade union representatives and researchers--providing different viewpoints. It also takes account of the intrinsic features of OSH which--compared to other disciplines--involves multidisciplinary factors calling into play a range of scientific settings, such as toxicologists, molecular biologists, epidemiologists, occupational hygienists and occupational physicians. This analysis showed how important it is to reach consensus among all those operating in the OSH sector, in order to establish standard methods that can be applied in different contexts, and give results that can be validly compared.
Phung, Viet-Hai; Essam, Nadya; Asghar, Zahid; Spaight, Anne; Siriwardena, Aloysius N
2016-02-01
Clinical leadership and organizational culture are important contextual factors for quality improvement (QI) but the relationship between these and with organizational change is complex and poorly understood. We aimed to explore the relationship between clinical leadership, culture of innovation and clinical engagement in QI within a national ambulance QI Collaborative (QIC). We used a self-administered online questionnaire survey sent to front-line clinicians in all 12 English ambulance services. We conducted a cross-sectional analysis of quantitative data and qualitative analysis of free-text responses. There were 2743 (12% of 22 117) responses from 11 of the 12 participating ambulance services. In the 3% of responders that were directly involved with the QIC, leadership behaviour was significantly higher than for those not directly involved. QIC involvement made no significant difference to responders' perceptions of the culture of innovation in their organization, which was generally considered poor. Although uptake of QI methods was low overall, QIC members were significantly more likely to use QI methods, which were also significantly associated with leadership behaviour. Despite a limited organizational culture of innovation, clinical leadership and use of QI methods in ambulance services generally, the QIC achieved its aims to significantly improve pre-hospital care for acute myocardial infarction and stroke. We postulate that this was mediated through an improvement subculture, linked to the QIC, which facilitated large-scale improvement by stimulating leadership and QI methods. Further research is needed to understand success factors for QI in complex health care environments. © 2016 The Authors. Journal of Evaluation in Clinical Practice published by John Wiley & Sons, Ltd.
ARBAN-A new method for analysis of ergonomic effort.
Holzmann, P
1982-06-01
ARBAN is a method for the ergonomic analysis of work, including work situations which involve widely differing body postures and loads. The idea of the method is thal all phases of the analysis process that imply specific knowledge on ergonomics are teken over by filming equipment and a computer routine. All tasks that must be carried out by the investigator in the process of analysis are so designed that they appear as evident by the use of systematic common sense. The ARBAN analysis method contains four steps: 1. Recording of the workplace situation on video or film. 2. Coding the posture and load situation at a number of closely spaced 'frozen' situations. 3. Computerisation. 4. Evaluation of the results. The computer calculates figures for the total ergonomic stress on the whole body as well as on different parts of the body separately. They are presented as 'Ergonomic stress/ time curves', where the heavy load situations occur as peaks of the curve. The work cycle may also be divided into different tasks, where the stress and duration patterns can be compared. The integral of the curves are calculated for single-figure comparison of different tasks as well as different work situations.
Expansion of Microbial Forensics
Schmedes, Sarah E.; Sajantila, Antti
2016-01-01
Microbial forensics has been defined as the discipline of applying scientific methods to the analysis of evidence related to bioterrorism, biocrimes, hoaxes, or the accidental release of a biological agent or toxin for attribution purposes. Over the past 15 years, technology, particularly massively parallel sequencing, and bioinformatics advances now allow the characterization of microorganisms for a variety of human forensic applications, such as human identification, body fluid characterization, postmortem interval estimation, and biocrimes involving tracking of infectious agents. Thus, microbial forensics should be more broadly described as the discipline of applying scientific methods to the analysis of microbial evidence in criminal and civil cases for investigative purposes. PMID:26912746
Development of solution techniques for nonlinear structural analysis
NASA Technical Reports Server (NTRS)
Vos, R. G.; Andrews, J. S.
1974-01-01
Nonlinear structural solution methods in the current research literature are classified according to order of the solution scheme, and it is shown that the analytical tools for these methods are uniformly derivable by perturbation techniques. A new perturbation formulation is developed for treating an arbitrary nonlinear material, in terms of a finite-difference generated stress-strain expansion. Nonlinear geometric effects are included in an explicit manner by appropriate definition of an applicable strain tensor. A new finite-element pilot computer program PANES (Program for Analysis of Nonlinear Equilibrium and Stability) is presented for treatment of problems involving material and geometric nonlinearities, as well as certain forms on nonconservative loading.
Replications and Extensions in Arousal Assessment for Sex Offenders with Developmental Disabilities
ERIC Educational Resources Information Center
Reyes, Jorge R.; Vollmer, Timothy R.; Hall, Astrid
2011-01-01
Three adult male sex offenders with developmental disabilities participated in phallometric assessments that involved repeated measures of arousal when exposed to various stimuli. Arousal assessment outcomes were similar to those obtained by Reyes et al. (2006). Additional data-analysis methods provided further information about sexual…
Program Evaluation of a School District's Multisensory Reading Initiative
ERIC Educational Resources Information Center
Asip, Michael Patrick
2012-01-01
The purpose of this study was to conduct a formative program evaluation of a school district's multisensory reading initiative. The mixed methods study involved semi-structured interviews, online survey, focus groups, document review, and analysis of extant special education student reading achievement data. Participants included elementary…
Infantilism: Theoretical Construct and Operationalization
ERIC Educational Resources Information Center
Sabelnikova, Y. V.; Khmeleva, N. L.
2018-01-01
The aim of this article is to define and operationalize the construct of infantilism. The methods of theoretical research involve analysis and synthesis. Age and content criteria are analyzed for childhood and adulthood. Infantile traits in an adult are described. Results: The characteristics of adult infantilism in the modern world are defined,…
An Informatics Approach to Establishing a Sustainable Public Health Community
ERIC Educational Resources Information Center
Kriseman, Jeffrey Michael
2012-01-01
This work involved the analysis of a public health system, and the design, development and deployment of enterprise informatics architecture, and sustainable community methods to address problems with the current public health system. Specifically, assessment of the Nationally Notifiable Disease Surveillance System (NNDSS) was instrumental in…
Cognitive Mapping Tobacco Control Advice for Dentistry: A Dental PBRN Study
ERIC Educational Resources Information Center
Qu, Haiyan; Houston, Thomas K.; Williams, Jessica H.; Gilbert, Gregg H.; Shewchuk, Richard M.
2011-01-01
Objective: To identify facilitative strategies that could be used in developing a tobacco cessation program for community dental practices. Methods: Nominal group technique (NGT) meetings and a card-sort task were used to obtain formative data. A cognitive mapping approach involving multidimensional scaling and hierarchical cluster analysis was…
The ability of infectious oocyst forms of Toxoplasma gondii and Cryptosporidium spp. to resist disinfection treatments and cause disease may have significant public health implications. Currently, little is known about oocyst-specific factors involved during host cell invasion p...
Psychosocial and Cognitive Functioning of Children with Specific Profiles of Maltreatment
ERIC Educational Resources Information Center
Pears, Katherine C.; Kim, Hyoun K.; Fisher, Philip A.
2008-01-01
Objective: Up to 90% of child welfare system cases involve multiple types of maltreatment; however, studies have rarely incorporated multiple dimensions of maltreatment. The present study employed a latent profile analysis to identify naturally occurring subgroups of children who had experienced maltreatment. Methods: Reports of maltreatment…
ERIC Educational Resources Information Center
Cameron, David Lansing
2014-01-01
Teacher-student interactions in 17 inclusive classrooms were examined using a mixed-methods approach that involved quantitative analysis of interactions recorded during classroom observations and follow-up interviews with seven general educators. Observational findings suggest that classrooms were organised along traditional lines with the vast…
Merging Quality Processes & Tools with DACUM.
ERIC Educational Resources Information Center
McLennan, Krystyna S.
This paper explains how merging DACUM (Developing a Curriculum) analysis with quality initiatives can reduce waste, increase job efficiency, assist in development of standard operating procedures, and involve employees in positive job improvement methods. In the first half of the paper, the following principles of total quality management (TQM)…
The generic method described here involves typical capillary electrophoresis (CE) techniques, with the addition of cyclodextrin chiral selectors to the electrolyte for enantiomer separation and also, in the case of neutral analytes, the further addition of a micelle forming comp...
Determination of Acidity Constants by Gradient Flow-Injection Titration
ERIC Educational Resources Information Center
Conceicao, Antonio C. L.; Minas da Piedade, Manuel E.
2006-01-01
A three-hour laboratory experiment, designed for an advanced undergraduate course in instrumental analysis that illustrates the application of the gradient chamber flow-injection titration (GCFIT) method with spectrophotometric detection to determine acidity constants is presented. The procedure involves the use of an acid-base indicator to obtain…
Quantified Academic Selves: The Gamification of Research through Social Networking Services
ERIC Educational Resources Information Center
Hammarfelt, Björn; de Rijcke, Sarah; Rushforth, Alexander D.
2016-01-01
Introduction: Our study critically engages with techniques of self-quantification in contemporary academia, by demonstrating how social networking services enact research and scholarly communication as a "game". Method: The empirical part of the study involves an analysis of two leading platforms: Impactstory and ResearchGate. Observed…
ERIC Educational Resources Information Center
Tucker, Stephen I.; Lommatsch, Christina W.; Moyer-Packenham, Patricia S.; Anderson-Pence, Katie L.; Symanzik, Jürgen
2017-01-01
The purpose of this study was to examine patterns of mathematical practices evident during children's interactions with touchscreen mathematics virtual manipulatives. Researchers analyzed 33 Kindergarten children's interactions during activities involving apps featuring mathematical content of early number sense or quantity in base ten, recorded…
Effects of Individual Development Accounts (IDAs) on Household Wealth and Saving Taste
ERIC Educational Resources Information Center
Huang, Jin
2010-01-01
This study examines effects of individual development accounts (IDAs) on household wealth of low-income participants. Methods: This study uses longitudinal survey data from the American Dream Demonstration (ADD) involving experimental design (treatment group = 537, control group = 566). Results: Results from quantile regression analysis indicate…
Using Visualization and Computation in the Analysis of Separation Processes
ERIC Educational Resources Information Center
Joo, Yong Lak; Choudhary, Devashish
2006-01-01
For decades, every chemical engineer has been asked to have a background in separations. The required separations course can, however, be uninspiring and superficial because understanding many separation processes involves conventional graphical methods and commercial process simulators. We utilize simple, user-friendly mathematical software,…
End-of-Life Caregiver's Perspectives on Their Role: Generative Caregiving
ERIC Educational Resources Information Center
Phillips, Linda R.; Reed, Pamela G.
2010-01-01
Purpose: To describe caregivers' constructions of their caregiving role in providing care to elders they knew were dying from life-limiting illnesses. Design and Methods: Study involved in-depth interviews with 27 family caregivers. Data were analyzed using constant comparative analysis. Results: Four categories were identified: centering life on…
The Value of Accuracy in Information for Planning and Control
ERIC Educational Resources Information Center
Higgins, J. C.
1974-01-01
The author discusses some approaches to assessing the impact of inaccurate information when the planning system involves formulae of the management accounting type or models of the operational research variety. The most appropriate method for quantifying information value in management information systems is through Bayesian analysis and decision…
Analysis of Learning Conceptions Based on Three Modules.
ERIC Educational Resources Information Center
Haygood, E. Langston; Iran-Nejad, Asghar
Three learning modules are described and investigated as they reflect different students' conceptions of and approaches to learning. The Schoolwork Module (SWM) focuses on task performance and involves a passive, incremental, piecemeal, and rote memory method of learning, parallel to what might be implied by the Information Processing model of…
Change in Classroom Relations: An Attempt that Signals Some Difficulties.
ERIC Educational Resources Information Center
Gutierrez, Roberto
2002-01-01
The instructor of a human resource class proposed a different division of labor between teacher and students. Analysis of four critical class incidents (essay sharing, class discussion, prejudices involved in a student presentation, student objections to course methods) showed that students preferred to preserve their identity as consumers and…
Teacher Acquisition of Functional Analysis Methods Using Pyramidal Training
ERIC Educational Resources Information Center
Pence, Sacha T.; St. Peter, Claire C.; Giles, Aimee F.
2014-01-01
Pyramidal training involves an experienced professional training a subset of individuals who, in turn, train additional individuals. Pyramidal training is effective for training a variety of behavior-analytic skills with direct-care staff, parents, and teachers. As teachers' roles in behavioral assessment increase, pyramidal training may be…
Research in remote sensing of agriculture, earth resources, and man's environment
NASA Technical Reports Server (NTRS)
Landgrebe, D. A.
1975-01-01
Progress is reported for several projects involving the utilization of LANDSAT remote sensing capabilities. Areas under study include crop inventory, crop identification, crop yield prediction, forest resources evaluation, land resources evaluation and soil classification. Numerical methods for image processing are discussed, particularly those for image enhancement and analysis.
Microrheology with optical tweezers: measuring the relative viscosity of solutions 'at a glance'.
Tassieri, Manlio; Del Giudice, Francesco; Robertson, Emma J; Jain, Neena; Fries, Bettina; Wilson, Rab; Glidle, Andrew; Greco, Francesco; Netti, Paolo Antonio; Maffettone, Pier Luca; Bicanic, Tihana; Cooper, Jonathan M
2015-03-06
We present a straightforward method for measuring the relative viscosity of fluids via a simple graphical analysis of the normalised position autocorrelation function of an optically trapped bead, without the need of embarking on laborious calculations. The advantages of the proposed microrheology method are evident when it is adopted for measurements of materials whose availability is limited, such as those involved in biological studies. The method has been validated by direct comparison with conventional bulk rheology methods, and has been applied both to characterise synthetic linear polyelectrolytes solutions and to study biomedical samples.
Microrheology with Optical Tweezers: Measuring the relative viscosity of solutions ‘at a glance'
Tassieri, Manlio; Giudice, Francesco Del; Robertson, Emma J.; Jain, Neena; Fries, Bettina; Wilson, Rab; Glidle, Andrew; Greco, Francesco; Netti, Paolo Antonio; Maffettone, Pier Luca; Bicanic, Tihana; Cooper, Jonathan M.
2015-01-01
We present a straightforward method for measuring the relative viscosity of fluids via a simple graphical analysis of the normalised position autocorrelation function of an optically trapped bead, without the need of embarking on laborious calculations. The advantages of the proposed microrheology method are evident when it is adopted for measurements of materials whose availability is limited, such as those involved in biological studies. The method has been validated by direct comparison with conventional bulk rheology methods, and has been applied both to characterise synthetic linear polyelectrolytes solutions and to study biomedical samples. PMID:25743468
Laser Pencil Beam Based Techniques for Visualization and Analysis of Interfaces Between Media
NASA Technical Reports Server (NTRS)
Adamovsky, Grigory; Giles, Sammie, Jr.
1998-01-01
Traditional optical methods that include interferometry, Schlieren, and shadowgraphy have been used successfully for visualization and evaluation of various media. Aerodynamics and hydrodynamics are major fields where these methods have been applied. However, these methods have such major drawbacks as a relatively low power density and suppression of the secondary order phenomena. A novel method introduced at NASA Lewis Research Center minimizes disadvantages of the 'classical' methods. The method involves a narrow pencil-like beam that penetrates a medium of interest. The paper describes the laser pencil beam flow visualization methods in detail. Various system configurations are presented. The paper also discusses interfaces between media in general terms and provides examples of interfaces.
REGIONAL-SCALE WIND FIELD CLASSIFICATION EMPLOYING CLUSTER ANALYSIS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Glascoe, L G; Glaser, R E; Chin, H S
2004-06-17
The classification of time-varying multivariate regional-scale wind fields at a specific location can assist event planning as well as consequence and risk analysis. Further, wind field classification involves data transformation and inference techniques that effectively characterize stochastic wind field variation. Such a classification scheme is potentially useful for addressing overall atmospheric transport uncertainty and meteorological parameter sensitivity issues. Different methods to classify wind fields over a location include the principal component analysis of wind data (e.g., Hardy and Walton, 1978) and the use of cluster analysis for wind data (e.g., Green et al., 1992; Kaufmann and Weber, 1996). The goalmore » of this study is to use a clustering method to classify the winds of a gridded data set, i.e, from meteorological simulations generated by a forecast model.« less
Model for spectral and chromatographic data
Jarman, Kristin [Richland, WA; Willse, Alan [Richland, WA; Wahl, Karen [Richland, WA; Wahl, Jon [Richland, WA
2002-11-26
A method and apparatus using a spectral analysis technique are disclosed. In one form of the invention, probabilities are selected to characterize the presence (and in another form, also a quantification of a characteristic) of peaks in an indexed data set for samples that match a reference species, and other probabilities are selected for samples that do not match the reference species. An indexed data set is acquired for a sample, and a determination is made according to techniques exemplified herein as to whether the sample matches or does not match the reference species. When quantification of peak characteristics is undertaken, the model is appropriately expanded, and the analysis accounts for the characteristic model and data. Further techniques are provided to apply the methods and apparatuses to process control, cluster analysis, hypothesis testing, analysis of variance, and other procedures involving multiple comparisons of indexed data.
Zou, Ling; Guo, Qian; Xu, Yi; Yang, Biao; Jiao, Zhuqing; Xiang, Jianbo
2016-04-29
Functional magnetic resonance imaging (fMRI) is an important tool in neuroscience for assessing connectivity and interactions between distant areas of the brain. To find and characterize the coherent patterns of brain activity as a means of identifying brain systems for the cognitive reappraisal of the emotion task, both density-based k-means clustering and independent component analysis (ICA) methods can be applied to characterize the interactions between brain regions involved in cognitive reappraisal of emotion. Our results reveal that compared with the ICA method, the density-based k-means clustering method provides a higher sensitivity of polymerization. In addition, it is more sensitive to those relatively weak functional connection regions. Thus, the study concludes that in the process of receiving emotional stimuli, the relatively obvious activation areas are mainly distributed in the frontal lobe, cingulum and near the hypothalamus. Furthermore, density-based k-means clustering method creates a more reliable method for follow-up studies of brain functional connectivity.
Mclean, Scott; Salmon, Paul M; Gorman, Adam D; Stevens, Nicholas J; Solomon, Colin
2018-02-01
In the current study, social network analysis (SNA) and notational analysis (NA) methods were applied to examine the goal scoring passing networks (GSPN) for all goals scored at the 2016 European Football Championships. The aim of the study was to determine the GSPN characteristics for the overall tournament, between the group and knock out stages, and for the successful and unsuccessful teams. The study also used degree centrality (DC) metrics as a novel method to determine the relative contributions of the pitch locations involved in the GSPN. To determine changes in GSPN characteristics as a function of changing score line, the analysis considered the match status of the game when goals were scored. There were significant differences for SNA metrics as a function of match status, and for the DC metrics in the comparison of the different pitch locations. There were no differences in the SNA metrics for the GSPN between teams in the group and knock out stages, or between the successful and unsuccessful teams. The results indicate that the GSPN had low values for network density, cohesion, connections, and duration. The networks were direct in terms of pitch zones utilised, where 85% of the GSPN included passes that were played within zones or progressed through the zones towards the goal. SNA and NA metrics were significantly different as a function of changing match status. The current study adds to the previous research on goal scoring in football, and demonstrates a novel method to determine the prominent pitch zones involved in the GSPN. These results have implications for match analysis and the coaching process. Copyright © 2017 Elsevier B.V. All rights reserved.
Takeda, Kayoko; Takahashi, Kiyoshi; Masukawa, Hiroyuki; Shimamori, Yoshimitsu
2017-01-01
Recently, the practice of active learning has spread, increasingly recognized as an essential component of academic studies. Classes incorporating small group discussion (SGD) are conducted at many universities. At present, assessments of the effectiveness of SGD have mostly involved evaluation by questionnaires conducted by teachers, by peer assessment, and by self-evaluation of students. However, qualitative data, such as open-ended descriptions by students, have not been widely evaluated. As a result, we have been unable to analyze the processes and methods involved in how students acquire knowledge in SGD. In recent years, due to advances in information and communication technology (ICT), text mining has enabled the analysis of qualitative data. We therefore investigated whether the introduction of a learning system comprising the jigsaw method and problem-based learning (PBL) would improve student attitudes toward learning; we did this by text mining analysis of the content of student reports. We found that by applying the jigsaw method before PBL, we were able to improve student attitudes toward learning and increase the depth of their understanding of the area of study as a result of working with others. The use of text mining to analyze qualitative data also allowed us to understand the processes and methods by which students acquired knowledge in SGD and also changes in students' understanding and performance based on improvements to the class. This finding suggests that the use of text mining to analyze qualitative data could enable teachers to evaluate the effectiveness of various methods employed to improve learning.
"Heroes" and "villains" of world history across cultures.
Hanke, Katja; Liu, James H; Sibley, Chris G; Paez, Dario; Gaines, Stanley O; Moloney, Gail; Leong, Chan-Hoong; Wagner, Wolfgang; Licata, Laurent; Klein, Olivier; Garber, Ilya; Böhm, Gisela; Hilton, Denis J; Valchev, Velichko; Khan, Sammyh S; Cabecinhas, Rosa
2015-01-01
Emergent properties of global political culture were examined using data from the World History Survey (WHS) involving 6,902 university students in 37 countries evaluating 40 figures from world history. Multidimensional scaling and factor analysis techniques found only limited forms of universality in evaluations across Western, Catholic/Orthodox, Muslim, and Asian country clusters. The highest consensus across cultures involved scientific innovators, with Einstein having the most positive evaluation overall. Peaceful humanitarians like Mother Theresa and Gandhi followed. There was much less cross-cultural consistency in the evaluation of negative figures, led by Hitler, Osama bin Laden, and Saddam Hussein. After more traditional empirical methods (e.g., factor analysis) failed to identify meaningful cross-cultural patterns, Latent Profile Analysis (LPA) was used to identify four global representational profiles: Secular and Religious Idealists were overwhelmingly prevalent in Christian countries, and Political Realists were common in Muslim and Asian countries. We discuss possible consequences and interpretations of these different representational profiles.
Phipps, Denham L; Tam, W Vanessa; Ashcroft, Darren M
2017-03-01
To explore the combined use of a critical incident database and work domain analysis to understand patient safety issues in a health-care setting. A retrospective review was conducted of incidents reported to the UK National Reporting and Learning System (NRLS) that involved community pharmacy between April 2005 and August 2010. A work domain analysis of community pharmacy was constructed using observational data from 5 community pharmacies, technical documentation, and a focus group with 6 pharmacists. Reports from the NRLS were mapped onto the model generated by the work domain analysis. Approximately 14,709 incident reports meeting the selection criteria were retrieved from the NRLS. Descriptive statistical analysis of these reports found that almost all of the incidents involved medication and that the most frequently occurring error types were dose/strength errors, incorrect medication, and incorrect formulation. The work domain analysis identified 4 overall purposes for community pharmacy: business viability, health promotion and clinical services, provision of medication, and use of medication. These purposes were served by lower-order characteristics of the work system (such as the functions, processes and objects). The tasks most frequently implicated in the incident reports were those involving medication storage, assembly, or patient medication records. Combining the insights from different analytical methods improves understanding of patient safety problems. Incident reporting data can be used to identify general patterns, whereas the work domain analysis can generate information about the contextual factors that surround a critical task.
Aschbrenner, Kelly A; Pepin, Renee; Mueser, Kim T; Naslund, John A; Rolin, Stephanie A; Faber, Marjan J; Bartels, Stephen J
2014-01-01
Many older persons with serious mental illness (SMI) suffer from high rates of comorbid medical conditions. Although families play a critical role in psychiatric illness management among adults with SMI, their contributions to improving health outcomes in this population has received little attention. This study explored family involvement in medical care for older adults with SMI. This mixed methods study involved analysis of quantitative data collected from older adults with SMI and cardiovascular risk (n = 28) participating in a pilot study of an intervention designed to improve patient-centered primary care augmented by qualitative interviews with their relatives (n = 13) to explore family involvement in medical care. Approximately 89% of older adults with SMI reported family involvement in at least one aspect of their medical care (e.g., medication reminders, medical decision making). However, many family members reported that they were rarely involved in their relative's medical visits, and most did not perceive a need to be involved during routine care. Family members identified obesity as their relative's primary health concern and many wanted guidance from providers on effective strategies for supporting weight loss. Although many family members did not perceive a need to be involved in their relative's routine medical visits, they expressed interest in talking with providers about how to help their relative change unhealthy behaviors. Educating patients, families, and providers about the potential benefits of family involvement in medical care, including routine medical visits for persons with SMI and cardiovascular health risk may promote patient- and family-centered collaboration in this high-risk population.
A strategy to apply quantitative epistasis analysis on developmental traits.
Labocha, Marta K; Yuan, Wang; Aleman-Meza, Boanerges; Zhong, Weiwei
2017-05-15
Genetic interactions are keys to understand complex traits and evolution. Epistasis analysis is an effective method to map genetic interactions. Large-scale quantitative epistasis analysis has been well established for single cells. However, there is a substantial lack of such studies in multicellular organisms and their complex phenotypes such as development. Here we present a method to extend quantitative epistasis analysis to developmental traits. In the nematode Caenorhabditis elegans, we applied RNA interference on mutants to inactivate two genes, used an imaging system to quantitatively measure phenotypes, and developed a set of statistical methods to extract genetic interactions from phenotypic measurement. Using two different C. elegans developmental phenotypes, body length and sex ratio, as examples, we showed that this method could accommodate various metazoan phenotypes with performances comparable to those methods in single cell growth studies. Comparing with qualitative observations, this method of quantitative epistasis enabled detection of new interactions involving subtle phenotypes. For example, several sex-ratio genes were found to interact with brc-1 and brd-1, the orthologs of the human breast cancer genes BRCA1 and BARD1, respectively. We confirmed the brc-1 interactions with the following genes in DNA damage response: C34F6.1, him-3 (ortholog of HORMAD1, HORMAD2), sdc-1, and set-2 (ortholog of SETD1A, SETD1B, KMT2C, KMT2D), validating the effectiveness of our method in detecting genetic interactions. We developed a reliable, high-throughput method for quantitative epistasis analysis of developmental phenotypes.
Suzuki, K; Barbiellini, B; Orikasa, Y; Go, N; Sakurai, H; Kaprzyk, S; Itou, M; Yamamoto, K; Uchimoto, Y; Wang, Yung Jui; Hafiz, H; Bansil, A; Sakurai, Y
2015-02-27
We present an incisive spectroscopic technique for directly probing redox orbitals based on bulk electron momentum density measurements via high-resolution x-ray Compton scattering. Application of our method to spinel Li_{x}Mn_{2}O_{4}, a lithium ion battery cathode material, is discussed. The orbital involved in the lithium insertion and extraction process is shown to mainly be the oxygen 2p orbital. Moreover, the manganese 3d states are shown to experience spatial delocalization involving 0.16±0.05 electrons per Mn site during the battery operation. Our analysis provides a clear understanding of the fundamental redox process involved in the working of a lithium ion battery.
Advanced superposition methods for high speed turbopump vibration analysis
NASA Technical Reports Server (NTRS)
Nielson, C. E.; Campany, A. D.
1981-01-01
The small, high pressure Mark 48 liquid hydrogen turbopump was analyzed and dynamically tested to determine the cause of high speed vibration at an operating speed of 92,400 rpm. This approaches the design point operating speed of 95,000 rpm. The initial dynamic analysis in the design stage and subsequent further analysis of the rotor only dynamics failed to predict the vibration characteristics found during testing. An advanced procedure for dynamics analysis was used in this investigation. The procedure involves developing accurate dynamic models of the rotor assembly and casing assembly by finite element analysis. The dynamically instrumented assemblies are independently rap tested to verify the analytical models. The verified models are then combined by modal superposition techniques to develop a completed turbopump model where dynamic characteristics are determined. The results of the dynamic testing and analysis obtained are presented and methods of moving the high speed vibration characteristics to speeds above the operating range are recommended. Recommendations for use of these advanced dynamic analysis procedures during initial design phases are given.
A two-step FEM-SEM approach for wave propagation analysis in cable structures
NASA Astrophysics Data System (ADS)
Zhang, Songhan; Shen, Ruili; Wang, Tao; De Roeck, Guido; Lombaert, Geert
2018-02-01
Vibration-based methods are among the most widely studied in structural health monitoring (SHM). It is well known, however, that the low-order modes, characterizing the global dynamic behaviour of structures, are relatively insensitive to local damage. Such local damage may be easier to detect by methods based on wave propagation which involve local high frequency behaviour. The present work considers the numerical analysis of wave propagation in cables. A two-step approach is proposed which allows taking into account the cable sag and the distribution of the axial forces in the wave propagation analysis. In the first step, the static deformation and internal forces are obtained by the finite element method (FEM), taking into account geometric nonlinear effects. In the second step, the results from the static analysis are used to define the initial state of the dynamic analysis which is performed by means of the spectral element method (SEM). The use of the SEM in the second step of the analysis allows for a significant reduction in computational costs as compared to a FE analysis. This methodology is first verified by means of a full FE analysis for a single stretched cable. Next, simulations are made to study the effects of damage in a single stretched cable and a cable-supported truss. The results of the simulations show how damage significantly affects the high frequency response, confirming the potential of wave propagation based methods for SHM.
NASA Technical Reports Server (NTRS)
Stolc, Viktor; Samanta, Manoj Pratim; Tongprasit, Waraporn; Marshall, Wallace F.
2005-01-01
The important role that cilia and flagella play in human disease creates an urgent need to identify genes involved in ciliary assembly and function. The strong and specific induction of flagellar-coding genes during flagellar regeneration in Chlamydomonas reinhardtii suggests that transcriptional profiling of such cells would reveal new flagella-related genes. We have conducted a genome-wide analysis of RNA transcript levels during flagellar regeneration in Chlamydomonas by using maskless photolithography method-produced DNA oligonucleotide microarrays with unique probe sequences for all exons of the 19,803 predicted genes. This analysis represents previously uncharacterized whole-genome transcriptional activity profiling study in this important model organism. Analysis of strongly induced genes reveals a large set of known flagellar components and also identifies a number of important disease-related proteins as being involved with cilia and flagella, including the zebrafish polycystic kidney genes Qilin, Reptin, and Pontin, as well as the testis-expressed tubby-like protein TULP2.
Busch, Hauke; Boerries, Melanie; Bao, Jie; Hanke, Sebastian T; Hiss, Manuel; Tiko, Theodhor; Rensing, Stefan A
2013-01-01
Transcription factors (TFs) often trigger developmental decisions, yet, their transcripts are often only moderately regulated and thus not easily detected by conventional statistics on expression data. Here we present a method that allows to determine such genes based on trajectory analysis of time-resolved transcriptome data. As a proof of principle, we have analysed apical stem cells of filamentous moss (P. patens) protonemata that develop from leaflets upon their detachment from the plant. By our novel correlation analysis of the post detachment transcriptome kinetics we predict five out of 1,058 TFs to be involved in the signaling leading to the establishment of pluripotency. Among the predicted regulators is the basic helix loop helix TF PpRSL1, which we show to be involved in the establishment of apical stem cells in P. patens. Our methodology is expected to aid analysis of key players of developmental decisions in complex plant and animal systems.
NASA Astrophysics Data System (ADS)
Zhang, Rui; Jiang, Shuai; Liu, Yi-Rong; Wen, Hui; Feng, Ya-Juan; Huang, Teng; Huang, Wei
2018-05-01
Despite the very important role of atmospheric aerosol nucleation in climate change and air quality, the detailed aerosol nucleation mechanism is still unclear. Here we investigated the formic acid (FA) involved multicomponent nucleation molecular clusters including sulfuric acid (SA), dimethylamine (DMA) and water (W) through a quantum chemical method. The thermodynamics and kinetics analysis was based on the global minima given by Basin-Hopping (BH) algorithm coupled with Density Functional Theory (DFT) and subsequent benchmarked calculations. Then the interaction analysis based on ElectroStatic Potential (ESP), Topological and Atomic Charges analysis was made to characterize the binding features of the clusters. The results show that FA binds weakly with the other molecules in the cluster while W binds more weakly. Further kinetic analysis about the time evolution of the clusters show that even though the formic acid's weak interaction with other nucleation precursors, its effect on sulfuric acid dimer steady state concentration cannot be neglected due to its high concentration in the atmosphere.
Acoustical Applications of the HHT Method
NASA Technical Reports Server (NTRS)
Huang, Norden E.
2003-01-01
A document discusses applications of a method based on the Huang-Hilbert transform (HHT). The method was described, without the HHT name, in Analyzing Time Series Using EMD and Hilbert Spectra (GSC-13817), NASA Tech Briefs, Vol. 24, No. 10 (October 2000), page 63. To recapitulate: The method is especially suitable for analyzing time-series data that represent nonstationary and nonlinear physical phenomena. The method involves the empirical mode decomposition (EMD), in which a complicated signal is decomposed into a finite number of functions, called intrinsic mode functions (IMFs), that admit well-behaved Hilbert transforms. The HHT consists of the combination of EMD and Hilbert spectral analysis.
A Sequential Optimization Sampling Method for Metamodels with Radial Basis Functions
Pan, Guang; Ye, Pengcheng; Yang, Zhidong
2014-01-01
Metamodels have been widely used in engineering design to facilitate analysis and optimization of complex systems that involve computationally expensive simulation programs. The accuracy of metamodels is strongly affected by the sampling methods. In this paper, a new sequential optimization sampling method is proposed. Based on the new sampling method, metamodels can be constructed repeatedly through the addition of sampling points, namely, extrema points of metamodels and minimum points of density function. Afterwards, the more accurate metamodels would be constructed by the procedure above. The validity and effectiveness of proposed sampling method are examined by studying typical numerical examples. PMID:25133206
A Selective Review of Group Selection in High-Dimensional Models
Huang, Jian; Breheny, Patrick; Ma, Shuangge
2013-01-01
Grouping structures arise naturally in many statistical modeling problems. Several methods have been proposed for variable selection that respect grouping structure in variables. Examples include the group LASSO and several concave group selection methods. In this article, we give a selective review of group selection concerning methodological developments, theoretical properties and computational algorithms. We pay particular attention to group selection methods involving concave penalties. We address both group selection and bi-level selection methods. We describe several applications of these methods in nonparametric additive models, semiparametric regression, seemingly unrelated regressions, genomic data analysis and genome wide association studies. We also highlight some issues that require further study. PMID:24174707
Farash, Katherine; Hanson, Erin K.; Ballantyne, Jack
2015-01-01
DNA profiles can be obtained from ‘touch DNA’ evidence, which comprises microscopic traces of human biological material. Current methods for the recovery of trace DNA employ cotton swabs or adhesive tape to sample an area of interest. However, such a ‘blind-swabbing’ approach will co-sample cellular material from the different individuals, even if the individuals’ cells are located in geographically distinct locations on the item. Thus, some of the DNA mixtures encountered in touch DNA samples are artificially created by the swabbing itself. In some instances, a victim’s DNA may be found in significant excess thus masking any potential perpetrator’s DNA. In order to circumvent the challenges with standard recovery and analysis methods, we have developed a lower cost, ‘smart analysis’ method that results in enhanced genetic analysis of touch DNA evidence. We describe an optimized and efficient micromanipulation recovery strategy for the collection of bio-particles present in touch DNA samples, as well as an enhanced amplification strategy involving a one-step 5 µl microvolume lysis/STR amplification to permit the recovery of STR profiles from the bio-particle donor(s). The use of individual or few (i.e., “clumps”) bioparticles results in the ability to obtain single source profiles. These procedures represent alternative enhanced techniques for the isolation and analysis of single bioparticles from forensic touch DNA evidence. While not necessary in every forensic investigation, the method could be highly beneficial for the recovery of a single source perpetrator DNA profile in cases involving physical assault (e.g., strangulation) that may not be possible using standard analysis techniques. Additionally, the strategies developed here offer an opportunity to obtain genetic information at the single cell level from a variety of other non-forensic trace biological material. PMID:25867046
Lee, Bai Qin; Wan Mohamed Radzi, Che Wan Jasimah Bt; Khor, Sook Mei
2016-02-05
This paper reports the application of hexamethyldisilazane-trimethylsilyl trifluoromethanesulfonate (HMDS-TMSOTf) for the simultaneous silylation of 3-monochloro-1,2-propanediol (3-MCPD) and 1,3-dicholoropropanol (1,3-DCP) in solid and liquid food samples. 3-MCPD and 1,3-DCP are chloropropanols that have been established as Group 2B carcinogens in clinical testing. They can be found in heat-processed food, especially when an extended high-temperature treatment is required. However, the current AOAC detection method is time-consuming and expensive. Thus, HMDS-TMSOTf was used in this study to provide a safer, and cost-effective alternative to the HFBI method. Three important steps are involved in the quantification of 3-MCPD and 1,3-DCP: extraction, derivatization and quantification. The optimization of the derivatization process, which involved focusing on the catalyst volume, derivatization temperature, and derivatization time was performed based on the findings obtained from both the Box-Behnken modeling and a real experimental set up. With the optimized conditions, the newly developed method was used for actual food sample quantification and the results were compared with those obtained via the standard AOAC method. The developed method required less samples and reagents but it could be used to achieve lower limits of quantification (0.0043mgL(-1) for 1,3-DCP and 0.0011mgL(-1) for 3-MCPD) and detection (0.0028mgL(-1) for 1,3-DCP and 0.0008mgL(-1) for 3-MCPD). All the detected concentrations are below the maximum tolerable limit of 0.02mgL(-1). The percentage of recovery obtained from food sample analysis was between 83% and 96%. The new procedure was validated with the AOAC method and showed a comparable performance. The HMDS-TMSOTf derivatization strategy is capable of simultaneously derivatizing 1,3-DCP and 3-MCPD at room temperature, and it also serves as a rapid, sensitive, and accurate analytical method for food samples analysis. Copyright © 2015 Elsevier B.V. All rights reserved.
Pataky, Todd C; Robinson, Mark A; Vanrenterghem, Jos
2018-01-03
Statistical power assessment is an important component of hypothesis-driven research but until relatively recently (mid-1990s) no methods were available for assessing power in experiments involving continuum data and in particular those involving one-dimensional (1D) time series. The purpose of this study was to describe how continuum-level power analyses can be used to plan hypothesis-driven biomechanics experiments involving 1D data. In particular, we demonstrate how theory- and pilot-driven 1D effect modeling can be used for sample-size calculations for both single- and multi-subject experiments. For theory-driven power analysis we use the minimum jerk hypothesis and single-subject experiments involving straight-line, planar reaching. For pilot-driven power analysis we use a previously published knee kinematics dataset. Results show that powers on the order of 0.8 can be achieved with relatively small sample sizes, five and ten for within-subject minimum jerk analysis and between-subject knee kinematics, respectively. However, the appropriate sample size depends on a priori justifications of biomechanical meaning and effect size. The main advantage of the proposed technique is that it encourages a priori justification regarding the clinical and/or scientific meaning of particular 1D effects, thereby robustly structuring subsequent experimental inquiry. In short, it shifts focus from a search for significance to a search for non-rejectable hypotheses. Copyright © 2017 Elsevier Ltd. All rights reserved.
Focused ion beam source method and apparatus
Pellin, Michael J.; Lykke, Keith R.; Lill, Thorsten B.
2000-01-01
A focused ion beam having a cross section of submicron diameter, a high ion current, and a narrow energy range is generated from a target comprised of particle source material by laser ablation. The method involves directing a laser beam having a cross section of critical diameter onto the target, producing a cloud of laser ablated particles having unique characteristics, and extracting and focusing a charged particle beam from the laser ablated cloud. The method is especially suited for producing focused ion beams for semiconductor device analysis and modification.
[A quickly methodology for drug intelligence using profiling of illicit heroin samples].
Zhang, Jianxin; Chen, Cunyi
2012-07-01
The aim of the paper was to evaluate a link between two heroin seizures using a descriptive method. The system involved the derivation and gas chromatographic separation of samples followed by a fully automatic data analysis and transfer to a database. Comparisons used the square cosine function between two chromatograms assimilated to vectors. The method showed good discriminatory capabilities. The probability of false positives was extremely slight. In conclusion, this method proved to be efficient and reliable, which appeared suitable for estimating the links between illicit heroin samples.
Stress analysis of circular semimonocoque cylinders with cutouts
NASA Technical Reports Server (NTRS)
Mccomb, Harvey G , Jr
1955-01-01
A method is presented for analyzing the stresses about cutouts in circular semimonocoque cylinders with flexible rings. The method involves the use of so-called perturbation stress distributions which are superposed on the stress distribution that would exist in the structure with no cutout in such a way as to give the effects of a cutout. The method can be used for any loading case for which the structure without the cutout can be analyzed and is sufficiently versatile to account for stringer and shear reinforcement about the cutout.
NASA Technical Reports Server (NTRS)
Young, J. W.; Schy, A. A.; Johnson, K. G.
1977-01-01
An analytical method has been developed for predicting critical control inputs for which nonlinear rotational coupling may cause sudden jumps in aircraft response. The analysis includes the effect of aerodynamics which are nonlinear in angle of attack. The method involves the simultaneous solution of two polynomials in roll rate, whose coefficients are functions of angle of attack and the control inputs. Results obtained using this procedure are compared with calculated time histories to verify the validity of the method for predicting jump-like instabilities.
Methods Used to Support a Life Cycle of Complex Engineering Products
NASA Astrophysics Data System (ADS)
Zakharova, Alexandra A.; Kolegova, Olga A.; Nekrasova, Maria E.; Eremenko, Andrey O.
2016-08-01
Management of companies involved in the design, development and operation of complex engineering products recognize the relevance of creating systems for product lifecycle management. A system of methods is proposed to support life cycles of complex engineering products, based on fuzzy set theory and hierarchical analysis. The system of methods serves to demonstrate the grounds for making strategic decisions in an environment of uncertainty, allows the use of expert knowledge, and provides interconnection of decisions at all phases of strategic management and all stages of a complex engineering product lifecycle.
Purves, Randy W; Khazaei, Hamid; Vandenberg, Albert
2018-08-01
Although faba bean provides environmental and health benefits, vicine and convicine (v-c) limit its use as a source of vegetable protein. Crop improvement efforts to minimize v-c concentration require low-cost, rapid screening methods to distinguish between high and low v-c genotypes to accelerate development of new cultivars and to detect out-crossing events. To assist crop breeders, we developed a unique and rapid screening method that uses a 60 s instrumental analysis step to accurately distinguish between high and low v-c genotypes. The method involves flow injection analysis (FIA) coupled with tandem mass spectrometry (i.e., selective reaction monitoring, SRM). Using seeds with known v-c levels as calibrants, measured v-c levels were comparable with liquid chromatography (LC)-SRM results and the method was used to screen 370 faba bean genotypes. Widespread use of FIA-SRM will accelerate breeding of low v-c faba bean, thereby alleviating concerns about anti-nutritional effects of v-c in this crop. Copyright © 2018 Elsevier Ltd. All rights reserved.
General methods for sensitivity analysis of equilibrium dynamics in patch occupancy models
Miller, David A.W.
2012-01-01
Sensitivity analysis is a useful tool for the study of ecological models that has many potential applications for patch occupancy modeling. Drawing from the rich foundation of existing methods for Markov chain models, I demonstrate new methods for sensitivity analysis of the equilibrium state dynamics of occupancy models. Estimates from three previous studies are used to illustrate the utility of the sensitivity calculations: a joint occupancy model for a prey species, its predators, and habitat used by both; occurrence dynamics from a well-known metapopulation study of three butterfly species; and Golden Eagle occupancy and reproductive dynamics. I show how to deal efficiently with multistate models and how to calculate sensitivities involving derived state variables and lower-level parameters. In addition, I extend methods to incorporate environmental variation by allowing for spatial and temporal variability in transition probabilities. The approach used here is concise and general and can fully account for environmental variability in transition parameters. The methods can be used to improve inferences in occupancy studies by quantifying the effects of underlying parameters, aiding prediction of future system states, and identifying priorities for sampling effort.
Multiple directed graph large-class multi-spectral processor
NASA Technical Reports Server (NTRS)
Casasent, David; Liu, Shiaw-Dong; Yoneyama, Hideyuki
1988-01-01
Numerical analysis techniques for the interpretation of high-resolution imaging-spectrometer data are described and demonstrated. The method proposed involves the use of (1) a hierarchical classifier with a tree structure generated automatically by a Fisher linear-discriminant-function algorithm and (2) a novel multiple-directed-graph scheme which reduces the local maxima and the number of perturbations required. Results for a 500-class test problem involving simulated imaging-spectrometer data are presented in tables and graphs; 100-percent-correct classification is achieved with an improvement factor of 5.
Meta-analysis of Odds Ratios: Current Good Practices
Chang, Bei-Hung; Hoaglin, David C.
2016-01-01
Background Many systematic reviews of randomized clinical trials lead to meta-analyses of odds ratios. The customary methods of estimating an overall odds ratio involve weighted averages of the individual trials’ estimates of the logarithm of the odds ratio. That approach, however, has several shortcomings, arising from assumptions and approximations, that render the results unreliable. Although the problems have been documented in the literature for many years, the conventional methods persist in software and applications. A well-developed alternative approach avoids the approximations by working directly with the numbers of subjects and events in the arms of the individual trials. Objective We aim to raise awareness of methods that avoid the conventional approximations, can be applied with widely available software, and produce more-reliable results. Methods We summarize the fixed-effect and random-effects approaches to meta-analysis; describe conventional, approximate methods and alternative methods; apply the methods in a meta-analysis of 19 randomized trials of endoscopic sclerotherapy in patients with cirrhosis and esophagogastric varices; and compare the results. We demonstrate the use of SAS, Stata, and R software for the analysis. Results In the example, point estimates and confidence intervals for the overall log-odds-ratio differ between the conventional and alternative methods, in ways that can affect inferences. Programming is straightforward in the three software packages; an appendix gives the details. Conclusions The modest additional programming required should not be an obstacle to adoption of the alternative methods. Because their results are unreliable, use of the conventional methods for meta-analysis of odds ratios should be discontinued. PMID:28169977
The most common technologies and tools for functional genome analysis.
Gasperskaja, Evelina; Kučinskas, Vaidutis
2017-01-01
Since the sequence of the human genome is complete, the main issue is how to understand the information written in the DNA sequence. Despite numerous genome-wide studies that have already been performed, the challenge to determine the function of genes, gene products, and also their interaction is still open. As changes in the human genome are highly likely to cause pathological conditions, functional analysis is vitally important for human health. For many years there have been a variety of technologies and tools used in functional genome analysis. However, only in the past decade there has been rapid revolutionizing progress and improvement in high-throughput methods, which are ranging from traditional real-time polymerase chain reaction to more complex systems, such as next-generation sequencing or mass spectrometry. Furthermore, not only laboratory investigation, but also accurate bioinformatic analysis is required for reliable scientific results. These methods give an opportunity for accurate and comprehensive functional analysis that involves various fields of studies: genomics, epigenomics, proteomics, and interactomics. This is essential for filling the gaps in the knowledge about dynamic biological processes at both cellular and organismal level. However, each method has both advantages and limitations that should be taken into account before choosing the right method for particular research in order to ensure successful study. For this reason, the present review paper aims to describe the most frequent and widely-used methods for the comprehensive functional analysis.
Futamure, Sumire; Bonnet, Vincent; Dumas, Raphael; Venture, Gentiane
2017-11-07
This paper presents a method allowing a simple and efficient sensitivity analysis of dynamics parameters of complex whole-body human model. The proposed method is based on the ground reaction and joint moment regressor matrices, developed initially in robotics system identification theory, and involved in the equations of motion of the human body. The regressor matrices are linear relatively to the segment inertial parameters allowing us to use simple sensitivity analysis methods. The sensitivity analysis method was applied over gait dynamics and kinematics data of nine subjects and with a 15 segments 3D model of the locomotor apparatus. According to the proposed sensitivity indices, 76 segments inertial parameters out the 150 of the mechanical model were considered as not influent for gait. The main findings were that the segment masses were influent and that, at the exception of the trunk, moment of inertia were not influent for the computation of the ground reaction forces and moments and the joint moments. The same method also shows numerically that at least 90% of the lower-limb joint moments during the stance phase can be estimated only from a force-plate and kinematics data without knowing any of the segment inertial parameters. Copyright © 2017 Elsevier Ltd. All rights reserved.
Van Driel, Robin; Trask, Catherine; Johnson, Peter W; Callaghan, Jack P; Koehoorn, Mieke; Teschke, Kay
2013-01-01
Measuring trunk posture in the workplace commonly involves subjective observation or self-report methods or the use of costly and time-consuming motion analysis systems (current gold standard). This work compared trunk inclination measurements using a simple data-logging inclinometer with trunk flexion measurements using a motion analysis system, and evaluated adding measures of subject anthropometry to exposure prediction models to improve the agreement between the two methods. Simulated lifting tasks (n=36) were performed by eight participants, and trunk postures were simultaneously measured with each method. There were significant differences between the two methods, with the inclinometer initially explaining 47% of the variance in the motion analysis measurements. However, adding one key anthropometric parameter (lower arm length) to the inclinometer-based trunk flexion prediction model reduced the differences between the two systems and accounted for 79% of the motion analysis method's variance. Although caution must be applied when generalizing lower-arm length as a correction factor, the overall strategy of anthropometric modeling is a novel contribution. In this lifting-based study, by accounting for subject anthropometry, a single, simple data-logging inclinometer shows promise for trunk posture measurement and may have utility in larger-scale field studies where similar types of tasks are performed.
Costello, Tracy J; Falk, Catherine T; Ye, Kenny Q
2003-01-01
The Framingham Heart Study data, as well as a related simulated data set, were generously provided to the participants of the Genetic Analysis Workshop 13 in order that newly developed and emerging statistical methodologies could be tested on that well-characterized data set. The impetus driving the development of novel methods is to elucidate the contributions of genes, environment, and interactions between and among them, as well as to allow comparison between and validation of methods. The seven papers that comprise this group used data-mining methodologies (tree-based methods, neural networks, discriminant analysis, and Bayesian variable selection) in an attempt to identify the underlying genetics of cardiovascular disease and related traits in the presence of environmental and genetic covariates. Data-mining strategies are gaining popularity because they are extremely flexible and may have greater efficiency and potential in identifying the factors involved in complex disorders. While the methods grouped together here constitute a diverse collection, some papers asked similar questions with very different methods, while others used the same underlying methodology to ask very different questions. This paper briefly describes the data-mining methodologies applied to the Genetic Analysis Workshop 13 data sets and the results of those investigations. Copyright 2003 Wiley-Liss, Inc.
Cloyes, Kristin Gates
2006-01-01
Nursing literature is replete with discussions about the ethics of research interviews. These largely involve questions of method, and how careful study design and data collection technique can render studies more ethical. Analysis, the perennial black box of the research process, is rarely discussed as an ethical practice. In this paper, I introduce the idea that analysis itself is an ethical practice. Specifically, I argue that political discourse analysis of research interviews is an ethical practice. I use examples from my own research in a prison control unit to illustrate what this might look like, and what is at stake.
Implementation of a computer database testing and analysis program.
Rouse, Deborah P
2007-01-01
The author is the coordinator of a computer software database testing and analysis program implemented in an associate degree nursing program. Computer software database programs help support the testing development and analysis process. Critical thinking is measurable and promoted with their use. The reader of this article will learn what is involved in procuring and implementing a computer database testing and analysis program in an academic nursing program. The use of the computerized database for testing and analysis will be approached as a method to promote and evaluate the nursing student's critical thinking skills and to prepare the nursing student for the National Council Licensure Examination.
A Data Driven Model for Predicting RNA-Protein Interactions based on Gradient Boosting Machine.
Jain, Dharm Skandh; Gupte, Sanket Rajan; Aduri, Raviprasad
2018-06-22
RNA protein interactions (RPI) play a pivotal role in the regulation of various biological processes. Experimental validation of RPI has been time-consuming, paving the way for computational prediction methods. The major limiting factor of these methods has been the accuracy and confidence of the predictions, and our in-house experiments show that they fail to accurately predict RPI involving short RNA sequences such as TERRA RNA. Here, we present a data-driven model for RPI prediction using a gradient boosting classifier. Amino acids and nucleotides are classified based on the high-resolution structural data of RNA protein complexes. The minimum structural unit consisting of five residues is used as the descriptor. Comparative analysis of existing methods shows the consistently higher performance of our method irrespective of the length of RNA present in the RPI. The method has been successfully applied to map RPI networks involving both long noncoding RNA as well as TERRA RNA. The method is also shown to successfully predict RNA and protein hubs present in RPI networks of four different organisms. The robustness of this method will provide a way for predicting RPI networks of yet unknown interactions for both long noncoding RNA and microRNA.
Visual Aggregate Analysis of Eligibility Features of Clinical Trials
He, Zhe; Carini, Simona; Sim, Ida; Weng, Chunhua
2015-01-01
Objective To develop a method for profiling the collective populations targeted for recruitment by multiple clinical studies addressing the same medical condition using one eligibility feature each time. Methods Using a previously published database COMPACT as the backend, we designed a scalable method for visual aggregate analysis of clinical trial eligibility features. This method consists of four modules for eligibility feature frequency analysis, query builder, distribution analysis, and visualization, respectively. This method is capable of analyzing (1) frequently used qualitative and quantitative features for recruiting subjects for a selected medical condition, (2) distribution of study enrollment on consecutive value points or value intervals of each quantitative feature, and (3) distribution of studies on the boundary values, permissible value ranges, and value range widths of each feature. All analysis results were visualized using Google Charts API. Five recruited potential users assessed the usefulness of this method for identifying common patterns in any selected eligibility feature for clinical trial participant selection. Results We implemented this method as a Web-based analytical system called VITTA (Visual Analysis Tool of Clinical Study Target Populations). We illustrated the functionality of VITTA using two sample queries involving quantitative features BMI and HbA1c for conditions “hypertension” and “Type 2 diabetes”, respectively. The recruited potential users rated the user-perceived usefulness of VITTA with an average score of 86.4/100. Conclusions We contributed a novel aggregate analysis method to enable the interrogation of common patterns in quantitative eligibility criteria and the collective target populations of multiple related clinical studies. A larger-scale study is warranted to formally assess the usefulness of VITTA among clinical investigators and sponsors in various therapeutic areas. PMID:25615940
The Significance of Scalp Involvement in Pemphigus: A Literature Review
Sar-Pomian, Marta; Rudnicka, Lidia
2018-01-01
Scalp is a unique location for pemphigus because of the abundance of desmogleins localized in hair follicles. Scalp involvement is observed in up to 60% of patients in the course of pemphigus. The lesions may occasionally lead to alopecia. Unforced removal of anagen hairs in a pull test is a sign of high disease activity. Direct immunofluorescence of plucked hair bulbs is considered a reliable diagnostic method in patients with pemphigus. Follicular acantholysis is a characteristic histopathological feature of pemphigus lesions localized on the scalp. Trichoscopy may serve as a supplementary method in the diagnosis of pemphigus. This review summarizes the most recent data concerning scalp involvement in pemphigus vulgaris and pemphigus foliaceus. A systematic literature search was conducted in three medical databases: PubMed, Embase, and Web of Science. The analysis included literature data about desmoglein distribution in hair follicles, as well as information about clinical manifestations, histopathology, immunopathology, and trichoscopy of scalp lesions in pemphigus and their response to treatment. PMID:29770335
Mitrofanenko, Tamara; Snajdr, Julia; Muhar, Andreas; Penker, Marianne; Schauppenlehner-Kloyber, Elisabeth
2018-05-22
Stakeholder participation is of high importance in UNESCO biosphere reserves as model regions for sustainable development; however, certain groups remain underrepresented. The paper proposes Intergenerational Practice (IP) as a means of involving youth and elderly women and explores its options and barriers, using the example of the Salzburger Lungau and Kärntner Nockberge Biosphere Reserve in Austria. Case study analysis is used involving mixed methods. The results reveal obstacles and motivations to participating in biosphere reserve implementation and intergenerational activities for the youth and the elderly women and imply that much potential for IP exists in the biosphere reserve region. The authors propose suitable solutions from the intergenerational field to overcome identified participation obstacles and suggest benefits of incorporating IP as a management tool into biosphere reserve activities. Suggestions for future research include evaluating applications of IP in the context of protected areas, testing of methods used in other contexts, and contribution to theory development.
Fathiazar, Elham; Anemuller, Jorn; Kretzberg, Jutta
2016-08-01
Voltage-Sensitive Dye (VSD) imaging is an optical imaging method that allows measuring the graded voltage changes of multiple neurons simultaneously. In neuroscience, this method is used to reveal networks of neurons involved in certain tasks. However, the recorded relative dye fluorescence changes are usually low and signals are superimposed by noise and artifacts. Therefore, establishing a reliable method to identify which cells are activated by specific stimulus conditions is the first step to identify functional networks. In this paper, we present a statistical method to identify stimulus-activated network nodes as cells, whose activities during sensory network stimulation differ significantly from the un-stimulated control condition. This method is demonstrated based on voltage-sensitive dye recordings from up to 100 neurons in a ganglion of the medicinal leech responding to tactile skin stimulation. Without relying on any prior physiological knowledge, the network nodes identified by our statistical analysis were found to match well with published cell types involved in tactile stimulus processing and to be consistent across stimulus conditions and preparations.
Yi, Ming; Stephens, Robert M.
2008-01-01
Analysis of microarray and other high throughput data often involves identification of genes consistently up or down-regulated across samples as the first step in extraction of biological meaning. This gene-level paradigm can be limited as a result of valid sample fluctuations and biological complexities. In this report, we describe a novel method, SLEPR, which eliminates this limitation by relying on pathway-level consistencies. Our method first selects the sample-level differentiated genes from each individual sample, capturing genes missed by other analysis methods, ascertains the enrichment levels of associated pathways from each of those lists, and then ranks annotated pathways based on the consistency of enrichment levels of individual samples from both sample classes. As a proof of concept, we have used this method to analyze three public microarray datasets with a direct comparison with the GSEA method, one of the most popular pathway-level analysis methods in the field. We found that our method was able to reproduce the earlier observations with significant improvements in depth of coverage for validated or expected biological themes, but also produced additional insights that make biological sense. This new method extends existing analyses approaches and facilitates integration of different types of HTP data. PMID:18818771
Vincent, Christopher James; Blandford, Ann
2017-03-01
We present findings of a UK study into how those involved in purchasing interactive medical devices go about evaluating usability, the challenges that arise, and opportunities for improvement. The study focused on procurement of infusion devices because these are used by various professionals across healthcare. A semi-structured interview study was carried out involving a range of stakeholders (20 in total) involved in or impacted by medical device procurement. Data was analysed using thematic analysis, a qualitative method designed to support the identification, analysis and reporting of patterns. In principle, health service purchasing was found to accommodate consideration of equipment usability. In practice, the evaluation process was driven primarily by engineering standards; assessment of local needs did not accommodate substantive assessment of usability; and choice was limited by the availability of equipment on the marketplace. We discuss ways in which purchasing could be improved through techniques that account for social circumstances. Copyright © 2016 Elsevier Ltd. All rights reserved.
New spectrophotometric assay for pilocarpine.
El-Masry, S; Soliman, R
1980-07-01
A quick method for the determination of pilocarpine in eye drops in the presence of decomposition products is described. The method involves complexation of the alkaloid with bromocresol purple at pH 6. After treatment with 0.1N NaOH, the liberated dye is measured at 580 nm. The method has a relative standard deviation of 1.99%, and has been successfully applied to the analysis of 2 batches of pilocarpine eye drops. The recommended method was also used to monitor the stability of a pilocarpine nitrate solution in 0.05N NaOH at 65 degrees C. The BPC method failed to detect any significant decomposition after 2 h incubation, but the recommended method revealed 87.5% decomposition.
To BECCS or Not To BECCS: A Question of Method
NASA Astrophysics Data System (ADS)
DeCicco, J. M.
2017-12-01
Bioenergy with carbon capture and storage (BECCS) is seen as an important option in many climate stabilization scenarios. Limited demonstrations are underway, including a system that captures and sequesters the fermentation CO2 from ethanol production. However, its net CO2 emissions are uncertain for reasons related to both system characteristics and methodological issues. As for bioenergy in general, evaluations draw on both ecological and engineering methods. It is informative to apply different methods using available data for demonstration systems in comparison to related bioenergy systems. To do so, this paper examines a case study BECCS system and addresses questions regarding the utilization of terrestrial carbon, biomass sustainability and the implications for scalability. The analysis examines four systems, all utilizing the same land area, using two methods. The cases are: A) a crop system without either biofuel production or CCS; B) a biofuel production system without CCS; C) biofuel system with CCS, i.e., the BECCS case, and D) a crop system without biofuel production or CCS but with crop residue removal and conversion to a stable char. In cases A and D, the delivered fuel is fossil-based; in cases B and C the fuel is biomass-based. The first method is LCA, involving steady-flow modeling of systems over a defined lifecycle, following current practice as seen in the attributional LCA component of California's Low-Carbon Fuel Standard (LCFS). The second method involves spatially and temporally explicit analysis, reflecting the dynamics of carbon exchanges with the atmosphere. Although parameters are calibrated to the California LCFS LCA model, simplified spreadsheet modeling is used to maximize transparency while highlighting assumptions that most influence the results. The analysis reveals distinctly different pictures of net CO2 emissions for the cases examined, with the dynamic method painting a less optimistic picture of the BECCS system than the LCA method. Differences in results are traced to differing representations of terrestrial carbon exchanges and associated modeling assumptions. We conclude with suggestions for future work on project- and program-scale carbon accounting methods and the need for caution in advancing BECCS before such methods are better validated.
The construction of high-accuracy schemes for acoustic equations
NASA Technical Reports Server (NTRS)
Tang, Lei; Baeder, James D.
1995-01-01
An accuracy analysis of various high order schemes is performed from an interpolation point of view. The analysis indicates that classical high order finite difference schemes, which use polynomial interpolation, hold high accuracy only at nodes and are therefore not suitable for time-dependent problems. Thus, some schemes improve their numerical accuracy within grid cells by the near-minimax approximation method, but their practical significance is degraded by maintaining the same stencil as classical schemes. One-step methods in space discretization, which use piecewise polynomial interpolation and involve data at only two points, can generate a uniform accuracy over the whole grid cell and avoid spurious roots. As a result, they are more accurate and efficient than multistep methods. In particular, the Cubic-Interpolated Psuedoparticle (CIP) scheme is recommended for computational acoustics.
Linguistic methodology for the analysis of aviation accidents
NASA Technical Reports Server (NTRS)
Goguen, J. A.; Linde, C.
1983-01-01
A linguistic method for the analysis of small group discourse, was developed and the use of this method on transcripts of commercial air transpot accidents is demonstrated. The method identifies the discourse types that occur and determine their linguistic structure; it identifies significant linguistic variables based upon these structures or other linguistic concepts such as speech act and topic; it tests hypotheses that support significance and reliability of these variables; and it indicates the implications of the validated hypotheses. These implications fall into three categories: (1) to train crews to use more nearly optimal communication patterns; (2) to use linguistic variables as indices for aspects of crew performance such as attention; and (3) to provide guidelines for the design of aviation procedures and equipment, especially those that involve speech.
Thermodynamic free energy methods to investigate shape transitions in bilayer membranes.
Ramakrishnan, N; Tourdot, Richard W; Radhakrishnan, Ravi
2016-06-01
The conformational free energy landscape of a system is a fundamental thermodynamic quantity of importance particularly in the study of soft matter and biological systems, in which the entropic contributions play a dominant role. While computational methods to delineate the free energy landscape are routinely used to analyze the relative stability of conformational states, to determine phase boundaries, and to compute ligand-receptor binding energies its use in problems involving the cell membrane is limited. Here, we present an overview of four different free energy methods to study morphological transitions in bilayer membranes, induced either by the action of curvature remodeling proteins or due to the application of external forces. Using a triangulated surface as a model for the cell membrane and using the framework of dynamical triangulation Monte Carlo, we have focused on the methods of Widom insertion, thermodynamic integration, Bennett acceptance scheme, and umbrella sampling and weighted histogram analysis. We have demonstrated how these methods can be employed in a variety of problems involving the cell membrane. Specifically, we have shown that the chemical potential, computed using Widom insertion, and the relative free energies, computed using thermodynamic integration and Bennett acceptance method, are excellent measures to study the transition from curvature sensing to curvature inducing behavior of membrane associated proteins. The umbrella sampling and WHAM analysis has been used to study the thermodynamics of tether formation in cell membranes and the quantitative predictions of the computational model are in excellent agreement with experimental measurements. Furthermore, we also present a method based on WHAM and thermodynamic integration to handle problems related to end-point-catastrophe that are common in most free energy methods.
[Experience feedback committee: a method for patient safety improvement].
François, P; Sellier, E; Imburchia, F; Mallaret, M-R
2013-04-01
An experience feedback committee (CREX, Comité de Retour d'EXpérience) is a method which contributes to the management of safety of care in a medical unit. Originally used for security systems of civil aviation, the method has been adapted to health care facilities and successfully implemented in radiotherapy units and in other specialties. We performed a brief review of the literature for studies reporting data on CREX established in hospitals. The review was performed using the main bibliographic databases and Google search results. The CREX is designed to analyse incidents reported by professionals. The method includes monthly meetings of a multi-professional committee that reviews the reported incidents, chooses a priority incident and designates a "pilot" responsible for investigating the incident. The investigation of the incident involves a systemic analysis method and a written synthesis presented at the next meeting of the committee. The committee agrees on actions for improvement that are suggested by the analysis and follows their implementation. Systems for the management of health care, including reporting systems, are organized into three levels: the medical unit, the hospital and the country as a triple loop learning process. The CREX is located in the base level, short loop of risk management and allows direct involvement of care professionals in patient safety. Safety of care has become a priority of health systems. In this context, the CREX can be a useful vehicle for the implementation of a safety culture in medical units. Copyright © 2013 Elsevier Masson SAS. All rights reserved.
Zero-mode clad waveguides for performing spectroscopy with confined effective observation volumes
Levene, Michael J.; Korlach, Jonas; Turner, Stephen W.; Craighead, Harold G.; Webb, Watt W.
2005-07-12
The present invention is directed to a method and an apparatus for analysis of an analyte. The method involves providing a zero-mode waveguide which includes a cladding surrounding a core where the cladding is configured to preclude propagation of electromagnetic energy of a frequency less than a cutoff frequency longitudinally through the core of the zero-mode waveguide. The analyte is positioned in the core of the zero-mode waveguide and is then subjected, in the core of the zero-mode waveguide, to activating electromagnetic radiation of a frequency less than the cut-off frequency under conditions effective to permit analysis of the analyte in an effective observation volume which is more compact than if the analysis were carried out in the absence of the zero-mode waveguide.
Waveguides for performing spectroscopy with confined effective observation volumes
Levene, Michael J.; Korlach, Jonas; Turner, Stephen W.; Craighead, Harold G.; Webb, Watt W.
2006-03-14
The present invention is directed to a method and an apparatus for analysis of an analyte. The method involves providing a zero-mode waveguide which includes a cladding surrounding a core where the cladding is configured to preclude propagation of electromagnetic energy of a frequency less than a cutoff frequency longitudinally through the core of the zero-mode waveguide. The analyte is positioned in the core of the zero-mode waveguide and is then subjected, in the core of the zero-mode waveguide, to activating electromagnetic radiation of a frequency less than the cut-off frequency under conditions effective to permit analysis of the analyte in an effective observation volume which is more compact than if the analysis were carried out in the absence of the zero-mode waveguide.
Recent developments in imaging system assessment methodology, FROC analysis and the search model.
Chakraborty, Dev P
2011-08-21
A frequent problem in imaging is assessing whether a new imaging system is an improvement over an existing standard. Observer performance methods, in particular the receiver operating characteristic (ROC) paradigm, are widely used in this context. In ROC analysis lesion location information is not used and consequently scoring ambiguities can arise in tasks, such as nodule detection, involving finding localized lesions. This paper reviews progress in the free-response ROC (FROC) paradigm in which the observer marks and rates suspicious regions and the location information is used to determine whether lesions were correctly localized. Reviewed are FROC data analysis, a search-model for simulating FROC data, predictions of the model and a method for estimating the parameters. The search model parameters are physically meaningful quantities that can guide system optimization.
NASA Astrophysics Data System (ADS)
Syed Mazlan, S. M. S.; Abdullah, S. R.; Shahidan, S.; Noor, S. R. Mohd
2017-11-01
Concrete durability may be affected by so many factors such as chemical attack and weathering action that reduce the performance and the service life of concrete structures. Low durability Reinforced concrete (RC) can be greatly improved by using Fiber Reinforce Polymer (FRP). FRP is a commonly used composite material for repairing and strengthening RC structures. A review on application of Acoustic Emission (AE) techniques of real time monitoring for various mechanical tests for RC strengthened with FRP involving four-point bending, three-point bending and cyclic loading was carried out and discussed in this paper. Correlations between each AE analyses namely b-value, sentry and intensity analysis on damage characterization also been critically reviewed. From the review, AE monitoring involving RC strengthened with FRP using b-value, sentry and intensity analysis are proven to be successful and efficient method in determining damage characterization. However, application of AE analysis using sentry analysis is still limited compared to b-value and intensity analysis in characterizing damages especially for RC strengthened with FRP specimen.
Knowledge Discovery from Posts in Online Health Communities Using Unified Medical Language System.
Chen, Donghua; Zhang, Runtong; Liu, Kecheng; Hou, Lei
2018-06-19
Patient-reported posts in Online Health Communities (OHCs) contain various valuable information that can help establish knowledge-based online support for online patients. However, utilizing these reports to improve online patient services in the absence of appropriate medical and healthcare expert knowledge is difficult. Thus, we propose a comprehensive knowledge discovery method that is based on the Unified Medical Language System for the analysis of narrative posts in OHCs. First, we propose a domain-knowledge support framework for OHCs to provide a basis for post analysis. Second, we develop a Knowledge-Involved Topic Modeling (KI-TM) method to extract and expand explicit knowledge within the text. We propose four metrics, namely, explicit knowledge rate, latent knowledge rate, knowledge correlation rate, and perplexity, for the evaluation of the KI-TM method. Our experimental results indicate that our proposed method outperforms existing methods in terms of providing knowledge support. Our method enhances knowledge support for online patients and can help develop intelligent OHCs in the future.
[Research progress on mechanical performance evaluation of artificial intervertebral disc].
Li, Rui; Wang, Song; Liao, Zhenhua; Liu, Weiqiang
2018-03-01
The mechanical properties of artificial intervertebral disc (AID) are related to long-term reliability of prosthesis. There are three testing methods involved in the mechanical performance evaluation of AID based on different tools: the testing method using mechanical simulator, in vitro specimen testing method and finite element analysis method. In this study, the testing standard, testing equipment and materials of AID were firstly introduced. Then, the present status of AID static mechanical properties test (static axial compression, static axial compression-shear), dynamic mechanical properties test (dynamic axial compression, dynamic axial compression-shear), creep and stress relaxation test, device pushout test, core pushout test, subsidence test, etc. were focused on. The experimental techniques using in vitro specimen testing method and testing results of available artificial discs were summarized. The experimental methods and research status of finite element analysis were also summarized. Finally, the research trends of AID mechanical performance evaluation were forecasted. The simulator, load, dynamic cycle, motion mode, specimen and test standard would be important research fields in the future.
Christopher M. Oswalt; Ted R. Ridley
2015-01-01
Benjamin Franklin once said âTell me and I forget. Teach me and I remember. Involve me and I learn.â It is with that in mind that the Southern Research Station (SRS) Forest Inventory and Analysis (FIA) Program jumps feet first into exploring alternative methods of communicating the knowledge that is discovered through broad-scale data collection of the forest resources...
Process modelling for space station experiments
NASA Technical Reports Server (NTRS)
Rosenberger, Franz; Alexander, J. Iwan D.
1988-01-01
The work performed during the first year 1 Oct. 1987 to 30 Sept. 1988 involved analyses of crystal growth from the melt and from solution. The particular melt growth technique under investigation is directional solidification by the Bridgman-Stockbarger method. Two types of solution growth systems are also being studied. One involves growth from solution in a closed container, the other concerns growth of protein crystals by the hanging drop method. Following discussions with Dr. R. J. Naumann of the Low Gravity Science Division at MSFC it was decided to tackle the analysis of crystal growth from the melt earlier than originally proposed. Rapid progress was made in this area. Work is on schedule and full calculations were underway for some time. Progress was also made in the formulation of the two solution growth models.
Modeling Multibody Stage Separation Dynamics Using Constraint Force Equation Methodology
NASA Technical Reports Server (NTRS)
Tartabini, Paul V.; Roithmayr, Carlos M.; Toniolo, Matthew D.; Karlgaard, Christopher D.; Pamadi, Bandu N.
2011-01-01
This paper discusses the application of the constraint force equation methodology and its implementation for multibody separation problems using three specially designed test cases. The first test case involves two rigid bodies connected by a fixed joint, the second case involves two rigid bodies connected with a universal joint, and the third test case is that of Mach 7 separation of the X-43A vehicle. For the first two cases, the solutions obtained using the constraint force equation method compare well with those obtained using industry- standard benchmark codes. For the X-43A case, the constraint force equation solutions show reasonable agreement with the flight-test data. Use of the constraint force equation method facilitates the analysis of stage separation in end-to-end simulations of launch vehicle trajectories
Surface photovoltage measurements and finite element modeling of SAW devices.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Donnelly, Christine
2012-03-01
Over the course of a Summer 2011 internship with the MEMS department of Sandia National Laboratories, work was completed on two major projects. The first and main project of the summer involved taking surface photovoltage measurements for silicon samples, and using these measurements to determine surface recombination velocities and minority carrier diffusion lengths of the materials. The SPV method was used to fill gaps in the knowledge of material parameters that had not been determined successfully by other characterization methods. The second project involved creating a 2D finite element model of a surface acoustic wave device. A basic form ofmore » the model with the expected impedance response curve was completed, and the model is ready to be further developed for analysis of MEMS photonic resonator devices.« less
Pharmacists' perspectives on monitoring adherence to treatment in Cystic Fibrosis.
Mooney, Karen; Ryan, Cristín; Downey, Damian G
2016-04-01
Cystic Fibrosis (CF) management requires complex treatment regimens but adherence to treatment is poor and has negative health implications. There are various methods of measuring adherence, but little is known regarding the extent of adherence measurement in CF centres throughout the UK and Ireland. To determine the adherence monitoring practices in CF centres throughout the UK and Ireland, and to establish CF pharmacists' views on these practices. UK and Ireland Cystic Fibrosis Pharmacists' Group's annual meeting (2014). A questionnaire was designed, piloted and distributed to pharmacists attending the UK and Ireland Cystic Fibrosis Pharmacists' Group's annual meeting (2014). The main outcome measures were the methods of inhaled/nebulised antibiotic supply and the methods used to measure treatment adherence in CF centres. The questionnaire also ascertained the demographic information of participating pharmacists. Closed question responses were analysed using descriptive statistics. Open questions were analysed using content analysis. Twenty-one respondents (84 % response) were included in the analysis and were mostly from English centres (66.7 %). Detailed records of patients receiving their inhaled/nebulised antibiotics were lacking. Adherence was most commonly described to be measured at 'every clinic visit' (28.6 %) and 'occasionally' (28.6 %). Patient self-reported adherence was the most commonly used method of measuring adherence in practice (90.5 %). The availability of electronic adherence monitoring in CF centres did not guarantee its use. Pharmacists attributed an equal professional responsibility for adherence monitoring in CF to Consultants, Nurses and Pharmacists. Seventy-six percent of pharmacists felt that the current adherence monitoring practices within their own unit were inadequate and associated with the absence of sufficient specialist CF pharmacist involvement. Many suggested that greater specialist pharmacist involvement could facilitate improved adherence monitoring. Current adherence knowledge is largely based on self-report. Further work is required to establish the most appropriate method of adherence monitoring in CF centres, to improve the recording of adherence and to understand the impact of increased specialist pharmacist involvement on that adherence.
Guthrie, Elspeth A; McMeekin, Aaron T; Khan, Sylvia; Makin, Sally; Shaw, Ben; Longson, Damien
2017-06-01
Aims and method This article presents a 12-month case series to determine the fraction of ward referrals of adults of working age who needed a liaison psychiatrist in a busy tertiary referral teaching hospital. Results The service received 344 referrals resulting in 1259 face-to-face contacts. Depression accounted for the most face-to-face contacts. We deemed the involvement of a liaison psychiatrist necessary in 241 (70.1%) referrals, with medication management as the most common reason. Clinical implications A substantial amount of liaison ward work involves the treatment and management of severe and complex mental health problems. Our analysis suggests that in the majority of cases the input of a liaison psychiatrist is required.
NASA Technical Reports Server (NTRS)
Maule, J.; Wainwright, N.; Steele, A.; Gunter, D.; Flores, G.; Effinger, M.; Danibm N,; Wells, M.; Williams, S.; Morris, H.;
2008-01-01
Microorganisms within the space stations Salyut, Mir and the International Space Station (ISS), have traditionally been monitored with culture-based techniques. These techniques involve growing environmental samples (cabin water, air or surfaces) on agar-type media for several days, followed by visualization of resulting colonies; and return of samples to Earth for ground-based analysis. This approach has provided a wealth of useful data and enhanced our understanding of the microbial ecology within space stations. However, the approach is also limited by the following: i) More than 95% microorganisms in the environment cannot grow on conventional growth media; ii) Significant time lags occur between onboard sampling and colony visualization (3-5 days) and ground-based analysis (as long as several months); iii) Colonies are often difficult to visualize due to condensation within contact slide media plates; and iv) Techniques involve growth of potentially harmful microorganisms, which must then be disposed of safely. This report describes the operation of a new culture-independent technique onboard the ISS for rapid analysis (within minutes) of endotoxin and -1, 3-glucan, found in the cell walls of gram-negative bacteria and fungi, respectively. This technique involves analysis of environmental samples with the Limulus Amebocyte Lysate (LAL) assay in a handheld device. This handheld device and sampling system is known as the Lab-On-a-Chip Application Development Portable Test System (LOCAD-PTS). A poster will be presented that describes a comparative study between LOCAD-PTS analysis and existing culture-based methods onboard the ISS; together with an exploratory survey of surface endotoxin throughout the ISS. It is concluded that while a general correlation between LOCAD-PTS and traditional culture-based methods should not necessarily be expected, a combinatorial approach can be adopted where both sets of data are used together to generate a more complete story of the microbial ecology on the ISS.
A Fourier method for the analysis of exponential decay curves.
Provencher, S W
1976-01-01
A method based on the Fourier convolution theorem is developed for the analysis of data composed of random noise, plus an unknown constant "base line," plus a sum of (or an integral over a continuous spectrum of) exponential decay functions. The Fourier method's usual serious practical limitation of needing high accuracy data over a very wide range is eliminated by the introduction of convergence parameters and a Gaussian taper window. A computer program is described for the analysis of discrete spectra, where the data involves only a sum of exponentials. The program is completely automatic in that the only necessary inputs are the raw data (not necessarily in equal intervals of time); no potentially biased initial guesses concerning either the number or the values of the components are needed. The outputs include the number of components, the amplitudes and time constants together with their estimated errors, and a spectral plot of the solution. The limiting resolving power of the method is studied by analyzing a wide range of simulated two-, three-, and four-component data. The results seem to indicate that the method is applicable over a considerably wider range of conditions than nonlinear least squares or the method of moments.
Sensory Isolation in Flotation Tanks: Altered States of Consciousness and Effects on Well-Being
ERIC Educational Resources Information Center
Kjellgren, Anette; Lyden, Francisca; Norlander, Torsten
2008-01-01
A qualitative analysis (The Empirical Phenomenological Psychological method) of interviews involving eight patients (depression, burn-out syndrome, and chronic pain) was carried out in order to obtain knowledge regarding the effects of flotation tank therapy. This knowledge might be helpful for both professionals and potential floaters. The…