Open Rotor Computational Aeroacoustic Analysis with an Immersed Boundary Method
NASA Technical Reports Server (NTRS)
Brehm, Christoph; Barad, Michael F.; Kiris, Cetin C.
2016-01-01
Reliable noise prediction capabilities are essential to enable novel fuel efficient open rotor designs that can meet the community and cabin noise standards. Toward this end, immersed boundary methods have reached a level of maturity so that they are being frequently employed for specific real world applications within NASA. This paper demonstrates that our higher-order immersed boundary method provides the ability for aeroacoustic analysis of wake-dominated flow fields generated by highly complex geometries. This is the first of a kind aeroacoustic simulation of an open rotor propulsion system employing an immersed boundary method. In addition to discussing the peculiarities of applying the immersed boundary method to this moving boundary problem, we will provide a detailed aeroacoustic analysis of the noise generation mechanisms encountered in the open rotor flow. The simulation data is compared to available experimental data and other computational results employing more conventional CFD methods. The noise generation mechanisms are analyzed employing spectral analysis, proper orthogonal decomposition and the causality method.
Employing Conjoint Analysis in Making Compensation Decisions.
ERIC Educational Resources Information Center
Kienast, Philip; And Others
1983-01-01
Describes a method employing conjoint analysis that generates utility/cost ratios for various elements of the compensation package. Its superiority to simple preference surveys is examined. Results of a study of the use of this method in fringe benefit planning in a large financial institution are reported. (Author/JAC)
Azevedo de Brito, Wanessa; Gomes Dantas, Monique; Andrade Nogueira, Fernando Henrique; Ferreira da Silva-Júnior, Edeildo; Xavier de Araújo-Júnior, João; Aquino, Thiago Mendonça de; Adélia Nogueira Ribeiro, Êurica; da Silva Solon, Lilian Grace; Soares Aragão, Cícero Flávio; Barreto Gomes, Ana Paula
2017-08-30
Guanylhydrazones are molecules with great pharmacological potential in various therapeutic areas, including antitumoral activity. Factorial design is an excellent tool in the optimization of a chromatographic method, because it is possible quickly change factors such as temperature, mobile phase composition, mobile phase pH, column length, among others to establish the optimal conditions of analysis. The aim of the present work was to develop and validate a HPLC and UHPLC methods for the simultaneous determination of guanylhydrazones with anticancer activity employing experimental design. Precise, exact, linear and robust HPLC and UHPLC methods were developed and validated for the simultaneous quantification of the guanylhydrazones LQM10, LQM14, and LQM17. The UHPLC method was more economic, with a four times less solvent consumption, and 20 times less injection volume, what allowed better column performance. Comparing the empirical approach employed in the HPLC method development to the DoE approach employed in the UHPLC method development, we can conclude that the factorial design made the method development faster, more practical and rational. This resulted in methods that can be employed in the analysis, evaluation and quality control of these new synthetic guanylhydrazones.
Open Rotor Computational Aeroacoustic Analysis with an Immersed Boundary Method
NASA Technical Reports Server (NTRS)
Brehm, Christoph; Barad, Michael F.; Kiris, Cetin C.
2016-01-01
Reliable noise prediction capabilities are essential to enable novel fuel efficient open rotor designs that can meet the community and cabin noise standards. Toward this end, immersed boundary methods have reached a level of maturity where more and more complex flow problems can be tackled with this approach. This paper demonstrates that our higher-order immersed boundary method provides the ability for aeroacoustic analysis of wake-dominated flow fields generated by a contra-rotating open rotor. This is the first of a kind aeroacoustic simulation of an open rotor propulsion system employing an immersed boundary method. In addition to discussing the methodologies of how to apply the immersed boundary method to this moving boundary problem, we will provide a detailed validation of the aeroacoustic analysis approach employing the Launch Ascent and Vehicle Aerodynamics (LAVA) solver. Two free-stream Mach numbers with M=0.2 and M=0.78 are considered in this analysis that are based on the nominally take-off and cruise flow conditions. The simulation data is compared to available experimental data and other computational results employing more conventional CFD methods. Spectral analysis is used to determine the dominant wave propagation pattern in the acoustic near-field.
On the use of attachment modes in substructure coupling for dynamic analysis
NASA Technical Reports Server (NTRS)
Craig, R. R., Jr.; Chang, C.-J.
1977-01-01
Substructure coupling or component-mode synthesis may be employed in the solution of dynamics problems for complex structures. Although numerous substructure-coupling methods have been devised, little attention has been devoted to methods employing attachment modes. In the present paper the various mode sets (normal modes, constraint modes, attachment modes) are defined. A generalized substructure-coupling procedure is described. Those substructure-coupling methods which employ attachment modes are described in detail. One of these methods is shown to lead to results (e.g., system natural frequencies) comparable to or better than those obtained by the Hurty (1965) method.
Bismuth-based electrochemical stripping analysis
Wang, Joseph
2004-01-27
Method and apparatus for trace metal detection and analysis using bismuth-coated electrodes and electrochemical stripping analysis. Both anodic stripping voltammetry and adsorptive stripping analysis may be employed.
The phase interrogation method for optical fiber sensor by analyzing the fork interference pattern
NASA Astrophysics Data System (ADS)
Lv, Riqing; Qiu, Liqiang; Hu, Haifeng; Meng, Lu; Zhang, Yong
2018-02-01
The phase interrogation method for optical fiber sensor is proposed based on the fork interference pattern between the orbital angular momentum beam and plane wave. The variation of interference pattern with phase difference between the two light beams is investigated to realize the phase interrogation. By employing principal component analysis method, the features of the interference pattern can be extracted. Moreover, the experimental system is designed to verify the theoretical analysis, as well as feasibility of phase interrogation. In this work, the Mach-Zehnder interferometer was employed to convert the strain applied on sensing fiber to the phase difference between the reference and measuring paths. This interrogation method is also applicable for the measurements of other physical parameters, which can produce the phase delay in optical fiber. The performance of the system can be further improved by employing highlysensitive materials and fiber structures.
REPRESENTATIVE SAMPLING AND ANALYSIS OF HETEROGENEOUS SOILS
Standard sampling and analysis methods for hazardous substances in contaminated soils currently are available and routinely employed. Standard methods inherently assume a homogeneous soil matrix and contaminant distribution; therefore only small sample quantities typically are p...
Generalized Structured Component Analysis
ERIC Educational Resources Information Center
Hwang, Heungsun; Takane, Yoshio
2004-01-01
We propose an alternative method to partial least squares for path analysis with components, called generalized structured component analysis. The proposed method replaces factors by exact linear combinations of observed variables. It employs a well-defined least squares criterion to estimate model parameters. As a result, the proposed method…
NASA Astrophysics Data System (ADS)
Gray, Bonnie L.
2012-04-01
Microfluidics is revolutionizing laboratory methods and biomedical devices, offering new capabilities and instrumentation in multiple areas such as DNA analysis, proteomics, enzymatic analysis, single cell analysis, immunology, point-of-care medicine, personalized medicine, drug delivery, and environmental toxin and pathogen detection. For many applications (e.g., wearable and implantable health monitors, drug delivery devices, and prosthetics) mechanically flexible polymer devices and systems that can conform to the body offer benefits that cannot be achieved using systems based on conventional rigid substrate materials. However, difficulties in implementing active devices and reliable packaging technologies have limited the success of flexible microfluidics. Employing highly compliant materials such as PDMS that are typically employed for prototyping, we review mechanically flexible polymer microfluidic technologies based on free-standing polymer substrates and novel electronic and microfluidic interconnection schemes. Central to these new technologies are hybrid microfabrication methods employing novel nanocomposite polymer materials and devices. We review microfabrication methods using these materials, along with demonstrations of example devices and packaging schemes that employ them. We review these recent developments and place them in the context of the fields of flexible microfluidics and conformable systems, and discuss cross-over applications to conventional rigid-substrate microfluidics.
EPA Method 537 was developed for the analysis of perfluoroalkyl acids (PFAAs) in drinking water to address the occurrence monitoring needs under EPA’s Unregulated Contaminant Monitoring Regulation (UCMR). The method employs solid-phase extraction with analysis by liquid chr...
FDDO and DSMC analyses of rarefied gas flow through 2D nozzles
NASA Technical Reports Server (NTRS)
Chung, Chan-Hong; De Witt, Kenneth J.; Jeng, Duen-Ren; Penko, Paul F.
1992-01-01
Two different approaches, the finite-difference method coupled with the discrete-ordinate method (FDDO), and the direct-simulation Monte Carlo (DSMC) method, are used in the analysis of the flow of a rarefied gas expanding through a two-dimensional nozzle and into a surrounding low-density environment. In the FDDO analysis, by employing the discrete-ordinate method, the Boltzmann equation simplified by a model collision integral is transformed to a set of partial differential equations which are continuous in physical space but are point functions in molecular velocity space. The set of partial differential equations are solved by means of a finite-difference approximation. In the DSMC analysis, the variable hard sphere model is used as a molecular model and the no time counter method is employed as a collision sampling technique. The results of both the FDDO and the DSMC methods show good agreement. The FDDO method requires less computational effort than the DSMC method by factors of 10 to 40 in CPU time, depending on the degree of rarefaction.
Finite element modeling and analysis of reinforced-concrete bridge.
DOT National Transportation Integrated Search
2000-09-01
Despite its long history, the finite element method continues to be the predominant strategy employed by engineers to conduct structural analysis. A reliable method is needed for analyzing structures made of reinforced concrete, a complex but common ...
Asif, Muhammad Khan; Nambiar, Phrabhakaran; Mani, Shani Ann; Ibrahim, Norliza Binti; Khan, Iqra Muhammad; Sukumaran, Prema
2018-02-01
The methods of dental age estimation and identification of unknown deceased individuals are evolving with the introduction of advanced innovative imaging technologies in forensic investigations. However, assessing small structures like root canal volumes can be challenging in spite of using highly advanced technology. The aim of the study was to investigate which amongst the two methods of volumetric analysis of maxillary central incisors displayed higher strength of correlation between chronological age and pulp/tooth volume ratio for Malaysian adults. Volumetric analysis of pulp cavity/tooth ratio was employed in Method 1 and pulp chamber/crown ratio (up to cemento-enamel junction) was analysed in Method 2. The images were acquired employing CBCT scans and enhanced by manipulating them with the Mimics software. These scans belonged to 56 males and 54 females and their ages ranged from 16 to 65 years. Pearson correlation and regression analysis indicated that both methods used for volumetric measurements had strong correlation between chronological age and pulp/tooth volume ratio. However, Method 2 gave higher coefficient of determination value (R2 = 0.78) when compared to Method 1 (R2 = 0.64). Moreover, manipulation in Method 2 was less time consuming and revealed higher inter-examiner reliability (0.982) as no manual intervention during 'multiple slice editing phase' of the software was required. In conclusion, this study showed that volumetric analysis of pulp cavity/tooth ratio is a valuable gender independent technique and the Method 2 regression equation should be recommended for dental age estimation. Copyright © 2018 Elsevier Ltd and Faculty of Forensic and Legal Medicine. All rights reserved.
Applied Graph-Mining Algorithms to Study Biomolecular Interaction Networks
2014-01-01
Protein-protein interaction (PPI) networks carry vital information on the organization of molecular interactions in cellular systems. The identification of functionally relevant modules in PPI networks is one of the most important applications of biological network analysis. Computational analysis is becoming an indispensable tool to understand large-scale biomolecular interaction networks. Several types of computational methods have been developed and employed for the analysis of PPI networks. Of these computational methods, graph comparison and module detection are the two most commonly used strategies. This review summarizes current literature on graph kernel and graph alignment methods for graph comparison strategies, as well as module detection approaches including seed-and-extend, hierarchical clustering, optimization-based, probabilistic, and frequent subgraph methods. Herein, we provide a comprehensive review of the major algorithms employed under each theme, including our recently published frequent subgraph method, for detecting functional modules commonly shared across multiple cancer PPI networks. PMID:24800226
Changing the Latitudes and Attitudes about Content Analysis Research
ERIC Educational Resources Information Center
Brank, Eve M.; Fox, Kathleen A.; Youstin, Tasha J.; Boeppler, Lee C.
2008-01-01
The current research employs the use of content analysis to teach research methods concepts among students enrolled in an upper division research methods course. Students coded and analyzed Jimmy Buffett song lyrics rather than using a downloadable database or collecting survey data. Students' knowledge of content analysis concepts increased after…
Spectroscopic chemical analysis methods and apparatus
NASA Technical Reports Server (NTRS)
Hug, William F. (Inventor); Reid, Ray D. (Inventor)
2009-01-01
Spectroscopic chemical analysis methods and apparatus are disclosed which employ deep ultraviolet (e.g. in the 200 nm to 300 nm spectral range) electron beam pumped wide bandgap semiconductor lasers, incoherent wide bandgap semiconductor light emitting devices, and hollow cathode metal ion lasers to perform non-contact, non-invasive detection of unknown chemical analytes. These deep ultraviolet sources enable dramatic size, weight and power consumption reductions of chemical analysis instruments. Chemical analysis instruments employed in some embodiments include capillary and gel plane electrophoresis, capillary electrochromatography, high performance liquid chromatography, flow cytometry, flow cells for liquids and aerosols, and surface detection instruments. In some embodiments, Raman spectroscopic detection methods and apparatus use ultra-narrow-band angle tuning filters, acousto-optic tuning filters, and temperature tuned filters to enable ultra-miniature analyzers for chemical identification. In some embodiments Raman analysis is conducted simultaneously with native fluorescence spectroscopy to provide high levels of sensitivity and specificity in the same instrument.
Spectroscopic chemical analysis methods and apparatus
NASA Technical Reports Server (NTRS)
Reid, Ray D. (Inventor); Hug, William F. (Inventor)
2010-01-01
Spectroscopic chemical analysis methods and apparatus are disclosed which employ deep ultraviolet (e.g. in the 200 nm to 300 nm spectral range) electron beam pumped wide bandgap semiconductor lasers, incoherent wide bandgap semiconductor light emitting devices, and hollow cathode metal ion lasers to perform non-contact, non-invasive detection of unknown chemical analytes. These deep ultraviolet sources enable dramatic size, weight and power consumption reductions of chemical analysis instruments. Chemical analysis instruments employed in some embodiments include capillary and gel plane electrophoresis, capillary electrochromatography, high performance liquid chromatography, flow cytometry, flow cells for liquids and aerosols, and surface detection instruments. In some embodiments, Raman spectroscopic detection methods and apparatus use ultra-narrow-band angle tuning filters, acousto-optic tuning filters, and temperature tuned filters to enable ultra-miniature analyzers for chemical identification. In some embodiments Raman analysis is conducted simultaneously with native fluorescence spectroscopy to provide high levels of sensitivity and specificity in the same instrument.
Microfluidic systems and methods of transport and lysis of cells and analysis of cell lysate
Culbertson, Christopher T.; Jacobson, Stephen C.; McClain, Maxine A.; Ramsey, J. Michael
2004-08-31
Microfluidic systems and methods are disclosed which are adapted to transport and lyse cellular components of a test sample for analysis. The disclosed microfluidic systems and methods, which employ an electric field to rupture the cell membrane, cause unusually rapid lysis, thereby minimizing continued cellular activity and resulting in greater accuracy of analysis of cell processes.
Microfluidic systems and methods for transport and lysis of cells and analysis of cell lysate
Culbertson, Christopher T [Oak Ridge, TN; Jacobson, Stephen C [Knoxville, TN; McClain, Maxine A [Knoxville, TN; Ramsey, J Michael [Knoxville, TN
2008-09-02
Microfluidic systems and methods are disclosed which are adapted to transport and lyse cellular components of a test sample for analysis. The disclosed microfluidic systems and methods, which employ an electric field to rupture the cell membrane, cause unusually rapid lysis, thereby minimizing continued cellular activity and resulting in greater accuracy of analysis of cell processes.
ERIC Educational Resources Information Center
Koenig, Lane; Fields, Errol L.; Dall, Timothy M.; Ameen, Ansari Z.; Harwood, Henrick J.
This report demonstrates three applications of case-mix methods using regression analysis. The results are used to assess the relative effectiveness of substance abuse treatment providers. The report also examines the ability of providers to improve client employment outcomes, an outcome domain relatively unexamined in the assessment of provider…
Slotnick, Scott D
2017-07-01
Analysis of functional magnetic resonance imaging (fMRI) data typically involves over one hundred thousand independent statistical tests; therefore, it is necessary to correct for multiple comparisons to control familywise error. In a recent paper, Eklund, Nichols, and Knutsson used resting-state fMRI data to evaluate commonly employed methods to correct for multiple comparisons and reported unacceptable rates of familywise error. Eklund et al.'s analysis was based on the assumption that resting-state fMRI data reflect null data; however, their 'null data' actually reflected default network activity that inflated familywise error. As such, Eklund et al.'s results provide no basis to question the validity of the thousands of published fMRI studies that have corrected for multiple comparisons or the commonly employed methods to correct for multiple comparisons.
Regional frequency analysis of extreme rainfalls using partial L moments method
NASA Astrophysics Data System (ADS)
Zakaria, Zahrahtul Amani; Shabri, Ani
2013-07-01
An approach based on regional frequency analysis using L moments and LH moments are revisited in this study. Subsequently, an alternative regional frequency analysis using the partial L moments (PL moments) method is employed, and a new relationship for homogeneity analysis is developed. The results were then compared with those obtained using the method of L moments and LH moments of order two. The Selangor catchment, consisting of 37 sites and located on the west coast of Peninsular Malaysia, is chosen as a case study. PL moments for the generalized extreme value (GEV), generalized logistic (GLO), and generalized Pareto distributions were derived and used to develop the regional frequency analysis procedure. PL moment ratio diagram and Z test were employed in determining the best-fit distribution. Comparison between the three approaches showed that GLO and GEV distributions were identified as the suitable distributions for representing the statistical properties of extreme rainfall in Selangor. Monte Carlo simulation used for performance evaluation shows that the method of PL moments would outperform L and LH moments methods for estimation of large return period events.
Modified multidimensional scaling approach to analyze financial markets.
Yin, Yi; Shang, Pengjian
2014-06-01
Detrended cross-correlation coefficient (σDCCA) and dynamic time warping (DTW) are introduced as the dissimilarity measures, respectively, while multidimensional scaling (MDS) is employed to translate the dissimilarities between daily price returns of 24 stock markets. We first propose MDS based on σDCCA dissimilarity and MDS based on DTW dissimilarity creatively, while MDS based on Euclidean dissimilarity is also employed to provide a reference for comparisons. We apply these methods in order to further visualize the clustering between stock markets. Moreover, we decide to confront MDS with an alternative visualization method, "Unweighed Average" clustering method, for comparison. The MDS analysis and "Unweighed Average" clustering method are employed based on the same dissimilarity. Through the results, we find that MDS gives us a more intuitive mapping for observing stable or emerging clusters of stock markets with similar behavior, while the MDS analysis based on σDCCA dissimilarity can provide more clear, detailed, and accurate information on the classification of the stock markets than the MDS analysis based on Euclidean dissimilarity. The MDS analysis based on DTW dissimilarity indicates more knowledge about the correlations between stock markets particularly and interestingly. Meanwhile, it reflects more abundant results on the clustering of stock markets and is much more intensive than the MDS analysis based on Euclidean dissimilarity. In addition, the graphs, originated from applying MDS methods based on σDCCA dissimilarity and DTW dissimilarity, may also guide the construction of multivariate econometric models.
ERIC Educational Resources Information Center
Cmar, Jennifer L.
2015-01-01
Introduction: Youths with visual impairments attend postsecondary school at high rates, yet these individuals have low rates of employment. In this study, factors associated with post-school employment were investigated in a nationally representative sample of youths with visual impairments. Methods: In a secondary analysis of data from the…
Improving the quality of parameter estimates obtained from slug tests
Butler, J.J.; McElwee, C.D.; Liu, W.
1996-01-01
The slug test is one of the most commonly used field methods for obtaining in situ estimates of hydraulic conductivity. Despite its prevalence, this method has received criticism from many quarters in the ground-water community. This criticism emphasizes the poor quality of the estimated parameters, a condition that is primarily a product of the somewhat casual approach that is often employed in slug tests. Recently, the Kansas Geological Survey (KGS) has pursued research directed it improving methods for the performance and analysis of slug tests. Based on extensive theoretical and field research, a series of guidelines have been proposed that should enable the quality of parameter estimates to be improved. The most significant of these guidelines are: (1) three or more slug tests should be performed at each well during a given test period; (2) two or more different initial displacements (Ho) should be used at each well during a test period; (3) the method used to initiate a test should enable the slug to be introduced in a near-instantaneous manner and should allow a good estimate of Ho to be obtained; (4) data-acquisition equipment that enables a large quantity of high quality data to be collected should be employed; (5) if an estimate of the storage parameter is needed, an observation well other than the test well should be employed; (6) the method chosen for analysis of the slug-test data should be appropriate for site conditions; (7) use of pre- and post-analysis plots should be an integral component of the analysis procedure, and (8) appropriate well construction parameters should be employed. Data from slug tests performed at a number of KGS field sites demonstrate the importance of these guidelines.
Interpreting Significant Discrete-Time Periods in Survival Analysis.
ERIC Educational Resources Information Center
Schumacker, Randall E.; Denson, Kathleen B.
Discrete-time survival analysis is a new method for educational researchers to employ when looking at the timing of certain educational events. Previous continuous-time methods do not allow for the flexibility inherent in a discrete-time method. Because both time-invariant and time-varying predictor variables can now be used, the interaction of…
Methods for the Measurement of a Bacterial Enzyme Activity in Cell Lysates and Extracts
Mendz, George; Hazell, Stuart
1998-01-01
The kinetic characteristics and regulation of aspartate carbamoyltransferase activity were studied in lysates and cell extracts of Helicobacter pylori by three diffirent methods. Nuclear magnetic resonance spectroscopy, radioactive tracer analysis, and spectrophotometry were employed in conjunction to identify the properties of the enzyme activity and to validate the results obtained with each assay. NMR spectroscopy was the most direct method to provide proof of ACTase activity; radioactive tracer analysis was the most sensitive technique and a microtitre-based colorimetric assay was the most cost-and time-efficient for large scale analyses. Freeze-thawing was adopted as the preferred method for cell lysis in studying enzyme activity in situ. This study showed the benefits of employing several different complementary methods to investigate bacterial enzyme activity. PMID:12734591
Labeled ALPHA4BETA2 ligands and methods therefor
Mukherjee, Jogeshwar; Pichika, Ramaiah; Potkin, Steven; Leslie, Frances; Chattopadhyay, Sankha
2013-02-19
Contemplated compositions and methods are employed to bind in vitro and in vivo to an .alpha.4.beta.2 nicotinic acetylcholine receptor in a highly selective manner. Where such compounds are labeled, compositions and methods employing such compounds can be used for PET and SPECT analysis. Alternatively, and/or additionally contemplated compounds can be used as antagonists, partial agonists or agonists in the treatment of diseases or conditions associated with .alpha.4.beta..beta.2 dysfunction.
Gis-Based Spatial Statistical Analysis of College Graduates Employment
NASA Astrophysics Data System (ADS)
Tang, R.
2012-07-01
It is urgently necessary to be aware of the distribution and employment status of college graduates for proper allocation of human resources and overall arrangement of strategic industry. This study provides empirical evidence regarding the use of geocoding and spatial analysis in distribution and employment status of college graduates based on the data from 2004-2008 Wuhan Municipal Human Resources and Social Security Bureau, China. Spatio-temporal distribution of employment unit were analyzed with geocoding using ArcGIS software, and the stepwise multiple linear regression method via SPSS software was used to predict the employment and to identify spatially associated enterprise and professionals demand in the future. The results show that the enterprises in Wuhan east lake high and new technology development zone increased dramatically from 2004 to 2008, and tended to distributed southeastward. Furthermore, the models built by statistical analysis suggest that the specialty of graduates major in has an important impact on the number of the employment and the number of graduates engaging in pillar industries. In conclusion, the combination of GIS and statistical analysis which helps to simulate the spatial distribution of the employment status is a potential tool for human resource development research.
Analysis of Social Cohesion in Health Data by Factor Analysis Method: The Ghanaian Perspective
ERIC Educational Resources Information Center
Saeed, Bashiru I. I.; Xicang, Zhao; Musah, A. A. I.; Abdul-Aziz, A. R.; Yawson, Alfred; Karim, Azumah
2013-01-01
We investigated the study of the overall social cohesion of Ghanaians. In this study, we considered the paramount interest of the involvement of Ghanaians in their communities, their views of other people and institutions, and their level of interest in both local and national politics. The factor analysis method was employed for analysis using R…
ERIC Educational Resources Information Center
Smedema, Susan Miller; Kesselmayer, Rachel Friefeld; Peterson, Lauren
2018-01-01
Purpose: To test a meditation model of the relationship between core self-evaluations (CSE) and job satisfaction in employed individuals with disabilities. Method: A quantitative descriptive design using Hayes's (2012) PROCESS macro for SPSS and multiple regression analysis. Two-hundred fifty-nine employed persons with disabilities were recruited…
Employing the Training Program Enrollee: An Analysis of Employer Personnel Records.
ERIC Educational Resources Information Center
Greenberg, David H.
The study is an attempt to assess the effects of the attributes of a firm in which a trainee is initially placed on his subsequent success. While traditional methods used information gathered from the trainee, this paper employs an alternative approach--the collection of follow up data from the personnel records of the hiring companies. Sixteen of…
ERIC Educational Resources Information Center
Karadag, Engin
2010-01-01
To assess research methods and analysis of statistical techniques employed by educational researchers, this study surveyed unpublished doctoral dissertation from 2003 to 2007. Frequently used research methods consisted of experimental research; a survey; a correlational study; and a case study. Descriptive statistics, t-test, ANOVA, factor…
Rapid qualitative and quantitative analysis of proanthocyanidin oligomers and polymers by UPLC-MS/MS
USDA-ARS?s Scientific Manuscript database
Proanthocyanidins (PAs) are a structurally complex and bioactive group of tannins. Detailed analysis of PA concentration, composition, and structure typically requires the use of one or more time-consuming analytical methods. For example, the commonly employed thiolysis and phloroglucinolysis method...
NASA Astrophysics Data System (ADS)
Okumura, Hiroshi; Suezaki, Masashi; Sueyasu, Hideki; Arai, Kohei
2003-03-01
An automated method that can select corresponding point candidates is developed. This method has the following three features: 1) employment of the RIN-net for corresponding point candidate selection; 2) employment of multi resolution analysis with Haar wavelet transformation for improvement of selection accuracy and noise tolerance; 3) employment of context information about corresponding point candidates for screening of selected candidates. Here, the 'RIN-net' means the back-propagation trained feed-forward 3-layer artificial neural network that feeds rotation invariants as input data. In our system, pseudo Zernike moments are employed as the rotation invariants. The RIN-net has N x N pixels field of view (FOV). Some experiments are conducted to evaluate corresponding point candidate selection capability of the proposed method by using various kinds of remotely sensed images. The experimental results show the proposed method achieves fewer training patterns, less training time, and higher selection accuracy than conventional method.
Spectroscopic chemical analysis methods and apparatus
NASA Technical Reports Server (NTRS)
Hug, William F. (Inventor); Reid, Ray D. (Inventor); Bhartia, Rohit (Inventor)
2013-01-01
Spectroscopic chemical analysis methods and apparatus are disclosed which employ deep ultraviolet (e.g. in the 200 nm to 300 nm spectral range) electron beam pumped wide bandgap semiconductor lasers, incoherent wide bandgap semiconductor light emitting devices, and hollow cathode metal ion lasers to perform non-contact, non-invasive detection of unknown chemical analytes. These deep ultraviolet sources enable dramatic size, weight and power consumption reductions of chemical analysis instruments. Chemical analysis instruments employed in some embodiments include capillary and gel plane electrophoresis, capillary electrochromatography, high performance liquid chromatography, flow cytometry, flow cells for liquids and aerosols, and surface detection instruments. In some embodiments, Raman spectroscopic detection methods and apparatus use ultra-narrow-band angle tuning filters, acousto-optic tuning filters, and temperature tuned filters to enable ultra-miniature analyzers for chemical identification. In some embodiments Raman analysis is conducted along with photoluminescence spectroscopy (i.e. fluorescence and/or phosphorescence spectroscopy) to provide high levels of sensitivity and specificity in the same instrument.
Event time analysis of longitudinal neuroimage data.
Sabuncu, Mert R; Bernal-Rusiel, Jorge L; Reuter, Martin; Greve, Douglas N; Fischl, Bruce
2014-08-15
This paper presents a method for the statistical analysis of the associations between longitudinal neuroimaging measurements, e.g., of cortical thickness, and the timing of a clinical event of interest, e.g., disease onset. The proposed approach consists of two steps, the first of which employs a linear mixed effects (LME) model to capture temporal variation in serial imaging data. The second step utilizes the extended Cox regression model to examine the relationship between time-dependent imaging measurements and the timing of the event of interest. We demonstrate the proposed method both for the univariate analysis of image-derived biomarkers, e.g., the volume of a structure of interest, and the exploratory mass-univariate analysis of measurements contained in maps, such as cortical thickness and gray matter density. The mass-univariate method employs a recently developed spatial extension of the LME model. We applied our method to analyze structural measurements computed using FreeSurfer, a widely used brain Magnetic Resonance Image (MRI) analysis software package. We provide a quantitative and objective empirical evaluation of the statistical performance of the proposed method on longitudinal data from subjects suffering from Mild Cognitive Impairment (MCI) at baseline. Copyright © 2014 Elsevier Inc. All rights reserved.
Research Methods and Data Analysis Procedures Used by Educational Researchers
ERIC Educational Resources Information Center
Hsu, Tse-chi
2005-01-01
To assess the status and the trends of subject matters investigated and research methods/designs and data analysis procedures employed by educational researchers, this study surveyed articles published by the "American Educational Research Journal (AERJ)," "Journal of Experimental Education (JEE)" and "Journal of Educational Research (JER)" from…
The creation of chiral chromatography techniques significantly advanced the development of methods for the analysis of individual enantiomers of chiral compounds. These techniques are being employed at the US EPA for human exposure and ecological research studies with indoor samp...
NASA Astrophysics Data System (ADS)
Crawford, I.; Ruske, S.; Topping, D. O.; Gallagher, M. W.
2015-07-01
In this paper we present improved methods for discriminating and quantifying Primary Biological Aerosol Particles (PBAP) by applying hierarchical agglomerative cluster analysis to multi-parameter ultra violet-light induced fluorescence (UV-LIF) spectrometer data. The methods employed in this study can be applied to data sets in excess of 1×106 points on a desktop computer, allowing for each fluorescent particle in a dataset to be explicitly clustered. This reduces the potential for misattribution found in subsampling and comparative attribution methods used in previous approaches, improving our capacity to discriminate and quantify PBAP meta-classes. We evaluate the performance of several hierarchical agglomerative cluster analysis linkages and data normalisation methods using laboratory samples of known particle types and an ambient dataset. Fluorescent and non-fluorescent polystyrene latex spheres were sampled with a Wideband Integrated Bioaerosol Spectrometer (WIBS-4) where the optical size, asymmetry factor and fluorescent measurements were used as inputs to the analysis package. It was found that the Ward linkage with z-score or range normalisation performed best, correctly attributing 98 and 98.1 % of the data points respectively. The best performing methods were applied to the BEACHON-RoMBAS ambient dataset where it was found that the z-score and range normalisation methods yield similar results with each method producing clusters representative of fungal spores and bacterial aerosol, consistent with previous results. The z-score result was compared to clusters generated with previous approaches (WIBS AnalysiS Program, WASP) where we observe that the subsampling and comparative attribution method employed by WASP results in the overestimation of the fungal spore concentration by a factor of 1.5 and the underestimation of bacterial aerosol concentration by a factor of 5. We suggest that this likely due to errors arising from misatrribution due to poor centroid definition and failure to assign particles to a cluster as a result of the subsampling and comparative attribution method employed by WASP. The methods used here allow for the entire fluorescent population of particles to be analysed yielding an explict cluster attribution for each particle, improving cluster centroid definition and our capacity to discriminate and quantify PBAP meta-classes compared to previous approaches.
Recent statistical methods for orientation data
NASA Technical Reports Server (NTRS)
Batschelet, E.
1972-01-01
The application of statistical methods for determining the areas of animal orientation and navigation are discussed. The method employed is limited to the two-dimensional case. Various tests for determining the validity of the statistical analysis are presented. Mathematical models are included to support the theoretical considerations and tables of data are developed to show the value of information obtained by statistical analysis.
Statistical methods in personality assessment research.
Schinka, J A; LaLone, L; Broeckel, J A
1997-06-01
Emerging models of personality structure and advances in the measurement of personality and psychopathology suggest that research in personality and personality assessment has entered a stage of advanced development, in this article we examine whether researchers in these areas have taken advantage of new and evolving statistical procedures. We conducted a review of articles published in the Journal of Personality, Assessment during the past 5 years. Of the 449 articles that included some form of data analysis, 12.7% used only descriptive statistics, most employed only univariate statistics, and fewer than 10% used multivariate methods of data analysis. We discuss the cost of using limited statistical methods, the possible reasons for the apparent reluctance to employ advanced statistical procedures, and potential solutions to this technical shortcoming.
Danhelova, Hana; Hradecky, Jaromir; Prinosilova, Sarka; Cajka, Tomas; Riddellova, Katerina; Vaclavik, Lukas; Hajslova, Jana
2012-07-01
The development and use of a fast method employing a direct analysis in real time (DART) ion source coupled to high-resolution time-of-flight mass spectrometry (TOFMS) for the quantitative analysis of caffeine in various coffee samples has been demonstrated in this study. A simple sample extraction procedure employing hot water was followed by direct, high-throughput (<1 min per run) examination of the extracts spread on a glass rod under optimized conditions of ambient mass spectrometry, without any prior chromatographic separation. For quantification of caffeine using DART-TOFMS, an external calibration was used. Isotopically labeled caffeine was used to compensate for the variations of the ion intensities of caffeine signal. Recoveries of the DART-TOFMS method were 97% for instant coffee at the spiking levels of 20 and 60 mg/g, respectively, while for roasted ground coffee, the obtained values were 106% and 107% at the spiking levels of 10 and 30 mg/g, respectively. The repeatability of the whole analytical procedure (expressed as relative standard deviation, RSD, %) was <5% for all tested spiking levels and matrices. Since the linearity range of the method was relatively narrow (two orders of magnitude), an optimization of sample dilution prior the DART-TOFMS measurement to avoid saturation of the detector was needed.
Military applications and examples of near-surface seismic surface wave methods (Invited)
NASA Astrophysics Data System (ADS)
sloan, S.; Stevens, R.
2013-12-01
Although not always widely known or publicized, the military uses a variety of geophysical methods for a wide range of applications--some that are already common practice in the industry while others are truly novel. Some of those applications include unexploded ordnance detection, general site characterization, anomaly detection, countering improvised explosive devices (IEDs), and security monitoring, to name a few. Techniques used may include, but are not limited to, ground penetrating radar, seismic, electrical, gravity, and electromagnetic methods. Seismic methods employed include surface wave analysis, refraction tomography, and high-resolution reflection methods. Although the military employs geophysical methods, that does not necessarily mean that those methods enable or support combat operations--often times they are being used for humanitarian applications within the military's area of operations to support local populations. The work presented here will focus on the applied use of seismic surface wave methods, including multichannel analysis of surface waves (MASW) and backscattered surface waves, often in conjunction with other methods such as refraction tomography or body-wave diffraction analysis. Multiple field examples will be shown, including explosives testing, tunnel detection, pre-construction site characterization, and cavity detection.
Who Benefits From Supported Employment: A Meta-analytic Study
Campbell, Kikuko; Drake, Robert E.
2011-01-01
Aims: This meta-analysis sought to identify which subgroups of clients with severe mental illness (SMI) benefited from evidence-based supported employment. Methods: We used meta-analysis to pool the samples from 4 randomized controlled trials comparing the Individual Placement and Support (IPS) model of supported employment to well-regarded vocational approaches using stepwise models and brokered services. Meta-analysis was used to determine the magnitude of effects for IPS/control group differences within specific client subgroups (defined by 2 work history, 7 sociodemographic, and 8 clinical variables) on 3 competitive employment outcomes (obtaining a job, total weeks worked, and job tenure). Results: The findings strongly favored IPS, with large effect sizes across all outcomes: 0.96 for job acquisition, 0.79 for total weeks worked, and 0.74 for job tenure. Overall, 90 (77%) of the 117 effect sizes calculated for the 39 subgroups exceeded 0.70, and all 117 favored IPS. Conclusions: IPS produces better competitive employment outcomes for persons with SMI than alternative vocational programs regardless of background demographic, clinical, and employment characteristics. PMID:19661196
Analysis of spreadable cheese by Raman spectroscopy and chemometric tools.
Oliveira, Kamila de Sá; Callegaro, Layce de Souza; Stephani, Rodrigo; Almeida, Mariana Ramos; de Oliveira, Luiz Fernando Cappa
2016-03-01
In this work, FT-Raman spectroscopy was explored to evaluate spreadable cheese samples. A partial least squares discriminant analysis was employed to identify the spreadable cheese samples containing starch. To build the models, two types of samples were used: commercial samples and samples manufactured in local industries. The method of supervised classification PLS-DA was employed to classify the samples as adulterated or without starch. Multivariate regression was performed using the partial least squares method to quantify the starch in the spreadable cheese. The limit of detection obtained for the model was 0.34% (w/w) and the limit of quantification was 1.14% (w/w). The reliability of the models was evaluated by determining the confidence interval, which was calculated using the bootstrap re-sampling technique. The results show that the classification models can be used to complement classical analysis and as screening methods. Copyright © 2015 Elsevier Ltd. All rights reserved.
Preliminary design methods for fiber reinforced composite structures employing a personal computer
NASA Technical Reports Server (NTRS)
Eastlake, C. N.
1986-01-01
The objective of this project was to develop a user-friendly interactive computer program to be used as an analytical tool by structural designers. Its intent was to do preliminary, approximate stress analysis to help select or verify sizing choices for composite structural members. The approach to the project was to provide a subroutine which uses classical lamination theory to predict an effective elastic modulus for a laminate of arbitrary material and ply orientation. This effective elastic modulus can then be used in a family of other subroutines which employ the familiar basic structural analysis methods for isotropic materials. This method is simple and convenient to use but only approximate, as is appropriate for a preliminary design tool which will be subsequently verified by more sophisticated analysis. Additional subroutines have been provided to calculate laminate coefficient of thermal expansion and to calculate ply-by-ply strains within a laminate.
Rapid analysis of pharmaceutical drugs using LIBS coupled with multivariate analysis.
Tiwari, P K; Awasthi, S; Kumar, R; Anand, R K; Rai, P K; Rai, A K
2018-02-01
Type 2 diabetes drug tablets containing voglibose having dose strengths of 0.2 and 0.3 mg of various brands have been examined, using laser-induced breakdown spectroscopy (LIBS) technique. The statistical methods such as the principal component analysis (PCA) and the partial least square regression analysis (PLSR) have been employed on LIBS spectral data for classifying and developing the calibration models of drug samples. We have developed the ratio-based calibration model applying PLSR in which relative spectral intensity ratios H/C, H/N and O/N are used. Further, the developed model has been employed to predict the relative concentration of element in unknown drug samples. The experiment has been performed in air and argon atmosphere, respectively, and the obtained results have been compared. The present model provides rapid spectroscopic method for drug analysis with high statistical significance for online control and measurement process in a wide variety of pharmaceutical industrial applications.
Formula Feedback and Central Cities: The Case of the Comprehensive Employment and Training Act.
ERIC Educational Resources Information Center
Jones, E. Terrence; Phares, Donald
1978-01-01
This study critically examines the measurement of the Comprehesive Employment and Training Act's key allocation variable, unemployment. The analysis indicates (1) unemployment rates are higher than government estimates and (2) methods used to measure state and local umemployment have several weaknesses. (Author/RLV)
ERIC Educational Resources Information Center
Yung-Kuan, Chan; Hsieh, Ming-Yuan; Lee, Chin-Feng; Huang, Chih-Cheng; Ho, Li-Chih
2017-01-01
Under the hyper-dynamic education situation, this research, in order to comprehensively explore the interplays between Teacher Competence Demands (TCD) and Learning Organization Requests (LOR), cross-employs the data refined method of Descriptive Statistics (DS) method and Analysis of Variance (ANOVA) and Principal Components Analysis (PCA)…
ERIC Educational Resources Information Center
Torreblanca, Maximo
1988-01-01
Discusses the validity of studies of Spanish pronunciation in terms of research methods employed. Topics include data collection in the laboratory vs. in a natural setting; recorded vs. non-recorded data; quality of the recording; aural analysis vs. spectrographic analysis; and transcriber reliability. Suggestions for improving data collection are…
Pires, Adriana Elias; Honda, Neli Kiko; Cardoso, Cláudia Andréa Lima
2004-10-29
A method for sample preparation and analysis by high performance liquid chromatography with UV detection (HPLC-UV) has been developed for routine analysis of psoralen and bergapten, photosensitizing compounds, in oral solutions of phytomedicines employed in Brazil for some illnesses. The linearity, accuracy, the inter- and intra-day precision of the procedure were evaluated. Calibration curves for psoralen and bergapten were linear in the range of 1.0-600.0 microg ml(-1) and 1.0-400.0 microg ml(-1) respectively. The recoveries of the psoralens in the oral solutions analysed were 94.43-99.97%. The percentage coefficient of variation (CV) of the quantitative analysis of the psoralens in the products analysis was within 5%. In inter-equipment study was employed gas chromatography-flame ionization (CG-FID) detection.
Monks, K; Molnár, I; Rieger, H-J; Bogáti, B; Szabó, E
2012-04-06
Robust HPLC separations lead to fewer analysis failures and better method transfer as well as providing an assurance of quality. This work presents the systematic development of an optimal, robust, fast UHPLC method for the simultaneous assay of two APIs of an eye drop sample and their impurities, in accordance with Quality by Design principles. Chromatography software is employed to effectively generate design spaces (Method Operable Design Regions), which are subsequently employed to determine the final method conditions and to evaluate robustness prior to validation. Copyright © 2011 Elsevier B.V. All rights reserved.
Integrated Processing in Planning and Understanding.
1986-12-01
to language analysis seemed necessary. The second observation was the rather commonsense one that it is easier to understand a foreign language ...syntactic analysis Probably the most widely employed method for natural language analysis is augmea ted transition network parsing, or ATNs (Thorne, Bratley...accomplished. It is for this reason that the programming language Prolog, which implements that general method , has proven so well-stilted to writing ATN
NASA Astrophysics Data System (ADS)
Pai, Akshay; Samala, Ravi K.; Zhang, Jianying; Qian, Wei
2010-03-01
Mammography reading by radiologists and breast tissue image interpretation by pathologists often leads to high False Positive (FP) Rates. Similarly, current Computer Aided Diagnosis (CADx) methods tend to concentrate more on sensitivity, thus increasing the FP rates. A novel method is introduced here which employs similarity based method to decrease the FP rate in the diagnosis of microcalcifications. This method employs the Principal Component Analysis (PCA) and the similarity metrics in order to achieve the proposed goal. The training and testing set is divided into generalized (Normal and Abnormal) and more specific (Abnormal, Normal, Benign) classes. The performance of this method as a standalone classification system is evaluated in both the cases (general and specific). In another approach the probability of each case belonging to a particular class is calculated. If the probabilities are too close to classify, the augmented CADx system can be instructed to have a detailed analysis of such cases. In case of normal cases with high probability, no further processing is necessary, thus reducing the computation time. Hence, this novel method can be employed in cascade with CADx to reduce the FP rate and also avoid unnecessary computational time. Using this methodology, a false positive rate of 8% and 11% is achieved for mammography and cellular images respectively.
Regional analysis of annual maximum rainfall using TL-moments method
NASA Astrophysics Data System (ADS)
Shabri, Ani Bin; Daud, Zalina Mohd; Ariff, Noratiqah Mohd
2011-06-01
Information related to distributions of rainfall amounts are of great importance for designs of water-related structures. One of the concerns of hydrologists and engineers is the probability distribution for modeling of regional data. In this study, a novel approach to regional frequency analysis using L-moments is revisited. Subsequently, an alternative regional frequency analysis using the TL-moments method is employed. The results from both methods were then compared. The analysis was based on daily annual maximum rainfall data from 40 stations in Selangor Malaysia. TL-moments for the generalized extreme value (GEV) and generalized logistic (GLO) distributions were derived and used to develop the regional frequency analysis procedure. TL-moment ratio diagram and Z-test were employed in determining the best-fit distribution. Comparison between the two approaches showed that the L-moments and TL-moments produced equivalent results. GLO and GEV distributions were identified as the most suitable distributions for representing the statistical properties of extreme rainfall in Selangor. Monte Carlo simulation was used for performance evaluation, and it showed that the method of TL-moments was more efficient for lower quantile estimation compared with the L-moments.
USDA-ARS?s Scientific Manuscript database
Immunoassays are analytical methods that employ antibodies or molecules derived from antibodies for the essential binding reactions. The choice of immunoassay system for food safety analysis depends on the analyte, the matrix, and the requirements of the analysis (speed, throughput, sensitivity, spe...
Shanghai: a study on the spatial growth of population and economy in a Chinese metropolitan area.
Zhu, J
1995-01-01
In this study of the growth in population and industry in Shanghai, China, between the 1982 and 1990 censuses, data on administrative divisions was normalized through digitization and spatial analysis. Analysis focused on spatial units, intensity of growth, time period, distance, rate of growth, and direction of spatial growth. The trisection method divided the city into city proper, outskirts, and suburbs. The distance function method considered the distance from center city as a function: exponential, power, trigonometric, logarithmic, and polynomial. Population growth and employment in all sectors increased in the outskirts and suburbs and decreased in the city proper except tertiary sectors. Primary sector employment decreased in all three sections. Employment in the secondary increased faster in the outskirts and suburbs than the total rate of growth of population and employment. In the city secondary sector employment rates decreased faster than total population and employment rates. The tertiary sector had the highest rate of growth in all sections, and employment grew faster than secondary sector rates. Tertiary growth was highest in real estate, finance, and insurance. Industrial growth in the secondary sector was 160.2% in the suburbs, 156.6% in the outskirts, and 80.9% in the city. In the distance function analysis, industry expanded further out than the entire secondary sector. Commerce grew the fastest in areas 15.4 km from center city. Economic growth was faster after economic reforms in 1978. Growth was led by industry and followed by the secondary sector, the tertiary sector, and population. Industrial expansion resulted from inner pressure, political factors controlling size, the social and economic system, and the housing construction and distribution system. Initially sociopsychological factors affected urban concentration.
Fuller, Daniel; Buote, Richard; Stanley, Kevin
2017-11-01
The volume and velocity of data are growing rapidly and big data analytics are being applied to these data in many fields. Population and public health researchers may be unfamiliar with the terminology and statistical methods used in big data. This creates a barrier to the application of big data analytics. The purpose of this glossary is to define terms used in big data and big data analytics and to contextualise these terms. We define the five Vs of big data and provide definitions and distinctions for data mining, machine learning and deep learning, among other terms. We provide key distinctions between big data and statistical analysis methods applied to big data. We contextualise the glossary by providing examples where big data analysis methods have been applied to population and public health research problems and provide brief guidance on how to learn big data analysis methods. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
Gonzalez, Aroa Garcia; Taraba, Lukáš; Hraníček, Jakub; Kozlík, Petr; Coufal, Pavel
2017-01-01
Dasatinib is a novel oral prescription drug proposed for treating adult patients with chronic myeloid leukemia. Three analytical methods, namely ultra high performance liquid chromatography, capillary zone electrophoresis, and sequential injection analysis, were developed, validated, and compared for determination of the drug in the tablet dosage form. The total analysis time of optimized ultra high performance liquid chromatography and capillary zone electrophoresis methods was 2.0 and 2.2 min, respectively. Direct ultraviolet detection with detection wavelength of 322 nm was employed in both cases. The optimized sequential injection analysis method was based on spectrophotometric detection of dasatinib after a simple colorimetric reaction with folin ciocalteau reagent forming a blue-colored complex with an absorbance maximum at 745 nm. The total analysis time was 2.5 min. The ultra high performance liquid chromatography method provided the lowest detection and quantitation limits and the most precise and accurate results. All three newly developed methods were demonstrated to be specific, linear, sensitive, precise, and accurate, providing results satisfactorily meeting the requirements of the pharmaceutical industry, and can be employed for the routine determination of the active pharmaceutical ingredient in the tablet dosage form. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Foundation Analysis East Coast Air Combat Maneuvering Range Offshore Kitty Hawk, North Carolina.
1976-09-01
1976 86 2 3 025 TABLE OF CONTENTS SECTION TITLE PAGE 1.0 INTRODUCTION 1.1 Introduction 1.01 1.2 Methods of Analysis 1.01 1.3 Personnel Resumes 1.02...piling into the desired penetration. 1.2 METHODS OF ANALYSIS The method employed to perform the computation of pipe pile capacity curves, as presented...AD-A163 522 FOUNDATION ANALYSIS EAST COAST AIR COMBAT NANsUVERING 14S RANGE OFFSHORE KITT.. CU) CREST ENGINEERING INC TULSA OK SEP 76 27-M7-97 CNES
Kim, Hwi; Min, Sung-Wook; Lee, Byoungho
2008-12-01
Geometrical optics analysis of the structural imperfection of retroreflection corner cubes is described. In the analysis, a geometrical optics model of six-beam reflection patterns generated by an imperfect retroreflection corner cube is developed, and its structural error extraction is formulated as a nonlinear optimization problem. The nonlinear conjugate gradient method is employed for solving the nonlinear optimization problem, and its detailed implementation is described. The proposed method of analysis is a mathematical basis for the nondestructive optical inspection of imperfectly fabricated retroreflection corner cubes.
A Review of CEFA Software: Comprehensive Exploratory Factor Analysis Program
ERIC Educational Resources Information Center
Lee, Soon-Mook
2010-01-01
CEFA 3.02(Browne, Cudeck, Tateneni, & Mels, 2008) is a factor analysis computer program designed to perform exploratory factor analysis. It provides the main properties that are needed for exploratory factor analysis, namely a variety of factoring methods employing eight different discrepancy functions to be minimized to yield initial…
41 CFR 60-2.12 - Job group analysis.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 41 Public Contracts and Property Management 1 2010-07-01 2010-07-01 true Job group analysis. 60-2... group analysis. (a) Purpose: A job group analysis is a method of combining job titles within the... employed. (b) In the job group analysis, jobs at the establishment with similar content, wage rates, and...
ERIC Educational Resources Information Center
Fong, Carlton J.; Murphy, Kathleen M.; Westbrook, John D.; Markle, Minda M.
2018-01-01
Purpose: The objective was to examine experimental and quasi-experimental studies about interventions that (i) included behavioral, psychological, educational, or vocational components; (ii) involved cancer survivors aged 18 years or older; and (iii) assessed employment outcomes. Methods: The aims were both to describe the variety of interventions…
Researching Employment Relations: A Self-Reflexive Analysis of a Multi-Method, School-Based Project
ERIC Educational Resources Information Center
McDonald, Paula; Graham, Tina
2011-01-01
Drawing on primary data and adjunct material, this article adopts a critical self-reflexive approach to a three-year, Australian Research Council-funded project that explored themes around "employment citizenship" for high school students in Queensland. The article addresses three overlapping areas that reflect some of the central…
A Quantitative Analysis of the Work Experiences of Adults with Visual Impairments in Nigeria
ERIC Educational Resources Information Center
Wolffe, Karen E.; Ajuwon, Paul M.; Kelly, Stacy M.
2013-01-01
Introduction: Worldwide, people with visual impairments often struggle to gain employment. This study attempts to closely evaluate the work experiences of employed individuals with visual impairments living in one of the world's most populous developing nations, Nigeria. Methods: The researchers developed a questionnaire that assessed personal and…
ERIC Educational Resources Information Center
DePinto, Ross M.
2013-01-01
Much of the relevant literature in the domains of leadership development, succession planning, and cross-generational issues that discusses learning paradigms associated with emerging generational cohorts has been based on qualitative research and anecdotal evidence. In contrast, this study employed quantitative research methods using a validated…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kawaguchi, Tomoya; Liu, Yihua; Reiter, Anthony
Here, a one-dimensional non-iterative direct method was employed for normalized crystal truncation rod analysis. The non-iterative approach, utilizing the Kramers–Kronig relation, avoids the ambiguities due to an improper initial model or incomplete convergence in the conventional iterative methods. The validity and limitations of the present method are demonstrated through both numerical simulations and experiments with Pt(111) in a 0.1 M CsF aqueous solution. The present method is compared with conventional iterative phase-retrieval methods.
Kawaguchi, Tomoya; Liu, Yihua; Reiter, Anthony; ...
2018-04-20
Here, a one-dimensional non-iterative direct method was employed for normalized crystal truncation rod analysis. The non-iterative approach, utilizing the Kramers–Kronig relation, avoids the ambiguities due to an improper initial model or incomplete convergence in the conventional iterative methods. The validity and limitations of the present method are demonstrated through both numerical simulations and experiments with Pt(111) in a 0.1 M CsF aqueous solution. The present method is compared with conventional iterative phase-retrieval methods.
Application of software technology to automatic test data analysis
NASA Technical Reports Server (NTRS)
Stagner, J. R.
1991-01-01
The verification process for a major software subsystem was partially automated as part of a feasibility demonstration. The methods employed are generally useful and applicable to other types of subsystems. The effort resulted in substantial savings in test engineer analysis time and offers a method for inclusion of automatic verification as a part of regression testing.
An analysis of burn-off impact on the structure microporous of activated carbons formation
NASA Astrophysics Data System (ADS)
Kwiatkowski, Mirosław; Kopac, Türkan
2017-12-01
The paper presents the results on the application of the LBET numerical method as a tool for analysis of the microporous structure of activated carbons obtained from a bituminous coal. The LBET method was employed particularly to evaluate the impact of the burn-off on the obtained microporous structure parameters of activated carbons.
2D/3D facial feature extraction
NASA Astrophysics Data System (ADS)
Çinar Akakin, Hatice; Ali Salah, Albert; Akarun, Lale; Sankur, Bülent
2006-02-01
We propose and compare three different automatic landmarking methods for near-frontal faces. The face information is provided as 480x640 gray-level images in addition to the corresponding 3D scene depth information. All three methods follow a coarse-to-fine suite and use the 3D information in an assist role. The first method employs a combination of principal component analysis (PCA) and independent component analysis (ICA) features to analyze the Gabor feature set. The second method uses a subset of DCT coefficients for template-based matching. These two methods employ SVM classifiers with polynomial kernel functions. The third method uses a mixture of factor analyzers to learn Gabor filter outputs. We contrast the localization performance separately with 2D texture and 3D depth information. Although the 3D depth information per se does not perform as well as texture images in landmark localization, the 3D information has still a beneficial role in eliminating the background and the false alarms.
Cost-volume-profit and net present value analysis of health information systems.
McLean, R A
1998-08-01
The adoption of any information system should be justified by an economic analysis demonstrating that its projected benefits outweigh its projected costs. Analysis differ, however, on which methods to employ for such a justification. Accountants prefer cost-volume-profit analysis, and economists prefer net present value analysis. The article explains the strengths and weaknesses of each method and shows how they can be used together so that well-informed investments in information systems can be made.
Ideology Awareness Project: An Exercise in Item Unit Content Analysis.
ERIC Educational Resources Information Center
Simon, David R.
1981-01-01
Describes an exercise in the content analysis of political ideologies. Advantages of the exercise include that it teaches students to employ content analysis as a method of research and that it introduces them to the ideological statements of America's leading social critics. (DB)
Watts, Logan L; Todd, E Michelle; Mulhearn, Tyler J; Medeiros, Kelsey E; Mumford, Michael D; Connelly, Shane
2017-01-01
Although qualitative research offers some unique advantages over quantitative research, qualitative methods are rarely employed in the evaluation of ethics education programs and are often criticized for a lack of rigor. This systematic review investigated the use of qualitative methods in studies of ethics education. Following a review of the literature in which 24 studies were identified, each study was coded based on 16 best practices characteristics in qualitative research. General thematic analysis and grounded theory were found to be the dominant approaches used. Researchers are effectively executing a number of best practices, such as using direct data sources, structured data collection instruments, non-leading questioning, and expert raters. However, other best practices were rarely present in the courses reviewed, such as collecting data using multiple sources, methods, raters, and timepoints, evaluating reliability, and employing triangulation analyses to assess convergence. Recommendations are presented for improving future qualitative research studies in ethics education.
Fault detection of gearbox using time-frequency method
NASA Astrophysics Data System (ADS)
Widodo, A.; Satrijo, Dj.; Prahasto, T.; Haryanto, I.
2017-04-01
This research deals with fault detection and diagnosis of gearbox by using vibration signature. In this work, fault detection and diagnosis are approached by employing time-frequency method, and then the results are compared with cepstrum analysis. Experimental work has been conducted for data acquisition of vibration signal thru self-designed gearbox test rig. This test-rig is able to demonstrate normal and faulty gearbox i.e., wears and tooth breakage. Three accelerometers were used for vibration signal acquisition from gearbox, and optical tachometer was used for shaft rotation speed measurement. The results show that frequency domain analysis using fast-fourier transform was less sensitive to wears and tooth breakage condition. However, the method of short-time fourier transform was able to monitor the faults in gearbox. Wavelet Transform (WT) method also showed good performance in gearbox fault detection using vibration signal after employing time synchronous averaging (TSA).
NASA Astrophysics Data System (ADS)
Weng, Jiawen; Clark, David C.; Kim, Myung K.
2016-05-01
A numerical reconstruction method based on compressive sensing (CS) for self-interference incoherent digital holography (SIDH) is proposed to achieve sectional imaging by single-shot in-line self-interference incoherent hologram. The sensing operator is built up based on the physical mechanism of SIDH according to CS theory, and a recovery algorithm is employed for image restoration. Numerical simulation and experimental studies employing LEDs as discrete point-sources and resolution targets as extended sources are performed to demonstrate the feasibility and validity of the method. The intensity distribution and the axial resolution along the propagation direction of SIDH by angular spectrum method (ASM) and by CS are discussed. The analysis result shows that compared to ASM the reconstruction by CS can improve the axial resolution of SIDH, and achieve sectional imaging. The proposed method may be useful to 3D analysis of dynamic systems.
An extended abstract: A heuristic repair method for constraint-satisfaction and scheduling problems
NASA Technical Reports Server (NTRS)
Minton, Steven; Johnston, Mark D.; Philips, Andrew B.; Laird, Philip
1992-01-01
The work described in this paper was inspired by a surprisingly effective neural network developed for scheduling astronomical observations on the Hubble Space Telescope. Our heuristic constraint satisfaction problem (CSP) method was distilled from an analysis of the network. In the process of carrying out the analysis, we discovered that the effectiveness of the network has little to do with its connectionist implementation. Furthermore, the ideas employed in the network can be implemented very efficiently within a symbolic CSP framework. The symbolic implementation is extremely simple. It also has the advantage that several different search strategies can be employed, although we have found that hill-climbing methods are particularly well-suited for the applications that we have investigated. We begin the paper with a brief review of the neural network. Following this, we describe our symbolic method for heuristic repair.
On the feasibility of a transient dynamic design analysis
NASA Astrophysics Data System (ADS)
Cunniff, Patrick F.; Pohland, Robert D.
1993-05-01
The Dynamic Design Analysis Method has been used for the past 30 years as part of the Navy's efforts to shock-harden heavy shipboard equipment. This method which has been validated several times employs normal mode theory and design shock values. This report examines the degree of success that may be achieved by using simple equipment-vehicle models that produce time history responses which are equivalent to the responses that would be achieved using spectral design values employed by the Dynamic Design Analysis Method. These transient models are constructed by attaching the equipment's modal oscillators to the vehicle which is composed of rigid masses and elastic springs. Two methods have been developed for constructing these transient models. Each method generates the parameters of the vehicles so as to approximate the required damaging effects, such that the transient model is excited by an idealized impulse applied to the vehicle mass to which the equipment modal oscillators are attached. The first method called the Direct Modeling Method, is limited to equipment with at most three-degrees of freedom and the vehicle consists of a single lumped mass and spring. The Optimization Modeling Method, which is based on the simplex method for optimization, has been used successfully with a variety of vehicle models and equipment sizes.
A Stirling engine analysis method based upon moving gas nodes
NASA Technical Reports Server (NTRS)
Martini, W. R.
1986-01-01
A Lagrangian nodal analysis method for Stirling engines (SEs) is described, validated, and applied to a conventional SE and an isothermalized SE (with fins in the hot and cold spaces). The analysis employs a constant-mass gas node (which moves with respect to the solid nodes during each time step) instead of the fixed gas nodes of Eulerian analysis. The isothermalized SE is found to have efficiency only slightly greater than that of a conventional SE.
Computer image analysis in caryopses quality evaluation as exemplified by malting barley
NASA Astrophysics Data System (ADS)
Koszela, K.; Raba, B.; Zaborowicz, M.; Przybył, K.; Wojcieszak, D.; Czekała, W.; Ludwiczak, A.; Przybylak, A.; Boniecki, P.; Przybył, J.
2015-07-01
One of the purposes to employ modern technologies in agricultural and food industry is to increase the efficiency and automation of production processes, which helps improve productive effectiveness of business enterprises, thus making them more competitive. Nowadays, a challenge presents itself for this branch of economy, to produce agricultural and food products characterized by the best parameters in terms of quality, while maintaining optimum production and distribution costs of the processed biological material. Thus, several scientific centers seek to devise new and improved methods and technologies in this field, which will allow to meet the expectations. A new solution, under constant development, is to employ the so-called machine vision which is to replace human work in both quality and quantity evaluation processes. An indisputable advantage of employing the method is keeping the evaluation unbiased while improving its rate and, what is important, eliminating the fatigue factor of the expert. This paper elaborates on the topic of quality evaluation by marking the contamination in malting barley grains using computer image analysis and selected methods of artificial intelligence [4-5].
NASA Astrophysics Data System (ADS)
Crawford, I.; Ruske, S.; Topping, D. O.; Gallagher, M. W.
2015-11-01
In this paper we present improved methods for discriminating and quantifying primary biological aerosol particles (PBAPs) by applying hierarchical agglomerative cluster analysis to multi-parameter ultraviolet-light-induced fluorescence (UV-LIF) spectrometer data. The methods employed in this study can be applied to data sets in excess of 1 × 106 points on a desktop computer, allowing for each fluorescent particle in a data set to be explicitly clustered. This reduces the potential for misattribution found in subsampling and comparative attribution methods used in previous approaches, improving our capacity to discriminate and quantify PBAP meta-classes. We evaluate the performance of several hierarchical agglomerative cluster analysis linkages and data normalisation methods using laboratory samples of known particle types and an ambient data set. Fluorescent and non-fluorescent polystyrene latex spheres were sampled with a Wideband Integrated Bioaerosol Spectrometer (WIBS-4) where the optical size, asymmetry factor and fluorescent measurements were used as inputs to the analysis package. It was found that the Ward linkage with z-score or range normalisation performed best, correctly attributing 98 and 98.1 % of the data points respectively. The best-performing methods were applied to the BEACHON-RoMBAS (Bio-hydro-atmosphere interactions of Energy, Aerosols, Carbon, H2O, Organics and Nitrogen-Rocky Mountain Biogenic Aerosol Study) ambient data set, where it was found that the z-score and range normalisation methods yield similar results, with each method producing clusters representative of fungal spores and bacterial aerosol, consistent with previous results. The z-score result was compared to clusters generated with previous approaches (WIBS AnalysiS Program, WASP) where we observe that the subsampling and comparative attribution method employed by WASP results in the overestimation of the fungal spore concentration by a factor of 1.5 and the underestimation of bacterial aerosol concentration by a factor of 5. We suggest that this likely due to errors arising from misattribution due to poor centroid definition and failure to assign particles to a cluster as a result of the subsampling and comparative attribution method employed by WASP. The methods used here allow for the entire fluorescent population of particles to be analysed, yielding an explicit cluster attribution for each particle and improving cluster centroid definition and our capacity to discriminate and quantify PBAP meta-classes compared to previous approaches.
Pre-employment and periodical health examinations, job analysis and placement of workers
Forssman, Sven
1955-01-01
A short survey has been given on the purpose and methods of pre-employment and regular examinations and job analysis. Placement of workers from the health point of view must be carried out according to the physical and mental demands of the work and the qualifications of the individual worker to fulfil those demands. Although the principles of the placement process are known, there is a great need for research into some leading problems. PMID:13276805
Yang, Mingxing; Li, Xiumin; Li, Zhibin; Ou, Zhimin; Liu, Ming; Liu, Suhuan; Li, Xuejun; Yang, Shuyu
2013-01-01
DNA microarray analysis is characterized by obtaining a large number of gene variables from a small number of observations. Cluster analysis is widely used to analyze DNA microarray data to make classification and diagnosis of disease. Because there are so many irrelevant and insignificant genes in a dataset, a feature selection approach must be employed in data analysis. The performance of cluster analysis of this high-throughput data depends on whether the feature selection approach chooses the most relevant genes associated with disease classes. Here we proposed a new method using multiple Orthogonal Partial Least Squares-Discriminant Analysis (mOPLS-DA) models and S-plots to select the most relevant genes to conduct three-class disease classification and prediction. We tested our method using Golub's leukemia microarray data. For three classes with subtypes, we proposed hierarchical orthogonal partial least squares-discriminant analysis (OPLS-DA) models and S-plots to select features for two main classes and their subtypes. For three classes in parallel, we employed three OPLS-DA models and S-plots to choose marker genes for each class. The power of feature selection to classify and predict three-class disease was evaluated using cluster analysis. Further, the general performance of our method was tested using four public datasets and compared with those of four other feature selection methods. The results revealed that our method effectively selected the most relevant features for disease classification and prediction, and its performance was better than that of the other methods.
Trends in study design and the statistical methods employed in a leading general medicine journal.
Gosho, M; Sato, Y; Nagashima, K; Takahashi, S
2018-02-01
Study design and statistical methods have become core components of medical research, and the methodology has become more multifaceted and complicated over time. The study of the comprehensive details and current trends of study design and statistical methods is required to support the future implementation of well-planned clinical studies providing information about evidence-based medicine. Our purpose was to illustrate study design and statistical methods employed in recent medical literature. This was an extension study of Sato et al. (N Engl J Med 2017; 376: 1086-1087), which reviewed 238 articles published in 2015 in the New England Journal of Medicine (NEJM) and briefly summarized the statistical methods employed in NEJM. Using the same database, we performed a new investigation of the detailed trends in study design and individual statistical methods that were not reported in the Sato study. Due to the CONSORT statement, prespecification and justification of sample size are obligatory in planning intervention studies. Although standard survival methods (eg Kaplan-Meier estimator and Cox regression model) were most frequently applied, the Gray test and Fine-Gray proportional hazard model for considering competing risks were sometimes used for a more valid statistical inference. With respect to handling missing data, model-based methods, which are valid for missing-at-random data, were more frequently used than single imputation methods. These methods are not recommended as a primary analysis, but they have been applied in many clinical trials. Group sequential design with interim analyses was one of the standard designs, and novel design, such as adaptive dose selection and sample size re-estimation, was sometimes employed in NEJM. Model-based approaches for handling missing data should replace single imputation methods for primary analysis in the light of the information found in some publications. Use of adaptive design with interim analyses is increasing after the presentation of the FDA guidance for adaptive design. © 2017 John Wiley & Sons Ltd.
State-Space Formulation for Circuit Analysis
ERIC Educational Resources Information Center
Martinez-Marin, T.
2010-01-01
This paper presents a new state-space approach for temporal analysis of electrical circuits. The method systematically obtains the state-space formulation of nondegenerate linear networks without using concepts of topology. It employs nodal/mesh systematic analysis to reduce the number of undesired variables. This approach helps students to…
Analysis and interpretation of diffraction data from complex, anisotropic materials
NASA Astrophysics Data System (ADS)
Tutuncu, Goknur
Most materials are elastically anisotropic and exhibit additional anisotropy beyond elastic deformation. For instance, in ferroelectric materials the main inelastic deformation mode is via domains, which are highly anisotropic crystallographic features. To quantify this anisotropy of ferroelectrics, advanced X-ray and neutron diffraction methods were employed. Extensive sets of data were collected from tetragonal BaTiO3, PZT and other ferroelectric ceramics. Data analysis was challenging due to the complex constitutive behavior of these materials. To quantify the elastic strain and texture evolution in ferroelectrics under loading, a number of data analysis techniques such as the single peak and Rietveld methods were used and their advantages and disadvantages compared. It was observed that the single peak analysis fails at low peak intensities especially after domain switching while the Rietveld method does not account for lattice strain anisotropy although it overcomes the low intensity problem via whole pattern analysis. To better account for strain anisotropy the constant stress (Reuss) approximation was employed within the Rietveld method and new formulations to estimate lattice strain were proposed. Along the way, new approaches for handling highly anisotropic lattice strain data were also developed and applied. All of the ceramics studied exhibited significant changes in their crystallographic texture after loading indicating non-180° domain switching. For a full interpretation of domain switching the spherical harmonics method was employed in Rietveld. A procedure for simultaneous refinement of multiple data sets was established for a complete texture analysis. To further interpret diffraction data, a solid mechanics model based on the self-consistent approach was used in calculating lattice strain and texture evolution during the loading of a polycrystalline ferroelectric. The model estimates both the macroscopic average response of a specimen and its hkl-dependent lattice strains for different reflections. It also tracks the number of grains (or domains) contributing to each reflection and allows for domain switching. The agreement between the model and experimental data was found to be satisfactory.
Mixed time integration methods for transient thermal analysis of structures, appendix 5
NASA Technical Reports Server (NTRS)
Liu, W. K.
1982-01-01
Mixed time integration methods for transient thermal analysis of structures are studied. An efficient solution procedure for predicting the thermal behavior of aerospace vehicle structures was developed. A 2D finite element computer program incorporating these methodologies is being implemented. The performance of these mixed time finite element algorithms can then be evaluated employing the proposed example problem.
Employment and the Risk of Domestic Abuse among Low-Income Women
ERIC Educational Resources Information Center
Gibson-Davis, Christina M.; Magnuson, Katherine; Gennetian, Lisa A.; Duncan, Greg J.
2005-01-01
This paper uses data from 2 randomized evaluations of welfare-to-work programs--the Minnesota Family Investment Program and the National Evaluation of Welfare-to-Work Strategies--to estimate the effect of employment on domestic abuse among low-income single mothers. Unique to our analysis is the application of a 2-stage least squares method, in…
Determination of copper by isotopic dilution.
Faquim, E S; Munita, C S
1994-01-01
A rapid and selective method was used for the determination of copper by isotopic dilution employing substoichiometric extraction with dithizone in carbon tetrachloride. The appropriate pH range for the substoichiometric extraction was 2-7. In the analysis, even a large excess of elements forming extractable complexes with dithizone does not interfere. The accuracy and precision of the method were evaluated. The method has been applied to analysis of reference materials, wheat flour, wine, and beer.
Ivanov, P L; Leonov, S N; Zemskova, E Iu
2012-01-01
The present study was designed to estimate the possibilities of application of the laser capture microdissection (LCM) technology for the molecular-genetic expert analysis (genotyping) of human chromosomal DNA. The experimental method employed for the purpose was the multiplex multilocus analysis of autosomal DNA polymorphism in the preparations of buccal epitheliocytes obtained by LCM. The key principles of the study were the application of physical methods for contrast enhancement of the micropreparations (such as phase-contrast microscopy and dark-field microscopy) and PCR-compatible cell lysis. Genotyping was carried out with the use of AmpFISTR Minifiler TM PCR Amplification Kits ("Applied Biosynthesis", USA). It was shown that the technique employed in the present study ensures reliable genotyping of human chromosomal DNA in the pooled preparations containing 10-20 dissected diploid cells each. This result fairly well agrees with the calculated sensitivity of the method. A few practical recommendations are offered.
14 CFR 415.115 - Flight safety.
Code of Federal Regulations, 2012 CFR
2012-01-01
... 14 Aeronautics and Space 4 2012-01-01 2012-01-01 false Flight safety. 415.115 Section 415.115... From a Non-Federal Launch Site § 415.115 Flight safety. (a) Flight safety analysis. An applicant's safety review document must describe each analysis method employed to meet the flight safety analysis...
14 CFR 415.115 - Flight safety.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 14 Aeronautics and Space 4 2010-01-01 2010-01-01 false Flight safety. 415.115 Section 415.115... From a Non-Federal Launch Site § 415.115 Flight safety. (a) Flight safety analysis. An applicant's safety review document must describe each analysis method employed to meet the flight safety analysis...
14 CFR 415.115 - Flight safety.
Code of Federal Regulations, 2013 CFR
2013-01-01
... 14 Aeronautics and Space 4 2013-01-01 2013-01-01 false Flight safety. 415.115 Section 415.115... From a Non-Federal Launch Site § 415.115 Flight safety. (a) Flight safety analysis. An applicant's safety review document must describe each analysis method employed to meet the flight safety analysis...
14 CFR 415.115 - Flight safety.
Code of Federal Regulations, 2011 CFR
2011-01-01
... 14 Aeronautics and Space 4 2011-01-01 2011-01-01 false Flight safety. 415.115 Section 415.115... From a Non-Federal Launch Site § 415.115 Flight safety. (a) Flight safety analysis. An applicant's safety review document must describe each analysis method employed to meet the flight safety analysis...
14 CFR 415.115 - Flight safety.
Code of Federal Regulations, 2014 CFR
2014-01-01
... 14 Aeronautics and Space 4 2014-01-01 2014-01-01 false Flight safety. 415.115 Section 415.115... From a Non-Federal Launch Site § 415.115 Flight safety. (a) Flight safety analysis. An applicant's safety review document must describe each analysis method employed to meet the flight safety analysis...
Counseling Workers over 40: GULHEMP, a New Approach.
ERIC Educational Resources Information Center
Meredith, Jack
This series of presentations describe a method of job counseling and placement for the middle-aged which combines pre-employment physical worker analysis with job analysis for effective matching of job requirements with worker capacities. The matching process involves these steps: (1) job analysis by an industrial engineer; (2) worker examination…
Cluster Analysis of Minnesota School Districts. A Research Report.
ERIC Educational Resources Information Center
Cleary, James
The term "cluster analysis" refers to a set of statistical methods that classify entities with similar profiles of scores on a number of measured dimensions, in order to create empirically based typologies. A 1980 Minnesota House Research Report employed cluster analysis to categorize school districts according to their relative mixtures…
Lerch, Oliver; Temme, Oliver; Daldrup, Thomas
2014-07-01
The analysis of opioids, cocaine, and metabolites from blood serum is a routine task in forensic laboratories. Commonly, the employed methods include many manual or partly automated steps like protein precipitation, dilution, solid phase extraction, evaporation, and derivatization preceding a gas chromatography (GC)/mass spectrometry (MS) or liquid chromatography (LC)/MS analysis. In this study, a comprehensively automated method was developed from a validated, partly automated routine method. This was possible by replicating method parameters on the automated system. Only marginal optimization of parameters was necessary. The automation relying on an x-y-z robot after manual protein precipitation includes the solid phase extraction, evaporation of the eluate, derivatization (silylation with N-methyl-N-trimethylsilyltrifluoroacetamide, MSTFA), and injection into a GC/MS. A quantitative analysis of almost 170 authentic serum samples and more than 50 authentic samples of other matrices like urine, different tissues, and heart blood on cocaine, benzoylecgonine, methadone, morphine, codeine, 6-monoacetylmorphine, dihydrocodeine, and 7-aminoflunitrazepam was conducted with both methods proving that the analytical results are equivalent even near the limits of quantification (low ng/ml range). To our best knowledge, this application is the first one reported in the literature employing this sample preparation system.
Informal employment in high-income countries for a health inequalities research: A scoping review.
Julià, Mireia; Tarafa, Gemma; O'Campo, Patricia; Muntaner, Carles; Jódar, Pere; Benach, Joan
2015-01-01
Informal employment (IE) is one of the least studied employment conditions in public health research, mainly due to the difficulty of its conceptualization and its measurement, producing a lack of a unique concept and a common method of measurement. The aim of this review is to identify literature on IE in order to improve its definition and methods of measurement, with special attention given to high-income countries, to be able to study the possible impact on health inequalities within and between countries. A scoping review of definitions and methods of measurement of IE was conducted reviewing relevant databases and grey literature and analyzing selected articles. We found a wide spectrum of terms for describing IE as well as definitions and methods of measurement. We provide a definition of IE to be used in health inequalities research in high-income countries. Direct methods such as surveys can capture more information about workers and firms in order to estimate IE. These results can be used in further investigations about the impacts of this IE on health inequalities. Public health research must improve monitoring and analysis of IE in order to know the impacts of this employment condition on health inequalities.
The Humanitarian Bailment of Foreign Possessed Territories: A Proactive Method of Legal Analysis
1996-04-01
entrusts an employee with the employer’s lawn mower to mow the employer’s lawn .220 This is analogous to the permissive entry situation where a host...accompanying text. 82 If an employee feloniously takes the lawn mower from the employer’s place of business to the employee’s house the servant has...committed larceny because the employee never had 230possession of the mower, only custody. If the employee picked up the lawn mower from the repair
An analysis of turbulent diffusion flame in axisymmetric jet
NASA Technical Reports Server (NTRS)
Chung, P. M.; Im, K. H.
1980-01-01
The kinetic theory of turbulent flow was employed to study the mixing limited combustion of hydrogen in axisymmetric jets. The integro-differential equations in two spatial and three velocity coordinates describing the combustion were reduced to a set of hyperbolic partial differential equations in the two spatial coordinates by a binodal approximation. The MacCormick's finite difference method was then employed for solution. The flame length was longer than that predicted by the flame-sheet analysis, and was found to be in general agreement with a recent experimental result. Increase of the turbulence energy and scale resulted in an enhancement of the combustion rate and, hence, in a shorter flame length. Details of the numerical method as well as of the physical findings are discussed.
Yoakum, A M; Stewart, P L; Sterrett, J E
1975-01-01
An emission spectrochemical method is described for the determination of trace quantities of platinum, lead, and manganese in biological tissues. Total energy burns in an argon-oxygen atmosphere are employed. Sample preparation, conditions of analysis, and preparation of standards are discussed. The precision of the method is consistently better than +/- 15%, and comparative analyses indicate comparable accuracies. Data obtained for experimental rat tissues and for selected autopsy tissues are presented. PMID:1157798
Survey of Existing and Promising New Methods of Surface Preparation
1982-04-01
and abroad, a description and analysis are givev of applicable methods including: • Equipment employing recycled steel shot and grit. • wet blast...requirements that must be met by these methods. 23. Barrillom, P., “Preservation of Materials in the Marine Environment— Analysis of Replies TO The Enquiry on...conditions, can hydrolyze or give sulfuric acid, causing renewed corrosion. Wet blasting or the use of high pressure water jets appears to be useful in
RICH detectors: Analysis methods and their impact on physics
NASA Astrophysics Data System (ADS)
Križan, Peter
2017-12-01
The paper discusses the importance of particle identification in particle physics experiments, and reviews the impact of ring imaging Cherenkov (RICH) counters in experiments that are currently running, or are under construction. Several analysis methods are discussed that are needed to calibrate a RICH counter, and to align its components with the rest of the detector. Finally, methods are reviewed on how to employ the collected data to efficiently separate one particle species from the other.
METAL SPECIATION IN SOIL, SEDIMENT, AND WATER SYSTEMS VIA SYNCHROTRON RADIATION RESEARCH
Metal contaminated environmental systems (soils, sediments, and water) have challenged researchers for many years. Traditional methods of analysis have employed extraction methods to determine total metal content and define risk based on the premise that as metal concentration in...
The Impact of Intervention Methods on Emotional Intelligence
ERIC Educational Resources Information Center
Davis, Christopher M.
2013-01-01
This experimental study continued the exploration surrounding emotional intelligence (EI). Emotional intelligence was examined through past and present literature, instrumentation, didactic teaching methods employing EI concepts, and data analysis. The experiment involved participants from two sections of an undergraduate economics class at a…
Method of identifying hairpin DNA probes by partial fold analysis
Miller, Benjamin L [Penfield, NY; Strohsahl, Christopher M [Saugerties, NY
2009-10-06
Method of identifying molecular beacons in which a secondary structure prediction algorithm is employed to identify oligonucleotide sequences within a target gene having the requisite hairpin structure. Isolated oligonucleotides, molecular beacons prepared from those oligonucleotides, and their use are also disclosed.
Method of identifying hairpin DNA probes by partial fold analysis
Miller, Benjamin L.; Strohsahl, Christopher M.
2008-10-28
Methods of identifying molecular beacons in which a secondary structure prediction algorithm is employed to identify oligonucleotide sequences within a target gene having the requisite hairpin structure. Isolated oligonucleotides, molecular beacons prepared from those oligonucleotides, and their use are also disclosed.
Method of Testing and Predicting Failures of Electronic Mechanical Systems
NASA Technical Reports Server (NTRS)
Iverson, David L.; Patterson-Hine, Frances A.
1996-01-01
A method employing a knowledge base of human expertise comprising a reliability model analysis implemented for diagnostic routines is disclosed. The reliability analysis comprises digraph models that determine target events created by hardware failures human actions, and other factors affecting the system operation. The reliability analysis contains a wealth of human expertise information that is used to build automatic diagnostic routines and which provides a knowledge base that can be used to solve other artificial intelligence problems.
ERIC Educational Resources Information Center
Biçer, Nursat
2017-01-01
The aim of the present study is to determine the influence of student-centered methods employed in Turkish language instruction on the academic success of students through meta-analysis. To this end, a literature review was conducted on the relevant studies conducted between 2000 and 2016 in order to determine the studies were suitable for the…
Dynamic analysis environment for nuclear forensic analyses
NASA Astrophysics Data System (ADS)
Stork, C. L.; Ummel, C. C.; Stuart, D. S.; Bodily, S.; Goldblum, B. L.
2017-01-01
A Dynamic Analysis Environment (DAE) software package is introduced to facilitate group inclusion/exclusion method testing, evaluation and comparison for pre-detonation nuclear forensics applications. Employing DAE, the multivariate signatures of a questioned material can be compared to the signatures for different, known groups, enabling the linking of the questioned material to its potential process, location, or fabrication facility. Advantages of using DAE for group inclusion/exclusion include built-in query tools for retrieving data of interest from a database, the recording and documentation of all analysis steps, a clear visualization of the analysis steps intelligible to a non-expert, and the ability to integrate analysis tools developed in different programming languages. Two group inclusion/exclusion methods are implemented in DAE: principal component analysis, a parametric feature extraction method, and k nearest neighbors, a nonparametric pattern recognition method. Spent Fuel Isotopic Composition (SFCOMPO), an open source international database of isotopic compositions for spent nuclear fuels (SNF) from 14 reactors, is used to construct PCA and KNN models for known reactor groups, and 20 simulated SNF samples are utilized in evaluating the performance of these group inclusion/exclusion models. For all 20 simulated samples, PCA in conjunction with the Q statistic correctly excludes a large percentage of reactor groups and correctly includes the true reactor of origination. Employing KNN, 14 of the 20 simulated samples are classified to their true reactor of origination.
NASA Astrophysics Data System (ADS)
Uchidate, M.
2018-09-01
In this study, with the aim of establishing a systematic knowledge on the impact of summit extraction methods and stochastic model selection in rough contact analysis, the contact area ratio (A r /A a ) obtained by statistical contact models with different summit extraction methods was compared with a direct simulation using the boundary element method (BEM). Fifty areal topography datasets with different autocorrelation functions in terms of the power index and correlation length were used for investigation. The non-causal 2D auto-regressive model which can generate datasets with specified parameters was employed in this research. Three summit extraction methods, Nayak’s theory, 8-point analysis and watershed segmentation, were examined. With regard to the stochastic model, Bhushan’s model and BGT (Bush-Gibson-Thomas) model were applied. The values of A r /A a from the stochastic models tended to be smaller than BEM. The discrepancy between the Bhushan’s model with the 8-point analysis and BEM was slightly smaller than Nayak’s theory. The results with the watershed segmentation was similar to those with the 8-point analysis. The impact of the Wolf pruning on the discrepancy between the stochastic analysis and BEM was not very clear. In case of the BGT model which employs surface gradients, good quantitative agreement against BEM was obtained when the Nayak’s bandwidth parameter was large.
Recruitment, Job Search, and the United States Employment Service. Volume II: Tables and Methods.
ERIC Educational Resources Information Center
Camil Associates, Inc., Philadelphia, PA.
This volume contains the appendixes to Volume I of the report on recruitment, job search, and the United States Employment Service in 20 middle-sized American cities. Appendix A contains 165 pages of tables. Appendix B (63 pages) contains details of sample design, data analysis, and estimate precision under the categories of: Overview of the study…
ERIC Educational Resources Information Center
McAllister, Jan; Collier, Jacqueline; Shepstone, Lee
2012-01-01
Purpose: In interview and survey studies, people who stutter report the belief that stuttering has had a negative impact on their own education and employment. This population study sought objective evidence of such disadvantage for people who stutter as a group, compared with people who do not stutter. Method: A secondary analysis of a British…
Hilton, Gillean; Unsworth, Carolyn A; Stuckey, Ruth; Murphy, Gregory C
2018-01-01
Vocational potential in people with spinal cord injury (SCI) are unrealised with rates of employment substantially lower than in the labour force participation of the general population and the pre-injury employment rates. To understand the experience and pathway of people achieving employment outcome after traumatic spinal cord injury by; classifying participants into employment outcome groups of stable, unstable and without employment; identifying pre and post-injury pathways for participants in each group and, exploring the experiences of people of seeking, gaining and maintaining employment. Thirty-one participants were interviewed. Mixed methods approach including interpretive phenomenological analysis and vocational pathway mapping of quantitative data. The most common pathway identified was from study and work pre-injury to stable employment post-injury. Four super-ordinate themes were identified from the interpretive phenomenological analysis; expectations of work, system impacts, worker identity and social supports. Implications for clinical practice include fostering cultural change, strategies for system navigation, promotion of worker identity and optimal use of social supports. The findings increase insight and understanding of the complex experience of employment after spinal cord injury. There is opportunity to guide experimental research, policy development and education concerning the complexity of the return to work experience and factors that influence pathways.
What does 'race' have to do with medical education research?
Muzzin, Linda; Mickleborough, Tim
2013-08-01
We live in a world of ethnoracial conflict. This is confirmed every day by opening and reading the newspaper. This everyday world seems far away in the pages of a medical education journal, but is it? The goal of this paper is to suggest that one need not look very far in medical education to encounter ethnoracial issues, and further, that research methods that are not ethnoracially biased must be employed to study these topics. We will draw attention to the relevance of employing an ethical conceptual approach to research involving 'race' by demonstrating how one author researching internationally educated health professionals has put 'race' front and centre in his analysis. He does this by using a postcolonial method of analysis termed a 'doubled-research' technique that sets up categories such as 'race' but then decolonizes them to avoid essentialism or stereotyping. We compare this method to another mainstream method employed for the same topic of inquiry which has sidelined 'race' in the analysis, potentially hiding findings about ethnoracial relations involving health professionals in our 'multicultural' society. This demonstration leads to the important question of whether research methods can be epistemologically racist-a question that has been raised about conventional research on education in general. Our argument is not meant to be the last word on this topic, but the first in this journal. We conclude that there is an internal ethics or axiology within research perspectives and methodologies that needs to be examined where ethnoracial issues are prominent. The use of mainstream approaches to undertake research can unintentionally 'leave unsaid' central aspects of what is researched while antiracist methods such as the one described in this article can open up the data to allow for a richer and deeper understanding of the problem. © 2013 John Wiley & Sons Ltd.
ERIC Educational Resources Information Center
Bailey, Charles-James N.
The author aims: (1) to show that generative phonology uses essentially the method of internal reconstruction which has previously been employed only in diachronic studies in setting up synchronic underlying phonological representations; (2) to show why synchronic analysis should add the comparative method to its arsenal, together with whatever…
Spatial analysis on future housing markets: economic development and housing implications.
Liu, Xin; Wang, Lizhe
2014-01-01
A coupled projection method combining formal modelling and other statistical techniques was developed to delineate the relationship between economic and social drivers for net new housing allocations. Using the example of employment growth in Tyne and Wear, UK, until 2016, the empirical analysis yields housing projections at the macro- and microspatial levels (e.g., region to subregion to elected ward levels). The results have important implications for the strategic planning of locations for housing and employment, demonstrating both intuitively and quantitatively how local economic developments affect housing demand.
Spatial Analysis on Future Housing Markets: Economic Development and Housing Implications
Liu, Xin; Wang, Lizhe
2014-01-01
A coupled projection method combining formal modelling and other statistical techniques was developed to delineate the relationship between economic and social drivers for net new housing allocations. Using the example of employment growth in Tyne and Wear, UK, until 2016, the empirical analysis yields housing projections at the macro- and microspatial levels (e.g., region to subregion to elected ward levels). The results have important implications for the strategic planning of locations for housing and employment, demonstrating both intuitively and quantitatively how local economic developments affect housing demand. PMID:24892097
Cagliero, Cecilia; Ho, Tien D; Zhang, Cheng; Bicchi, Carlo; Anderson, Jared L
2016-06-03
This study describes a simple and rapid sampling method employing a polymeric ionic liquid (PIL) sorbent coating in direct immersion solid-phase microextraction (SPME) for the trace-level analysis of acrylamide in brewed coffee and coffee powder. The crosslinked PIL sorbent coating demonstrated superior sensitivity in the extraction of acrylamide compared to all commercially available SPME coatings. A spin coating method was developed to evenly distribute the PIL coating on the SPME support and reproducibly produce fibers with a large film thickness. Ninhydrin was employed as a quenching reagent during extraction to inhibit the production of interfering acrylamide. The PIL fiber produced a limit of quantitation for acrylamide of 10μgL(-1) and achieved comparable results to the ISO method in the analysis of six coffee powder samples. Copyright © 2016 Elsevier B.V. All rights reserved.
NASA Technical Reports Server (NTRS)
1984-01-01
The atmospheric backscatter coefficient, beta, measured with an airborne CO Laser Doppler Velocimeter (LDV) system operating in a continuous wave, focussed model is discussed. The Single Particle Mode (SPM) algorithm, was developed from concept through analysis of an extensive amount of data obtained with the system on board a NASA aircraft. The SPM algorithm is intended to be employed in situations where one particle at a time appears in the sensitive volume of the LDV. In addition to giving the backscatter coefficient, the SPM algorithm also produces as intermediate results the aerosol density and the aerosol backscatter cross section distribution. A second method, which measures only the atmospheric backscatter coefficient, is called the Volume Mode (VM) and was simultaneously employed. The results of these two methods differed by slightly less than an order of magnitude. The measurement uncertainties or other errors in the results of the two methods are examined.
Methods for the analysis of azo dyes employed in food industry--A review.
Yamjala, Karthik; Nainar, Meyyanathan Subramania; Ramisetti, Nageswara Rao
2016-02-01
A wide variety of azo dyes are generally added for coloring food products not only to make them visually aesthetic but also to reinstate the original appearance lost during the production process. However, many countries in the world have banned the use of most of the azo dyes in food and their usage is highly regulated by domestic and export food supplies. The regulatory authorities and food analysts adopt highly sensitive and selective analytical methods for monitoring as well as assuring the quality and safety of food products. The present manuscript presents a comprehensive review of various analytical techniques used in the analysis of azo dyes employed in food industries of different parts of the world. A brief description on the use of different extraction methods such as liquid-liquid, solid phase and membrane extraction has also been presented. Copyright © 2015 Elsevier Ltd. All rights reserved.
Mega-Analysis of School Psychology Blueprint for Training and Practice Domains
ERIC Educational Resources Information Center
Burns, Matthew K.; Kanive, Rebecca; Zaslofsky, Anne F.; Parker, David C.
2013-01-01
Meta-analytic research is an effective method for synthesizing existing research and for informing practice and policy. Hattie (2009) suggested that meta-analytic procedures could be employed to existing meta-analyses to create a mega-analysis. The current mega-analysis examined a sample of 47 meta-analyses according to the "School…
A Historical Analysis of Primary Mathematics Curricula in Terms of Teaching Principles
ERIC Educational Resources Information Center
Ozmantar, Mehmet Fatih
2017-01-01
This study carries out a comparative analysis of primary mathematics curricula put into practice during Turkish Republican period. The data for this study are composed of official curricula documents which are examined in terms of teaching principles. The study adopts a qualitative approach and employs document analysis method. The official…
Microvascular Autonomic Composites
2012-01-06
thermogravimetric analysis (TGA) was employed. The double wall allowed for increased thermal stability of the microcapsules, which was...fluorescent nanoparticles (Berfield et al. 2006). Digital Image Correlation (DIC) is a data analysis method, which applies a mathematical...Theme IV: Experimental Assessment & Analysis 2.4.1 Optical diagnostics for complex microfluidic systems pg. 50 2.4.2 Fluorescent thermometry
NASA Astrophysics Data System (ADS)
Wasilah, S.; Fahmyddin, T.
2018-03-01
The employment of structural equation modeling (SEM) in research has taken an increasing attention in among researchers in built environment. There is a gap to understand the attributes, application, and importance of this approach in data analysis in built environment study. This paper intends to provide fundamental comprehension of SEM method in data analysis, unveiling attributes, employment and significance and bestow cases to assess associations amongst variables and constructs. The study uses some main literature to grasp the essence of SEM regarding with built environment research. The better acknowledgment of this analytical tool may assist the researcher in the built environment to analyze data under complex research questions and to test multivariate models in a single study.
DETERMINING BERYLLIUM IN DRINKING WATER BY GRAPHITE FURNACE ATOMIC ABSORPTION SPECTROSCOPY
A direct graphite furnace atomic absorption spectroscopy method for the analysis of beryllium in drinking water has been derived from a method for determining beryllium in urine. Ammonium phosphomolybdate and ascorbic acid were employed as matrix modifiers. The matrix modifiers s...
Innovation Analysis | Energy Analysis | NREL
. New empirical methods for estimating technical and commercial impact (based on patent citations and Commercial Breakthroughs, NREL employed regression models and multivariate simulations to compare social in the marketplace and found that: Web presence may provide a better representation of the commercial
Detrended fluctuation analysis for major depressive disorder.
Mumtaz, Wajid; Malik, Aamir Saeed; Ali, Syed Saad Azhar; Yasin, Mohd Azhar Mohd; Amin, Hafeezullah
2015-01-01
Clinical utility of Electroencephalography (EEG) based diagnostic studies is less clear for major depressive disorder (MDD). In this paper, a novel machine learning (ML) scheme was presented to discriminate the MDD patients and healthy controls. The proposed method inherently involved feature extraction, selection, classification and validation. The EEG data acquisition involved eyes closed (EC) and eyes open (EO) conditions. At feature extraction stage, the de-trended fluctuation analysis (DFA) was performed, based on the EEG data, to achieve scaling exponents. The DFA was performed to analyzes the presence or absence of long-range temporal correlations (LRTC) in the recorded EEG data. The scaling exponents were used as input features to our proposed system. At feature selection stage, 3 different techniques were used for comparison purposes. Logistic regression (LR) classifier was employed. The method was validated by a 10-fold cross-validation. As results, we have observed that the effect of 3 different reference montages on the computed features. The proposed method employed 3 different types of feature selection techniques for comparison purposes as well. The results show that the DFA analysis performed better in LE data compared with the IR and AR data. In addition, during Wilcoxon ranking, the AR performed better than LE and IR. Based on the results, it was concluded that the DFA provided useful information to discriminate the MDD patients and with further validation can be employed in clinics for diagnosis of MDD.
NASA Astrophysics Data System (ADS)
Oh, Han Bin; Leach, Franklin E.; Arungundram, Sailaja; Al-Mafraji, Kanar; Venot, Andre; Boons, Geert-Jan; Amster, I. Jonathan
2011-03-01
The structural characterization of glycosaminoglycan (GAG) carbohydrates by mass spectrometry has been a long-standing analytical challenge due to the inherent heterogeneity of these biomolecules, specifically polydispersity, variability in sulfation, and hexuronic acid stereochemistry. Recent advances in tandem mass spectrometry methods employing threshold and electron-based ion activation have resulted in the ability to determine the location of the labile sulfate modification as well as assign the stereochemistry of hexuronic acid residues. To facilitate the analysis of complex electron detachment dissociation (EDD) spectra, principal component analysis (PCA) is employed to differentiate the hexuronic acid stereochemistry of four synthetic GAG epimers whose EDD spectra are nearly identical upon visual inspection. For comparison, PCA is also applied to infrared multiphoton dissociation spectra (IRMPD) of the examined epimers. To assess the applicability of multivariate methods in GAG mixture analysis, PCA is utilized to identify the relative content of two epimers in a binary mixture.
Ren, Jingzheng
2018-01-01
This objective of this study is to develop a generic multi-attribute decision analysis framework for ranking the technologies for ballast water treatment and determine their grades. An evaluation criteria system consisting of eight criteria in four categories was used to evaluate the technologies for ballast water treatment. The Best-Worst method, which is a subjective weighting method and Criteria importance through inter-criteria correlation method, which is an objective weighting method, were combined to determine the weights of the evaluation criteria. The extension theory was employed to prioritize the technologies for ballast water treatment and determine their grades. An illustrative case including four technologies for ballast water treatment, i.e. Alfa Laval (T 1 ), Hyde (T 2 ), Unitor (T 3 ), and NaOH (T 4 ), were studied by the proposed method, and the Hyde (T 2 ) was recognized as the best technology. Sensitivity analysis was also carried to investigate the effects of the combined coefficients and the weights of the evaluation criteria on the final priority order of the four technologies for ballast water treatment. The sum weighted method and the TOPSIS was also employed to rank the four technologies, and the results determined by these two methods are consistent to that determined by the proposed method in this study. Copyright © 2017 Elsevier Ltd. All rights reserved.
Bageshwar, Deepak; Khanvilkar, Vineeta; Kadam, Vilasrao
2011-01-01
A specific, precise and stability indicating high-performance thin-layer chromatographic method for simultaneous estimation of pantoprazole sodium and itopride hydrochloride in pharmaceutical formulations was developed and validated. The method employed TLC aluminium plates precoated with silica gel 60F254 as the stationary phase. The solvent system consisted of methanol:water:ammonium acetate; 4.0:1.0:0.5 (v/v/v). This system was found to give compact and dense spots for both itopride hydrochloride (Rf value of 0.55±0.02) and pantoprazole sodium (Rf value of 0.85±0.04). Densitometric analysis of both drugs was carried out in the reflectance–absorbance mode at 289 nm. The linear regression analysis data for the calibration plots showed a good linear relationship with R2=0.9988±0.0012 in the concentration range of 100–400 ng for pantoprazole sodium. Also, the linear regression analysis data for the calibration plots showed a good linear relationship with R2=0.9990±0.0008 in the concentration range of 200–1200 ng for itopride hydrochloride. The method was validated for specificity, precision, robustness and recovery. Statistical analysis proves that the method is repeatable and selective for the estimation of both the said drugs. As the method could effectively separate the drug from its degradation products, it can be employed as a stability indicating method. PMID:29403710
Bageshwar, Deepak; Khanvilkar, Vineeta; Kadam, Vilasrao
2011-11-01
A specific, precise and stability indicating high-performance thin-layer chromatographic method for simultaneous estimation of pantoprazole sodium and itopride hydrochloride in pharmaceutical formulations was developed and validated. The method employed TLC aluminium plates precoated with silica gel 60F 254 as the stationary phase. The solvent system consisted of methanol:water:ammonium acetate; 4.0:1.0:0.5 (v/v/v). This system was found to give compact and dense spots for both itopride hydrochloride ( R f value of 0.55±0.02) and pantoprazole sodium ( R f value of 0.85±0.04). Densitometric analysis of both drugs was carried out in the reflectance-absorbance mode at 289 nm. The linear regression analysis data for the calibration plots showed a good linear relationship with R 2 =0.9988±0.0012 in the concentration range of 100-400 ng for pantoprazole sodium. Also, the linear regression analysis data for the calibration plots showed a good linear relationship with R 2 =0.9990±0.0008 in the concentration range of 200-1200 ng for itopride hydrochloride. The method was validated for specificity, precision, robustness and recovery. Statistical analysis proves that the method is repeatable and selective for the estimation of both the said drugs. As the method could effectively separate the drug from its degradation products, it can be employed as a stability indicating method.
Analysis method for Thomson scattering diagnostics in GAMMA 10/PDX.
Ohta, K; Yoshikawa, M; Yasuhara, R; Chikatsu, M; Shima, Y; Kohagura, J; Sakamoto, M; Nakasima, Y; Imai, T; Ichimura, M; Yamada, I; Funaba, H; Minami, T
2016-11-01
We have developed an analysis method to improve the accuracies of electron temperature measurement by employing a fitting technique for the raw Thomson scattering (TS) signals. Least square fitting of the raw TS signals enabled reduction of the error in the electron temperature measurement. We applied the analysis method to a multi-pass (MP) TS system. Because the interval between the MPTS signals is very short, it is difficult to separately analyze each Thomson scattering signal intensity by using the raw signals. We used the fitting method to obtain the original TS scattering signals from the measured raw MPTS signals to obtain the electron temperatures in each pass.
Active controls: A look at analytical methods and associated tools
NASA Technical Reports Server (NTRS)
Newsom, J. R.; Adams, W. M., Jr.; Mukhopadhyay, V.; Tiffany, S. H.; Abel, I.
1984-01-01
A review of analytical methods and associated tools for active controls analysis and design problems is presented. Approaches employed to develop mathematical models suitable for control system analysis and/or design are discussed. Significant efforts have been expended to develop tools to generate the models from the standpoint of control system designers' needs and develop the tools necessary to analyze and design active control systems. Representative examples of these tools are discussed. Examples where results from the methods and tools have been compared with experimental data are also presented. Finally, a perspective on future trends in analysis and design methods is presented.
An advanced approach for computer modeling and prototyping of the human tooth.
Chang, Kuang-Hua; Magdum, Sheetalkumar; Khera, Satish C; Goel, Vijay K
2003-05-01
This paper presents a systematic and practical method for constructing accurate computer and physical models that can be employed for the study of human tooth mechanics. The proposed method starts with a histological section preparation of a human tooth. Through tracing outlines of the tooth on the sections, discrete points are obtained and are employed to construct B-spline curves that represent the exterior contours and dentino-enamel junction (DEJ) of the tooth using a least square curve fitting technique. The surface skinning technique is then employed to quilt the B-spline curves to create a smooth boundary and DEJ of the tooth using B-spline surfaces. These surfaces are respectively imported into SolidWorks via its application protocol interface to create solid models. The solid models are then imported into Pro/MECHANICA Structure for finite element analysis (FEA). The major advantage of the proposed method is that it first generates smooth solid models, instead of finite element models in discretized form. As a result, a more advanced p-FEA can be employed for structural analysis, which usually provides superior results to traditional h-FEA. In addition, the solid model constructed is smooth and can be fabricated with various scales using the solid freeform fabrication technology. This method is especially useful in supporting bioengineering applications, where the shape of the object is usually complicated. A human maxillary second molar is presented to illustrate and demonstrate the proposed method. Note that both the solid and p-FEA models of the molar are presented. However, comparison between p- and h-FEA models is out of the scope of the paper.
Category's analysis and operational project capacity method of transformation in design
NASA Astrophysics Data System (ADS)
Obednina, S. V.; Bystrova, T. Y.
2015-10-01
The method of transformation is attracting widespread interest in fields such contemporary design. However, in theory of design little attention has been paid to a categorical status of the term "transformation". This paper presents the conceptual analysis of transformation based on the theory of form employed in the influential essays by Aristotle and Thomas Aquinas. In the present work the transformation as a method of shaping design has been explored as well as potential application of this term in design has been demonstrated.
ERIC Educational Resources Information Center
Zhuoxin, Liang
2013-01-01
The research has chosen some migrant workers of new generation from different fields in F City to investigate their current situation of vocational education. The research reveals that their education is helpful, mainly in employment, work and the methods of mentorship and further study organized by employers. The research also reveals its…
Analysis of the glow curve of SrB 4O 7:Dy compounds employing the GOT model
NASA Astrophysics Data System (ADS)
Ortega, F.; Molina, P.; Santiago, M.; Spano, F.; Lester, M.; Caselli, E.
2006-02-01
The glow curve of SrB 4O 7:Dy phosphors has been analysed with the general one trap model (GOT). To solve the differential equation describing the GOT model a novel algorithm has been employed, which reduces significantly the deconvolution time with respect to the time required by usual integration algorithms, such as the Runge-Kutta method.
Systems and methods for analyzing building operations sensor data
Mezic, Igor; Eisenhower, Bryan A.
2015-05-26
Systems and methods are disclosed for analyzing building sensor information and decomposing the information therein to a more manageable and more useful form. Certain embodiments integrate energy-based and spectral-based analysis methods with parameter sampling and uncertainty/sensitivity analysis to achieve a more comprehensive perspective of building behavior. The results of this analysis may be presented to a user via a plurality of visualizations and/or used to automatically adjust certain building operations. In certain embodiments, advanced spectral techniques, including Koopman-based operations, are employed to discern features from the collected building sensor data.
An approximate methods approach to probabilistic structural analysis
NASA Technical Reports Server (NTRS)
Mcclung, R. C.; Millwater, H. R.; Wu, Y.-T.; Thacker, B. H.; Burnside, O. H.
1989-01-01
A probabilistic structural analysis method (PSAM) is described which makes an approximate calculation of the structural response of a system, including the associated probabilistic distributions, with minimal computation time and cost, based on a simplified representation of the geometry, loads, and material. The method employs the fast probability integration (FPI) algorithm of Wu and Wirsching. Typical solution strategies are illustrated by formulations for a representative critical component chosen from the Space Shuttle Main Engine (SSME) as part of a major NASA-sponsored program on PSAM. Typical results are presented to demonstrate the role of the methodology in engineering design and analysis.
Tidal frequency estimation for closed basins
NASA Technical Reports Server (NTRS)
Eades, J. B., Jr.
1978-01-01
A method was developed for determining the fundamental tidal frequencies for closed basins of water, by means of an eigenvalue analysis. The mathematical model employed, was the Laplace tidal equations.
Computational performance of Free Mesh Method applied to continuum mechanics problems
YAGAWA, Genki
2011-01-01
The free mesh method (FMM) is a kind of the meshless methods intended for particle-like finite element analysis of problems that are difficult to handle using global mesh generation, or a node-based finite element method that employs a local mesh generation technique and a node-by-node algorithm. The aim of the present paper is to review some unique numerical solutions of fluid and solid mechanics by employing FMM as well as the Enriched Free Mesh Method (EFMM), which is a new version of FMM, including compressible flow and sounding mechanism in air-reed instruments as applications to fluid mechanics, and automatic remeshing for slow crack growth, dynamic behavior of solid as well as large-scale Eigen-frequency of engine block as applications to solid mechanics. PMID:21558753
Han, Yang; Hou, Shao-Yang; Ji, Shang-Zhi; Cheng, Juan; Zhang, Meng-Yue; He, Li-Juan; Ye, Xiang-Zhong; Li, Yi-Min; Zhang, Yi-Xuan
2017-11-15
A novel method, real-time reverse transcription PCR (real-time RT-PCR) coupled with probe-melting curve analysis, has been established to detect two kinds of samples within one fluorescence channel. Besides a conventional TaqMan probe, this method employs another specially designed melting-probe with a 5' terminus modification which meets the same label with the same fluorescent group. By using an asymmetric PCR method, the melting-probe is able to detect an extra sample in the melting stage effectively while it almost has little influence on the amplification detection. Thus, this method allows the availability of united employment of both amplification stage and melting stage for detecting samples in one reaction. The further demonstration by simultaneous detection of human immunodeficiency virus (HIV) and hepatitis C virus (HCV) in one channel as a model system is presented in this essay. The sensitivity of detection by real-time RT-PCR coupled with probe-melting analysis was proved to be equal to that detected by conventional real-time RT-PCR. Because real-time RT-PCR coupled with probe-melting analysis can double the detection throughputs within one fluorescence channel, it is expected to be a good solution for the problem of low-throughput in current real-time PCR. Copyright © 2017 Elsevier Inc. All rights reserved.
A drinking water method for 12 chemicals, predominately pesticides, is presented that addresses the occurrence monitoring needs of the U.S. Environmental Protection Agency (EPA) for a future Unregulated Contaminant Monitoring Regulation (UCMR). The method employs solid phase ext...
ERIC Educational Resources Information Center
Konold, Timothy R.; Glutting, Joseph J.
2008-01-01
This study employed a correlated trait-correlated method application of confirmatory factor analysis to disentangle trait and method variance from measures of attention-deficit/hyperactivity disorder obtained at the college level. The two trait factors were "Diagnostic and Statistical Manual of Mental Disorders-Fourth Edition" ("DSM-IV")…
A drinking water method for seven pesticides and pesticide degradates is presented that addresses the occurrence monitoring needs of the US Environmental Protection Agency (EPA) for a future Unregulated Contaminant Monitoring Regulation (UCMR). The method employs online solid pha...
NASA Technical Reports Server (NTRS)
1994-01-01
General Purpose Boundary Element Solution Technology (GPBEST) software employs the boundary element method of mechanical engineering analysis, as opposed to finite element. It is, according to one of its developers, 10 times faster in data preparation and more accurate than other methods. Its use results in less expensive products because the time between design and manufacturing is shortened. A commercial derivative of a NASA-developed computer code, it is marketed by Best Corporation to solve problems in stress analysis, heat transfer, fluid analysis and yielding and cracking of solids. Other applications include designing tractor and auto parts, household appliances and acoustic analysis.
Assessment of environmental impacts part one. Intervention analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hipel, Keith William; Lettenmaier, Dennis P.; McLeod, A. Ian
The use of intervention analysis as a statistical method of gauging the effects of environmental changes is discussed. The Box-Jenkins model, serves as the basis for the intervention analysis methodology. Environmental studies of the Aswan Dam, the South Saskatchewan River, and a forest fire near the Pipers Hole River, Canada, are included as case studies in which intervention analysis was employed. Methods of data collection for intervention analysis are found to have a significant impact on model reliability; effective data collection processes for the Box-Jenkins model are provided. (15 graphs, 27 references, 2 tables)
Zhang, Chao; Jia, Pengli; Yu, Liu; Xu, Chang
2018-05-01
Dose-response meta-analysis (DRMA) is widely applied to investigate the dose-specific relationship between independent and dependent variables. Such methods have been in use for over 30 years and are increasingly employed in healthcare and clinical decision-making. In this article, we give an overview of the methodology used in DRMA. We summarize the commonly used regression model and the pooled method in DRMA. We also use an example to illustrate how to employ a DRMA by these methods. Five regression models, linear regression, piecewise regression, natural polynomial regression, fractional polynomial regression, and restricted cubic spline regression, were illustrated in this article to fit the dose-response relationship. And two types of pooling approaches, that is, one-stage approach and two-stage approach are illustrated to pool the dose-response relationship across studies. The example showed similar results among these models. Several dose-response meta-analysis methods can be used for investigating the relationship between exposure level and the risk of an outcome. However the methodology of DRMA still needs to be improved. © 2018 Chinese Cochrane Center, West China Hospital of Sichuan University and John Wiley & Sons Australia, Ltd.
ERIC Educational Resources Information Center
Doerann-George, Judith
The Integrated Moving Average (IMA) model of time series, and the analysis of intervention effects based on it, assume random shocks which are normally distributed. To determine the robustness of the analysis to violations of this assumption, empirical sampling methods were employed. Samples were generated from three populations; normal,…
Exploring Advertising in Higher Education: An Empirical Analysis in North America, Europe, and Japan
ERIC Educational Resources Information Center
Papadimitriou, Antigoni; Blanco Ramírez, Gerardo
2015-01-01
This empirical study explores higher education advertising campaigns displayed in five world cities: Boston, New York, Oslo, Tokyo, and Toronto. The study follows a mixed-methods research design relying on content analysis and multimodal semiotic analysis and employs a conceptual framework based on the knowledge triangle of education, research,…
Silva, Arlene S; Brandao, Geovani C; Matos, Geraldo D; Ferreira, Sergio L C
2015-11-01
The present work proposed an analytical method for the direct determination of chromium in infant formulas employing the high-resolution continuum source electrothermal atomic absorption spectrometry combined with the solid sample analysis (SS-HR-CS ET AAS). Sample masses up to 2.0mg were directly weighted on a solid sampling platform and introduced into the graphite tube. In order to minimize the formation of carbonaceous residues and to improve the contact of the modifier solution with the solid sample, a volume of 10 µL of a solution containing 6% (v/v) H2O2, 20% (v/v) ethanol and 1% (v/v) HNO3 was added. The pyrolysis and atomization temperatures established were 1600 and 2400 °C, respectively, using magnesium as chemical modifier. The calibration technique was evaluated by comparing the slopes of calibration curves established using aqueous and solid standards. This test revealed that chromium can be determined employing the external calibration technique using aqueous standards. Under these conditions, the method developed allows the direct determination of chromium with limit of quantification of 11.5 ng g(-1), precision expressed as relative standard deviation (RSD) in the range of 4.0-17.9% (n=3) and a characteristic mass of 1.2 pg of chromium. The accuracy was confirmed by analysis of a certified reference material of tomato leaves furnished by National Institute of Standards and Technology. The method proposed was applied for the determination of chromium in five different infant formula samples. The chromium content found varied in the range of 33.9-58.1 ng g(-1) (n=3). These samples were also analyzed employing ICP-MS. A statistical test demonstrated that there is no significant difference between the results found by two methods. The chromium concentrations achieved are lower than the maximum limit permissible for chromium in foods by Brazilian Legislation. Copyright © 2015. Published by Elsevier B.V.
Non-destructive evaluation method employing dielectric electrostatic ultrasonic transducers
NASA Technical Reports Server (NTRS)
Yost, William T. (Inventor); Cantrell, Jr., John H. (Inventor)
2003-01-01
An acoustic nonlinearity parameter (.beta.) measurement method and system for Non-Destructive Evaluation (NDE) of materials and structural members novelly employs a loosely mounted dielectric electrostatic ultrasonic transducer (DEUT) to receive and convert ultrasonic energy into an electrical signal which can be analyzed to determine the .beta. of the test material. The dielectric material is ferroelectric with a high dielectric constant .di-elect cons.. A computer-controlled measurement system coupled to the DEUT contains an excitation signal generator section and a measurement and analysis section. As a result, the DEUT measures the absolute particle displacement amplitudes in test material, leading to derivation of the nonlinearity parameter (.beta.) without the costly, low field reliability methods of the prior art.
Numerical Investigation of Laminar-Turbulent Transition in a Flat Plate Wake
1990-03-02
Difference Methods , Oxford University Press. 3 Swarztrauber, P. N. (1977). "The Methods of Cyclic Reduction, Fourier Analysis and The FACR Algorithm for...streamwise and trans- verse directions. For the temporal discretion, a combination of ADI, Crank-Nicolson,Iand Adams-Rashforth methods is employed. The...41 U 5. NUMERICAL METHOD ...... .................... .. 50 3 5.1 Spanwise Spectral Approximation ... .............. ... 50 5.1.1 Fourier
ERIC Educational Resources Information Center
Rukundo, Aloysius; Kibanja, Grace; Steffens, Karl
2014-01-01
Introduction: Psychoactive substance use among adolescents influences behavioral and cognitive processes and is associated with adolescents' performance in school. We therefore sought to investigate association of PASU with adolescents' school performance. Methods: We employed quantitative methods of data collection and analysis. To test the…
Implementation of radiation shielding calculation methods. Volume 2: Seminar/Workshop notes
NASA Technical Reports Server (NTRS)
Capo, M. A.; Disney, R. K.
1971-01-01
Detailed descriptions are presented of the input data for each of the MSFC computer codes applied to the analysis of a realistic nuclear propelled vehicle. The analytical techniques employed include cross section data, preparation, one and two dimensional discrete ordinates transport, point kernel, and single scatter methods.
Razalas' Grouping Method and Mathematics Achievement
ERIC Educational Resources Information Center
Salazar, Douglas A.
2015-01-01
This study aimed to raise the achievement level of students in Integral Calculus using Direct Instruction with Razalas' Method of Grouping. The study employed qualitative and quantitative analysis relative to data generated by the Achievement Test and Math journal with follow-up interview. Within the framework of the limitations of the study, the…
Child-Parent Interventions for Childhood Anxiety Disorders: A Systematic Review and Meta-Analysis
ERIC Educational Resources Information Center
Brendel, Kristen Esposito; Maynard, Brandy R.
2014-01-01
Objective: This study compared the effects of direct child-parent interventions to the effects of child-focused interventions on anxiety outcomes for children with anxiety disorders. Method: Systematic review methods and meta-analytic techniques were employed. Eight randomized controlled trials examining effects of family cognitive behavior…
IMMUNOASSAY METHODS FOR MEASURING ATRAZINE AND 3,5,6-TRICHLORO-2-PYRIDINOL IN FOODS
This chapter describes the use of enzyme-linked immunosorbent assay (ELISA) methods for the analysis of two potential environmental contaminants in food sample media, atrazine and 3,5,6-trichloro-2-pyridinol (3,5,6-TCP). Two different immunoassay formats are employed: a magnetic...
[Predictors of employment intention for mentally disabled persons].
Han, Sang-Sook; Han, Jeong Hye; Yun, Eun Kyoung
2008-08-01
This study was conducted to determine the predictors of employment intention for mentally disabled persons. Mentally disabled persons who had participated in rehabilitation programs in one of 16 mental health centers and 9 community rehabilitation centers located in Seoul and Kyunggi province were recruited for this study. A random sampling method was used and 414 respondents were used for final analysis. Data was analyzed by Pearson's correlation, and stepwise multiple regression using the SPSS Win 14.0. The predictors influencing employment intention of the mentally disabled person were observed as employment desire (beta=.48), guardian's expectation (beta=.26), professional's support (beta=.23), financial management (beta=.10), eating habits (beta=.07), and quality of life (beta=-.01). Six factors explained 61.1% of employment intention of mentally disabled persons. The employment intention of a mentally disabled person was influenced by employment desire, diet self-efficacy, guardian's expectation, professional's support, quality of life, financial management and eating habits.
Viewpoints on Factors for Successful Employment for Adults with Autism Spectrum Disorder
2015-01-01
This article explores the key factors for successful employment from the viewpoints of adults with autism spectrum disorder (ASD) and employers. Two groups of individuals participated in this study, 40 adults with ASD and 35 employers. Q method was used to understand and contrast the viewpoints of the two groups. Data were analysed using by-person varimax rotation factor analysis. Results showed that although both groups appear committed to the employment process, the difference in their understanding regarding the type of workplace support required, job expectations and productivity requirements continues to hinder successful employment. These results highlight the need to facilitate communication between employees and employers to ensure a clear understanding of the needs of both groups are met. The use of an ASD-specific workplace tool may assist in facilitating the necessary communication between these two groups. PMID:26462234
NASA Technical Reports Server (NTRS)
Bechtold, I. C. (Principal Investigator); Liggett, M. L.; Childs, J. F.
1973-01-01
There are no author-identified significant results in this report. Research progress in applications of ERTS-1 MSS imagery in study of Basin-Range tectonics is summarized. Field reconnaissance of ERTS-1 image anomalies has resulted in recognition of previously unreported fault zones and regional structural control of volcanic and plutonic activity. NIMBUS, Apollo 9, X-15, U-2, and SLAR imagery are discussed with specific applications, and methods of image enhancement and analysis employed in the research are summarized. Areas studied and methods employed in geologic field work are outlined.
NASA Technical Reports Server (NTRS)
Liggett, M. A.; Childs, J. F.
1973-01-01
The author has identified the following significant results. Research progress in applications of ERTS-1 MSS imagery to study of Basin-Range tectonics is summarized. Field reconnaissance of ERTS-1 image anomalies has resulted in recognition of previously unreported fault zones and regional structural control of volcanic and plutonic activity. Nimbus, Apollo 9, X-15, U-2, and SIAR imagery are discussed with specific applications, and methods of image enhancement and analysis employed in the research are summarized. Field areas studied and methods employed in geologic field work are outlined.
Analysis of Publications and Citations from a Geophysics Research Institute.
ERIC Educational Resources Information Center
Frohlich, Cliff; Resler, Lynn
2001-01-01
Performs an analysis of all 1128 publications produced by scientists during their employment at the University of Texas Institute for Geophysics, thus assessing research performance using as bibliometric indicators such statistics as publications per year, citations per paper, and cited half-lives. Evaluates five different methods for determining…
State-Space Analysis of Working Memory in Schizophrenia: An FBIRN Study
ERIC Educational Resources Information Center
Janoos, Firdaus; Brown, Gregory; Morocz, Istvan A.; Wells, William M., III
2013-01-01
The neural correlates of "working memory" (WM) in schizophrenia (SZ) have been extensively studied using the multisite fMRI data acquired by the Functional Biomedical Informatics Research Network (fBIRN) consortium. Although univariate and multivariate analysis methods have been variously employed to localize brain responses under differing task…
Free Mesh Method: fundamental conception, algorithms and accuracy study
YAGAWA, Genki
2011-01-01
The finite element method (FEM) has been commonly employed in a variety of fields as a computer simulation method to solve such problems as solid, fluid, electro-magnetic phenomena and so on. However, creation of a quality mesh for the problem domain is a prerequisite when using FEM, which becomes a major part of the cost of a simulation. It is natural that the concept of meshless method has evolved. The free mesh method (FMM) is among the typical meshless methods intended for particle-like finite element analysis of problems that are difficult to handle using global mesh generation, especially on parallel processors. FMM is an efficient node-based finite element method that employs a local mesh generation technique and a node-by-node algorithm for the finite element calculations. In this paper, FMM and its variation are reviewed focusing on their fundamental conception, algorithms and accuracy. PMID:21558752
A Test Generation Framework for Distributed Fault-Tolerant Algorithms
NASA Technical Reports Server (NTRS)
Goodloe, Alwyn; Bushnell, David; Miner, Paul; Pasareanu, Corina S.
2009-01-01
Heavyweight formal methods such as theorem proving have been successfully applied to the analysis of safety critical fault-tolerant systems. Typically, the models and proofs performed during such analysis do not inform the testing process of actual implementations. We propose a framework for generating test vectors from specifications written in the Prototype Verification System (PVS). The methodology uses a translator to produce a Java prototype from a PVS specification. Symbolic (Java) PathFinder is then employed to generate a collection of test cases. A small example is employed to illustrate how the framework can be used in practice.
Radar fall detection using principal component analysis
NASA Astrophysics Data System (ADS)
Jokanovic, Branka; Amin, Moeness; Ahmad, Fauzia; Boashash, Boualem
2016-05-01
Falls are a major cause of fatal and nonfatal injuries in people aged 65 years and older. Radar has the potential to become one of the leading technologies for fall detection, thereby enabling the elderly to live independently. Existing techniques for fall detection using radar are based on manual feature extraction and require significant parameter tuning in order to provide successful detections. In this paper, we employ principal component analysis for fall detection, wherein eigen images of observed motions are employed for classification. Using real data, we demonstrate that the PCA based technique provides performance improvement over the conventional feature extraction methods.
Abbas, Mohsin
2015-01-01
Background The present study aimed to analyze the index value trends of injured employed persons (IEPs) covered in Pakistan Labour Force Surveys from 2001–02 to 2012–13. Methods The index value method based on reference years and reference groups was used to analyze the IEP trends in terms of different criteria such as gender, area, employment status, industry types, occupational groups, types of injury, injured body parts, and treatment received. The Pearson correlation coefficient analysis was also performed to investigate the inter-relationship of different occupational variables. Results The values of IEP increased at the end of the studied year in industry divisions such as agriculture, forestry, hunting, and fishing, followed by in manufacturing and construction industry divisions. People associated with major occupations (such as skilled agricultural and fishery workers) and elementary (unskilled) occupations were found to be at an increasing risk of occupational injuries/diseases with an increasing IEP trend. Types of occupational injuries such as sprain or strain, superficial injury, and dislocation increased during the studied years. Major injured parts of body such as upper limb and lower limb found with increasing trend. Types of treatment received, including hospitalization and no treatment, were found to decrease. Increased IEP can be justified due to inadequate health care facilities, especially in rural areas by increased IEP in terms of gender, areas, received treatment, occupational groups and employment status as results found after Pearson correlation coefficient analysis. Conclusion The increasing trend in the IEP% of the total employed persons due to agrarian activities shows that there is a need to improve health care setups in rural areas of Pakistan. PMID:26929831
Insausti, Matías; Gomes, Adriano A; Cruz, Fernanda V; Pistonesi, Marcelo F; Araujo, Mario C U; Galvão, Roberto K H; Pereira, Claudete F; Band, Beatriz S F
2012-08-15
This paper investigates the use of UV-vis, near infrared (NIR) and synchronous fluorescence (SF) spectrometries coupled with multivariate classification methods to discriminate biodiesel samples with respect to the base oil employed in their production. More specifically, the present work extends previous studies by investigating the discrimination of corn-based biodiesel from two other biodiesel types (sunflower and soybean). Two classification methods are compared, namely full-spectrum SIMCA (soft independent modelling of class analogies) and SPA-LDA (linear discriminant analysis with variables selected by the successive projections algorithm). Regardless of the spectrometric technique employed, full-spectrum SIMCA did not provide an appropriate discrimination of the three biodiesel types. In contrast, all samples were correctly classified on the basis of a reduced number of wavelengths selected by SPA-LDA. It can be concluded that UV-vis, NIR and SF spectrometries can be successfully employed to discriminate corn-based biodiesel from the two other biodiesel types, but wavelength selection by SPA-LDA is key to the proper separation of the classes. Copyright © 2012 Elsevier B.V. All rights reserved.
Working with Persistent Pain: An Exploration of Strategies Utilised to Stay Productive at Work.
Oakman, Jodi; Kinsman, Natasha; Briggs, Andrew M
2017-03-01
Purpose Maintaining productive employment for people with persistent pain conditions is challenging. This study aims to explore supports-work and non-work- used by employees to assist them in maintaining productive employment. Methods An exploratory, mixed-methods study comprising a questionnaire battery followed by semi-structured interviews to collect in-depth qualitative data was undertaken. The questionnaires measured descriptive variables used to select participants for interviews based on maximum heterogeneity sampling. Thirty-five semi-structured interviews were undertaken (14 males; 21 females). The interview schedule covered: employment situation, workplace challenges, workplace supports, coping strategies, motivations, future employment options and any other resources utilised. Inductive content analysis was undertaken using a grounded theory approach to systematically explore the data. Results Three key themes were identified: barriers to working productively, enablers to working productively, disclosing my condition at work. A key determinant of maintaining productive employment was a supportive employer. In addition, flexibility in the work organisation was also pivotal in maintaining sustainable, productive employment. An important issue emerged with regard to disclosure of one's condition to an employer. For some, this was a significant barrier to employment. Conclusions To ensure sustainable employment is attainable for those with persistent pain conditions, a good match is required between an employee and their work. Workplace accommodations may assist with improving job fit but this requires disclosure of a condition to an employer. Weighing up the risks and benefits of disclosure is difficult, and may be assisted by knowledge of available supports to assist with maintaining ongoing employment.
ERIC Educational Resources Information Center
Temel, Senar
2016-01-01
This study aims to analyse prospective chemistry teachers' cognitive structures related to the subject of oxidation and reduction through a flow map method. Purposeful sampling method was employed in this study, and 8 prospective chemistry teachers from a group of students who had taken general chemistry and analytical chemistry courses were…
ERIC Educational Resources Information Center
Usher, Wayne
2011-01-01
Introduction: To identify health website recommendation trends by Gold Coast (Australia) general practitioners (GPs) to their patients. Method: A mixed method approach to data collection and analysis was employed. Quantitative data were collected using a prepaid postal survey, consisting of 17 questions, mailed to 250 (61 per cent) of 410 GPs on…
Boundary element analysis of post-tensioned slabs
NASA Astrophysics Data System (ADS)
Rashed, Youssef F.
2015-06-01
In this paper, the boundary element method is applied to carry out the structural analysis of post-tensioned flat slabs. The shear-deformable plate-bending model is employed. The effect of the pre-stressing cables is taken into account via the equivalent load method. The formulation is automated using a computer program, which uses quadratic boundary elements. Verification samples are presented, and finally a practical application is analyzed where results are compared against those obtained from the finite element method. The proposed method is efficient in terms of computer storage and processing time as well as the ease in data input and modifications.
Lam, Lun Tak; Sun, Yi; Davey, Neil; Adams, Rod; Prapopoulou, Maria; Brown, Marc B; Moss, Gary P
2010-06-01
The aim was to employ Gaussian processes to assess mathematically the nature of a skin permeability dataset and to employ these methods, particularly feature selection, to determine the key physicochemical descriptors which exert the most significant influence on percutaneous absorption, and to compare such models with established existing models. Gaussian processes, including automatic relevance detection (GPRARD) methods, were employed to develop models of percutaneous absorption that identified key physicochemical descriptors of percutaneous absorption. Using MatLab software, the statistical performance of these models was compared with single linear networks (SLN) and quantitative structure-permeability relationships (QSPRs). Feature selection methods were used to examine in more detail the physicochemical parameters used in this study. A range of statistical measures to determine model quality were used. The inherently nonlinear nature of the skin data set was confirmed. The Gaussian process regression (GPR) methods yielded predictive models that offered statistically significant improvements over SLN and QSPR models with regard to predictivity (where the rank order was: GPR > SLN > QSPR). Feature selection analysis determined that the best GPR models were those that contained log P, melting point and the number of hydrogen bond donor groups as significant descriptors. Further statistical analysis also found that great synergy existed between certain parameters. It suggested that a number of the descriptors employed were effectively interchangeable, thus questioning the use of models where discrete variables are output, usually in the form of an equation. The use of a nonlinear GPR method produced models with significantly improved predictivity, compared with SLN or QSPR models. Feature selection methods were able to provide important mechanistic information. However, it was also shown that significant synergy existed between certain parameters, and as such it was possible to interchange certain descriptors (i.e. molecular weight and melting point) without incurring a loss of model quality. Such synergy suggested that a model constructed from discrete terms in an equation may not be the most appropriate way of representing mechanistic understandings of skin absorption.
NASA Astrophysics Data System (ADS)
Kim, Jeong-Man; Koo, Min-Mo; Jeong, Jae-Hoon; Hong, Keyyong; Cho, Il-Hyoung; Choi, Jang-Young
2017-05-01
This paper reports the design and analysis of a tubular permanent magnet linear generator (TPMLG) for a small-scale wave-energy converter. The analytical field computation is performed by applying a magnetic vector potential and a 2-D analytical model to determine design parameters. Based on analytical solutions, parametric analysis is performed to meet the design specifications of a wave-energy converter (WEC). Then, 2-D FEA is employed to validate the analytical method. Finally, the experimental result confirms the predictions of the analytical and finite element analysis (FEA) methods under regular and irregular wave conditions.
An overview of the mathematical and statistical analysis component of RICIS
NASA Technical Reports Server (NTRS)
Hallum, Cecil R.
1987-01-01
Mathematical and statistical analysis components of RICIS (Research Institute for Computing and Information Systems) can be used in the following problem areas: (1) quantification and measurement of software reliability; (2) assessment of changes in software reliability over time (reliability growth); (3) analysis of software-failure data; and (4) decision logic for whether to continue or stop testing software. Other areas of interest to NASA/JSC where mathematical and statistical analysis can be successfully employed include: math modeling of physical systems, simulation, statistical data reduction, evaluation methods, optimization, algorithm development, and mathematical methods in signal processing.
Estimating short-run and long-run interaction mechanisms in interictal state.
Ozkaya, Ata; Korürek, Mehmet
2010-04-01
We address the issue of analyzing electroencephalogram (EEG) from seizure patients in order to test, model and determine the statistical properties that distinguish between EEG states (interictal, pre-ictal, ictal) by introducing a new class of time series analysis methods. In the present study: firstly, we employ statistical methods to determine the non-stationary behavior of focal interictal epileptiform series within very short time intervals; secondly, for such intervals that are deemed non-stationary we suggest the concept of Autoregressive Integrated Moving Average (ARIMA) process modelling, well known in time series analysis. We finally address the queries of causal relationships between epileptic states and between brain areas during epileptiform activity. We estimate the interaction between different EEG series (channels) in short time intervals by performing Granger-causality analysis and also estimate such interaction in long time intervals by employing Cointegration analysis, both analysis methods are well-known in econometrics. Here we find: first, that the causal relationship between neuronal assemblies can be identified according to the duration and the direction of their possible mutual influences; second, that although the estimated bidirectional causality in short time intervals yields that the neuronal ensembles positively affect each other, in long time intervals neither of them is affected (increasing amplitudes) from this relationship. Moreover, Cointegration analysis of the EEG series enables us to identify whether there is a causal link from the interictal state to ictal state.
Screening Workers: An Examination and Analysis of Practice and Public Policy.
ERIC Educational Resources Information Center
Greenfield, Patricia A.; And Others
1989-01-01
Discusses methods of screening job applicants and issues raised by screening procedures.. Includes legal ramifications, current practices in Britain and the United States, future directions, and the employment interview. (JOW)
Oldfield, Margaret; MacEachen, Ellen; MacNeill, Margaret; Kirsh, Bonnie
2018-06-01
Background Advice on fibromyalgia, a chronic illness primarily affecting women, often presents it as incompatible with work and rarely covers how to remain employed. Yet many women do. Objectives We aimed to understand how these women, their family members, and workmates portrayed employees with fibromyalgia, and how these portrayals helped women retain employment. Methods We interviewed 22 participants, comprising five triads and three dyads of people who knew each other. Using the methodology of critical discourse analysis, we analysed the interview data within and across the triads/dyads through coding, narrative summaries, and relational mapping. Results Participants reported stereotypes that employees with fibromyalgia are lazy, malingering, and less productive than healthy workers. Countering these assumptions, participants portrayed the women as normal, valuable employees who did not 'give in' to their illness. The portrayals drew on two discourses, normalcy and mind-controlling-the-body, and a related narrative, overcoming disability. We propose that participants' portrayals helped women manage their identities in competitive workplaces and thereby remain employed. Discussion Our findings augment the very sparse literature on employment with fibromyalgia. Using a new approach, critical discourse analysis, we expand on known job-retention strategies and add the perspectives of two key stakeholders: family members and workmates.
NASA Technical Reports Server (NTRS)
Bratanow, T.; Ecer, A.
1973-01-01
A general computational method for analyzing unsteady flow around pitching and plunging airfoils was developed. The finite element method was applied in developing an efficient numerical procedure for the solution of equations describing the flow around airfoils. The numerical results were employed in conjunction with computer graphics techniques to produce visualization of the flow. The investigation involved mathematical model studies of flow in two phases: (1) analysis of a potential flow formulation and (2) analysis of an incompressible, unsteady, viscous flow from Navier-Stokes equations.
[Methods of quantitative proteomics].
Kopylov, A T; Zgoda, V G
2007-01-01
In modern science proteomic analysis is inseparable from other fields of systemic biology. Possessing huge resources quantitative proteomics operates colossal information on molecular mechanisms of life. Advances in proteomics help researchers to solve complex problems of cell signaling, posttranslational modification, structure and functional homology of proteins, molecular diagnostics etc. More than 40 various methods have been developed in proteomics for quantitative analysis of proteins. Although each method is unique and has certain advantages and disadvantages all these use various isotope labels (tags). In this review we will consider the most popular and effective methods employing both chemical modifications of proteins and also metabolic and enzymatic methods of isotope labeling.
Review of Hull Structural Monitoring Systems for Navy Ships
2013-05-01
generally based on the same basic form of S-N curve, different correction methods are used by the various classification societies. ii. Methods for...Likewise there are a number of different methods employed for temperature compensation and these vary depending on the type of gauge, although typically...Analysis, Inc.[30] Figure 8. Examples of different methods of temperature compensation of fibre-optic strain sensors. It is noted in NATO
Lima, Manoel J A; Reis, Boaventura F
2017-03-01
This paper describes an environmentally friendly procedure for the determination of losartan potassium (Los-K) in pharmaceuticals. The photometric method was based on the light scattering effect due to particles suspension, which were formed by the reaction of Los-K with Cu (II) ions. The method was automated employing a multicommuted flow analysis approach, implemented using solenoid mini-pumps for fluid propelling and a homemade LED based photometer. Under the optimized experimental conditions, the procedure showed a linear relationship in the concentration range of 23.2-417.6mgL -1 (r=0.9997, n=6), a relative standard deviation of 1.61% (n=10), a limit of detection (3.3*σ) estimated to be 12.1mgL -1 , and a sampling rate of 140 determinations per hour. Each determination consumed 12µg of copper (II) acetate and generated 0.54mL of waste. Copyright © 2016 Elsevier B.V. All rights reserved.
NASA Technical Reports Server (NTRS)
Tamma, Kumar K.; D'Costa, Joseph F.
1991-01-01
This paper describes the evaluation of mixed implicit-explicit finite element formulations for hyperbolic heat conduction problems involving non-Fourier effects. In particular, mixed implicit-explicit formulations employing the alpha method proposed by Hughes et al. (1987, 1990) are described for the numerical simulation of hyperbolic heat conduction models, which involves time-dependent relaxation effects. Existing analytical approaches for modeling/analysis of such models involve complex mathematical formulations for obtaining closed-form solutions, while in certain numerical formulations the difficulties include severe oscillatory solution behavior (which often disguises the true response) in the vicinity of the thermal disturbances, which propagate with finite velocities. In view of these factors, the alpha method is evaluated to assess the control of the amount of numerical dissipation for predicting the transient propagating thermal disturbances. Numerical test models are presented, and pertinent conclusions are drawn for the mixed-time integration simulation of hyperbolic heat conduction models involving non-Fourier effects.
Droplet microfluidics with magnetic beads: a new tool to investigate drug-protein interactions.
Lombardi, Dario; Dittrich, Petra S
2011-01-01
In this study, we give the proof of concept for a method to determine binding constants of compounds in solution. By implementing a technique based on magnetic beads with a microfluidic device for segmented flow generation, we demonstrate, for individual droplets, fast, robust and complete separation of the magnetic beads. The beads are used as a carrier for one binding partner and hence, any bound molecule is separated likewise, while the segmentation into small microdroplets ensures fast mixing, and opens future prospects for droplet-wise analysis of drug candidate libraries. We employ the method for characterization of drug-protein binding, here warfarin to human serum albumin. The approach lays the basis for a microfluidic droplet-based screening device aimed at investigating the interactions of drugs with specific targets including enzymes and cells. Furthermore, the continuous method could be employed for various applications, such as binding assays, kinetic studies, and single cell analysis, in which rapid removal of a reactive component is required.
Effectiveness of Individual Placement and Support Supported Employment for Young Adults
Bond, Gary R.; Drake, Robert E.; Campbell, Kikuko
2015-01-01
Objective The Individual Placement and Support (IPS) model of supported employment was first developed in community mental health centers for adults with severe mental illness. While IPS is an established evidence-based practice in this broad population, evidence on its effectiveness focused specifically on young adults has been limited. The current study aimed to address this gap. Methods To investigate the effects of IPS on young adults, the authors conducted a secondary analysis on a pooled sample of 109 unemployed young adults (under age 30) from four randomized controlled trials employing a common research protocol that included a standardized measurement battery and rigorous fidelity monitoring. Researchers assessed these participants over 18 months on nine competitive employment outcome measures. Results On all measures the IPS group had significantly better employment outcomes. Overall, 40 (82%) of IPS participants obtained employment during follow-up compared to 25 (42%) of control participants, Χ2 =17.9, p < .001. IPS participants averaged 25.0 weeks of employment, compared to 7.0 weeks for control participants, t = 4.50, p < .001. Conclusions The current analysis supports a small number of previous studies in showing that IPS is highly effective in helping young adults to attain competitive employment. When young adults acquire competitive jobs and initiate a path toward normal adult roles, they may avoid the cycle of disability and psychiatric patient roles that are demeaning and demoralizing. PMID:25138195
Correlational Analysis of Servant Leadership and School Climate
ERIC Educational Resources Information Center
Black, Glenda Lee
2010-01-01
The purpose of this mixed-method research study was to determine the extent that servant leadership was correlated with perceptions of school climate to identify whether there was a relationship between principals' and teachers' perceived practice of servant leadership and of school climate. The study employed a mixed-method approach by first…
1980-10-01
reported using the method of Gentzkow (1942), which involves conversion of urea to ammonia with urease and measurement of the ammonia by...Nesslerization. Methods employing urease are not well suited for automated analysis since an incubation time of about 20 minutes is required for the conversion of
ERIC Educational Resources Information Center
Potter, Penny F.; Graham-Moore, Brian E.
Most organizations planning to assess adverse impact or perform a stock analysis for affirmative action planning must correctly classify their jobs into appropriate occupational categories. Two methods of job classification were assessed in a combination archival and field study. Classification results from expert judgment of functional job…
NASA Technical Reports Server (NTRS)
An, S. H.; Yao, K.
1986-01-01
Lattice algorithm has been employed in numerous adaptive filtering applications such as speech analysis/synthesis, noise canceling, spectral analysis, and channel equalization. In this paper the application to adaptive-array processing is discussed. The advantages are fast convergence rate as well as computational accuracy independent of the noise and interference conditions. The results produced by this technique are compared to those obtained by the direct matrix inverse method.
Barnes, Stephen; Benton, H. Paul; Casazza, Krista; Cooper, Sara; Cui, Xiangqin; Du, Xiuxia; Engler, Jeffrey; Kabarowski, Janusz H.; Li, Shuzhao; Pathmasiri, Wimal; Prasain, Jeevan K.; Renfrow, Matthew B.; Tiwari, Hemant K.
2017-01-01
Metabolomics, a systems biology discipline representing analysis of known and unknown pathways of metabolism, has grown tremendously over the past 20 years. Because of its comprehensive nature, metabolomics requires careful consideration of the question(s) being asked, the scale needed to answer the question(s), collection and storage of the sample specimens, methods for extraction of the metabolites from biological matrices, the analytical method(s) to be employed and the quality control of the analyses, how collected data are correlated, the statistical methods to determine metabolites undergoing significant change, putative identification of metabolites, and the use of stable isotopes to aid in verifying metabolite identity and establishing pathway connections and fluxes. This second part of a comprehensive description of the methods of metabolomics focuses on data analysis, emerging methods in metabolomics and the future of this discipline. PMID:28239968
ERIC Educational Resources Information Center
Morozov, Andrew; Kilgore, Deborah; Atman, Cynthia
2007-01-01
In this study, the authors used two methods for analyzing expert data: verbal protocol analysis (VPA) and narrative analysis. VPA has been effectively used to describe the design processes employed by engineering students, expert designers, and expert-novice comparative research. VPA involves asking participants to "think aloud" while…
ERIC Educational Resources Information Center
Embrey, Karen K.
2012-01-01
Cognitive task analysis (CTA) is a knowledge elicitation technique employed for acquiring expertise from domain specialists to support the effective instruction of novices. CTA guided instruction has proven effective in improving surgical skills training for medical students and surgical residents. The standard, current method of teaching clinical…
Language and Nutrition (Mis)Information: Food Labels, FDA Policies and Meaning
ERIC Educational Resources Information Center
Taylor, Christy Marie
2013-01-01
In this dissertation, I address the ways in which food manufacturers can exploit the often vague and ambiguous nature of FDA policies concerning language and images used on food labels. Employing qualitative analysis methods (Strauss, 1987; Denzin and Lincoln, 2003; Mackey and Gass, 2005) that drew upon critical discourse analysis (Fairclough,…
USDA-ARS?s Scientific Manuscript database
A ‘dilute-and-shoot’ method for vitamin D and triacylglycerols is demonstrated that employed four mass spectrometers, operating in different ionization modes, for a ‘quadruple parallel mass spectrometry’ analysis, plus three other detectors, for seven detectors overall. Sets of five samples of diet...
ERIC Educational Resources Information Center
Yook, Cheongmin; Lee, Yong-hun
2016-01-01
This study employed qualitative data collection and analysis methods to investigate the influence of English as a foreign language teacher education programme on Korean teachers' classroom teaching practices. Six in-service secondary-school teachers participated in semi-structured interviews. Thematic analysis was applied to the data collected…
NASA Astrophysics Data System (ADS)
Felipe-Sesé, Luis; López-Alba, Elías; Siegmann, Philip; Díaz, Francisco A.
2016-12-01
A low-cost approach for three-dimensional (3-D) full-field displacement measurement is applied for the analysis of large displacements involved in two different mechanical events. The method is based on a combination of fringe projection and two-dimensional digital image correlation (DIC) techniques. The two techniques have been employed simultaneously using an RGB camera and a color encoding method; therefore, it is possible to measure in-plane and out-of-plane displacements at the same time with only one camera even at high speed rates. The potential of the proposed methodology has been employed for the analysis of large displacements during contact experiments in a soft material block. Displacement results have been successfully compared with those obtained using a 3D-DIC commercial system. Moreover, the analysis of displacements during an impact test on a metal plate was performed to emphasize the application of the methodology for dynamics events. Results show a good level of agreement, highlighting the potential of FP + 2D DIC as low-cost alternative for the analysis of large deformations problems.
Huang, Weiqing; Fan, Hongbo; Qiu, Yongfu; Cheng, Zhiyu; Xu, Pingru; Qian, Yu
2016-05-01
Recently, China has frequently experienced large-scale, severe and persistent haze pollution due to surging urbanization and industrialization and a rapid growth in the number of motor vehicles and energy consumption. The vehicle emission due to the consumption of a large number of fossil fuels is no doubt a critical factor of the haze pollution. This work is focused on the causation mechanism of haze pollution related to the vehicle emission for Guangzhou city by employing the Fault Tree Analysis (FTA) method for the first time. With the establishment of the fault tree system of "Haze weather-Vehicle exhausts explosive emission", all of the important risk factors are discussed and identified by using this deductive FTA method. The qualitative and quantitative assessments of the fault tree system are carried out based on the structure, probability and critical importance degree analysis of the risk factors. The study may provide a new simple and effective tool/strategy for the causation mechanism analysis and risk management of haze pollution in China. Copyright © 2016 Elsevier Ltd. All rights reserved.
Monitoring of an antigen manufacturing process.
Zavatti, Vanessa; Budman, Hector; Legge, Raymond; Tamer, Melih
2016-06-01
Fluorescence spectroscopy in combination with multivariate statistical methods was employed as a tool for monitoring the manufacturing process of pertactin (PRN), one of the virulence factors of Bordetella pertussis utilized in whopping cough vaccines. Fluorophores such as amino acids and co-enzymes were detected throughout the process. The fluorescence data collected at different stages of the fermentation and purification process were treated employing principal component analysis (PCA). Through PCA, it was feasible to identify sources of variability in PRN production. Then, partial least square (PLS) was employed to correlate the fluorescence spectra obtained from pure PRN samples and the final protein content measured by a Kjeldahl test from these samples. In view that a statistically significant correlation was found between fluorescence and PRN levels, this approach could be further used as a method to predict the final protein content.
Judicial perspectives on child passenger protection legislation
DOT National Transportation Integrated Search
1980-08-01
This report provides an analysis of judicial perspectives of general sessions judges concerning the Tennessee child passenger protection law. Two methods were employed to gather information: questionnaire were mailed to 103 judges while 12 judges par...
Monitoring beach changes using GPS surveying techniques
Morton, Robert; Leach, Mark P.; Paine, Jeffrey G.; Cardoza, Michael A.
1993-01-01
The adaptation of Global Positioning System (GPS) surveying techniques to beach monitoring activities is a promising response to this challenge. An experiment that employed both GPS and conventional beach surveying was conducted, and a new beach monitoring method employing kinematic GPS surveys was devised. This new method involves the collection of precise shore-parallel and shore-normal GPS positions from a moving vehicle so that an accurate two-dimensional beach surface can be generated. Results show that the GPS measurements agree with conventional shore-normal surveys at the 1 cm level, and repeated GPS measurements employing the moving vehicle demonstrate a precision of better than 1 cm. In addition, the nearly continuous sampling and increased resolution provided by the GPS surveying technique reveals alongshore changes in beach morphology that are undetected by conventional shore-normal profiles. The application of GPS surveying techniques combined with the refinement of appropriate methods for data collection and analysis provides a better understanding of beach changes, sediment transport, and storm impacts.
2012-01-01
Background Response surface methodology by Box–Behnken design employing the multivariate approach enables substantial improvement in the method development using fewer experiments, without wastage of large volumes of organic solvents, which leads to high analysis cost. This methodology has not been employed for development of a method for analysis of atorvastatin calcium (ATR-Ca). Results The present research study describes the use of in optimization and validation of a new microwell-based UV-Visible spectrophotometric method of for determination of ATR-Ca in its tablets. By the use of quadratic regression analysis, equations were developed to describe the behavior of the response as simultaneous functions of the selected independent variables. Accordingly, the optimum conditions were determined which included concentration of 2,3-dichloro-5,6-dicyano-1,4-benzoquinone (DDQ), time of reaction and temperature. The absorbance of the colored-CT complex was measured at 460 nm by microwell-plate absorbance reader. The method was validated, in accordance with ICH guidelines for accuracy, precision, selectivity and linearity (r² = 0.9993) over the concentration range of 20–200 μg/ml. The assay was successfully applied to the analysis of ATR-Ca in its pharmaceutical dosage forms with good accuracy and precision. Conclusion The assay described herein has great practical value in the routine analysis of ATR-Ca in quality control laboratories, as it has high throughput property, consumes minimum volume of organic solvent thus it offers the reduction in the exposures of the analysts to the toxic effects of organic solvents, environmentally friendly "Green" approach) and reduction in the analysis cost by 50-fold. PMID:23146143
Dual Solutions for Nonlinear Flow Using Lie Group Analysis
Awais, Muhammad; Hayat, Tasawar; Irum, Sania; Saleem, Salman
2015-01-01
`The aim of this analysis is to investigate the existence of the dual solutions for magnetohydrodynamic (MHD) flow of an upper-convected Maxwell (UCM) fluid over a porous shrinking wall. We have employed the Lie group analysis for the simplification of the nonlinear differential system and computed the absolute invariants explicitly. An efficient numerical technique namely the shooting method has been employed for the constructions of solutions. Dual solutions are computed for velocity profile of an upper-convected Maxwell (UCM) fluid flow. Plots reflecting the impact of dual solutions for the variations of Deborah number, Hartman number, wall mass transfer are presented and analyzed. Streamlines are also plotted for the wall mass transfer effects when suction and blowing situations are considered. PMID:26575996
ERIC Educational Resources Information Center
Güngör, Sema Nur; Özkan, Muhlis
2016-01-01
The aim of this study is to teach enzymes, which are one of the biology subjects in understanding which students have a big difficulty, to pre-service teachers through POE method in the case of catalase, which is an oxidoreductase. Descriptive analysis method was employed in this study in which 38 second grade pre-service teachers attending Uludag…
Prevalence and Determinants of Contraceptive use among Employed and Unemployed Women in Bangladesh
Islam, Ahmed Zohirul; Mondal, Md. Nazrul Islam; Khatun, Mt. Laily; Rahman, Md. Mosiur; Islam, Md. Rafiqul; Mostofa, Md. Golam; Hoque, Md. Nazrul
2016-01-01
Background: Contraceptive use plays a significant role in controlling fertility, particularly in reaching the replacement level of fertility. The association between women’s employment status and contraceptive use is poorly studied and understood in Bangladesh. The aim of this study was to determine the factors that influence contraceptive use among employed and unemployed women in Bangladesh. Methods: Data and necessary information of 16,616 married women were extracted from the Bangladesh Demographic and Health Survey (BDHS) 2011. The cross sectional data has been used for univariate analysis, to carry out the description of the variables; bivariate analysis, to find the associations among the variables; and binary logistic regression analysis, to evaluate the effects of selected sociodemographic factors on contraceptive use. Results: The results revealed that the contraceptive use was found higher among employed women (67%) than that of unemployed women. Women’s age, education, region, number of living children, and child preference were found to be significantly associated with current use of contraception among employed women. On the other hand, women’s age, education, husband’s education, region, residence, religion, number of living children, ever heard about family planning, and child preference were identified as the significant predictors of contraceptive use among unemployed women. Conclusion and Global Health Implications: A gap in using contraceptives among employed and unemployed women is identified. By creating employment opportunities for women to be enhanced the contraceptive use. Moreover, the sociodemographic factors need to be taken into consideration in formulating policies and implementing programs to increase the contraceptive prevalence rate among women. PMID:28058196
Miranda, Tiago A; Silva, Pedro H R; Pianetti, Gerson A; César, Isabela C
2015-01-28
Chloroquine and primaquine are the first-line treatment recommended by World Health Organization for malaria caused by Plasmodium vivax. Since the problem of counterfeit or substandard anti-malarials is well established all over the world, the development of rapid and reliable methods for quality control analysis of these drugs is essential. Thus, the aim of this study was to develop and validate a novel UPLC-DAD method for simultaneously quantifying chloroquine and primaquine in tablet formulations. The UPLC separation was carried out using a Hypersil C18 column (50 × 2.1 mm id; 1.9 μm particle size) and a mobile phase composed of acetonitrile (A) and 0.1% aqueous triethylamine, pH 3.0 adjusted with phosphoric acid (B), at a flow rate 0.6 mL/min. Gradient elution was employed. UV detection was performed at 260 nm. UPLC method was fully validated and the results were compared to a conventional HPLC-DAD method for the analysis of chloroquine and primaquine in tablet formulations. UPLC method was shown to be linear (r2 > 0.99), precise (CV < 2.0%), accurate (recovery rates from 98.11 to 99.83%), specific, and robust. No significant differences were observed between the chloroquine and primaquine contents obtained by UPLC and HPLC methods. However, UPLC method promoted faster analyses, better chromatographic performance and lower solvent consumption. The developed UPLC method was shown to be a rapid and suitable technique to quantify chloroquine and primaquine in pharmaceutical preparations and may be successfully employed for quality control analysis.
Oka, Megan; Whiting, Jason
2013-01-01
In Marriage and Family Therapy (MFT), as in many clinical disciplines, concern surfaces about the clinician/researcher gap. This gap includes a lack of accessible, practical research for clinicians. MFT clinical research often borrows from the medical tradition of randomized control trials, which typically use linear methods, or follow procedures distanced from "real-world" therapy. We review traditional research methods and their use in MFT and propose increased use of methods that are more systemic in nature and more applicable to MFTs: process research, dyadic data analysis, and sequential analysis. We will review current research employing these methods, as well as suggestions and directions for further research. © 2013 American Association for Marriage and Family Therapy.
Berridge, Georgina; Chalk, Rod; D’Avanzo, Nazzareno; Dong, Liang; Doyle, Declan; Kim, Jung-In; Xia, Xiaobing; Burgess-Brown, Nicola; deRiso, Antonio; Carpenter, Elisabeth Paula; Gileadi, Opher
2011-01-01
We have developed a method for intact mass analysis of detergent-solubilized and purified integral membrane proteins using liquid chromatography–mass spectrometry (LC–MS) with methanol as the organic mobile phase. Membrane proteins and detergents are separated chromatographically during the isocratic stage of the gradient profile from a 150-mm C3 reversed-phase column. The mass accuracy is comparable to standard methods employed for soluble proteins; the sensitivity is 10-fold lower, requiring 0.2–5 μg of protein. The method is also compatible with our standard LC–MS method used for intact mass analysis of soluble proteins and may therefore be applied on a multiuser instrument or in a high-throughput environment. PMID:21093405
Schwarz, Betje; Specht, Timo; Bethge, Matthias
2017-12-01
Purpose To explore the patient's perspective on the involvement of employers into rehabilitation. Methods 8 participants of a work-related medical rehabilitation were interviewed by telephone 4 weeks after discharge. Qualitative content analysis was used to analyze generated data. Results Beside a poor employer-involvement, the interviews revealed that the process of returning to work was characterized and hampered by unused measures of supporting vocational reintegration during rehabilitation, intersection problems in the health care and social security system, and a strategy of waiting by all involved actors. Conclusion Beside an improved employer-involvement, systematic intersection management and full usage of existing measures are demanded to support vocational reintegration. © Georg Thieme Verlag KG Stuttgart · New York.
Lattice Boltzmann methods for global linear instability analysis
NASA Astrophysics Data System (ADS)
Pérez, José Miguel; Aguilar, Alfonso; Theofilis, Vassilis
2017-12-01
Modal global linear instability analysis is performed using, for the first time ever, the lattice Boltzmann method (LBM) to analyze incompressible flows with two and three inhomogeneous spatial directions. Four linearization models have been implemented in order to recover the linearized Navier-Stokes equations in the incompressible limit. Two of those models employ the single relaxation time and have been proposed previously in the literature as linearization of the collision operator of the lattice Boltzmann equation. Two additional models are derived herein for the first time by linearizing the local equilibrium probability distribution function. Instability analysis results are obtained in three benchmark problems, two in closed geometries and one in open flow, namely the square and cubic lid-driven cavity flow and flow in the wake of the circular cylinder. Comparisons with results delivered by classic spectral element methods verify the accuracy of the proposed new methodologies and point potential limitations particular to the LBM approach. The known issue of appearance of numerical instabilities when the SRT model is used in direct numerical simulations employing the LBM is shown to be reflected in a spurious global eigenmode when the SRT model is used in the instability analysis. Although this mode is absent in the multiple relaxation times model, other spurious instabilities can also arise and are documented herein. Areas of potential improvements in order to make the proposed methodology competitive with established approaches for global instability analysis are discussed.
Apparatus and method for quantitative determination of materials contained in fluids
Radziemski, Leon J.; Cremers, David A.
1985-01-01
Apparatus and method for near real-time in-situ monitoring of particulates and vapors contained in fluids. Initial filtration of a known volume of the fluid sample is combined with laser-induced dielectric breakdown spectroscopy of the filter employed to obtain qualitative and quantitative information with high sensitivity. Application of the invention to monitoring of beryllium, beryllium oxide, or other beryllium-alloy dusts is demonstrated. Significant shortening of analysis time is achieved from those of the usual chemical techniques of analysis.
Apparatus and method for quantitative determination of materials contained in fluids
Radziemski, L.J.; Cremers, D.A.
1982-09-07
Apparatus and method for near real-time in-situ monitoring of particulates and vapors contained in fluids are described. Initial filtration of a known volume of the fluid sample is combined with laser-induced dielectric breakdown spectroscopy of the filter employed to obtain qualitative and quantitative information with high sensitivity. Application of the invention to monitoring of beryllium, beryllium oxide, or other beryllium-alloy dusts is shown. Significant shortening of analysis time is achieved from the usual chemical techniques of analysis.
Simões, Rodrigo Almeida; Bonato, Pierina Sueli; Mirnaghi, Fatemeh S; Bojko, Barbara; Pawliszyn, Janusz
2015-01-01
A high-throughput bioanalytical method using 96-blade thin film microextraction (TFME) and LC-MS/MS for the analysis of repaglinide (RPG) and two of its main metabolites was developed and used for an in vitro metabolism study. The target analytes were extracted from human microsomal medium by a 96-blade-TFME system employing the low-cost prototype 'SPME multi-sampler' using C18 coating. Method validation showed recoveries around 90% for all analytes and was linear over the concentration range of 2-1000 ng ml(-1) for RPG and of 2-500 ng ml(-1) for each RPG metabolite. The method was applied to an in vitro metabolism study of RPG employing human liver microsomes and proved to be very useful for this purpose.
An Efficient and Accurate Genetic Algorithm for Backcalculation of Flexible Pavement Layer Moduli
DOT National Transportation Integrated Search
2012-12-01
The importance of a backcalculation method in the analysis of elastic modulus in pavement engineering has been : known for decades. Despite many backcalculation programs employing different backcalculation procedures and : algorithms, accurate invers...
Validation of Physics Standardized Test Items
NASA Astrophysics Data System (ADS)
Marshall, Jill
2008-10-01
The Texas Physics Assessment Team (TPAT) examined the Texas Assessment of Knowledge and Skills (TAKS) to determine whether it is a valid indicator of physics preparation for future course work and employment, and of the knowledge and skills needed to act as an informed citizen in a technological society. We categorized science items from the 2003 and 2004 10th and 11th grade TAKS by content area(s) covered, knowledge and skills required to select the correct answer, and overall quality. We also analyzed a 5000 student sample of item-level results from the 2004 11th grade exam using standard statistical methods employed by test developers (factor analysis and Item Response Theory). Triangulation of our results revealed strengths and weaknesses of the different methods of analysis. The TAKS was found to be only weakly indicative of physics preparation and we make recommendations for increasing the validity of standardized physics testing..
Waist circumference, body mass index, and employment outcomes.
Kinge, Jonas Minet
2017-07-01
Body mass index (BMI) is an imperfect measure of body fat. Recent studies provide evidence in favor of replacing BMI with waist circumference (WC). Hence, I investigated whether or not the association between fat mass and employment status vary by anthropometric measures. I used 15 rounds of the Health Survey for England (1998-2013), which has measures of employment status in addition to measured height, weight, and WC. WC and BMI were entered as continuous variables and obesity as binary variables defined using both WC and BMI. I used multivariate models controlling for a set of covariates. The association of WC with employment was of greater magnitude than the association between BMI and employment. I reran the analysis using conventional instrumental variables methods. The IV models showed significant impacts of obesity on employment; however, they were not more pronounced when WC was used to measure obesity, compared to BMI. This means that, in the IV models, the impact of fat mass on employment did not depend on the measure of fat mass.
NASA Astrophysics Data System (ADS)
Wang, Xiao; Gao, Feng; Dong, Junyu; Qi, Qiang
2018-04-01
Synthetic aperture radar (SAR) image is independent on atmospheric conditions, and it is the ideal image source for change detection. Existing methods directly analysis all the regions in the speckle noise contaminated difference image. The performance of these methods is easily affected by small noisy regions. In this paper, we proposed a novel change detection framework for saliency-guided change detection based on pattern and intensity distinctiveness analysis. The saliency analysis step can remove small noisy regions, and therefore makes the proposed method more robust to the speckle noise. In the proposed method, the log-ratio operator is first utilized to obtain a difference image (DI). Then, the saliency detection method based on pattern and intensity distinctiveness analysis is utilized to obtain the changed region candidates. Finally, principal component analysis and k-means clustering are employed to analysis pixels in the changed region candidates. Thus, the final change map can be obtained by classifying these pixels into changed or unchanged class. The experiment results on two real SAR images datasets have demonstrated the effectiveness of the proposed method.
ERIC Educational Resources Information Center
Oo, Htun Naing; Sutheerawatthana, Pitch; Minato, Takayuki
2010-01-01
This article analyzes the practice of information dissemination regarding pesticide usage in floating gardening in a rural area. The analysis reveals reasons why the current information dissemination methods employed by relevant stakeholders do not work. It then puts forward a proposition that information sharing within organizations of and among…
The Brief History of Environmental Education and Its Changes from 1972 to Present in Iran
ERIC Educational Resources Information Center
Shobeiri, Seyed Mohammad; Meiboudi, Hossein; Kamali, Fatemeh Ahmadi
2014-01-01
The present study investigates environmental education (EE) before and after Iran's Islamic Revolution. The research method is case study, and among the case study methods, historical analysis has been used in this research. A wide array of sources were employed, from government performance reports to documents, records, books, and articles…
NASA Technical Reports Server (NTRS)
Worstell, J. H.; Daniel, S. R.
1981-01-01
A method for the separation and analysis of tetralin hydroperoxide and its decomposition products by high pressure liquid chromatography has been developed. Elution with a single, mixed solvent from a micron-Porasil column was employed. Constant response factors (internal standard method) over large concentration ranges and reproducible retention parameters are reported.
ERIC Educational Resources Information Center
Fidan, Nuray Kurtdede; Ergün, Mustafa
2016-01-01
In this study, social, literary and technological sources used by classroom teachers in social studies courses are analyzed in terms of frequency. The study employs mixed methods research and is designed following the convergent parallel design. In the qualitative part of the study, phenomenological method was used and in the quantitative…
Equal Protection and Due Process: Contrasting Methods of Review under Fourteenth Amendment Doctrine.
ERIC Educational Resources Information Center
Hughes, James A.
1979-01-01
Argues that the Court has, at times, confused equal protection and due process methods of review, primarily by employing interest balancing in certain equal protection cases that should have been subjected to due process analysis. Available from Harvard Civil Rights-Civil Liberties Law Review, Harvard Law School, Cambridge, MA 02138; sc $4.00.…
White Paper: A Defect Prioritization Method Based on the Risk Priority Number
2013-11-01
adapted The Failure Modes and Effects Analysis ( FMEA ) method employs a measurement technique called Risk Priority Number (RPN) to quantify the...Up to an hour 16-60 1.5 Brief Interrupt 0-15 1 Table 1 – Time Scaling Factors In the FMEA formulation, RPN is a product of the three categories
A hybrid fault diagnosis approach based on mixed-domain state features for rotating machinery.
Xue, Xiaoming; Zhou, Jianzhong
2017-01-01
To make further improvement in the diagnosis accuracy and efficiency, a mixed-domain state features data based hybrid fault diagnosis approach, which systematically blends both the statistical analysis approach and the artificial intelligence technology, is proposed in this work for rolling element bearings. For simplifying the fault diagnosis problems, the execution of the proposed method is divided into three steps, i.e., fault preliminary detection, fault type recognition and fault degree identification. In the first step, a preliminary judgment about the health status of the equipment can be evaluated by the statistical analysis method based on the permutation entropy theory. If fault exists, the following two processes based on the artificial intelligence approach are performed to further recognize the fault type and then identify the fault degree. For the two subsequent steps, mixed-domain state features containing time-domain, frequency-domain and multi-scale features are extracted to represent the fault peculiarity under different working conditions. As a powerful time-frequency analysis method, the fast EEMD method was employed to obtain multi-scale features. Furthermore, due to the information redundancy and the submergence of original feature space, a novel manifold learning method (modified LGPCA) is introduced to realize the low-dimensional representations for high-dimensional feature space. Finally, two cases with 12 working conditions respectively have been employed to evaluate the performance of the proposed method, where vibration signals were measured from an experimental bench of rolling element bearing. The analysis results showed the effectiveness and the superiority of the proposed method of which the diagnosis thought is more suitable for practical application. Copyright © 2016 ISA. Published by Elsevier Ltd. All rights reserved.
Markov Chain Monte Carlo Methods for Bayesian Data Analysis in Astronomy
NASA Astrophysics Data System (ADS)
Sharma, Sanjib
2017-08-01
Markov Chain Monte Carlo based Bayesian data analysis has now become the method of choice for analyzing and interpreting data in almost all disciplines of science. In astronomy, over the last decade, we have also seen a steady increase in the number of papers that employ Monte Carlo based Bayesian analysis. New, efficient Monte Carlo based methods are continuously being developed and explored. In this review, we first explain the basics of Bayesian theory and discuss how to set up data analysis problems within this framework. Next, we provide an overview of various Monte Carlo based methods for performing Bayesian data analysis. Finally, we discuss advanced ideas that enable us to tackle complex problems and thus hold great promise for the future. We also distribute downloadable computer software (available at https://github.com/sanjibs/bmcmc/ ) that implements some of the algorithms and examples discussed here.
ERIC Educational Resources Information Center
Golden, Thomas P.; Karpur, Arun
2012-01-01
This study is a comparative analysis of the impact of traditional face-to-face training contrasted with a blended learning approach, as it relates to improving skills, knowledge and attitudes for enhancing practices for achieving improved employment outcomes for individuals with disabilities. The study included two intervention groups: one…
ERIC Educational Resources Information Center
Cooper, R. Lyle; MacMaster, Samuel; Rasch, Randolph
2010-01-01
Purpose: This study employed a static group comparison design with 106 men in residential treatment to examine the relationship of race to treatment retention. Methods: A retrospective analysis of retention, by race, including survival analysis, was undertaken. Results: Findings from the study indicated that (a) Caucasian men complete treatment…
ERIC Educational Resources Information Center
Bloh, Christopher; Axelrod, Saul
2008-01-01
With the passage of the Individuals with Disabilities Education Improvement Act, classrooms are now mandated to employ behavioral methods to address target behaviors. These relevant behavioral strategies have long been advanced and disseminated by the field of Applied Behavior Analysis (ABA). Notwithstanding this capability, proponents of the…
ERIC Educational Resources Information Center
Skinner, Anna; Diller, David; Kumar, Rohit; Cannon-Bowers, Jan; Smith, Roger; Tanaka, Alyssa; Julian, Danielle; Perez, Ray
2018-01-01
Background: Contemporary work in the design and development of intelligent training systems employs task analysis (TA) methods for gathering knowledge that is subsequently encoded into task models. These task models form the basis of intelligent interpretation of student performance within education and training systems. Also referred to as expert…
An Analysis of Variance Approach for the Estimation of Response Time Distributions in Tests
ERIC Educational Resources Information Center
Attali, Yigal
2010-01-01
Generalizability theory and analysis of variance methods are employed, together with the concept of objective time pressure, to estimate response time distributions and the degree of time pressure in timed tests. By estimating response time variance components due to person, item, and their interaction, and fixed effects due to item types and…
Performance Evaluation of Technical Institutions: An Application of Data Envelopment Analysis
ERIC Educational Resources Information Center
Debnath, Roma Mitra; Shankar, Ravi; Kumar, Surender
2008-01-01
Technical institutions (TIs) are playing an important role in making India a knowledge hub of this century. There is still great diversity in their relative performance, which is a matter of concern to the education planner. This article employs the method of data envelopment analysis (DEA) to compare the relative efficiency of TIs in India. The…
Classical Item Analysis Using Latent Variable Modeling: A Note on a Direct Evaluation Procedure
ERIC Educational Resources Information Center
Raykov, Tenko; Marcoulides, George A.
2011-01-01
A directly applicable latent variable modeling procedure for classical item analysis is outlined. The method allows one to point and interval estimate item difficulty, item correlations, and item-total correlations for composites consisting of categorical items. The approach is readily employed in empirical research and as a by-product permits…
The Effectiveness Level of Material Use in Classroom Instruction: A Meta-Analysis Study
ERIC Educational Resources Information Center
Kablan, Zeynel; Topan, Beyda; Erkan, Burak
2013-01-01
In this study, the aim was to combine the results obtained in independent studies aiming to determine the effectiveness of material use. The main question of the study is: "Does material use in classroom instruction improve students' academic achievements?" To answer this question, the meta-analysis method was employed.…
New method for stock-tank oil compositional analysis.
McAndrews, Kristine; Nighswander, John; Kotzakoulakis, Konstantin; Ross, Paul; Schroeder, Helmut
2009-01-01
A new method for accurately determining stock-tank oil composition to normal pentatriacontane using gas chromatography is developed and validated. The new method addresses the potential errors associated with the traditional equipment and technique employed for extended hydrocarbon gas chromatography outside a controlled laboratory environment, such as on an offshore oil platform. In particular, the experimental measurement of stock-tank oil molecular weight with the freezing point depression technique and the use of an internal standard to find the unrecovered sample fraction are replaced with correlations for estimating these properties. The use of correlations reduces the number of necessary experimental steps in completing the required sample preparation and analysis, resulting in reduced uncertainty in the analysis.
NASA Technical Reports Server (NTRS)
Ivanova, I. D.; Gurdev, L. L.; Mitev, V. M.
1992-01-01
Various lidar methods have been developed for measuring the atmospheric temperature, making use of the temperature dependant characteristics of rotational Raman scattering (RRS) from nitrogen and oxygen, and Rayleigh or Rayleigh-Brillowin scattering (RS or RBS). These methods have various advantages and disadvantages as compared to each other but their potential accuracies are principal characteristics of their efficiency. No systematic attempt has been undertaken so far to compare the efficiences, in the above meaning, of different temperature lidar methods. Two RRS techniques have been compared. Here, we do such a comparison using two methods based on the detection and analysis of RS (RBS) spectra. Four methods are considered here for measuring the atmospheric temperature. One of them (Schwiesow and Lading, 1981) is based on an analysis of the RS linewidth with two Michelson interferometers (MI) in parallel. The second method (Shimisu et al., 1986) employs a high-resolution analysis of the RBS line shape. The third method (Cooney, 1972) employs the temperature dependance of the RRS spectrum envelope. The fourth method (Armstrong, 1974) makes use of a scanning Fabry-Perot interferometer (FPI) as a comb filter for processing the periodic RRS spectrum of the nitrogen. Let us denote the corresponding errors in measuring the temperature by sigma(sub MI), sigma(sub HR), sigma(sub ENV), and sigma(sub FPI). Let us also define the ratios chi(sub 1) = sigma(sub MI)/sigma(sub ENV), chi(sub 2) = sigma(sub HR)/sigma(sub ENV), and chi(sub 3) = sigma(sub FPI)/sigma(sub ENV) interpreted as relative errors with respect to sigma(sub ENV).
Does maternal employment augment spending for children's health care? A test from Haryana, India.
Berman, P; Zeitlin, J; Roy, P; Khumtakar, S
1997-10-01
Evidence that women's employment and earnings foster increased allocations of household resources to children's well-being have led to advocacy of investment in women's employment as a method for targeting the social benefits of enhanced economic opportunity. Work and associated earnings are hypothesized to empower women, who can then exercise their individual preferences for spending on child well-being as well as influence household spending patterns. This paper presents results from a small detailed household and community study of maternal employment and child health in northern India (one of six studies in a research network), which sought to show that such effects did indeed occur and that they could be linked to work characteristics. Careful analysis of employment and earnings showed that they are multidimensional and highly variable over occupations and seasons. Contrary to expectations, spending on health care for children's illness episodes was negatively associated with maternal employment and earnings variables in econometric analysis. The expected individual effects on women of work and earnings, if they did occur, were not sufficient to alter the general spending pattern. We conclude that the attributes of work as well as the social and cultural environment are important mediators of such effects, suggesting a confluence of 'individual' and 'collective' behavioural determinants meeting in the locus of the household.
Protocol vulnerability detection based on network traffic analysis and binary reverse engineering.
Wen, Shameng; Meng, Qingkun; Feng, Chao; Tang, Chaojing
2017-01-01
Network protocol vulnerability detection plays an important role in many domains, including protocol security analysis, application security, and network intrusion detection. In this study, by analyzing the general fuzzing method of network protocols, we propose a novel approach that combines network traffic analysis with the binary reverse engineering method. For network traffic analysis, the block-based protocol description language is introduced to construct test scripts, while the binary reverse engineering method employs the genetic algorithm with a fitness function designed to focus on code coverage. This combination leads to a substantial improvement in fuzz testing for network protocols. We build a prototype system and use it to test several real-world network protocol implementations. The experimental results show that the proposed approach detects vulnerabilities more efficiently and effectively than general fuzzing methods such as SPIKE.
Wang, Yun-Tung; Lin, Yi-Jiun; Shu, Ching-Hsien
2012-01-01
The aim of this study is to do a cost-benefit analysis with monetary and non-monetary benefits for sheltered employment service programs and try to provide more evidence-based information for policy makers and practitioners to understand the outcomes of sheltered employment services. This study analyzed 3 sheltered employment service programs for people with disabilities (2006-2007) implemented by Sunshine Social Welfare Foundation in Taiwan using cost-benefit analysis (including non-monetary benefits). Three groups were analyzed, including participants in the programs, taxpayers, and society (participants and taxpayers). This study found that the net social monetary benefit was $NT29,432.07 per participant per year and the benefit cost ratio was 1.43. (In 2006-2007, $US1 = $NT32.5 averagely around.) The net monetary benefit for the participants was between $NT7,890.86 and $NT91,890.86 per participant per year. On the non-monetary benefit side, the physical health (raised 7.49%), social relationship (raised 3.36%) domains, and general quality of life (raised 2.53%) improved. However, the psychological (decreased 1.51%) and working/environment (decreased 3.85%) domains backslided. In addition, the differences between pre-test and post-test average scores of all domains were not statistically significant. This study is the first to use monetary and non-monetary cost-benefit analysis methods to analyze sheltered employment service programs for people with disabilities in Taiwan. The findings indicated that sheltered employment service programs for people with disabilities could be efficient and beneficial for the whole society and sheltered employees/clients, and also helpful for raising their quality of lives.
Dynamic Analysis Method for Electromagnetic Artificial Muscle Actuator under PID Control
NASA Astrophysics Data System (ADS)
Nakata, Yoshihiro; Ishiguro, Hiroshi; Hirata, Katsuhiro
We have been studying an interior permanent magnet linear actuator for an artificial muscle. This actuator mainly consists of a mover and stator. The mover is composed of permanent magnets, magnetic cores and a non-magnetic shaft. The stator is composed of 3-phase coils and a back yoke. In this paper, the dynamic analysis method under PID control is proposed employing the 3-D finite element method (3-D FEM) to compute the dynamic response and current response when the positioning control is active. As a conclusion, computed results show good agreement with measured ones of a prototype.
Reed, Karla S; Meade, Michelle A; Krause, James S
2016-01-01
Objective: The purpose of this study was to examine the relationship between employment and psychological health and health management as described by individuals with spinal cord injury (SCI) who were employed at least once following injury. Methods: A qualitative approach used 6 focus groups at 2 sites with 44 participants who were at least 10 years post SCI. All had been employed at some point since injury. Heterogeneous and homogeneous groups were delineated based on specific characteristics, such as education, gender, or race. Group sessions followed a semi-structured interview format with questions about personal, environmental, and policy related factors influencing employment following SCI. All group sessions were recorded, transcribed, and coded into conceptual categories to identify topics, themes, and patterns. Inferences were drawn about their meaning. NVivo 10 software using the constant comparative method was used for data analysis. Results: Narratives discussed the relationship between employment and psychological and emotional health and health management. Four themes were identified: (1) adjustment and dealing with emotional reactions, (2) gaining self-confidence, (3) preventing burnout, and (4) attitudes and perspectives. Most themes reflected issues that varied based on severity of injury as well as stage of employment. Conclusions: Individuals with SCI who are successful in working following injury must determine how to perform the behaviors necessary to manage their health and prevent emotional or physical complications. The emotional consequences of SCI must be recognized and addressed and specific behaviors enacted in order to optimize employment outcomes.
Simplified welding distortion analysis for fillet welding using composite shell elements
NASA Astrophysics Data System (ADS)
Kim, Mingyu; Kang, Minseok; Chung, Hyun
2015-09-01
This paper presents the simplified welding distortion analysis method to predict the welding deformation of both plate and stiffener in fillet welds. Currently, the methods based on equivalent thermal strain like Strain as Direct Boundary (SDB) has been widely used due to effective prediction of welding deformation. Regarding the fillet welding, however, those methods cannot represent deformation of both members at once since the temperature degree of freedom is shared at the intersection nodes in both members. In this paper, we propose new approach to simulate deformation of both members. The method can simulate fillet weld deformations by employing composite shell element and using different thermal expansion coefficients according to thickness direction with fixed temperature at intersection nodes. For verification purpose, we compare of result from experiments, 3D thermo elastic plastic analysis, SDB method and proposed method. Compared of experiments results, the proposed method can effectively predict welding deformation for fillet welds.
NASA Technical Reports Server (NTRS)
Knies, R. J.; Byrn, N. R.; Smith, H. T.
1972-01-01
A study program of radiation shielding against the deleterious effects of nuclear radiation on man and equipment is reported. The methods used to analyze the radiation environment from bremsstrahlung photons are discussed along with the methods employed by transport code users. The theory and numerical methods used to solve transport of neutrons and gammas are described, and the neutron and cosmic fluxes that would be present on the gamma-ray telescope were analyzed.
Multivariate analysis in thoracic research.
Mengual-Macenlle, Noemí; Marcos, Pedro J; Golpe, Rafael; González-Rivas, Diego
2015-03-01
Multivariate analysis is based in observation and analysis of more than one statistical outcome variable at a time. In design and analysis, the technique is used to perform trade studies across multiple dimensions while taking into account the effects of all variables on the responses of interest. The development of multivariate methods emerged to analyze large databases and increasingly complex data. Since the best way to represent the knowledge of reality is the modeling, we should use multivariate statistical methods. Multivariate methods are designed to simultaneously analyze data sets, i.e., the analysis of different variables for each person or object studied. Keep in mind at all times that all variables must be treated accurately reflect the reality of the problem addressed. There are different types of multivariate analysis and each one should be employed according to the type of variables to analyze: dependent, interdependence and structural methods. In conclusion, multivariate methods are ideal for the analysis of large data sets and to find the cause and effect relationships between variables; there is a wide range of analysis types that we can use.
do Lago, Ayla Campos; Marchioni, Camila; Mendes, Tássia Venga; Wisniewski, Célio; Fadini, Pedro Sergio; Luccas, Pedro Orival
2016-11-01
This work proposes a preconcentration method using an ion imprinted polymer (IIP) for determination of cadmium, in several samples, employing a mini-column filled with the polymer coupled into a flow injection analysis system with detection by thermospray flame furnace atomic absorption spectrometry (FIA-TS-FF-AAS). The polymer was synthesized via bulk using methacrylic acid and vinylimidazole as a functional monomer. For the FIA system initial assessment, the variables: pH, eluent concentration and buffer concentration were studied, employing a 23 full factorial design. To obtain the optimum values for each significant variable, a Doehlert matrix was employed. After the optimization conditions as: pH 5.8, eluent (HNO3) concentration of 0.48 mol L -1 and buffer concentration of 0.01 mol L -1 , were adopted. The proposed method showed a linear response in the range of 0.081-10.0 μg L -1 , limits detection and quantification of 0.024 and 0.081 μg L -1 , respectively; preconcentration factor of 165, consumptive index of 0.06 mL, concentration efficiency 132 min -1 , and frequency of readings equal to 26 readings h -1 The accuracy was checked by analysis of certified reference materials for trace metals and recovery tests. The obtained results were in agreement with 95% confidence level (t-test). The method was adequate to apply in samples of: jewelry (earrings) (2.38 ± 0.28 μg kg -1 ), black tea (1.09 ± 0.15 μg kg -1 ), green tea (3.85 ± 0.13 μg kg -1 ), cigarette tobacco (38.27 ± 0.22 μg kg -1 ), and hair (0.35 ± 0.02 μg kg -1 ). © The Author(s) 2016.
Foreign Object Damage to Tires Operating in a Wartime Environment
1991-11-01
barriers were successfully overcome and the method of testing employed can now be confidently used for future test needs of this type. Data Analysis ...combined variable effects. Analysis consideration involved cut types, cut depths, number of cuts, cut/hit probabilities, tire failures, and aircraft...November 1988 with data reduction and analysis continuing into October 1989. All of the cutting tests reported in this report were conducted at the
DOT National Transportation Integrated Search
1971-04-01
An automated fluorometric trihydroxyindole procedure is described for the measurement of norepinephrine (NE) and epinephrine (E) in blood plasma or urine. The method employs conventional techniques for isolation of the catecholamines by alumina colum...
ESTER HYDROLYSIS RATE CONSTANT PREDICTION FROM INFRARED INTERFEROGRAMS
A method for predicting reactivity parameters of organic chemicals from spectroscopic data is being developed to assist in assessing the environmental fate of pollutants. he prototype system, which employs multiple linear regression analysis using selected points from the Fourier...
Mixture distributions of wind speed in the UAE
NASA Astrophysics Data System (ADS)
Shin, J.; Ouarda, T.; Lee, T. S.
2013-12-01
Wind speed probability distribution is commonly used to estimate potential wind energy. The 2-parameter Weibull distribution has been most widely used to characterize the distribution of wind speed. However, it is unable to properly model wind speed regimes when wind speed distribution presents bimodal and kurtotic shapes. Several studies have concluded that the Weibull distribution should not be used for frequency analysis of wind speed without investigation of wind speed distribution. Due to these mixture distributional characteristics of wind speed data, the application of mixture distributions should be further investigated in the frequency analysis of wind speed. A number of studies have investigated the potential wind energy in different parts of the Arabian Peninsula. Mixture distributional characteristics of wind speed were detected from some of these studies. Nevertheless, mixture distributions have not been employed for wind speed modeling in the Arabian Peninsula. In order to improve our understanding of wind energy potential in Arabian Peninsula, mixture distributions should be tested for the frequency analysis of wind speed. The aim of the current study is to assess the suitability of mixture distributions for the frequency analysis of wind speed in the UAE. Hourly mean wind speed data at 10-m height from 7 stations were used in the current study. The Weibull and Kappa distributions were employed as representatives of the conventional non-mixture distributions. 10 mixture distributions are used and constructed by mixing four probability distributions such as Normal, Gamma, Weibull and Extreme value type-one (EV-1) distributions. Three parameter estimation methods such as Expectation Maximization algorithm, Least Squares method and Meta-Heuristic Maximum Likelihood (MHML) method were employed to estimate the parameters of the mixture distributions. In order to compare the goodness-of-fit of tested distributions and parameter estimation methods for sample wind data, the adjusted coefficient of determination, Bayesian Information Criterion (BIC) and Chi-squared statistics were computed. Results indicate that MHML presents the best performance of parameter estimation for the used mixture distributions. In most of the employed 7 stations, mixture distributions give the best fit. When the wind speed regime shows mixture distributional characteristics, most of these regimes present the kurtotic statistical characteristic. Particularly, applications of mixture distributions for these stations show a significant improvement in explaining the whole wind speed regime. In addition, the Weibull-Weibull mixture distribution presents the best fit for the wind speed data in the UAE.
Lee, L.; Helsel, D.
2007-01-01
Analysis of low concentrations of trace contaminants in environmental media often results in left-censored data that are below some limit of analytical precision. Interpretation of values becomes complicated when there are multiple detection limits in the data-perhaps as a result of changing analytical precision over time. Parametric and semi-parametric methods, such as maximum likelihood estimation and robust regression on order statistics, can be employed to model distributions of multiply censored data and provide estimates of summary statistics. However, these methods are based on assumptions about the underlying distribution of data. Nonparametric methods provide an alternative that does not require such assumptions. A standard nonparametric method for estimating summary statistics of multiply-censored data is the Kaplan-Meier (K-M) method. This method has seen widespread usage in the medical sciences within a general framework termed "survival analysis" where it is employed with right-censored time-to-failure data. However, K-M methods are equally valid for the left-censored data common in the geosciences. Our S-language software provides an analytical framework based on K-M methods that is tailored to the needs of the earth and environmental sciences community. This includes routines for the generation of empirical cumulative distribution functions, prediction or exceedance probabilities, and related confidence limits computation. Additionally, our software contains K-M-based routines for nonparametric hypothesis testing among an unlimited number of grouping variables. A primary characteristic of K-M methods is that they do not perform extrapolation and interpolation. Thus, these routines cannot be used to model statistics beyond the observed data range or when linear interpolation is desired. For such applications, the aforementioned parametric and semi-parametric methods must be used.
Kaya, Yılmaz
2015-09-01
This paper proposes a novel approach to detect epilepsy seizures by using Electroencephalography (EEG), which is one of the most common methods for the diagnosis of epilepsy, based on 1-Dimension Local Binary Pattern (1D-LBP) and grey relational analysis (GRA) methods. The main aim of this paper is to evaluate and validate a novel approach, which is a computer-based quantitative EEG analyzing method and based on grey systems, aimed to help decision-maker. In this study, 1D-LBP, which utilizes all data points, was employed for extracting features in raw EEG signals, Fisher score (FS) was employed to select the representative features, which can also be determined as hidden patterns. Additionally, GRA is performed to classify EEG signals through these Fisher scored features. The experimental results of the proposed approach, which was employed in a public dataset for validation, showed that it has a high accuracy in identifying epileptic EEG signals. For various combinations of epileptic EEG, such as A-E, B-E, C-E, D-E, and A-D clusters, 100, 96, 100, 99.00 and 100% were achieved, respectively. Also, this work presents an attempt to develop a new general-purpose hidden pattern determination scheme, which can be utilized for different categories of time-varying signals.
Plasma and magnetospheric research
NASA Technical Reports Server (NTRS)
Comfort, R. H.; Horwitz, J. L.
1984-01-01
Methods employed in the analysis of plasmas and the magnetosphere are examined. Computer programs which generate distribution functions are used in the analysis of charging phenomena and non maxwell plasmas in terms of density and average energy. An analytical model for spin curve analysis is presented. A program for the analysis of the differential ion flux probe on the space shuttle mission is complete. Satellite data analysis for ion heating, plasma flows in the polar cap, polar wind flow, and density and temperature profiles for several plasmasphere transits are included.
Life cycle cost evaluation of the digital opacity compliance system.
McFarland, Michael J; Palmer, Glenn R; Olivas, Arthur C
2010-01-01
The US Environmental Protection Agency (EPA) has established EPA Reference Method 9 (Method 9) as the preferred enforcement approach for verifying compliance with federal visible opacity standards. While Method 9 has an extensive history of successful employment, reliance on human observers to quantify visible emissions is inherently subjective, a characteristic that exposes Method 9 results to claims of inaccuracy, bias and, in some cases, outright fraud. The Digital Opacity Compliance System (DOCS), which employs commercial-off-the-shelf digital photography coupled with simple computer processing, is a new approach for quantifying visible opacity. The DOCS technology has been previously demonstrated to meet and, in many cases, surpass the Method 9 accuracy and reliability standards (McFarland et al., 2006). Beyond its performance relative to Method 9, DOCS provides a permanent visual record of opacity, a vital feature in legal compliance challenges. In recent DOCS field testing, the opacity analysis of two hundred and forty one (241) regulated air emissions from the following industrial processes: 1) industrial scrubbers, 2) emergency generators, 3) asphalt paving, 4) steel production and 5) incineration indicated that Method 9 and DOCS were statistically equivalent at the 99% confidence level. However, a life cycle cost analysis demonstrated that implementation of DOCS could potentially save a facility $15,732 per trained opacity observer compared to utilization of Method 9. Copyright 2009 Elsevier Ltd. All rights reserved.
Martinez Manzanera, Octavio; Elting, Jan Willem; van der Hoeven, Johannes H.; Maurits, Natasha M.
2016-01-01
In the clinic, tremor is diagnosed during a time-limited process in which patients are observed and the characteristics of tremor are visually assessed. For some tremor disorders, a more detailed analysis of these characteristics is needed. Accelerometry and electromyography can be used to obtain a better insight into tremor. Typically, routine clinical assessment of accelerometry and electromyography data involves visual inspection by clinicians and occasionally computational analysis to obtain objective characteristics of tremor. However, for some tremor disorders these characteristics may be different during daily activity. This variability in presentation between the clinic and daily life makes a differential diagnosis more difficult. A long-term recording of tremor by accelerometry and/or electromyography in the home environment could help to give a better insight into the tremor disorder. However, an evaluation of such recordings using routine clinical standards would take too much time. We evaluated a range of techniques that automatically detect tremor segments in accelerometer data, as accelerometer data is more easily obtained in the home environment than electromyography data. Time can be saved if clinicians only have to evaluate the tremor characteristics of segments that have been automatically detected in longer daily activity recordings. We tested four non-parametric methods and five parametric methods on clinical accelerometer data from 14 patients with different tremor disorders. The consensus between two clinicians regarding the presence or absence of tremor on 3943 segments of accelerometer data was employed as reference. The nine methods were tested against this reference to identify their optimal parameters. Non-parametric methods generally performed better than parametric methods on our dataset when optimal parameters were used. However, one parametric method, employing the high frequency content of the tremor bandwidth under consideration (High Freq) performed similarly to non-parametric methods, but had the highest recall values, suggesting that this method could be employed for automatic tremor detection. PMID:27258018
Predictors of employment for young adults with developmental motor disabilities.
Magill-Evans, Joyce; Galambos, Nancy; Darrah, Johanna; Nickerson, Christy
2008-01-01
To identify the personal, family, and community factors that facilitate or hinder employment for young adults with developmental motor disabilities. Quantitative methods with an embedded qualitative component were used. Seventy-six persons between the ages of 20 and 30 years of age (Mean = 25, SD = 3.1) with a diagnosis of either cerebral palsy or spina bifida completed questionnaires addressing factors such as depression, and participated in a semi-structured interview that allowed participants to describe their experiences with education, employment, transportation, and other services. Almost half of the participants (n = 35) were not currently employed. Hierarchical regression analyses indicated that gender (females were less likely to be employed), IQ (lower IQ associated with unemployment), and transportation dependence accounted for 42% of the variance in employment. Themes emerging from content analysis of the interviews supported the findings related to transportation barriers. Social reactions to disability limited employment opportunities, and participants often felt stuck in terms of employment options with limited opportunities for advancement. Transportation is a significant barrier to employment and innovative solutions are needed. Issues related to gender need to be considered when addressing employment inequities for persons with primarily motor disabilities.
Medication management strategies used by older adults with heart failure: A systems-based analysis.
Mickelson, Robin S; Holden, Richard J
2017-09-01
Older adults with heart failure use strategies to cope with the constraining barriers impeding medication management. Strategies are behavioral adaptations that allow goal achievement despite these constraining conditions. When strategies do not exist, are ineffective or maladaptive, medication performance and health outcomes are at risk. While constraints to medication adherence are described in literature, strategies used by patients to manage medications are less well-described or understood. Guided by cognitive engineering concepts, the aim of this study was to describe and analyze the strategies used by older adults with heart failure to achieve their medication management goals. This mixed methods study employed an empirical strategies analysis method to elicit medication management strategies used by older adults with heart failure. Observation and interview data collected from 61 older adults with heart failure and 31 caregivers were analyzed using qualitative content analysis to derive categories, patterns and themes within and across cases. Data derived thematic sub-categories described planned and ad hoc methods of strategic adaptations. Stable strategies proactively adjusted the medication management process, environment, or the patients themselves. Patients applied situational strategies (planned or ad hoc) to irregular or unexpected situations. Medication non-adherence was a strategy employed when life goals conflicted with medication adherence. The health system was a source of constraints without providing commensurate strategies. Patients strived to control their medication system and achieve goals using adaptive strategies. Future patient self-mangement research can benefit from methods and theories used to study professional work, such as strategies analysis.
Analysis of a spacecraft instrument ball bearing assembly lubricated by a perfluoroalkylether
NASA Technical Reports Server (NTRS)
Morales, W.; Jones, W. R., Jr.; Buckley, D. H.
1986-01-01
An analysis of a spacecraft instrument ball bearing assembly, subjected to a scanning life test, was performed to determine the possible case of rotational problems involving these units aboard several satellites. The analysis indicated an ineffective transfer of a fluorinated liquid lubricant from a phenolic retainer to the bearing balls. Part of the analysis led to a novel HPLC separation method employing a fluorinated mobile phase in conjunction with silica based size exclusion columns.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Adams, Brian M.; Ebeida, Mohamed Salah; Eldred, Michael S.
The Dakota (Design Analysis Kit for Optimization and Terascale Applications) toolkit provides a exible and extensible interface between simulation codes and iterative analysis methods. Dakota contains algorithms for optimization with gradient and nongradient-based methods; uncertainty quanti cation with sampling, reliability, and stochastic expansion methods; parameter estimation with nonlinear least squares methods; and sensitivity/variance analysis with design of experiments and parameter study methods. These capabilities may be used on their own or as components within advanced strategies such as surrogate-based optimization, mixed integer nonlinear programming, or optimization under uncertainty. By employing object-oriented design to implement abstractions of the key components requiredmore » for iterative systems analyses, the Dakota toolkit provides a exible and extensible problem-solving environment for design and performance analysis of computational models on high performance computers. This report serves as a user's manual for the Dakota software and provides capability overviews and procedures for software execution, as well as a variety of example studies.« less
Spectroscopic Chemical Analysis Methods and Apparatus
NASA Technical Reports Server (NTRS)
Hug, William F. (Inventor); Lane, Arthur L. (Inventor); Bhartia, Rohit (Inventor); Reid, Ray D. (Inventor)
2017-01-01
Spectroscopic chemical analysis methods and apparatus are disclosed which employ deep ultraviolet (e.g. in the 200 nm to 300 nm spectral range) electron beam pumped wide bandgap semiconductor lasers, incoherent wide bandgap semiconductor light emitting devices, and hollow cathode metal ion lasers to perform non-contact, non-invasive detection of unknown chemical analytes. These deep ultraviolet sources enable dramatic size, weight and power consumption reductions of chemical analysis instruments. In some embodiments, Raman spectroscopic detection methods and apparatus use ultra-narrow-band angle tuning filters, acousto-optic tuning filters, and temperature tuned filters to enable ultra-miniature analyzers for chemical identification. In some embodiments Raman analysis is conducted along with photoluminescence spectroscopy (i.e. fluorescence and/or phosphorescence spectroscopy) to provide high levels of sensitivity and specificity in the same instrument.
Spectroscopic Chemical Analysis Methods and Apparatus
NASA Technical Reports Server (NTRS)
Hug, William F. (Inventor); Lane, Arthur L. (Inventor); Reid, Ray D. (Inventor); Bhartia, Rohit (Inventor)
2018-01-01
Spectroscopic chemical analysis methods and apparatus are disclosed which employ deep ultraviolet (e.g. in the 200 nm to 300 nm spectral range) electron beam pumped wide bandgap semiconductor lasers, incoherent wide bandgap semiconductor light emitting devices, and hollow cathode metal ion lasers to perform non-contact, non-invasive detection of unknown chemical analytes. These deep ultraviolet sources enable dramatic size, weight and power consumption reductions of chemical analysis instruments. In some embodiments, Raman spectroscopic detection methods and apparatus use ultra-narrow-band angle tuning filters, acousto-optic tuning filters, and temperature tuned filters to enable ultra-miniature analyzers for chemical identification. In some embodiments Raman analysis is conducted along with photoluminescence spectroscopy (i.e. fluorescence and/or phosphorescence spectroscopy) to provide high levels of sensitivity and specificity in the same instrument.
Speech Emotion Feature Selection Method Based on Contribution Analysis Algorithm of Neural Network
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang Xiaojia; Mao Qirong; Zhan Yongzhao
There are many emotion features. If all these features are employed to recognize emotions, redundant features may be existed. Furthermore, recognition result is unsatisfying and the cost of feature extraction is high. In this paper, a method to select speech emotion features based on contribution analysis algorithm of NN is presented. The emotion features are selected by using contribution analysis algorithm of NN from the 95 extracted features. Cluster analysis is applied to analyze the effectiveness for the features selected, and the time of feature extraction is evaluated. Finally, 24 emotion features selected are used to recognize six speech emotions.more » The experiments show that this method can improve the recognition rate and the time of feature extraction.« less
Health assessment of self-employed in the food service industry.
Grégoris, Marina; Deschamps, Frédéric; Salles, Julie; Sanchez, Stéphane
2017-07-01
Objectives This study's objective was to assess the morbidity of self-employed workers in the food service industry, an industry with a large amount of occupational health risks. Methods A cross-sectional study, consisting of 437 participants, was conducted between 2011 and 2013 in Champagne-Ardenne, France. The health questionnaire included an interview, a clinical examination, and medical investigations. Results The study population consisted of 146 self-employed workers (not working for an employer) and 291 employees (working with employment contracts for an employer). Logistic regression analysis revealed that self-employed workers had a higher morbidity than employees, after adjusting for age (OR: 3.45; 95% CI: 1.28 to 9.25). Main adverse health conditions were joint pain (71.2% self-employed vs. 38.1% employees, p < 0.001), ear disorders (54.1% self-employed vs. 33.7%, employees, p < 0.001), and cardiovascular diseases (47.3% self-employed vs. 21% employees, p < 0.001). Conclusions The study highlights the need for occupational health services for self-employed workers in France so that they may benefit from prevention of occupational risks and health surveillance. Results were presented to the self-employed healthcare insurance fund in order to establish an occupational health risks prevention system.
Performance-Based Seismic Design of Steel Frames Utilizing Colliding Bodies Algorithm
Veladi, H.
2014-01-01
A pushover analysis method based on semirigid connection concept is developed and the colliding bodies optimization algorithm is employed to find optimum seismic design of frame structures. Two numerical examples from the literature are studied. The results of the new algorithm are compared to the conventional design methods to show the power or weakness of the algorithm. PMID:25202717
Performance-based seismic design of steel frames utilizing colliding bodies algorithm.
Veladi, H
2014-01-01
A pushover analysis method based on semirigid connection concept is developed and the colliding bodies optimization algorithm is employed to find optimum seismic design of frame structures. Two numerical examples from the literature are studied. The results of the new algorithm are compared to the conventional design methods to show the power or weakness of the algorithm.
ERIC Educational Resources Information Center
Johnson, Ian M.; Cano, Virginia
2008-01-01
Introduction: This paper draws on the results of studies undertaken between 2004 and 2007 as part of Project REVISTAS, supported by the European Commission's ALFA Programme. Method: A variety of methods was employed over the life of the project, including analysis of directories, a survey of universities in the region believed to be offering…
Redshift data and statistical inference
NASA Technical Reports Server (NTRS)
Newman, William I.; Haynes, Martha P.; Terzian, Yervant
1994-01-01
Frequency histograms and the 'power spectrum analysis' (PSA) method, the latter developed by Yu & Peebles (1969), have been widely employed as techniques for establishing the existence of periodicities. We provide a formal analysis of these two classes of methods, including controlled numerical experiments, to better understand their proper use and application. In particular, we note that typical published applications of frequency histograms commonly employ far greater numbers of class intervals or bins than is advisable by statistical theory sometimes giving rise to the appearance of spurious patterns. The PSA method generates a sequence of random numbers from observational data which, it is claimed, is exponentially distributed with unit mean and variance, essentially independent of the distribution of the original data. We show that the derived random processes is nonstationary and produces a small but systematic bias in the usual estimate of the mean and variance. Although the derived variable may be reasonably described by an exponential distribution, the tail of the distribution is far removed from that of an exponential, thereby rendering statistical inference and confidence testing based on the tail of the distribution completely unreliable. Finally, we examine a number of astronomical examples wherein these methods have been used giving rise to widespread acceptance of statistically unconfirmed conclusions.
Kalafut, P; Kucera, R; Klimes, J; Sochor, J
2009-07-12
3-[4-(2-Methylpropyl)phenyl]propanoic acid has been introduced as impurity F to the European Pharmacopoeia in its Supplement 4.2. In contrast to other impurities, which are evaluated by HPLC, the content of impurity F is determined by gas chromatography after previous derivatization. Thus a novel reversed-phase HPLC method was developed to simplify the evaluation of pharmacopoeial impurity F of ibuprofen. Favourable properties of zirconia stationary phases were employed for this purpose. The HPLC separation was achieved on a Zr-CARB column (150 mm x 4.6mm i.d., 5 microm) using the mobile phase acetonitrile-phosphate buffer (pH 3.5, 25 mM) (38:62, v/v), temperature 80 degrees C and the flow rate 1.2 ml min(-1). The fluorescence detection was employed to enhance the sensitivity of the method. Optimal detection parameters were chosen on the basis of fluorescence spectra of the analytes. The excitation and emission wavelengths were 220 nm and 285 nm, respectively. The analysis was completed within 25 min. The subsequent validation of the method confirmed the applicability of method for the analytical assay of impurity F.
Menoutis, James; Parisi, Angela; Verma, Natasha
2018-04-15
In efforts to control the potential presence of heavy metals in pharmaceuticals, the United States Pharmacopeia (USP) and International Conference on Harmonization (ICH) have put forth new requirements and guidelines for their control. The new requirements and guidelines establish specific daily exposures (PDE) for 24 heavy metals/elemental impurities (EI) based upon their toxicological properties. USP General Chapter 〈233〉 provides a general reference procedure for preparing pharmaceutical samples for analysis employing microwave assisted digestion (MWAD). It also provides two Compendial Procedures, Procedure 1 employing ICP-AES, and Procedure 2 employing ICP-MS. Given the extremely low detection limits afforded by ICP-MS, much work has been done in developing and evaluating analytical methods to support the analysis of elemental impurities in finished pharmaceutical products, active pharmaceutical ingredients, and excipients by this analytical technique. In this study, we have evaluated the use of axial ICP-AES. This employs ultrasonic nebulization (UN) for the determination of Class 1 and 2 EI, instead of traditional pneumatic nebulization. The study also employed closed vessel MWAD to prepare samples for analysis. Limits of quantitation were element specific and significantly lower than the PDEs for oral drugs. Spike recoveries for the elements studied ranged between 89.3% and 109.25%, except for Os, which was subject to OsO4 formation during MWAD. The use of axial ICP-AES UN provides an alternative to ICP-MS in the analysis of EI requiring low detection limits. Copyright © 2018 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
da Silva, Jorge Alberto Valle; Modesto-Costa, Lucas; de Koning, Martijn C.; Borges, Itamar; França, Tanos Celmar Costa
2018-01-01
In this work, quaternary and non-quaternary oximes designed to bind at the peripheral site of acetylcholinesterase previously inhibited by organophosphates were investigated theoretically. Some of those oximes have a large number of degrees of freedom, thus requiring an accurate method to obtain molecular geometries. For this reason, the density functional theory (DFT) was employed to refine their molecular geometries after conformational analysis and to compare their 1H and 13C nuclear magnetic resonance (NMR) theoretical signals in gas-phase and in solvent. A good agreement with experimental data was achieved and the same theoretical approach was employed to obtain the geometries in water environment for further studies.
Nonlinear Dynamic Behavior of Impact Damage in a Composite Skin-Stiffener Structure
NASA Technical Reports Server (NTRS)
Ooijevaar, T. H.; Rogge, M. D.; Loendersloot, R.; Warnet, L.; Akkerman, R.; deBoer, A.
2013-01-01
One of the key issues in composite structures for aircraft applications is the early identification of damage. Often, service induced damage does not involve visible plastic deformation, but internal matrix related damage, like delaminations. A wide range of technologies, comprising global vibration and local wave propagation methods can be employed for health monitoring purposes. Traditional low frequency modal analysis based methods are linear methods. The effectiveness of these methods is often limited since they rely on a stationary and linear approximation of the system. The nonlinear interaction between a low frequency wave field and a local impact induced skin-stiffener failure is experimentally demonstrated in this paper. The different mechanisms that are responsible for the nonlinearities (opening, closing and contact) of the distorted harmonic waveforms are separated with the help of phase portraits. A basic analytical model is employed to support the observations.
Application of an enriched FEM technique in thermo-mechanical contact problems
NASA Astrophysics Data System (ADS)
Khoei, A. R.; Bahmani, B.
2018-02-01
In this paper, an enriched FEM technique is employed for thermo-mechanical contact problem based on the extended finite element method. A fully coupled thermo-mechanical contact formulation is presented in the framework of X-FEM technique that takes into account the deformable continuum mechanics and the transient heat transfer analysis. The Coulomb frictional law is applied for the mechanical contact problem and a pressure dependent thermal contact model is employed through an explicit formulation in the weak form of X-FEM method. The equilibrium equations are discretized by the Newmark time splitting method and the final set of non-linear equations are solved based on the Newton-Raphson method using a staggered algorithm. Finally, in order to illustrate the capability of the proposed computational model several numerical examples are solved and the results are compared with those reported in literature.
Probability and Cumulative Density Function Methods for the Stochastic Advection-Reaction Equation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Barajas-Solano, David A.; Tartakovsky, Alexandre M.
We present a cumulative density function (CDF) method for the probabilistic analysis of $d$-dimensional advection-dominated reactive transport in heterogeneous media. We employ a probabilistic approach in which epistemic uncertainty on the spatial heterogeneity of Darcy-scale transport coefficients is modeled in terms of random fields with given correlation structures. Our proposed CDF method employs a modified Large-Eddy-Diffusivity (LED) approach to close and localize the nonlocal equations governing the one-point PDF and CDF of the concentration field, resulting in a $(d + 1)$ dimensional PDE. Compared to the classsical LED localization, the proposed modified LED localization explicitly accounts for the mean-field advectivemore » dynamics over the phase space of the PDF and CDF. To illustrate the accuracy of the proposed closure, we apply our CDF method to one-dimensional single-species reactive transport with uncertain, heterogeneous advection velocities and reaction rates modeled as random fields.« less
Szymańska, Ewa; Tinnevelt, Gerjen H; Brodrick, Emma; Williams, Mark; Davies, Antony N; van Manen, Henk-Jan; Buydens, Lutgarde M C
2016-08-05
Current challenges of clinical breath analysis include large data size and non-clinically relevant variations observed in exhaled breath measurements, which should be urgently addressed with competent scientific data tools. In this study, three different baseline correction methods are evaluated within a previously developed data size reduction strategy for multi capillary column - ion mobility spectrometry (MCC-IMS) datasets. Introduced for the first time in breath data analysis, the Top-hat method is presented as the optimum baseline correction method. A refined data size reduction strategy is employed in the analysis of a large breathomic dataset on a healthy and respiratory disease population. New insights into MCC-IMS spectra differences associated with respiratory diseases are provided, demonstrating the additional value of the refined data analysis strategy in clinical breath analysis. Copyright © 2016 Elsevier B.V. All rights reserved.
FINITE-ELEMENT ANALYSIS OF MULTIPHASE IMMISCIBLE FLOW THROUGH SOILS
A finite-element model is developed for multiphase flow through soil involving three immiscible fluids: namely, air, water, and a nonaqueous phase liquid (NAPL). A variational method is employed for the finite-element formulation corresponding to the coupled differential equation...
Monitoring of bone regeneration process by means of texture analysis
NASA Astrophysics Data System (ADS)
Kokkinou, E.; Boniatis, I.; Costaridou, L.; Saridis, A.; Panagiotopoulos, E.; Panayiotakis, G.
2009-09-01
An image analysis method is proposed for the monitoring of the regeneration of the tibial bone. For this purpose, 130 digitized radiographs of 13 patients, who had undergone tibial lengthening by the Ilizarov method, were studied. For each patient, 10 radiographs, taken at an equal number of postoperative successive time moments, were available. Employing available software, 3 Regions Of Interest (ROIs), corresponding to the: (a) upper, (b) central, and (c) lower aspect of the gap, where bone regeneration was expected to occur, were determined on each radiograph. Employing custom developed algorithms: (i) a number of textural features were generated from each of the ROIs, and (ii) a texture-feature based regression model was designed for the quantitative monitoring of the bone regeneration process. Statistically significant differences (p < 0.05) were derived for the initial and the final textural features values, generated from the first and the last postoperatively obtained radiographs, respectively. A quadratic polynomial regression equation fitted data adequately (r2 = 0.9, p < 0.001). The suggested method may contribute to the monitoring of the tibial bone regeneration process.
A solution-adaptive hybrid-grid method for the unsteady analysis of turbomachinery
NASA Technical Reports Server (NTRS)
Mathur, Sanjay R.; Madavan, Nateri K.; Rajagopalan, R. G.
1993-01-01
A solution-adaptive method for the time-accurate analysis of two-dimensional flows in turbomachinery is described. The method employs a hybrid structured-unstructured zonal grid topology in conjunction with appropriate modeling equations and solution techniques in each zone. The viscous flow region in the immediate vicinity of the airfoils is resolved on structured O-type grids while the rest of the domain is discretized using an unstructured mesh of triangular cells. Implicit, third-order accurate, upwind solutions of the Navier-Stokes equations are obtained in the inner regions. In the outer regions, the Euler equations are solved using an explicit upwind scheme that incorporates a second-order reconstruction procedure. An efficient and robust grid adaptation strategy, including both grid refinement and coarsening capabilities, is developed for the unstructured grid regions. Grid adaptation is also employed to facilitate information transfer at the interfaces between unstructured grids in relative motion. Results for grid adaptation to various features pertinent to turbomachinery flows are presented. Good comparisons between the present results and experimental measurements and earlier structured-grid results are obtained.
NASA Astrophysics Data System (ADS)
Laib dit Leksir, Y.; Mansour, M.; Moussaoui, A.
2018-03-01
Analysis and processing of databases obtained from infrared thermal inspections made on electrical installations require the development of new tools to obtain more information to visual inspections. Consequently, methods based on the capture of thermal images show a great potential and are increasingly employed in this field. However, there is a need for the development of effective techniques to analyse these databases in order to extract significant information relating to the state of the infrastructures. This paper presents a technique explaining how this approach can be implemented and proposes a system that can help to detect faults in thermal images of electrical installations. The proposed method classifies and identifies the region of interest (ROI). The identification is conducted using support vector machine (SVM) algorithm. The aim here is to capture the faults that exist in electrical equipments during an inspection of some machines using A40 FLIR camera. After that, binarization techniques are employed to select the region of interest. Later the comparative analysis of the obtained misclassification errors using the proposed method with Fuzzy c means and Ostu, has also be addressed.
ERIC Educational Resources Information Center
Ayon, Cecilia; Lee, Cheryl D.
2005-01-01
Objective: The purpose of this study was to find if differences exist among 88 African American, Caucasian, and Latino families who received child welfare services. Method: A secondary data analysis of cross-sectional survey data employing standardized measures was used for this study. Family preservation (FP) services were received by 49…
A Metaphor Analysis of the Fifth Grade Students' Perceptions about Writing
ERIC Educational Resources Information Center
Erdogan, Tolga; Erdogan, Özge
2013-01-01
The aim of this study is to examine the fifth grade students' perceptions about writing through metaphor analysis. This is a descriptive research in nature, and a qualitative research method was employed in the study. The participants of the study are a total of 594 fifth graders in the city of Ankara. The students are asked to complete the…
ERIC Educational Resources Information Center
Soloway, Irv
An approach to the study of drug sub-culture groups and a model for predictive research in the identification and isolation of heroin addicts are developed in this thesis. The basic methodologies employed are the linguistic methods of Kenneth Pike and Claude Levi-Strauss for use in the analysis of social phenomena. Communicative mechanisms by…
Rehabilitation Associate Training for Employed Staff. Task Analysis (RA-2).
ERIC Educational Resources Information Center
Davis, Michael J.; Jensen, Mary
This learning module, which is intended for use in in-service training for vocational rehabilitation counselors, deals with writing a task analysis. Step-by-step guidelines are provided for breaking down a task into small teachable steps by analyzing the task in terms of the way in which it will be performed once learned (method), the steps to be…
Exploring Cultural Predictors of Military Intervention Success
2015-04-01
research employed a sequential, mixed method analysis consisting of a quantitative ex post facto analysis of United Nation’s (UN) interventions... research . Results In spite of the many assumptions and limitation forced upon the research by its ex post facto design, it nonetheless provided some... post facto exploration of predictors of military intervention success. As such, the research examined pre- and post -intervention
ERIC Educational Resources Information Center
Weaver, Russell
2016-01-01
This article reports on an analysis of the effects of a quasinatural experiment in which 16 rural communities participated in public discussion, leadership training, and community visioning as part of an Extension program at Montana State University. Difference-in-differences methods reveal that key U.S. Census socioeconomic indicators either…
Comparative policy analysis for alcohol and drugs: Current state of the field.
Ritter, Alison; Livingston, Michael; Chalmers, Jenny; Berends, Lynda; Reuter, Peter
2016-05-01
A central policy research question concerns the extent to which specific policies produce certain effects - and cross-national (or between state/province) comparisons appear to be an ideal way to answer such a question. This paper explores the current state of comparative policy analysis (CPA) with respect to alcohol and drugs policies. We created a database of journal articles published between 2010 and 2014 as the body of CPA work for analysis. We used this database of 57 articles to clarify, extract and analyse the ways in which CPA has been defined. Quantitative and qualitative analysis of the CPA methods employed, the policy areas that have been studied, and differences between alcohol CPA and drug CPA are explored. There is a lack of clear definition as to what counts as a CPA. The two criteria for a CPA (explicit study of a policy, and comparison across two or more geographic locations), exclude descriptive epidemiology and single state comparisons. With the strict definition, most CPAs were with reference to alcohol (42%), although the most common policy to be analysed was medical cannabis (23%). The vast majority of papers undertook quantitative data analysis, with a variety of advanced statistical methods. We identified five approaches to the policy specification: classification or categorical coding of policy as present or absent; the use of an index; implied policy differences; described policy difference and data-driven policy coding. Each of these has limitations, but perhaps the most common limitation was the inability for the method to account for the differences between policy-as-stated versus policy-as-implemented. There is significant diversity in CPA methods for analysis of alcohol and drugs policy, and some substantial challenges with the currently employed methods. The absence of clear boundaries to a definition of what counts as a 'comparative policy analysis' may account for the methodological plurality but also appears to stand in the way of advancing the techniques. Copyright © 2016 Elsevier B.V. All rights reserved.
Barnes, Stephen; Benton, H Paul; Casazza, Krista; Cooper, Sara J; Cui, Xiangqin; Du, Xiuxia; Engler, Jeffrey; Kabarowski, Janusz H; Li, Shuzhao; Pathmasiri, Wimal; Prasain, Jeevan K; Renfrow, Matthew B; Tiwari, Hemant K
2016-08-01
Metabolomics, a systems biology discipline representing analysis of known and unknown pathways of metabolism, has grown tremendously over the past 20 years. Because of its comprehensive nature, metabolomics requires careful consideration of the question(s) being asked, the scale needed to answer the question(s), collection and storage of the sample specimens, methods for extraction of the metabolites from biological matrices, the analytical method(s) to be employed and the quality control of the analyses, how collected data are correlated, the statistical methods to determine metabolites undergoing significant change, putative identification of metabolites and the use of stable isotopes to aid in verifying metabolite identity and establishing pathway connections and fluxes. This second part of a comprehensive description of the methods of metabolomics focuses on data analysis, emerging methods in metabolomics and the future of this discipline. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.
Hirayama, Junichi; Tazumi, Akihiro; Hayashi, Kyohei; Tasaki, Erina; Kuribayashi, Takashi; Moore, John E; Millar, Beverley C; Matsuda, Motoo
2011-06-01
In the present study, the reliability of full-length gene sequence information for several genes including 16S rRNA was examined, for the discrimination of the two representative Campylobacter lari taxa, namely urease-negative (UN) C. lari and urease-positive thermophilic Campylobacter (UPTC). As previously described, 16S rRNA gene sequence are not reliable for the molecular discrimination of UN C. lari from UPTC organisms employing both the unweighted pair group method using arithmetic means analysis (UPGMA) and neighbor joining (NJ) methods. In addition, three composite full-length gene sequences (ciaB, flaC and vacJ) out of seven gene loci examined were reliable for discrimination employing dendrograms constructed by the UPGMA method. In addition, all the dendrograms of the NJ phylogenetic trees constructed based on the nine gene information were not reliable for the discrimination. Three composite full-length gene sequences (ciaB, flaC and vacJ) were reliable for the molecular discrimination between UN C. lari and UPTC organisms employing the UPGMA method, as well as among four thermophilic Campylobacter species. Copyright © 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hatakeyama, Keisuke, E-mail: hatakeyamak@pref.tottori.jp; Okuda, Masukazu; Kuki, Takahiro
2012-12-15
Graphical abstract: Display Omitted Highlights: ► The photocatalytic property of a silver orthophosphate (Ag{sub 3}PO{sub 4}) was investigated for humic acid degradation. ► The Ag{sub 3}PO{sub 4} shows high photocatalytic activity under visible light. ► The photocatalytic activity was greatly improved by employing the precipitation method. -- Abstract: In order to remove dissolved organic matter such as humic acid from water, a silver orthophosphate (Ag{sub 3}PO{sub 4}) was newly employed as a heterogeneous photocatalyst. Here, Ag{sub 3}PO{sub 4} was prepared by simple ion-exchange and precipitation methods, and the physico-chemical properties were characterized by X-ray diffraction, ultraviolet–visible diffuse reflectance spectroscopy, scanningmore » electron microscopy, particle distribution measurements and Brunauer–Emmett–Teller (BET) analysis. The degradation of humic acid was faster over Ag{sub 3}PO{sub 4} catalyst than over conventional TiO{sub 2} (P-25). The total photocatalytic properties were improved by employing not an ion-exchange method but a precipitation method; humic acid degradation was performed with a removal ratio of dissolved organic carbon of 75% under visible light (λ = 451 nm) for 2-h irradiation.« less
NASA Astrophysics Data System (ADS)
Cheruku, Rajesh; Govindaraj, G.; Vijayan, Lakshmi
2017-12-01
The nanocrystalline lithium ferrite was synthesized by wet chemical methods such as solution combustion technique, sol-gel, and hydrothermal for a comparative study. Different characterization techniques like x-ray powder diffraction and thermal analysis were employed to confirm the structure and phase. Temperature-dependent Raman analysis was employed to classify the phonon modes associated with precise atomic motions existing in the synthesized materials. Morphology of sample surface was explored by scanning electron microscopy, and elemental analysis was done by energy dispersive spectroscopy analysis. The nanocrystalline nature of the materials was confirmed through transmission electron microscopy. Magnetic properties of these samples were explored through a vibrating sample magnetometer. Ac electrical impedance spectroscopy data were investigated using two Cole-Cole functions, and activation energies were calculated for all materials. Among them, solution combustion prepared lithium ferrite shows the highest conductivity and lowest activation energy.
NASA Astrophysics Data System (ADS)
Florindo, João. Batista
2018-04-01
This work proposes the use of Singular Spectrum Analysis (SSA) for the classification of texture images, more specifically, to enhance the performance of the Bouligand-Minkowski fractal descriptors in this task. Fractal descriptors are known to be a powerful approach to model and particularly identify complex patterns in natural images. Nevertheless, the multiscale analysis involved in those descriptors makes them highly correlated. Although other attempts to address this point was proposed in the literature, none of them investigated the relation between the fractal correlation and the well-established analysis employed in time series. And SSA is one of the most powerful techniques for this purpose. The proposed method was employed for the classification of benchmark texture images and the results were compared with other state-of-the-art classifiers, confirming the potential of this analysis in image classification.
Analysis of Genome-Wide Association Studies with Multiple Outcomes Using Penalization
Liu, Jin; Huang, Jian; Ma, Shuangge
2012-01-01
Genome-wide association studies have been extensively conducted, searching for markers for biologically meaningful outcomes and phenotypes. Penalization methods have been adopted in the analysis of the joint effects of a large number of SNPs (single nucleotide polymorphisms) and marker identification. This study is partly motivated by the analysis of heterogeneous stock mice dataset, in which multiple correlated phenotypes and a large number of SNPs are available. Existing penalization methods designed to analyze a single response variable cannot accommodate the correlation among multiple response variables. With multiple response variables sharing the same set of markers, joint modeling is first employed to accommodate the correlation. The group Lasso approach is adopted to select markers associated with all the outcome variables. An efficient computational algorithm is developed. Simulation study and analysis of the heterogeneous stock mice dataset show that the proposed method can outperform existing penalization methods. PMID:23272092
An exploration of function analysis and function allocation in the commercial flight domain
NASA Technical Reports Server (NTRS)
Mcguire, James C.; Zich, John A.; Goins, Richard T.; Erickson, Jeffery B.; Dwyer, John P.; Cody, William J.; Rouse, William B.
1991-01-01
The applicability is explored of functional analysis methods to support cockpit design. Specifically, alternative techniques are studied for ensuring an effective division of responsibility between the flight crew and automation. A functional decomposition is performed of the commercial flight domain to provide the information necessary to support allocation decisions and demonstrate methodology for allocating functions to flight crew or to automation. The function analysis employed 'bottom up' and 'top down' analyses and demonstrated the comparability of identified functions, using the 'lift off' segment of the 'take off' phase as a test case. The normal flight mission and selected contingencies were addressed. Two alternative methods for using the functional description in the allocation of functions between man and machine were investigated. The two methods were compared in order to ascertain their relative strengths and weaknesses. Finally, conclusions were drawn regarding the practical utility of function analysis methods.
Elastostatic stress analysis of orthotropic rectangular center-cracked plates
NASA Technical Reports Server (NTRS)
Gyekenyesi, G. S.; Mendelson, A.
1972-01-01
A mapping-collocation method was developed for the elastostatic stress analysis of finite, anisotropic plates with centrally located traction-free cracks. The method essentially consists of mapping the crack into the unit circle and satisfying the crack boundary conditions exactly with the help of Muskhelishvili's function extension concept. The conditions on the outer boundary are satisfied approximately by applying the method of least-squares boundary collocation. A parametric study of finite-plate stress intensity factors, employing this mapping-collocation method, is presented. It shows the effects of varying material properties, orientation angle, and crack-length-to-plate-width and plate-height-to-plate-width ratios for rectangular orthotropic plates under constant tensile and shear loads.
A Study on Predictive Analytics Application to Ship Machinery Maintenance
2013-09-01
Looking at the nature of the time series forecasting method , it would be better applied to offline analysis . The application for real- time online...other system attributes in future. Two techniques of statistical analysis , mainly time series models and cumulative sum control charts, are discussed in...statistical tool employed for the two techniques of statistical analysis . Both time series forecasting as well as CUSUM control charts are shown to be
Audio signal analysis for tool wear monitoring in sheet metal stamping
NASA Astrophysics Data System (ADS)
Ubhayaratne, Indivarie; Pereira, Michael P.; Xiang, Yong; Rolfe, Bernard F.
2017-02-01
Stamping tool wear can significantly degrade product quality, and hence, online tool condition monitoring is a timely need in many manufacturing industries. Even though a large amount of research has been conducted employing different sensor signals, there is still an unmet demand for a low-cost easy to set up condition monitoring system. Audio signal analysis is a simple method that has the potential to meet this demand, but has not been previously used for stamping process monitoring. Hence, this paper studies the existence and the significance of the correlation between emitted sound signals and the wear state of sheet metal stamping tools. The corrupting sources generated by the tooling of the stamping press and surrounding machinery have higher amplitudes compared to that of the sound emitted by the stamping operation itself. Therefore, a newly developed semi-blind signal extraction technique was employed as a pre-processing technique to mitigate the contribution of these corrupting sources. The spectral analysis results of the raw and extracted signals demonstrate a significant qualitative relationship between wear progression and the emitted sound signature. This study lays the basis for employing low-cost audio signal analysis in the development of a real-time industrial tool condition monitoring system.
Schwarz, L.K.; Runge, M.C.
2009-01-01
Age estimation of individuals is often an integral part of species management research, and a number of ageestimation techniques are commonly employed. Often, the error in these techniques is not quantified or accounted for in other analyses, particularly in growth curve models used to describe physiological responses to environment and human impacts. Also, noninvasive, quick, and inexpensive methods to estimate age are needed. This research aims to provide two Bayesian methods to (i) incorporate age uncertainty into an age-length Schnute growth model and (ii) produce a method from the growth model to estimate age from length. The methods are then employed for Florida manatee (Trichechus manatus) carcasses. After quantifying the uncertainty in the aging technique (counts of ear bone growth layers), we fit age-length data to the Schnute growth model separately by sex and season. Independent prior information about population age structure and the results of the Schnute model are then combined to estimate age from length. Results describing the age-length relationship agree with our understanding of manatee biology. The new methods allow us to estimate age, with quantified uncertainty, for 98% of collected carcasses: 36% from ear bones, 62% from length.
NASA Astrophysics Data System (ADS)
Şahan, Mehmet Fatih
2017-11-01
In this paper, the viscoelastic damped response of cross-ply laminated shallow spherical shells is investigated numerically in a transformed Laplace space. In the proposed approach, the governing differential equations of cross-ply laminated shallow spherical shell are derived using the dynamic version of the principle of virtual displacements. Following this, the Laplace transform is employed in the transient analysis of viscoelastic laminated shell problem. Also, damping can be incorporated with ease in the transformed domain. The transformed time-independent equations in spatial coordinate are solved numerically by Gauss elimination. Numerical inverse transformation of the results into the real domain are operated by the modified Durbin transform method. Verification of the presented method is carried out by comparing the results with those obtained by the Newmark method and ANSYS finite element software. Furthermore, the developed solution approach is applied to problems with several impulsive loads. The novelty of the present study lies in the fact that a combination of the Navier method and Laplace transform is employed in the analysis of cross-ply laminated shallow spherical viscoelastic shells. The numerical sample results have proved that the presented method constitutes a highly accurate and efficient solution, which can be easily applied to the laminated viscoelastic shell problems.
Bryan, Kenneth; Cunningham, Pádraig
2008-01-01
Background Microarrays have the capacity to measure the expressions of thousands of genes in parallel over many experimental samples. The unsupervised classification technique of bicluster analysis has been employed previously to uncover gene expression correlations over subsets of samples with the aim of providing a more accurate model of the natural gene functional classes. This approach also has the potential to aid functional annotation of unclassified open reading frames (ORFs). Until now this aspect of biclustering has been under-explored. In this work we illustrate how bicluster analysis may be extended into a 'semi-supervised' ORF annotation approach referred to as BALBOA. Results The efficacy of the BALBOA ORF classification technique is first assessed via cross validation and compared to a multi-class k-Nearest Neighbour (kNN) benchmark across three independent gene expression datasets. BALBOA is then used to assign putative functional annotations to unclassified yeast ORFs. These predictions are evaluated using existing experimental and protein sequence information. Lastly, we employ a related semi-supervised method to predict the presence of novel functional modules within yeast. Conclusion In this paper we demonstrate how unsupervised classification methods, such as bicluster analysis, may be extended using of available annotations to form semi-supervised approaches within the gene expression analysis domain. We show that such methods have the potential to improve upon supervised approaches and shed new light on the functions of unclassified ORFs and their co-regulation. PMID:18831786
Structural Analysis Methods for Structural Health Management of Future Aerospace Vehicles
NASA Technical Reports Server (NTRS)
Tessler, Alexander
2007-01-01
Two finite element based computational methods, Smoothing Element Analysis (SEA) and the inverse Finite Element Method (iFEM), are reviewed, and examples of their use for structural health monitoring are discussed. Due to their versatility, robustness, and computational efficiency, the methods are well suited for real-time structural health monitoring of future space vehicles, large space structures, and habitats. The methods may be effectively employed to enable real-time processing of sensing information, specifically for identifying three-dimensional deformed structural shapes as well as the internal loads. In addition, they may be used in conjunction with evolutionary algorithms to design optimally distributed sensors. These computational tools have demonstrated substantial promise for utilization in future Structural Health Management (SHM) systems.
Espinosa, Nieves; Søndergaard, Roar R; Jørgensen, Mikkel; Krebs, Frederik C
2016-04-21
Silver nanowires (AgNWs) were prepared on a 5 g scale using either the well-known batch synthesis following the polyol method or a new flow synthesis method. The AgNWs were employed as semitransparent electrode materials in organic photovoltaics and compared to traditional printed silver electrodes based on micron sized silver flakes using life cycle analysis and environmental impact analysis methods. The life cycle analysis of AgNWs confirms that they provide an avenue to low-impact semitransparent electrodes. We find that the benefit of AgNWs in terms of embodied energy is less pronounced than generally assumed but that the toxicological and environmental benefits are significant. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Situational Analysis for Complex Systems: Methodological Development in Public Health Research.
Martin, Wanda; Pauly, Bernie; MacDonald, Marjorie
2016-01-01
Public health systems have suffered infrastructure losses worldwide. Strengthening public health systems requires not only good policies and programs, but also development of new research methodologies to support public health systems renewal. Our research team considers public health systems to be complex adaptive systems and as such new methods are necessary to generate knowledge about the process of implementing public health programs and services. Within our program of research, we have employed situational analysis as a method for studying complex adaptive systems in four distinct research studies on public health program implementation. The purpose of this paper is to demonstrate the use of situational analysis as a method for studying complex systems and highlight the need for further methodological development.
Oxygen Measurements in Liposome Encapsulated Hemoglobin
NASA Astrophysics Data System (ADS)
Phiri, Joshua Benjamin
Liposome encapsulated hemoglobins (LEH's) are of current interest as blood substitutes. An analytical methodology for rapid non-invasive measurements of oxygen in artificial oxygen carriers is examined. High resolution optical absorption spectra are calculated by means of a one dimensional diffusion approximation. The encapsulated hemoglobin is prepared from fresh defibrinated bovine blood. Liposomes are prepared from hydrogenated soy phosphatidylcholine (HSPC), cholesterol and dicetylphosphate using a bath sonication method. An integrating sphere spectrophotometer is employed for diffuse optics measurements. Data is collected using an automated data acquisition system employing lock-in -amplifiers. The concentrations of hemoglobin derivatives are evaluated from the corresponding extinction coefficients using a numerical technique of singular value decomposition, and verification of the results is done using Monte Carlo simulations. In situ measurements are required for the determination of hemoglobin derivatives because most encapsulation methods invariably lead to the formation of methemoglobin, a nonfunctional form of hemoglobin. The methods employed in this work lead to high resolution absorption spectra of oxyhemoglobin and other derivatives in red blood cells and liposome encapsulated hemoglobin (LEH). The analysis using singular value decomposition method offers a quantitative means of calculating the fractions of oxyhemoglobin and other hemoglobin derivatives in LEH samples. The analytical methods developed in this work will become even more useful when production of LEH as a blood substitute is scaled up to large volumes.
NASA Astrophysics Data System (ADS)
Mao, Jin-Jin; Tian, Shou-Fu; Zou, Li; Zhang, Tian-Tian
2018-05-01
In this paper, we consider a generalized Hirota equation with a bounded potential, which can be used to describe the propagation properties of optical soliton solutions. By employing the hypothetical method and the sub-equation method, we construct the bright soliton, dark soliton, complexitons and Gaussian soliton solutions of the Hirota equation. Moreover, we explicitly derive the power series solutions with their convergence analysis. Finally, we provide the graphical analysis of such soliton solutions in order to better understand their dynamical behavior.
Graf, Tyler N; Cech, Nadja B; Polyak, Stephen J; Oberlies, Nicholas H
2016-07-15
Validated methods are needed for the analysis of natural product secondary metabolites. These methods are particularly important to translate in vitro observations to in vivo studies. Herein, a method is reported for the analysis of the key secondary metabolites, a series of flavonolignans and a flavonoid, from an extract prepared from the seeds of milk thistle [Silybum marianum (L.) Gaertn. (Asteraceae)]. This report represents the first UHPLC MS-MS method validated for quantitative analysis of these compounds. The method takes advantage of the excellent resolution achievable with UHPLC to provide a complete analysis in less than 7min. The method is validated using both UV and MS detectors, making it applicable in laboratories with different types of analytical instrumentation available. Lower limits of quantitation achieved with this method range from 0.0400μM to 0.160μM with UV and from 0.0800μM to 0.160μM with MS. The new method is employed to evaluate variability in constituent composition in various commercial S. marianum extracts, and to show that storage of the milk thistle compounds in DMSO leads to degradation. Copyright © 2016 Elsevier B.V. All rights reserved.
Lin, Yi-Jiun; Huang, I-Chun; Wang, Yun-Tung
2014-01-01
The aim of this exploratory study is to gain an understanding of the outcomes of home-based employment service programs for people with disabilities and their related factors in Taiwan. This study used survey method to collect 132 questionnaires. Descriptive and two-variable statistics including chi-square (χ(2)), independent sample t-test and analysis of variance were employed. The results found that 36.5% of the subjects improved their employment status and 75.8% of them improved in employability. Educational level and and vocational categories including "web page production", "e-commerce", "internet marketing", "on-line store" and "website set-up and management" were significantly "positively" associated with either of the two outcome indicators - change of employment status and employability. This study is the first evidence-based study about the outcomes of home-based employment service programs and their related factors for people with disabilities in Taiwan. The outcomes of the home-based employment service programs for people with disabilities were presented. Implications for Rehabilitation Home-based rehabilitation for people with disabilities can be effective. A programme of this kind supports participants in improving or gaining employment status as well as developing employability skills. Further consideration should be given to developing cost-effective home-based programmes and evaluating their effectiveness.
Analysis of large soil samples for actinides
Maxwell, III; Sherrod, L [Aiken, SC
2009-03-24
A method of analyzing relatively large soil samples for actinides by employing a separation process that includes cerium fluoride precipitation for removing the soil matrix and precipitates plutonium, americium, and curium with cerium and hydrofluoric acid followed by separating these actinides using chromatography cartridges.
NASA software specification and evaluation system design, part 2
NASA Technical Reports Server (NTRS)
1976-01-01
A survey and analysis of the existing methods, tools and techniques employed in the development of software are presented along with recommendations for the construction of reliable software. Functional designs for software specification language, and the data base verifier are presented.
DOE Office of Scientific and Technical Information (OSTI.GOV)
R.I. Rudyka; Y.E. Zingerman; K.G. Lavrov
Up-to-date mathematical methods, such as correlation analysis and expert systems, are employed in creating a model of the coking process. Automatic coking-control systems developed by Giprokoks rule out human error. At an existing coke battery, after introducing automatic control, the heating-gas consumption is reduced by {>=}5%.
Geospatial methods and data analysis for assessing distribution of grazing livestock
USDA-ARS?s Scientific Manuscript database
Free-ranging livestock research must begin with a well conceived problem statement and employ appropriate data acquisition tools and analytical techniques to accomplish the research objective. These requirements are especially critical in addressing animal distribution. Tools and statistics used t...
Text Genres in Information Organization
ERIC Educational Resources Information Center
Nahotko, Marek
2016-01-01
Introduction: Text genres used by so-called information organizers in the processes of information organization in information systems were explored in this research. Method: The research employed text genre socio-functional analysis. Five genre groups in information organization were distinguished. Every genre group used in information…
The development of technology for detection of marijuana intoxication by analysis of body fluids
DOT National Transportation Integrated Search
1975-09-01
A method employing high pressure liquid chromatography plus mass spectrometry was developed for the detection of low concentrations of various marijuana metabolites in body fluids. A new marijuana metabolite was found which could be detected in blood...
DOT National Transportation Integrated Search
2007-09-01
Two competing approaches to travel demand modeling exist today. The more traditional 4-step travel demand models rely on aggregate demographic data at a traffic analysis zone (TAZ) level. Activity-based microsimulation methods employ more robus...
Apollo/Skylab suit program management systems study. Volume 2: Cost analysis
NASA Technical Reports Server (NTRS)
1974-01-01
The business management methods employed in the performance of the Apollo-Skylab Suit Program are studied. The data accumulated over the span of the contract as well as the methods used to accumulate the data are examined. Management methods associated with the monitoring and control of resources applied towards the performance of the contract are also studied and recommended upon. The primary objective is the compilation, analysis, and presentation of historical cost performance criteria. Cost data are depicted for all phases of the Apollo-Skylab program in common, meaningful terms, whereby the data may be applicable to future suit program planning efforts.
Work productivity loss from depression: evidence from an employer survey.
Rost, Kathryn M; Meng, Hongdao; Xu, Stanley
2014-12-18
National working groups identify the need for return on investment research conducted from the purchaser perspective; however, the field has not developed standardized methods for measuring the basic components of return on investment, including costing out the value of work productivity loss due to illness. Recent literature is divided on whether the most commonly used method underestimates or overestimates this loss. The goal of this manuscript is to characterize between and within variation in the cost of work productivity loss from illness estimated by the most commonly used method and its two refinements. One senior health benefit specialist from each of 325 companies employing 100+ workers completed a cross-sectional survey describing their company size, industry and policies/practices regarding work loss which allowed the research team to derive the variables needed to estimate work productivity loss from illness using three methods. Compensation estimates were derived by multiplying lost work hours from presenteeism and absenteeism by wage/fringe. Disruption correction adjusted this estimate to account for co-worker disruption, while friction correction accounted for labor substitution. The analysis compared bootstrapped means and medians between and within these three methods. The average company realized an annual $617 (SD = $75) per capita loss from depression by compensation methods and a $649 (SD = $78) loss by disruption correction, compared to a $316 (SD = $58) loss by friction correction (p < .0001). Agreement across estimates was 0.92 (95% CI 0.90, 0.93). Although the methods identify similar companies with high costs from lost productivity, friction correction reduces the size of compensation estimates of productivity loss by one half. In analyzing the potential consequences of method selection for the dissemination of interventions to employers, intervention developers are encouraged to include friction methods in their estimate of the economic value of interventions designed to improve absenteeism and presenteeism. Business leaders in industries where labor substitution is common are encouraged to seek friction corrected estimates of return on investment. Health policy analysts are encouraged to target the dissemination of productivity enhancing interventions to employers with high losses rather than all employers. NCT01013220.
Wang, Mei; Wang, Yan-Hong; Avula, Bharathi; Radwan, Mohamed M; Wanas, Amira S; Mehmedic, Zlatko; van Antwerp, John; ElSohly, Mahmoud A; Khan, Ikhlas A
2017-05-01
Ultra-high-performance supercritical fluid chromatography (UHPSFC) is an efficient analytical technique and has not been fully employed for the analysis of cannabis. Here, a novel method was developed for the analysis of 30 cannabis plant extracts and preparations using UHPSFC/PDA-MS. Nine of the most abundant cannabinoids, viz. CBD, ∆ 8 -THC, THCV, ∆ 9 -THC, CBN, CBG, THCA-A, CBDA, and CBGA, were quantitatively determined (RSDs < 6.9%). Unlike GC methods, no derivatization or decarboxylation was required prior to UHPSFC analysis. The UHPSFC chromatographic separation of cannabinoids displayed an inverse elution order compared to UHPLC. Combining with PDA-MS, this orthogonality is valuable for discrimination of cannabinoids in complex matrices. The developed method was validated, and the quantification results were compared with a standard UHPLC method. The RSDs of these two methods were within ±13.0%. Finally, chemometric analysis including principal component analysis (PCA) and partial least squares-discriminant analysis (PLS-DA) were used to differentiate between cannabis samples. © 2016 American Academy of Forensic Sciences.
Noninvasive deep Raman detection with 2D correlation analysis
NASA Astrophysics Data System (ADS)
Kim, Hyung Min; Park, Hyo Sun; Cho, Youngho; Jin, Seung Min; Lee, Kang Taek; Jung, Young Mee; Suh, Yung Doug
2014-07-01
The detection of poisonous chemicals enclosed in daily necessaries is prerequisite essential for homeland security with the increasing threat of terrorism. For the detection of toxic chemicals, we combined a sensitive deep Raman spectroscopic method with 2D correlation analysis. We obtained the Raman spectra from concealed chemicals employing spatially offset Raman spectroscopy in which incident line-shaped light experiences multiple scatterings before being delivered to inner component and yielding deep Raman signal. Furthermore, we restored the pure Raman spectrum of each component using 2D correlation spectroscopic analysis with chemical inspection. Using this method, we could elucidate subsurface component under thick powder and packed contents in a bottle.
NASA Astrophysics Data System (ADS)
Miura, Yasunari; Sugiyama, Yuki
2017-12-01
We present a general method for analyzing macroscopic collective phenomena observed in many-body systems. For this purpose, we employ diffusion maps, which are one of the dimensionality-reduction techniques, and systematically define a few relevant coarse-grained variables for describing macroscopic phenomena. The time evolution of macroscopic behavior is described as a trajectory in the low-dimensional space constructed by these coarse variables. We apply this method to the analysis of the traffic model, called the optimal velocity model, and reveal a bifurcation structure, which features a transition to the emergence of a moving cluster as a traffic jam.
NASA Astrophysics Data System (ADS)
Jia, Xiaofei
2018-06-01
Starting from the basic equations describing the evolution of the carriers and photons inside a semiconductor optical amplifier (SOA), the equation governing pulse propagation in the SOA is derived. By employing homotopy analysis method (HAM), a series solution for the output pulse by the SOA is obtained, which can effectively characterize the temporal features of the nonlinear process during the pulse propagation inside the SOA. Moreover, the analytical solution is compared with numerical simulations with a good agreement. The theoretical results will benefit the future analysis of other problems related to the pulse propagation in the SOA.
Bacci, Silvia; Pigini, Claudia; Seracini, Marco; Minelli, Liliana
2017-02-03
Background : The mixed empirical evidence about employment conditions (i.e., permanent vs. temporary job, full-time vs. part-time job) as well as unemployment has motivated the development of conceptual models with the aim of assessing the pathways leading to effects of employment status on health. Alongside physically and psychologically riskier working conditions, one channel stems in the possibly severe economic deprivation faced by temporary workers. We investigate whether economic deprivation is able to partly capture the effect of employment status on Self-evaluated Health Status (SHS). Methods : Our analysis is based on the European Union Statistics on Income and Living Conditions (EU-SILC) survey, for a balanced sample from 26 countries from 2009 to 2012. We estimate a correlated random-effects logit model for the SHS that accounts for the ordered nature of the dependent variable and the longitudinal structure of the data. Results and Discussion : Material deprivation and economic strain are able to partly account for the negative effects on SHS from precarious and part-time employment as well as from unemployment that, however, exhibits a significant independent negative association with SHS. Conclusions : Some of the indicators used to proxy economic deprivation are significant predictors of SHS and their correlation with the employment condition is such that it should not be neglected in empirical analysis, when available and further to the monetary income.
Bacci, Silvia; Pigini, Claudia; Seracini, Marco; Minelli, Liliana
2017-01-01
Background: The mixed empirical evidence about employment conditions (i.e., permanent vs. temporary job, full-time vs. part-time job) as well as unemployment has motivated the development of conceptual models with the aim of assessing the pathways leading to effects of employment status on health. Alongside physically and psychologically riskier working conditions, one channel stems in the possibly severe economic deprivation faced by temporary workers. We investigate whether economic deprivation is able to partly capture the effect of employment status on Self-evaluated Health Status (SHS). Methods: Our analysis is based on the European Union Statistics on Income and Living Conditions (EU-SILC) survey, for a balanced sample from 26 countries from 2009 to 2012. We estimate a correlated random-effects logit model for the SHS that accounts for the ordered nature of the dependent variable and the longitudinal structure of the data. Results and Discussion: Material deprivation and economic strain are able to partly account for the negative effects on SHS from precarious and part-time employment as well as from unemployment that, however, exhibits a significant independent negative association with SHS. Conclusions: Some of the indicators used to proxy economic deprivation are significant predictors of SHS and their correlation with the employment condition is such that it should not be neglected in empirical analysis, when available and further to the monetary income. PMID:28165375
A comparison of two microscale laboratory reporting methods in a secondary chemistry classroom
NASA Astrophysics Data System (ADS)
Martinez, Lance Michael
This study attempted to determine if there was a difference between the laboratory achievement of students who used a modified reporting method and those who used traditional laboratory reporting. The study also determined the relationships between laboratory performance scores and the independent variables score on the Group Assessment of Logical Thinking (GALT) test, chronological age in months, gender, and ethnicity for each of the treatment groups. The study was conducted using 113 high school students who were enrolled in first-year general chemistry classes at Pueblo South High School in Colorado. The research design used was the quasi-experimental Nonequivalent Control Group Design. The statistical treatment consisted of the Multiple Regression Analysis and the Analysis of Covariance. Based on the GALT, students in the two groups were generally in the concrete and transitional stages of the Piagetian cognitive levels. The findings of the study revealed that the traditional and the modified methods of laboratory reporting did not have any effect on the laboratory performance outcome of the subjects. However, the students who used the traditional method of reporting showed a higher laboratory performance score when evaluation was conducted using the New Standards rubric recommended by the state. Multiple Regression Analysis revealed that there was a significant relationship between the criterion variable student laboratory performance outcome of individuals who employed traditional laboratory reporting methods and the composite set of predictor variables. On the contrary, there was no significant relationship between the criterion variable student laboratory performance outcome of individuals who employed modified laboratory reporting methods and the composite set of predictor variables.
NASA Astrophysics Data System (ADS)
Wang, Dong; Ding, Hao; Singh, Vijay P.; Shang, Xiaosan; Liu, Dengfeng; Wang, Yuankun; Zeng, Xiankui; Wu, Jichun; Wang, Lachun; Zou, Xinqing
2015-05-01
For scientific and sustainable management of water resources, hydrologic and meteorologic data series need to be often extended. This paper proposes a hybrid approach, named WA-CM (wavelet analysis-cloud model), for data series extension. Wavelet analysis has time-frequency localization features, known as "mathematics microscope," that can decompose and reconstruct hydrologic and meteorologic series by wavelet transform. The cloud model is a mathematical representation of fuzziness and randomness and has strong robustness for uncertain data. The WA-CM approach first employs the wavelet transform to decompose the measured nonstationary series and then uses the cloud model to develop an extension model for each decomposition layer series. The final extension is obtained by summing the results of extension of each layer. Two kinds of meteorologic and hydrologic data sets with different characteristics and different influence of human activity from six (three pairs) representative stations are used to illustrate the WA-CM approach. The approach is also compared with four other methods, which are conventional correlation extension method, Kendall-Theil robust line method, artificial neural network method (back propagation, multilayer perceptron, and radial basis function), and single cloud model method. To evaluate the model performance completely and thoroughly, five measures are used, which are relative error, mean relative error, standard deviation of relative error, root mean square error, and Thiel inequality coefficient. Results show that the WA-CM approach is effective, feasible, and accurate and is found to be better than other four methods compared. The theory employed and the approach developed here can be applied to extension of data in other areas as well.
A comparison of unemployed job-seekers with and without social anxiety
Himle, Joseph A; Weaver, Addie; Bybee, Deborah; O'Donnell, Lisa; Vlnka, Sarah; Laviolette, Wayne; Steinberger, Edward; Zipora, Golenberg; Levine, Debra Siegel
2014-01-01
Objective Literature consistently demonstrates that social anxiety disorder has substantial negative impacts on occupational functioning. However, to date, no identified empirical work has focused on understanding the specific nature of vocational problems among persons with social anxiety disorder. This study examines the association between employment-related factors (i.e., barriers to employment; skills related to employment; and job aspirations) and social anxiety among a sample of adults seeking vocational rehabilitation services. Methods Data from intake assessments, including a screen for social anxiety disorder, of 265 low-income, unemployed adults who initiated vocational rehabilitation services in urban Michigan was examined to assess differences in barriers to employment, employment skills, job aspirations, and demographic characteristics among participants who screened positive for social anxiety disorder compared to those who did not. Bivariate and multiple logistic regression analyses were performed. Results Multiple logistic regression analysis revealed that greater perceived experience and skill barriers to employment, fewer skills related to social-type occupations, and less education were significantly associated with social anxiety, after adjusting for other factors. Bivariate analysis also suggested that participants who screened positive for social anxiety disorder were significantly less likely to aspire to social jobs. Conclusions Employment-related factors likely impacting occupational functioning were significantly different between persons with and without social anxiety problems. Identifying these differences in employment barriers, skills, and job aspirations offer potentially important functional targets for psychosocial interventions aimed at social anxiety disorder and suggest the need for vocational service professionals to assess and address social anxiety among their clients. PMID:24733524
Global/local stress analysis of composite panels
NASA Technical Reports Server (NTRS)
Ransom, Jonathan B.; Knight, Norman F., Jr.
1989-01-01
A method for performing a global/local stress analysis is described, and its capabilities are demonstrated. The method employs spline interpolation functions which satisfy the linear plate bending equation to determine displacements and rotations from a global model which are used as boundary conditions for the local model. Then, the local model is analyzed independent of the global model of the structure. This approach can be used to determine local, detailed stress states for specific structural regions using independent, refined local models which exploit information from less-refined global models. The method presented is not restricted to having a priori knowledge of the location of the regions requiring local detailed stress analysis. This approach also reduces the computational effort necessary to obtain the detailed stress state. Criteria for applying the method are developed. The effectiveness of the method is demonstrated using a classical stress concentration problem and a graphite-epoxy blade-stiffened panel with a discontinuous stiffener.
Global/local stress analysis of composite structures. M.S. Thesis
NASA Technical Reports Server (NTRS)
Ransom, Jonathan B.
1989-01-01
A method for performing a global/local stress analysis is described and its capabilities are demonstrated. The method employs spline interpolation functions which satisfy the linear plate bending equation to determine displacements and rotations from a global model which are used as boundary conditions for the local model. Then, the local model is analyzed independent of the global model of the structure. This approach can be used to determine local, detailed stress states for specific structural regions using independent, refined local models which exploit information from less-refined global models. The method presented is not restricted to having a priori knowledge of the location of the regions requiring local detailed stress analysis. This approach also reduces the computational effort necessary to obtain the detailed stress state. Criteria for applying the method are developed. The effectiveness of the method is demonstrated using a classical stress concentration problem and a graphite-epoxy blade-stiffened panel with a discontinuous stiffener.
NASA Technical Reports Server (NTRS)
Rzasnicki, W.
1973-01-01
A method of solution is presented, which, when applied to the elasto-plastic analysis of plates having a v-notch on one edge and subjected to pure bending, will produce stress and strain fields in much greater detail than presently available. Application of the boundary integral equation method results in two coupled Fredholm-type integral equations, subject to prescribed boundary conditions. These equations are replaced by a system of simultaneous algebraic equations and solved by a successive approximation method employing Prandtl-Reuss incremental plasticity relations. The method is first applied to number of elasto-static problems and the results compared with available solutions. Good agreement is obtained in all cases. The elasto-plastic analysis provides detailed stress and strain distributions for several cases of plates with various notch angles and notch depths. A strain hardening material is assumed and both plane strain and plane stress conditions are considered.
NASA Technical Reports Server (NTRS)
Ibrahim, A. H.; Tiwari, S. N.; Smith, R. E.
1997-01-01
Variational methods (VM) sensitivity analysis employed to derive the costate (adjoint) equations, the transversality conditions, and the functional sensitivity derivatives. In the derivation of the sensitivity equations, the variational methods use the generalized calculus of variations, in which the variable boundary is considered as the design function. The converged solution of the state equations together with the converged solution of the costate equations are integrated along the domain boundary to uniquely determine the functional sensitivity derivatives with respect to the design function. The application of the variational methods to aerodynamic shape optimization problems is demonstrated for internal flow problems at supersonic Mach number range. The study shows, that while maintaining the accuracy of the functional sensitivity derivatives within the reasonable range for engineering prediction purposes, the variational methods show a substantial gain in computational efficiency, i.e., computer time and memory, when compared with the finite difference sensitivity analysis.
NASA Astrophysics Data System (ADS)
Pirmoradi, Zhila; Haji Hajikolaei, Kambiz; Wang, G. Gary
2015-10-01
Product family design is cost-efficient for achieving the best trade-off between commonalization and diversification. However, for computationally intensive design functions which are viewed as black boxes, the family design would be challenging. A two-stage platform configuration method with generalized commonality is proposed for a scale-based family with unknown platform configuration. Unconventional sensitivity analysis and information on variation in the individual variants' optimal design are used for platform configuration design. Metamodelling is employed to provide the sensitivity and variable correlation information, leading to significant savings in function calls. A family of universal electric motors is designed for product performance and the efficiency of this method is studied. The impact of the employed parameters is also analysed. Then, the proposed method is modified for obtaining higher commonality. The proposed method is shown to yield design solutions with better objective function values, allowable performance loss and higher commonality than the previously developed methods in the literature.
Lipid Vesicle Shape Analysis from Populations Using Light Video Microscopy and Computer Vision
Zupanc, Jernej; Drašler, Barbara; Boljte, Sabina; Kralj-Iglič, Veronika; Iglič, Aleš; Erdogmus, Deniz; Drobne, Damjana
2014-01-01
We present a method for giant lipid vesicle shape analysis that combines manually guided large-scale video microscopy and computer vision algorithms to enable analyzing vesicle populations. The method retains the benefits of light microscopy and enables non-destructive analysis of vesicles from suspensions containing up to several thousands of lipid vesicles (1–50 µm in diameter). For each sample, image analysis was employed to extract data on vesicle quantity and size distributions of their projected diameters and isoperimetric quotients (measure of contour roundness). This process enables a comparison of samples from the same population over time, or the comparison of a treated population to a control. Although vesicles in suspensions are heterogeneous in sizes and shapes and have distinctively non-homogeneous distribution throughout the suspension, this method allows for the capture and analysis of repeatable vesicle samples that are representative of the population inspected. PMID:25426933
Interactive Learning: The Casewriting Method as an Entire Semester Course for Higher Education.
ERIC Educational Resources Information Center
Bowen, Brent D.
This guide explains the reasons for employing the case method as a tool in the academic discipline of aviation. It promotes the use of case writing as a unique opportunity to derive even further benefits from case analysis. The benefits to students of using case writing as a learning strategy include a focus on the strategy of a real situation;…
ERIC Educational Resources Information Center
Namey, Emily; Guest, Greg; McKenna, Kevin; Chen, Mario
2016-01-01
Evaluators often use qualitative research methods, yet there is little evidence on the comparative cost-effectiveness of the two most commonly employed qualitative methods--in-depth interviews (IDIs) and focus groups (FGs). We performed an inductive thematic analysis of data from 40 IDIs and 40 FGs on the health-seeking behaviors of African…
ERIC Educational Resources Information Center
Ghadyani, Fariba; Tahririan, Mohammad Hassan
2014-01-01
To determine the issue of whether there were any significant differences between the groups including Iran ISI, Iran non- ISI, and native authors in binary comparisons as for employing interactional markers, the present study was conducted. To collect the data, 90 "method sections" of English medical research articles within Iranian ISI,…
ERIC Educational Resources Information Center
Mages, Wendy Karen
2008-01-01
This systematic review of the literature synthesizes research from a number of disciplines and provides a succinct distillation of the methods and measures used to study the impact of creative drama on the language development of young children. An analysis of the merits and limitations of the reviewed studies reveals a number of methodological…
Validation and Improvement of Reliability Methods for Air Force Building Systems
focusing primarily on HVAC systems . This research used contingency analysis to assess the performance of each model for HVAC systems at six Air Force...probabilistic model produced inflated reliability calculations for HVAC systems . In light of these findings, this research employed a stochastic method, a...Nonhomogeneous Poisson Process (NHPP), in an attempt to produce accurate HVAC system reliability calculations. This effort ultimately concluded that
Libraries of Peptide Fragmentation Mass Spectra Database
National Institute of Standards and Technology Data Gateway
SRD 1C NIST Libraries of Peptide Fragmentation Mass Spectra Database (Web, free access) The purpose of the library is to provide peptide reference data for laboratories employing mass spectrometry-based proteomics methods for protein analysis. Mass spectral libraries identify these compounds in a more sensitive and robust manner than alternative methods. These databases are freely available for testing and development of new applications.
Method and apparatus for continuous flow injection extraction analysis
Hartenstein, Steven D.; Siemer, Darryl D.
1992-01-01
A method and apparatus for a continuous flow injection batch extraction aysis system is disclosed employing extraction of a component of a first liquid into a second liquid which is a solvent for a component of the first liquid, and is immiscible with the first liquid, and for separating the first liquid from the second liquid subsequent to extraction of the component of the first liquid.
Östlund, Ulrika; Kidd, Lisa; Wengström, Yvonne; Rowa-Dewar, Neneh
2011-03-01
It has been argued that mixed methods research can be useful in nursing and health science because of the complexity of the phenomena studied. However, the integration of qualitative and quantitative approaches continues to be one of much debate and there is a need for a rigorous framework for designing and interpreting mixed methods research. This paper explores the analytical approaches (i.e. parallel, concurrent or sequential) used in mixed methods studies within healthcare and exemplifies the use of triangulation as a methodological metaphor for drawing inferences from qualitative and quantitative findings originating from such analyses. This review of the literature used systematic principles in searching CINAHL, Medline and PsycINFO for healthcare research studies which employed a mixed methods approach and were published in the English language between January 1999 and September 2009. In total, 168 studies were included in the results. Most studies originated in the United States of America (USA), the United Kingdom (UK) and Canada. The analytic approach most widely used was parallel data analysis. A number of studies used sequential data analysis; far fewer studies employed concurrent data analysis. Very few of these studies clearly articulated the purpose for using a mixed methods design. The use of the methodological metaphor of triangulation on convergent, complementary, and divergent results from mixed methods studies is exemplified and an example of developing theory from such data is provided. A trend for conducting parallel data analysis on quantitative and qualitative data in mixed methods healthcare research has been identified in the studies included in this review. Using triangulation as a methodological metaphor can facilitate the integration of qualitative and quantitative findings, help researchers to clarify their theoretical propositions and the basis of their results. This can offer a better understanding of the links between theory and empirical findings, challenge theoretical assumptions and develop new theory. Copyright © 2010 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Zhao, Zhonglong; Han, Bo
2017-10-01
In this paper, the Lie symmetry analysis method is employed to investigate the Lie point symmetries and the one-parameter transformation groups of a (2 + 1)-dimensional Boiti-Leon-Pempinelli system. By using Ibragimov's method, the optimal system of one-dimensional subalgebras of this system is constructed. Truncated Painlevé analysis is used for deriving the Bäcklund transformation. The method of constructing lump-type solutions of integrable equations by means of Bäcklund transformation is first presented. Meanwhile, the lump-type solutions of the (2 + 1)-dimensional Boiti-Leon-Pempinelli system are obtained. The lump-type wave is one kind of rogue wave. The fusion-type N-solitary wave solutions are also constructed. In addition, this system is integrable in terms of the consistent Riccati expansion method.
Tylová, Tereza; Kolařík, Miroslav; Olšovská, Jana
2011-07-01
A new simple ultra-high-performance liquid chromatography method with diode array detection (UHPLC-DAD) was developed for chemical fingerprinting analysis of extracellular metabolites in fermentation broth of Geosmithia spp. The SPE method employing Oasis MCX strong cation-exchange mixed-mode polymeric sorbent was chosen for extraction of the metabolites. The analyses were performed on an Acquity UPLC BEH C18 column (100 × 2.1 mm i.d.; particle size, 1.7 μm; Waters) using a gradient elution program with an aqueous solution of trifluoroacetic acid and acetonitrile as the mobile phase. The applicability of the method was proved by analysis of 38 strains produced by different species and isolated from different sources (hosts). The results revealed the correlation of obtained UHPLC-DAD fingerprints with taxonomical identity.
Detecting chaos in particle accelerators through the frequency map analysis method.
Papaphilippou, Yannis
2014-06-01
The motion of beams in particle accelerators is dominated by a plethora of non-linear effects, which can enhance chaotic motion and limit their performance. The application of advanced non-linear dynamics methods for detecting and correcting these effects and thereby increasing the region of beam stability plays an essential role during the accelerator design phase but also their operation. After describing the nature of non-linear effects and their impact on performance parameters of different particle accelerator categories, the theory of non-linear particle motion is outlined. The recent developments on the methods employed for the analysis of chaotic beam motion are detailed. In particular, the ability of the frequency map analysis method to detect chaotic motion and guide the correction of non-linear effects is demonstrated in particle tracking simulations but also experimental data.
Analysis of 2-Acetyl-1-Pyrroline in rice by HSSE/GC/MS.
USDA-ARS?s Scientific Manuscript database
An alternative method for the analysis of 2-acetyl-1-pyrroline (2AP) in rice employing stir bar sorptive extraction (Twister™), is described. The Twister stir bar is placed in the headspace of a 20 ml vial containing 1 g rice kernels, 5 ml 0.1 M KOH, 2,2 g NaCl, and a second Teflon™ coated stir bar...
ERIC Educational Resources Information Center
Loeffler, Gordon
The intent of this field tested instructional package is to acquaint the student with a method of career analysis to enable him to determine whether a career is in harmony with his future employment goals. The package provides behavioral objectives, a student self-test, a basic information section, and a career analysis study to aid the student in…
Analysis of $sup 239$Pu and $sup 241$Am in NAEG large-sized bovine samples
DOE Office of Scientific and Technical Information (OSTI.GOV)
Major, W.J.; Lee, K.D.; Wessman, R.A.
Methods are described for the analysis of environmental levels of $sup 239$Pu and $sup 241$Am in large-sized bovine samples. Special procedure modifications to overcome the complexities of sample preparation and analyses and special techniques employed to prepare and analyze different types of bovine samples, such as muscle, blood, liver, and bone are discussed. (CH)
ERIC Educational Resources Information Center
Brusling, Christer; Tingsell, Jan-Gunnar
This new model for the supervision of student teachers utilizes videotaping hardware which allows the student teacher and his supervisor to evaluate teaching methods and behavior. Thus, the student teacher is better able to supervise himself. Employing Flanders Interaction Analysis, the student is able to interpret his teaching on closed-circuit…
NASA Technical Reports Server (NTRS)
Clapp, J. L. (Principal Investigator); Green, T., III; Hanson, G. F.; Kiefer, R. W.; Niemann, B. J., Jr.
1974-01-01
The author has identified the following significant results. Employing simple and economical extraction methods, ERTS can provide valuable data to the planners at the state or regional level with a frequency never before possible. Interactive computer methods of working directly with ERTS digital information show much promise for providing land use information at a more specific level, since the data format production rate of ERTS justifies improved methods of analysis.
THE DETERMINATION OF URANIUM BURNUP IN MWD/TON
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rider, B.F.; Russell, J.L. Jr.; Harris, D.W.
The mass-spectrometric and radiochemical methods for the determination of burn-up in nuclear fuel are compared for reliability in the range of 5000 to 15,000 Mwd/ton. Neither appears to be clearly superior to the other. Each appears to have an uncertainty of approximately 6 to 8%. It is concluded that both methods of analysis should be employed where reliability is of great concern. Agreement between both methods is the best possible indication of reliable results. (auth)
Rapid Harmonic Analysis of Piezoelectric MEMS Resonators.
Puder, Jonathan M; Pulskamp, Jeffrey S; Rudy, Ryan Q; Cassella, Cristian; Rinaldi, Matteo; Chen, Guofeng; Bhave, Sunil A; Polcawich, Ronald G
2018-06-01
This paper reports on a novel simulation method combining the speed of analytical evaluation with the accuracy of finite-element analysis (FEA). This method is known as the rapid analytical-FEA technique (RAFT). The ability of the RAFT to accurately predict frequency response orders of magnitude faster than conventional simulation methods while providing deeper insights into device design not possible with other types of analysis is detailed. Simulation results from the RAFT across wide bandwidths are compared to measured results of resonators fabricated with various materials, frequencies, and topologies with good agreement. These include resonators targeting beam extension, disk flexure, and Lamé beam modes. An example scaling analysis is presented and other applications enabled are discussed as well. The supplemental material includes example code for implementation in ANSYS, although any commonly employed FEA package may be used.
Peterson, Debbie; Currey, Nandika; Collings, Sunny
2011-01-01
The purpose of this study was to describe the pressures surrounding disclosure of a mental illness in the New Zealand workplace. Using qualitative methods and general inductive analysis, the study included twenty-two employed New Zealanders with experience of mental illnesses. Fear of discrimination, and legal, practical and moral pressures contributed to tension between workplace disclosure and non-disclosure of a mental illness. The decision to disclose a mental illness is a dilemma throughout the employment process, not just a problem for the beginning of an employment relationship. Employees with experience of mental illnesses and their employers need to be able to access advice throughout this process on disclosure issues. Disclosure is irreversible; therefore, the decision to disclose, and its timing, must remain at the discretion of the employee.
The Mathematical Structure of Elementary Particles.
1983-10-01
Physical Mathematics) *Instituto de Matematica Pura e Aplicada, Estrada Dona Castorina 110, 22460 Rio de Janeiro, Brazil Sponsored by the United...is the basic method of analysis to be employed in this work. *Instituto de Matematica Pura e Aplicada, Estrada Dona Castorina 110, 22460 Rio de Janeiro
Funding and Cost Analysis. Policy Paper Series: Document 8.
ERIC Educational Resources Information Center
Cobb, H. Brian, Ed.; Larkin, Dave, Ed.
Five policy papers address methods of funding vocational/special education and relative benefits versus expenditures for different employment training systems for moderately and severely handicapped persons. The first paper critiques the present vocational education funding system for handicapped students. Federal funding mechanisms, state and…
Broadcast Journalism Education and the Capstone Experience
ERIC Educational Resources Information Center
Tanner, Andrea; Forde, Kathy Roberts; Besley, John C.; Weir, Tom
2012-01-01
This study assesses the current state of the television news capstone experience in accredited journalism and mass communication programs in the United States. Specifically, the authors employed a mixed-methods approach, interviewing 20 television news capstone instructors and conducting an analysis of broadcast journalism curriculum information…
Characterizing Preservice Teachers' Mathematical Understanding of Algebraic Relationships
ERIC Educational Resources Information Center
Nillas, Leah A.
2010-01-01
Qualitative research methods were employed to investigate characterization of preservice teachers' mathematical understanding. Responses on test items involving algebraic relationships were analyzed using with-in case analysis (Miles and Huberman, 1994) and Pirie and Kieren's (1994) model of growth of mathematical understanding. Five elementary…
Yang, Litao; Liang, Wanqi; Jiang, Lingxi; Li, Wenquan; Cao, Wei; Wilson, Zoe A; Zhang, Dabing
2008-06-04
Real-time PCR techniques are being widely used for nucleic acids analysis, but one limitation of current frequently employed real-time PCR is the high cost of the labeled probe for each target molecule. We describe a real-time PCR technique employing attached universal duplex probes (AUDP), which has the advantage of generating fluorescence by probe hydrolysis and strand displacement over current real-time PCR methods. AUDP involves one set of universal duplex probes in which the 5' end of the fluorescent probe (FP) and a complementary quenching probe (QP) lie in close proximity so that fluorescence can be quenched. The PCR primer pair with attached universal template (UT) and the FP are identical to the UT sequence. We have shown that the AUDP technique can be used for detecting multiple target DNA sequences in both simplex and duplex real-time PCR assays for gene expression analysis, genotype identification, and genetically modified organism (GMO) quantification with comparable sensitivity, reproducibility, and repeatability with other real-time PCR methods. The results from GMO quantification, gene expression analysis, genotype identification, and GMO quantification using AUDP real-time PCR assays indicate that the AUDP real-time PCR technique has been successfully applied in nucleic acids analysis, and the developed AUDP real-time PCR technique will offer an alternative way for nucleic acid analysis with high efficiency, reliability, and flexibility at low cost.
Comparison of analytical methods for profiling N- and O-linked glycans from cultured cell lines
Togayachi, Akira; Azadi, Parastoo; Ishihara, Mayumi; Geyer, Rudolf; Galuska, Christina; Geyer, Hildegard; Kakehi, Kazuaki; Kinoshita, Mitsuhiro; Karlsson, Niclas G.; Jin, Chunsheng; Kato, Koichi; Yagi, Hirokazu; Kondo, Sachiko; Kawasaki, Nana; Hashii, Noritaka; Kolarich, Daniel; Stavenhagen, Kathrin; Packer, Nicolle H.; Thaysen-Andersen, Morten; Nakano, Miyako; Taniguchi, Naoyuki; Kurimoto, Ayako; Wada, Yoshinao; Tajiri, Michiko; Yang, Pengyuan; Cao, Weiqian; Li, Hong; Rudd, Pauline M.; Narimatsu, Hisashi
2016-01-01
The Human Disease Glycomics/Proteome Initiative (HGPI) is an activity in the Human Proteome Organization (HUPO) supported by leading researchers from international institutes and aims at development of disease-related glycomics/glycoproteomics analysis techniques. Since 2004, the initiative has conducted three pilot studies. The first two were N- and O-glycan analyses of purified transferrin and immunoglobulin-G and assessed the most appropriate analytical approach employed at the time. This paper describes the third study, which was conducted to compare different approaches for quantitation of N- and O-linked glycans attached to proteins in crude biological samples. The preliminary analysis on cell pellets resulted in wildly varied glycan profiles, which was probably the consequence of variations in the pre-processing sample preparation methodologies. However, the reproducibility of the data was not improved dramatically in the subsequent analysis on cell lysate fractions prepared in a specified method by one lab. The study demonstrated the difficulty of carrying out a complete analysis of the glycome in crude samples by any single technology and the importance of rigorous optimization of the course of analysis from preprocessing to data interpretation. It suggests that another collaborative study employing the latest technologies in this rapidly evolving field will help to realize the requirements of carrying out the large-scale analysis of glycoproteins in complex cell samples. PMID:26511985
Advanced building energy management system demonstration for Department of Defense buildings.
O'Neill, Zheng; Bailey, Trevor; Dong, Bing; Shashanka, Madhusudana; Luo, Dong
2013-08-01
This paper presents an advanced building energy management system (aBEMS) that employs advanced methods of whole-building performance monitoring combined with statistical methods of learning and data analysis to enable identification of both gradual and discrete performance erosion and faults. This system assimilated data collected from multiple sources, including blueprints, reduced-order models (ROM) and measurements, and employed advanced statistical learning algorithms to identify patterns of anomalies. The results were presented graphically in a manner understandable to facilities managers. A demonstration of aBEMS was conducted in buildings at Naval Station Great Lakes. The facility building management systems were extended to incorporate the energy diagnostics and analysis algorithms, producing systematic identification of more efficient operation strategies. At Naval Station Great Lakes, greater than 20% savings were demonstrated for building energy consumption by improving facility manager decision support to diagnose energy faults and prioritize alternative, energy-efficient operation strategies. The paper concludes with recommendations for widespread aBEMS success. © 2013 New York Academy of Sciences.
Zhao, Yuancun; Chen, Xiaogang; Yang, Yiwen; Zhao, Xiaohong; Zhang, Shu; Gao, Zehua; Fang, Ting; Wang, Yufang; Zhang, Ji
2018-05-07
Diatom examination has always been used for the diagnosis of drowning in forensic practice. However, traditional examination of the microscopic features of diatom frustules is time-consuming and requires taxonomic expertise. In this study, we demonstrate a potential DNA-based method of inferring suspected drowning site using pyrosequencing (PSQ) of the V7 region of 18S ribosome DNA (18S rDNA) as a diatom DNA barcode. By employing a sparse representation-based AdvISER-M-PYRO algorithm, the original PSQ signals of diatom DNA mixtures were deciphered to determine the corresponding taxa of the composite diatoms. Additionally, we evaluated the possibility of correlating water samples to collection sites by analyzing the PSQ signal profiles of diatom mixtures contained in the water samples via multidimensional scaling. The results suggest that diatomaceous PSQ profile analysis could be used as a cost-effective method to deduce the geographical origin of an environmental bio-sample.
Authenticity analysis of pear juice employing chromatographic fingerprinting.
Willems, Jamie L; Low, Nicholas H
2014-12-03
Pear juice is predominately composed of carbohydrates/polyols (>95% of the total soluble solids), making it susceptible to adulteration by the addition of less expensive commercial sweeteners. In this research, the major carbohydrate and polyol (fructose, glucose, sucrose, and sorbitol) content of 32 pure pear juices representing five world producing regions and three years of production was determined. Additionally, methods employing oligosaccharide profiling to detect the debasing of these samples with four commercial sweeteners (HFCS 55 and 90, TIS, and HIS) were developed using capillary gas chromatography with flame ionization detection (CGC-FID) and high-performance liquid chromatography with pulsed amperometric detection (HPAE-PAD). Detection limits for the four commercial sweeteners ranged from 0.5 to 5.0% (v/v). In addition, the developed CGC-FID method could be used to (a) detect the addition of pear to apple juice via arbutin detection and (b) determine if a pear juice was produced using enzymatic liquefaction via the presence of O-β-d-glucopyranosyl-(1→4)-d-glucopyranose (cellobiose), all within a single chromatographic analysis.
An analysis of parameter sensitivities of preference-inspired co-evolutionary algorithms
NASA Astrophysics Data System (ADS)
Wang, Rui; Mansor, Maszatul M.; Purshouse, Robin C.; Fleming, Peter J.
2015-10-01
Many-objective optimisation problems remain challenging for many state-of-the-art multi-objective evolutionary algorithms. Preference-inspired co-evolutionary algorithms (PICEAs) which co-evolve the usual population of candidate solutions with a family of decision-maker preferences during the search have been demonstrated to be effective on such problems. However, it is unknown whether PICEAs are robust with respect to the parameter settings. This study aims to address this question. First, a global sensitivity analysis method - the Sobol' variance decomposition method - is employed to determine the relative importance of the parameters controlling the performance of PICEAs. Experimental results show that the performance of PICEAs is controlled for the most part by the number of function evaluations. Next, we investigate the effect of key parameters identified from the Sobol' test and the genetic operators employed in PICEAs. Experimental results show improved performance of the PICEAs as more preferences are co-evolved. Additionally, some suggestions for genetic operator settings are provided for non-expert users.
Airado-Rodríguez, Diego; Skaret, Josefine; Wold, Jens Petter
2010-05-12
This paper describes the fluorescent behavior of cod caviar paste, stored under different conditions, in terms of light exposure and concentration of oxygen in the headspace. Multivariate curve resolution was employed to decompose the overall fluorescence spectra into pure fluorescent components and calculate the relative concentrations of these components in the different samples. Profiles corresponding to protoporphyrin IX, photoprotoporphyrin, and fluorescent oxidation products were identified. Sensory evaluation, TBARS, and analysis of volatiles are typical methods employed in the routine analysis and quality control of such food. Successful calibration models were established between fluorescence and those routine methods. Correlation coefficients higher than 0.80 were found for 79% and higher than 0.90 for 50% of the assessed odors and flavors. For instance, R values of 0.94, and 0.96 were obtained for fresh and rancid flavors respectively, and 0.89 for TBARS. On the basis of these data, it can be argued that front-face fluorescence spectroscopy can substitute all of these expensive and tedious methodologies.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sharifi, Mahdi; Reactor and Catalysis Research Center; Haghighi, Mohammad, E-mail: haghighi@sut.ac.ir
2014-12-15
Highlights: • Synthesis of nanostructured Ni/Y catalyst by sonochemical and impregnation methods. • Enhancement of size distribution and active phase dispersion by employing sonochemical method. • Evaluation of biogas reforming over Ni/Y catalyst with different Ni-loadings. • Preparation of highly active and stable catalyst with low Ni content for biogas reforming. • Getting H{sub 2}/CO very close to equilibrium ratio by employing sonochemical method. - Abstract: The effect of ultrasound irradiation and various Ni-loadings on dispersion of active phase over zeolite Y were evaluated in biogas reforming for hydrogen production. X-ray diffraction, field emission scanning electron microscopy, energy dispersive X-ray,more » Brunauer–Emmett–Teller, Fourier transform infrared analysis and TEM analysis were employed to observe the characteristics of nanostructured catalysts. The characterizations implied that utilization of ultrasound irradiation enhanced catalyst physicochemical properties including high dispersion of Ni on support, smallest particles size and high catalyst surface area. The reforming reactions were carried out at GHSV = 24 l/g.h, P = 1 atm, CH{sub 4}/CO{sub 2} = 1 and temperature range of 550–850 °C. Activity test displayed that ultrasound irradiated Ni(5 wt.%)/Y had the best performance and the activity remained stable during 600 min. Furthermore, the proposed reaction mechanism showed that there are three major reaction channels in biogas reforming.« less
Interdisciplinary Project-Based Learning through an Environmental Water Quality Study
NASA Astrophysics Data System (ADS)
Juhl, Lorie; Yearsley, Kaye; Silva, Andrew J.
1997-12-01
An interdisciplinary environmental water quality study was designed and conducted to enhance training and employability of chemical and environmental technician students in associate degree programs. Four project objectives were identified as a means to enhance the educational experience and employability of our students: provide experience on analytical instrumentation for organic compounds (gas chromatography/mass spectrometry, GC/MS), require interdisciplinary group interactions and problem solving, provide experience with Environmental Protection Agency (EPA) procedures, and require cooperation with state agencies/private organizations. Students worked in groups that included representatives from both programs to develop project objectives and a Sampling and Analysis Plan (SAP) following EPA standards. Input from personnel at Idaho's Department of Environmental Quality and Bureau of Laboratories and from volunteers in an environmental "watch dog" organization called the Henry's Fork Foundation aided students in the development and implementation of their SAP. Subsequently, groups sampled sections of the Henry's Fork River and analyzed for organic, inorganic, and fecal contaminants. Analysis included EPA method 525.2 for pesticides using GC/MS. Data from all river segments was shared and each group submitted a final report analyzing results. Surveys completed by students and instructors indicate that the project is a successful teaching method allowing introduction of new skills as well as review of important technical and employability skills.
Broadband computation of the scattering coefficients of infinite arbitrary cylinders.
Blanchard, Cédric; Guizal, Brahim; Felbacq, Didier
2012-07-01
We employ a time-domain method to compute the near field on a contour enclosing infinitely long cylinders of arbitrary cross section and constitution. We therefore recover the cylindrical Hankel coefficients of the expansion of the field outside the circumscribed circle of the structure. The recovered coefficients enable the wideband analysis of complex systems, e.g., the determination of the radar cross section becomes straightforward. The prescription for constructing such a numerical tool is provided in great detail. The method is validated by computing the scattering coefficients for a homogeneous circular cylinder illuminated by a plane wave, a problem for which an analytical solution exists. Finally, some radiation properties of an optical antenna are examined by employing the proposed technique.
Analyzing Visibility Configurations.
Dachsbacher, C
2011-04-01
Many algorithms, such as level of detail rendering and occlusion culling methods, make decisions based on the degree of visibility of an object, but do not analyze the distribution, or structure, of the visible and occluded regions across surfaces. We present an efficient method to classify different visibility configurations and show how this can be used on top of existing methods based on visibility determination. We adapt co-occurrence matrices for visibility analysis and generalize them to operate on clusters of triangular surfaces instead of pixels. We employ machine learning techniques to reliably classify the thus extracted feature vectors. Our method allows perceptually motivated level of detail methods for real-time rendering applications by detecting configurations with expected visual masking. We exemplify the versatility of our method with an analysis of area light visibility configurations in ray tracing and an area-to-area visibility analysis suitable for hierarchical radiosity refinement. Initial results demonstrate the robustness, simplicity, and performance of our method in synthetic scenes, as well as real applications.
Numerical simulation of rarefied gas flow through a slit
NASA Technical Reports Server (NTRS)
Keith, Theo G., Jr.; Jeng, Duen-Ren; De Witt, Kenneth J.; Chung, Chan-Hong
1990-01-01
Two different approaches, the finite-difference method coupled with the discrete-ordinate method (FDDO), and the direct-simulation Monte Carlo (DSMC) method, are used in the analysis of the flow of a rarefied gas from one reservoir to another through a two-dimensional slit. The cases considered are for hard vacuum downstream pressure, finite pressure ratios, and isobaric pressure with thermal diffusion, which are not well established in spite of the simplicity of the flow field. In the FDDO analysis, by employing the discrete-ordinate method, the Boltzmann equation simplified by a model collision integral is transformed to a set of partial differential equations which are continuous in physical space but are point functions in molecular velocity space. The set of partial differential equations are solved by means of a finite-difference approximation. In the DSMC analysis, three kinds of collision sampling techniques, the time counter (TC) method, the null collision (NC) method, and the no time counter (NTC) method, are used.
Undergraduate nursing assistant employment in aged care has benefits for new graduates.
Algoso, Maricris; Ramjan, Lucie; East, Leah; Peters, Kath
2018-04-20
To determine how undergraduate assistant in nursing employment in aged care helps to prepare new graduates for clinical work as a registered nurse. The amount and quality of clinical experience afforded by university programs has been the subject of constant debate in the nursing profession. New graduate nurses are often deemed inadequately prepared for clinical practice and so many nursing students seek employment as assistants in nursing whilst studying to increase their clinical experience. This paper presents the first phase of a larger mixed-methods study to explore whether undergraduate assistant in nursing employment in aged care prepares new graduate nurses for the clinical work environment. The first phase involved the collection of quantitative data from a modified Preparation for Clinical Practice survey, which contained 50-scaled items relating to nursing practice. Ethics approval was obtained prior to commencing data collection. New graduate nurses who were previously employed as assistants in nursing in aged care and had at least 3 months' experience as a registered nurse, were invited to complete the survey. Social media and professional networks were used to distribute the survey between March 2015 and May 2016 and again in January 2017 - February 2017. Purposeful and snowballing sampling methods using social media and nursing networks were used to collect survey responses. Data were analysed using principal components analysis. 110 completed surveys were returned. Principal components analysis revealed four underlying constructs (components) of undergraduate assistant in nursing employment in aged care. These were emotional literacy (component 1), clinical skills (component 2), managing complex patient care (component 3) and health promotion (component 4). The 4 extracted components reflect the development of core nursing skills that transcend that of technical skills and includes the ability to situate oneself as a nurse in the care of an individual and in a healthcare team. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.
ToNER: A tool for identifying nucleotide enrichment signals in feature-enriched RNA-seq data.
Promworn, Yuttachon; Kaewprommal, Pavita; Shaw, Philip J; Intarapanich, Apichart; Tongsima, Sissades; Piriyapongsa, Jittima
2017-01-01
Biochemical methods are available for enriching 5' ends of RNAs in prokaryotes, which are employed in the differential RNA-seq (dRNA-seq) and the more recent Cappable-seq protocols. Computational methods are needed to locate RNA 5' ends from these data by statistical analysis of the enrichment. Although statistical-based analysis methods have been developed for dRNA-seq, they may not be suitable for Cappable-seq data. The more efficient enrichment method employed in Cappable-seq compared with dRNA-seq could affect data distribution and thus algorithm performance. We present Transformation of Nucleotide Enrichment Ratios (ToNER), a tool for statistical modeling of enrichment from RNA-seq data obtained from enriched and unenriched libraries. The tool calculates nucleotide enrichment scores and determines the global transformation for fitting to the normal distribution using the Box-Cox procedure. From the transformed distribution, sites of significant enrichment are identified. To increase power of detection, meta-analysis across experimental replicates is offered. We tested the tool on Cappable-seq and dRNA-seq data for identifying Escherichia coli transcript 5' ends and compared the results with those from the TSSAR tool, which is designed for analyzing dRNA-seq data. When combining results across Cappable-seq replicates, ToNER detects more known transcript 5' ends than TSSAR. In general, the transcript 5' ends detected by ToNER but not TSSAR occur in regions which cannot be locally modeled by TSSAR. ToNER uses a simple yet robust statistical modeling approach, which can be used for detecting RNA 5'ends from Cappable-seq data, in particular when combining information from experimental replicates. The ToNER tool could potentially be applied for analyzing other RNA-seq datasets in which enrichment for other structural features of RNA is employed. The program is freely available for download at ToNER webpage (http://www4a.biotec.or.th/GI/tools/toner) and GitHub repository (https://github.com/PavitaKae/ToNER).
Unequal cluster sizes in stepped-wedge cluster randomised trials: a systematic review.
Kristunas, Caroline; Morris, Tom; Gray, Laura
2017-11-15
To investigate the extent to which cluster sizes vary in stepped-wedge cluster randomised trials (SW-CRT) and whether any variability is accounted for during the sample size calculation and analysis of these trials. Any, not limited to healthcare settings. Any taking part in an SW-CRT published up to March 2016. The primary outcome is the variability in cluster sizes, measured by the coefficient of variation (CV) in cluster size. Secondary outcomes include the difference between the cluster sizes assumed during the sample size calculation and those observed during the trial, any reported variability in cluster sizes and whether the methods of sample size calculation and methods of analysis accounted for any variability in cluster sizes. Of the 101 included SW-CRTs, 48% mentioned that the included clusters were known to vary in size, yet only 13% of these accounted for this during the calculation of the sample size. However, 69% of the trials did use a method of analysis appropriate for when clusters vary in size. Full trial reports were available for 53 trials. The CV was calculated for 23 of these: the median CV was 0.41 (IQR: 0.22-0.52). Actual cluster sizes could be compared with those assumed during the sample size calculation for 14 (26%) of the trial reports; the cluster sizes were between 29% and 480% of that which had been assumed. Cluster sizes often vary in SW-CRTs. Reporting of SW-CRTs also remains suboptimal. The effect of unequal cluster sizes on the statistical power of SW-CRTs needs further exploration and methods appropriate to studies with unequal cluster sizes need to be employed. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
NASA Astrophysics Data System (ADS)
Rajamanickam, Govindaraj; Narendhiran, Santhosh; Muthu, Senthil Pandian; Mukhopadhyay, Sumita; Perumalsamy, Ramasamy
2017-12-01
Titanium dioxide is a promising wide band gap semiconducting material for dye-sensitized solar cell. The poor electron transport properties still remain a challenge with conventional nanoparticles. Here, we synthesized TiO2 nanorods/nanoparticles by hydrothermal method to improve the charge transport properties. The structural and morphological information of the prepared nanorods/nanoparticles was analysed with X-ray diffraction and electron microscopy analysis, respectively. A high power conversion efficiency of 7.7% is achieved with nanorods/nanoparticles employed device under 100 mW/cm2. From the electrochemical impedance analysis, superior electron transport properties have been found for synthesized TiO2 nanorods/nanoparticles employed device than commercial P25 nanoparticles based device.
Integrated multiplexed capillary electrophoresis system
Yeung, Edward S.; Tan, Hongdong
2002-05-14
The present invention provides an integrated multiplexed capillary electrophoresis system for the analysis of sample analytes. The system integrates and automates multiple components, such as chromatographic columns and separation capillaries, and further provides a detector for the detection of analytes eluting from the separation capillaries. The system employs multiplexed freeze/thaw valves to manage fluid flow and sample movement. The system is computer controlled and is capable of processing samples through reaction, purification, denaturation, pre-concentration, injection, separation and detection in parallel fashion. Methods employing the system of the invention are also provided.
Zeeshan, Farrukh; Tabbassum, Misbah; Jorgensen, Lene; Medlicott, Natalie J
2018-02-01
Protein drugs may encounter conformational perturbations during the formulation processing of lipid-based solid dosage forms. In aqueous protein solutions, attenuated total reflection Fourier transform infrared (ATR FT-IR) spectroscopy can investigate these conformational changes following the subtraction of spectral interference of solvent with protein amide I bands. However, in solid dosage forms, the possible spectral contribution of lipid carriers to protein amide I band may be an obstacle to determine conformational alterations. The objective of this study was to develop an ATR FT-IR spectroscopic method for the analysis of protein secondary structure embedded in solid lipid matrices. Bovine serum albumin (BSA) was chosen as a model protein, while Precirol AT05 (glycerol palmitostearate, melting point 58 ℃) was employed as the model lipid matrix. Bovine serum albumin was incorporated into lipid using physical mixing, melting and mixing, or wet granulation mixing methods. Attenuated total reflection FT-IR spectroscopy and size exclusion chromatography (SEC) were performed for the analysis of BSA secondary structure and its dissolution in aqueous media, respectively. The results showed significant interference of Precirol ATO5 with BSA amide I band which was subtracted up to 90% w/w lipid content to analyze BSA secondary structure. In addition, ATR FT-IR spectroscopy also detected thermally denatured BSA solid alone and in the presence of lipid matrix indicating its suitability for the detection of denatured protein solids in lipid matrices. Despite being in the solid state, conformational changes occurred to BSA upon incorporation into solid lipid matrices. However, the extent of these conformational alterations was found to be dependent on the mixing method employed as indicated by area overlap calculations. For instance, the melting and mixing method imparted negligible effect on BSA secondary structure, whereas the wet granulation mixing method promoted more changes. Size exclusion chromatography analysis depicted the complete dissolution of BSA in the aqueous media employed in the wet granulation method. In conclusion, an ATR FT-IR spectroscopic method was successfully developed to investigate BSA secondary structure in solid lipid matrices following the subtraction of lipid spectral interference. The ATR FT-IR spectroscopy could further be applied to investigate the secondary structure perturbations of therapeutic proteins during their formulation development.
NASA Technical Reports Server (NTRS)
Johnson, F. T.
1980-01-01
A method for solving the linear integral equations of incompressible potential flow in three dimensions is presented. Both analysis (Neumann) and design (Dirichlet) boundary conditions are treated in a unified approach to the general flow problem. The method is an influence coefficient scheme which employs source and doublet panels as boundary surfaces. Curved panels possessing singularity strengths, which vary as polynomials are used, and all influence coefficients are derived in closed form. These and other features combine to produce an efficient scheme which is not only versatile but eminently suited to the practical realities of a user-oriented environment. A wide variety of numerical results demonstrating the method is presented.
Application of a data-mining method based on Bayesian networks to lesion-deficit analysis
NASA Technical Reports Server (NTRS)
Herskovits, Edward H.; Gerring, Joan P.
2003-01-01
Although lesion-deficit analysis (LDA) has provided extensive information about structure-function associations in the human brain, LDA has suffered from the difficulties inherent to the analysis of spatial data, i.e., there are many more variables than subjects, and data may be difficult to model using standard distributions, such as the normal distribution. We herein describe a Bayesian method for LDA; this method is based on data-mining techniques that employ Bayesian networks to represent structure-function associations. These methods are computationally tractable, and can represent complex, nonlinear structure-function associations. When applied to the evaluation of data obtained from a study of the psychiatric sequelae of traumatic brain injury in children, this method generates a Bayesian network that demonstrates complex, nonlinear associations among lesions in the left caudate, right globus pallidus, right side of the corpus callosum, right caudate, and left thalamus, and subsequent development of attention-deficit hyperactivity disorder, confirming and extending our previous statistical analysis of these data. Furthermore, analysis of simulated data indicates that methods based on Bayesian networks may be more sensitive and specific for detecting associations among categorical variables than methods based on chi-square and Fisher exact statistics.
DAKOTA Design Analysis Kit for Optimization and Terascale
DOE Office of Scientific and Technical Information (OSTI.GOV)
Adams, Brian M.; Dalbey, Keith R.; Eldred, Michael S.
2010-02-24
The DAKOTA (Design Analysis Kit for Optimization and Terascale Applications) toolkit provides a flexible and extensible interface between simulation codes (computational models) and iterative analysis methods. By employing object-oriented design to implement abstractions of the key components required for iterative systems analyses, the DAKOTA toolkit provides a flexible and extensible problem-solving environment for design and analysis of computational models on high performance computers.A user provides a set of DAKOTA commands in an input file and launches DAKOTA. DAKOTA invokes instances of the computational models, collects their results, and performs systems analyses. DAKOTA contains algorithms for optimization with gradient and nongradient-basedmore » methods; uncertainty quantification with sampling, reliability, polynomial chaos, stochastic collocation, and epistemic methods; parameter estimation with nonlinear least squares methods; and sensitivity/variance analysis with design of experiments and parameter study methods. These capabilities may be used on their own or as components within advanced strategies such as hybrid optimization, surrogate-based optimization, mixed integer nonlinear programming, or optimization under uncertainty. Services for parallel computing, simulation interfacing, approximation modeling, fault tolerance, restart, and graphics are also included.« less
Adaptive methods for nonlinear structural dynamics and crashworthiness analysis
NASA Technical Reports Server (NTRS)
Belytschko, Ted
1993-01-01
The objective is to describe three research thrusts in crashworthiness analysis: adaptivity; mixed time integration, or subcycling, in which different timesteps are used for different parts of the mesh in explicit methods; and methods for contact-impact which are highly vectorizable. The techniques are being developed to improve the accuracy of calculations, ease-of-use of crashworthiness programs, and the speed of calculations. The latter is still of importance because crashworthiness calculations are often made with models of 20,000 to 50,000 elements using explicit time integration and require on the order of 20 to 100 hours on current supercomputers. The methodologies are briefly reviewed and then some example calculations employing these methods are described. The methods are also of value to other nonlinear transient computations.
NASA Astrophysics Data System (ADS)
Belov, A. V.; Kurkov, Andrei S.; Chikolini, A. V.
1990-08-01
An offset method is modified to allow an analysis of the distribution of fields in a single-mode fiber waveguide without recourse to the Gaussian approximation. A new approximation for the field is obtained for fiber waveguides with a step refractive index profile and a special analysis employing the Hankel transformation is applied to waveguides with a distributed refractive index. The field distributions determined by this method are compared with the corresponding distributions calculated from the refractive index of a preform from which the fibers are drawn. It is shown that these new approaches can be used to determine the dimensions of a mode spot defined in different ways and to forecast the dispersion characteristics of single-mode fiber waveguides.
THE DETERMINATION OF TRACES OF IRON IN SAMPLES OF PLATINUM BY NE TRON- ACTIVATION ANALYSIS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Morris, D.F.C.; Killick, R.A.
1963-11-01
A neutron-activation analysis method for the determination of traces of iron in samples of purified platinum is described. The nuclear reactor BEPO at Harwell was used as the neutron source. A rapid radiochemical separation procedure using carriers was employed to decontaminate the iron activity from most other induced activities. The analysis is completed by discriminated gamma scintillation counting. Results of analyses of seven samples of platinum are quoted. The method of analysis has the advantage that it obviates difficulties caused by reagent blanks or by contamination from traces of inactive iron after irradiation. Interference resulting from nuclear reactions of elementsmore » other than iron in the samples appears to be of no consequence. (auth)« less
Garbarino, John R.; Taylor, Howard E.
1987-01-01
Inductively coupled plasma mass spectrometry is employed in the determination of Ni, Cu, Sr, Cd, Ba, Ti, and Pb in nonsaline, natural water samples by stable isotope dilution analysis. Hydrologic samples were directly analyzed without any unusual pretreatment. Interference effects related to overlapping isobars, formation of metal oxide and multiply charged ions, and matrix composition were identified and suitable methods of correction evaluated. A comparability study snowed that single-element isotope dilution analysis was only marginally better than sequential multielement isotope dilution analysis. Accuracy and precision of the single-element method were determined on the basis of results obtained for standard reference materials. The instrumental technique was shown to be ideally suited for programs associated with certification of standard reference materials.
Employing broadband spectra and cluster analysis to assess thermal defoliation of cotton
USDA-ARS?s Scientific Manuscript database
Growers and field scouts need assistance in surveying cotton (Gossypium hirsutum L.) fields subjected to thermal defoliation to reap the benefits provided by this nonchemical defoliation method. A study was conducted to evaluate broadband spectral data and unsupervised classification as tools for s...
Measuring charge nonuniformity in MOS devices
NASA Technical Reports Server (NTRS)
Maserjian, J.; Zamani, N.
1980-01-01
Convenient method of determining inherent lateral charge non-uniformities along silicon dioxide/silicon interface of metal-oxide-semiconductor (MOS) employs rapid measurement of capacitance of interface as function of voltage at liquid nitrogen temperature. Charge distribution is extracted by fast-Fourier-transform analysis of capacitance voltage (C-V) measurement.
Eleventh All-Union Conference on High-Molecular-Weight Compounds
1960-07-18
report of B. P. Yershov (Scientific Research Institute for Plastics) on the employment of high-frequency titration for the analysis of polymer materials...development of a new thermometric method of control in the production of synthetic materials based on polyethylacrylates. In the reports and communications
DOT National Transportation Integrated Search
2017-04-04
This paper employs the finite element (FE) modeling : method to investigate the contributing factors to the horizontal : splitting cracks observed in the upper strand plane in some : concrete crossties made with seven-wire strands. The concrete...
Standardised Library Instruction Assessment: An Institution-Specific Approach
ERIC Educational Resources Information Center
Staley, Shannon M.; Branch, Nicole A.; Hewitt, Tom L.
2010-01-01
Introduction: We explore the use of a psychometric model for locally-relevant, information literacy assessment, using an online tool for standardised assessment of student learning during discipline-based library instruction sessions. Method: A quantitative approach to data collection and analysis was used, employing standardised multiple-choice…
Size and shape measurement in contemporary cephalometrics.
McIntyre, Grant T; Mossey, Peter A
2003-06-01
The traditional method of analysing cephalograms--conventional cephalometric analysis (CCA)--involves the calculation of linear distance measurements, angular measurements, area measurements, and ratios. Because shape information cannot be determined from these 'size-based' measurements, an increasing number of studies employ geometric morphometric tools in the cephalometric analysis of craniofacial morphology. Most of the discussions surrounding the appropriateness of CCA, Procrustes superimposition, Euclidean distance matrix analysis (EDMA), thin-plate spline analysis (TPS), finite element morphometry (FEM), elliptical Fourier functions (EFF), and medial axis analysis (MAA) have centred upon mathematical and statistical arguments. Surprisingly, little information is available to assist the orthodontist in the clinical relevance of each technique. This article evaluates the advantages and limitations of the above methods currently used to analyse the craniofacial morphology on cephalograms and investigates their clinical relevance and possible applications.
Vibrational Spectral Studies of Gemfibrozil
NASA Astrophysics Data System (ADS)
Benitta, T. Asenath; Balendiran, G. K.; James, C.
2008-11-01
The Fourier Transform Raman and infrared spectra of the crystallized drug molecule 5-(2,5-Dimethylphenoxy)-2,2-dimethylpentanoic acid (Gemfibrozil) have been recorded and analyzed. Quantum chemical computational methods have been employed using Gaussian 03 software package based on Hartree Fock method for theoretically modeling the grown molecule. The optimized geometry and vibrational frequencies have been predicted. Observed vibrational modes have been assigned with the aid of normal coordinate analysis.
Analysis of Curved Target-Type Thrust Reversers
1974-06-07
methods f-or two -dimensional cases, the Levi - Civita method provides a \\ariet> t>l bucket shapes and enables one to round off the sharp corners of...surface In the present work three methods arc employed to investigate the deflection of mviscid. incompressible curved surfaces: Levi - Civitas ...shapes are shown in Fig. V A special case for (T, =0 31416 and fr2= 0.47124, and A = 0. ,*46, (/< = 6X ), is shown in Fig 4. Fvidently, Levi - Civita "s
Application of the Boundary Element Method to Fatigue Crack Growth Analysis
1988-09-01
III, and Noetic PROBE in Section IV. Correlation of the boundary element method and modeling techniques employed in this study were shown with the...distribution unlimited I I I Preface! 3 The purpose of this study was to apply the boundary element method (BEM) to two dimensional fracture mechanics...problems, and to use the BEM to analyze the interference effects of holes on cracks through a parametric study of a two hole 3 tension strip. The study
The role of continuity in residual-based variational multiscale modeling of turbulence
NASA Astrophysics Data System (ADS)
Akkerman, I.; Bazilevs, Y.; Calo, V. M.; Hughes, T. J. R.; Hulshoff, S.
2008-02-01
This paper examines the role of continuity of the basis in the computation of turbulent flows. We compare standard finite elements and non-uniform rational B-splines (NURBS) discretizations that are employed in Isogeometric Analysis (Hughes et al. in Comput Methods Appl Mech Eng, 194:4135 4195, 2005). We make use of quadratic discretizations that are C 0-continuous across element boundaries in standard finite elements, and C 1-continuous in the case of NURBS. The variational multiscale residual-based method (Bazilevs in Isogeometric analysis of turbulence and fluid-structure interaction, PhD thesis, ICES, UT Austin, 2006; Bazilevs et al. in Comput Methods Appl Mech Eng, submitted, 2007; Calo in Residual-based multiscale turbulence modeling: finite volume simulation of bypass transition. PhD thesis, Department of Civil and Environmental Engineering, Stanford University, 2004; Hughes et al. in proceedings of the XXI international congress of theoretical and applied mechanics (IUTAM), Kluwer, 2004; Scovazzi in Multiscale methods in science and engineering, PhD thesis, Department of Mechanical Engineering, Stanford Universty, 2004) is employed as a turbulence modeling technique. We find that C 1-continuous discretizations outperform their C 0-continuous counterparts on a per-degree-of-freedom basis. We also find that the effect of continuity is greater for higher Reynolds number flows.
A square wave is the most efficient and reliable waveform for resonant actuation of micro switches
NASA Astrophysics Data System (ADS)
Ben Sassi, S.; Khater, M. E.; Najar, F.; Abdel-Rahman, E. M.
2018-05-01
This paper investigates efficient actuation methods of shunt MEMS switches and other parallel-plate actuators. We start by formulating a multi-physics model of the micro switch, coupling the nonlinear Euler-Bernoulli beam theory with the nonlinear Reynolds equation to describe the structural and fluidic domains, respectively. The model takes into account fringing field effects as well as mid-plane stretching and squeeze film damping nonlinearities. Static analysis is undertaken using the differential quadrature method (DQM) to obtain the pull-in voltage, which is verified by means of the finite element model and validated experimentally. We develop a reduced order model employing the Galerkin method for the structural domain and DQM for the fluidic domain. The proposed waveforms are intended to be more suitable for integrated circuit standards. The dynamic response of the micro switch to harmonic, square and triangular waveforms are evaluated and compared experimentally and analytically. Low voltage actuation is obtained using dynamic pull-in with the proposed waveforms. In addition, global stability analysis carried out for the three signals shows advantages of employing the square signal as the actuation method in enhancing the performance of the micro switch in terms of actuation voltage, switching time, and sensitivity to initial conditions.
NASA Astrophysics Data System (ADS)
Li, Li-Na; Ma, Chang-Ming; Chang, Ming; Zhang, Ren-Cheng
2017-12-01
A novel method based on SIMPLe-to-use Interactive Self-modeling Mixture Analysis (SIMPLISMA) and Kernel Partial Least Square (KPLS), named as SIMPLISMA-KPLS, is proposed in this paper for selection of outlier samples and informative samples simultaneously. It is a quick algorithm used to model standardization (or named as model transfer) in near infrared (NIR) spectroscopy. The NIR experiment data of the corn for analysis of the protein content is introduced to evaluate the proposed method. Piecewise direct standardization (PDS) is employed in model transfer. And the comparison of SIMPLISMA-PDS-KPLS and KS-PDS-KPLS is given in this research by discussion of the prediction accuracy of protein content and calculation speed of each algorithm. The conclusions include that SIMPLISMA-KPLS can be utilized as an alternative sample selection method for model transfer. Although it has similar accuracy to Kennard-Stone (KS), it is different from KS as it employs concentration information in selection program. This means that it ensures analyte information is involved in analysis, and the spectra (X) of the selected samples is interrelated with concentration (y). And it can be used for outlier sample elimination simultaneously by validation of calibration. According to the statistical data results of running time, it is clear that the sample selection process is more rapid when using KPLS. The quick algorithm of SIMPLISMA-KPLS is beneficial to improve the speed of online measurement using NIR spectroscopy.
Determination of benzylpenicillin in pharmaceuticals by capillary zone electrophoresis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hoyt, A.M. Jr.; Sepaniak, M.J.
A rapid and direct method is described for the determination of benzylpenicillin (penicillin G) in pharmaceutical preparations. The method involves very little sample preparation and total analysis time for duplicate results is less 30 minutes per sample. The method takes advantage of the speed and separating power of capillary zone electrophoresis (CZE). Detection of penicillin is by absorption at 228 nm. An internal standard is employed to reduce sample injection error. The method was applied successfully to both tablets and injectable preparations. 14 refs., 5 figs., 3 tabs.
Examining the extraction of artemisinin from artemisia annua using ultrasound
NASA Astrophysics Data System (ADS)
Briars, Rhianna; Paniwnyk, Larysa
2012-05-01
Artemisinin suppresses the life-cycle of the plasmodium parasite which causes malaria. It is found naturally occurring within the trichome glands of the Artemisia annua plant. Traditional methods for extracting artemisinin are time-consuming and have high environmental impact due to the temperatures and organic solvents which must be employed. Ultrasound decreases these through acoustic streaming and micro-jets. But to fully utilise this technology parameters, such as frequency, temperature and the properties of leaf and solvent, must be explored. As with the extraction process there is also no set analysis method for identification of artemisinin. Therefore several methods of analysing these extracts are employed. Initial results indicate that sonication is able to enhance levels of artemisinin extracted when compared to the conventional/traditional extraction process. In addition Thin Layer Chromatography (TLC) and High Performance Liquid Chromatography (HPLC) have been shown to have a high level of reproducible calibration.
NASA Technical Reports Server (NTRS)
Ross, M. D.; Pote, K. G.; Rarey, K. E.; Verma, L. M.
1981-01-01
The gravity receptors of all vertebrates utilize a 'test mass' consisting of a complex arrangement of mineral and organic substance that lies over the sensory receptor areas. In most vertebrates, the mineral is a polymorph of calcium carbonate in the form of minute, single crystals called otoconia. An investigation is conducted to determine the number of proteins in otoconial complexes and their molecular weights. The investigation makes use of a microdisk gel electrophoresis method reported by Gainer (1971). The most important finding of the reported research is that analysis of the proteins of the organic material of the otoconial complexes is possible when sensitive microanalytical methods are employed. Further modification of the basic technique employed and the inclusion of other sensitive staining methods should mean that, in the future, protein separation by molecular weight will be possible in sample pools containing only two otoconial masses.
Assessing digital literacy in web-based physical activity surveillance: the WIN study.
Mathew, Merly; Morrow, James R; Frierson, Georita M; Bain, Tyson M
2011-01-01
PURPOSE. Investigate relations between demographic characteristics and submission method, Internet or paper, when physical activity behaviors are reported. DESIGN. Observational. SETTING . Metropolitan. SUBJECTS. Adult women (N = 918) observed weekly for 2 years (total number of weekly reports, 44,963). MEASURES. Independent variables included age, race, education, income, employment status, and Internet skills. Dependent variables were method of submission (Internet or paper) and adherence. ANALYSIS . Logistic regression to analyze weekly odds of submitting data online and meeting study adherence criteria. Model 1 investigated method of submission, model 2 analyzed meeting study's Internet adherence, and model 3 analyzed meeting total adherence regardless of submission method. RESULTS. Whites, those with good Internet skills, and those reporting higher incomes were more likely to log online. Those who were white, older, and reported good Internet skills were more likely to be at least 75% adherent online. Older women were more likely to be adherent regardless of method. Employed women were less likely to log online or be adherent. CONCLUSION . Providing participants with multiple submission methods may reduce potential bias and provide more generalizable results relevant for future Internet-based research.
A new ultrasonic method for measuring minute motion activities of rats.
Young, C W; Young, M S; Li, Y C; Lin, M T
1996-12-01
A new ultrasonic method is presented for measuring the minute motion activities of rats. A pair of low-cost 40 kHz ultrasonic transducers are used to transmit ultrasound toward a rat and receive the ultrasound reflected from the rat. The relative motion of the rat modulates the phase difference between the transmitted and received ultrasound signals. An 8-bit digital phase meter was designed to record the phase difference signal which was used to reconstruct the relative motion waveform of the rat in an 8751 single-chip microcomputer. The reconstructed data are then sent to a PC-AT microcomputer for further processing. This method employs a spectrum analysis for the reconstructed data and can measure three minute motion activities including locomotor activity (LMA), tremor and myoclonia. Finally, the method has been tested with real animal experiments. The main advantages of this new method are that it is non-invasive, non-contact, low cost and high precision. This new method could also be profitably employed for other behavioral studies and offer potential for research in basic medicine.
Report to the President of the United States on Sexual Assault Prevention and Response
2014-11-01
established research history based on laboratory- tested principles of memory retrieval, knowledge representation, and communication. AFOSI has been using CI...analysis methods, including scientific research , data analysis, focus groups , and on -site assessments to evaluate the Department’s SAPR program...131 Report to the President of the United States on SAPR DMDC’s focus group methodology employs a standard qualitative research approach to
Employing conservation of co-expression to improve functional inference
Daub, Carsten O; Sonnhammer, Erik LL
2008-01-01
Background Observing co-expression between genes suggests that they are functionally coupled. Co-expression of orthologous gene pairs across species may improve function prediction beyond the level achieved in a single species. Results We used orthology between genes of the three different species S. cerevisiae, D. melanogaster, and C. elegans to combine co-expression across two species at a time. This led to increased function prediction accuracy when we incorporated expression data from either of the other two species and even further increased when conservation across both of the two other species was considered at the same time. Employing the conservation across species to incorporate abundant model organism data for the prediction of protein interactions in poorly characterized species constitutes a very powerful annotation method. Conclusion To be able to employ the most suitable co-expression distance measure for our analysis, we evaluated the ability of four popular gene co-expression distance measures to detect biologically relevant interactions between pairs of genes. For the expression datasets employed in our co-expression conservation analysis above, we used the GO and the KEGG PATHWAY databases as gold standards. While the differences between distance measures were small, Spearman correlation showed to give most robust results. PMID:18808668
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brown, Jason P.; Pender, John; Wiser, Ryan
2012-09-02
The economic development potential from wind power installations has been a driver of public and policy support for the industry at the local and state levels for many years. The possibility for economic development has been particularly salient in rural areas of the country where new investment, earnings growth, and employment opportunities have, in many cases, otherwise trended downward for some time. Despite frequent mention of the economic development potential of wind power projects, however, questions persist on the magnitude, distribution, and durability of these impacts. Of particular concern for rural communities is whether new investment in wind power projectsmore » stimulates long-term local economic growth and employment. Questions about the economic development and employment impacts of wind power also persist at the national level. However, such debates tend to be more concerned with potential economic losses associated with displacement of other energy sources or land uses and the macroeconomic effects of policy support for renewable energy and changes in electricity rates that might result from wind energy deployment. The present analysis focuses solely on county-level impacts.« less
Analytical Solutions for Rumor Spreading Dynamical Model in a Social Network
NASA Astrophysics Data System (ADS)
Fallahpour, R.; Chakouvari, S.; Askari, H.
2015-03-01
In this paper, Laplace Adomian decomposition method is utilized for evaluating of spreading model of rumor. Firstly, a succinct review is constructed on the subject of using analytical methods such as Adomian decomposion method, Variational iteration method and Homotopy Analysis method for epidemic models and biomathematics. In continue a spreading model of rumor with consideration of forgetting mechanism is assumed and subsequently LADM is exerted for solving of it. By means of the aforementioned method, a general solution is achieved for this problem which can be readily employed for assessing of rumor model without exerting any computer program. In addition, obtained consequences for this problem are discussed for different cases and parameters. Furthermore, it is shown the method is so straightforward and fruitful for analyzing equations which have complicated terms same as rumor model. By employing numerical methods, it is revealed LADM is so powerful and accurate for eliciting solutions of this model. Eventually, it is concluded that this method is so appropriate for this problem and it can provide researchers a very powerful vehicle for scrutinizing rumor models in diverse kinds of social networks such as Facebook, YouTube, Flickr, LinkedIn and Tuitor.
Acoustic emission from a growing crack
NASA Technical Reports Server (NTRS)
Jacobs, Laurence J.
1989-01-01
An analytical method is being developed to determine the signature of an acoustic emission waveform from a growing crack and the results of this analysis are compared to experimentally obtained values. Within the assumptions of linear elastic fracture mechanics, a two dimensional model is developed to examine a semi-infinite crack that, after propagating with a constant velocity, suddenly stops. The analytical model employs an integral equation method for the analysis of problems of dynamic fracture mechanics. The experimental procedure uses an interferometric apparatus that makes very localized absolute measurements with very high fidelity and without acoustically loading the specimen.
Galli, V; Barbas, C
2004-09-10
A method has been developed for the analysis of a cough syrup containing dextromethorphan, guaifenesin, benzoic acid, saccharin and other components. Forced degradation was also studied to demonstrate that the method could be employed during a stability study of the syrup. Final conditions were phosphate buffer (25 mM, pH 2.8) with triethylamine (TEA)-acetonitrile (75:25, v/v). In such conditions, all the actives, excipients and degradation products were baseline resolved in less than 14 min, and different wavelengths were used for the different analytes and related compounds.
Reanalysis, compatibility and correlation in analysis of modified antenna structures
NASA Technical Reports Server (NTRS)
Levy, R.
1989-01-01
A simple computational procedure is synthesized to process changes in the microwave-antenna pathlength-error measure when there are changes in the antenna structure model. The procedure employs structural modification reanalysis methods combined with new extensions of correlation analysis to provide the revised rms pathlength error. Mainframe finite-element-method processing of the structure model is required only for the initial unmodified structure, and elementary postprocessor computations develop and deal with the effects of the changes. Several illustrative computational examples are included. The procedure adapts readily to processing spectra of changes for parameter studies or sensitivity analyses.
Vasil'ev, G F
2013-01-01
Owing to methodical disadvantages, the theory of control still lacks the potential for the analysis of biological systems. To get the full benefit of the method in addition to the algorithmic model of control (as of today the only used model in the theory of control) a parametric model of control is offered to employ. The reasoning for it is explained. The approach suggested provides the possibility to use all potential of the modern theory of control for the analysis of biological systems. The cybernetic approach is shown taking a system of the rise of glucose concentration in blood as an example.
NASA Technical Reports Server (NTRS)
Dash, S.; Delguidice, P.
1972-01-01
A second order numerical method employing reference plane characteristics has been developed for the calculation of geometrically complex three dimensional nozzle-exhaust flow fields, heretofore uncalculable by existing methods. The nozzles may have irregular cross sections with swept throats and may be stacked in modules using the vehicle undersurface for additional expansion. The nozzles may have highly nonuniform entrance conditions, the medium considered being an equilibrium hydrogen-air mixture. The program calculates and carries along the underexpansion shock and contact as discrete discontinuity surfaces, for a nonuniform vehicle external flow.
Research on power market technical analysis index system employing high-low matching mechanism
NASA Astrophysics Data System (ADS)
Li, Tao; Wang, Shengyu
2018-06-01
The power market trading technical analysis refers to a method that takes the bidding behavior of members in the power market as the research object, sums up some typical market rules and price trends by applying mathematical and logical methods, and finally can effectively assist members in the power market to make more reasonable trading decisions. In this paper, the following four indicators have been proposed: bidding price difference scale, extreme bidding price rate, dispersion of bidding price and monthly transaction satisfaction of electricity trading, which are the core of the index system.
Hypercuboidal renormalization in spin foam quantum gravity
NASA Astrophysics Data System (ADS)
Bahr, Benjamin; Steinhaus, Sebastian
2017-06-01
In this article, we apply background-independent renormalization group methods to spin foam quantum gravity. It is aimed at extending and elucidating the analysis of a companion paper, in which the existence of a fixed point in the truncated renormalization group flow for the model was reported. Here, we repeat the analysis with various modifications and find that both qualitative and quantitative features of the fixed point are robust in this setting. We also go into details about the various approximation schemes employed in the analysis.
Downey, Mark O; Rochfort, Simone
2008-08-01
A limitation of large-scale viticultural trials is the time and cost of comprehensive compositional analysis of the fruit by high-performance liquid chromatography (HPLC). In addition, separate methods have generally been required to identify and quantify different classes of metabolites. To address these shortcomings a reversed-phase HPLC method was developed to simultaneously separate the anthocyanins and flavonols present in grape skins. The method employs a methanol and water gradient acidified with 10% formic acid with a run-time of 48 min including re-equilibration. Identity of anthocyanins and flavonols in Shiraz (Vitis vinifera L.) skin was confirmed by mass spectral analysis.
Extraction of the number of peroxisomes in yeast cells by automated image analysis.
Niemistö, Antti; Selinummi, Jyrki; Saleem, Ramsey; Shmulevich, Ilya; Aitchison, John; Yli-Harja, Olli
2006-01-01
An automated image analysis method for extracting the number of peroxisomes in yeast cells is presented. Two images of the cell population are required for the method: a bright field microscope image from which the yeast cells are detected and the respective fluorescent image from which the number of peroxisomes in each cell is found. The segmentation of the cells is based on clustering the local mean-variance space. The watershed transformation is thereafter employed to separate cells that are clustered together. The peroxisomes are detected by thresholding the fluorescent image. The method is tested with several images of a budding yeast Saccharomyces cerevisiae population, and the results are compared with manually obtained results.
Nonlinear analysis of a closed-loop tractor-semitrailer vehicle system with time delay
NASA Astrophysics Data System (ADS)
Liu, Zhaoheng; Hu, Kun; Chung, Kwok-wai
2016-08-01
In this paper, a nonlinear analysis is performed on a closed-loop system of articulated heavy vehicles with driver steering control. The nonlinearity arises from the nonlinear cubic tire force model. An integration method is employed to derive an analytical periodic solution of the system in the neighbourhood of the critical speed. The results show that excellent accuracy can be achieved for the calculation of periodic solutions arising from Hopf bifurcation of the vehicle motion. A criterion is obtained for detecting the Bautin bifurcation which separates branches of supercritical and subcritical Hopf bifurcations. The integration method is compared to the incremental harmonic balance method in both supercritical and subcritical scenarios.
NASA Astrophysics Data System (ADS)
Wu, Jianing; Yan, Shaoze; Xie, Liyang
2011-12-01
To address the impact of solar array anomalies, it is important to perform analysis of the solar array reliability. This paper establishes the fault tree analysis (FTA) and fuzzy reasoning Petri net (FRPN) models of a solar array mechanical system and analyzes reliability to find mechanisms of the solar array fault. The index final truth degree (FTD) and cosine matching function (CMF) are employed to resolve the issue of how to evaluate the importance and influence of different faults. So an improvement reliability analysis method is developed by means of the sorting of FTD and CMF. An example is analyzed using the proposed method. The analysis results show that harsh thermal environment and impact caused by particles in space are the most vital causes of the solar array fault. Furthermore, other fault modes and the corresponding improvement methods are discussed. The results reported in this paper could be useful for the spacecraft designers, particularly, in the process of redesigning the solar array and scheduling its reliability growth plan.
Time-dependent inertia analysis of vehicle mechanisms
NASA Astrophysics Data System (ADS)
Salmon, James Lee
Two methods for performing transient inertia analysis of vehicle hardware systems are developed in this dissertation. The analysis techniques can be used to predict the response of vehicle mechanism systems to the accelerations associated with vehicle impacts. General analytical methods for evaluating translational or rotational system dynamics are generated and evaluated for various system characteristics. The utility of the derived techniques are demonstrated by applying the generalized methods to two vehicle systems. Time dependent acceleration measured during a vehicle to vehicle impact are used as input to perform a dynamic analysis of an automobile liftgate latch and outside door handle. Generalized Lagrange equations for a non-conservative system are used to formulate a second order nonlinear differential equation defining the response of the components to the transient input. The differential equation is solved by employing the fourth order Runge-Kutta method. The events are then analyzed using commercially available two dimensional rigid body dynamic analysis software. The results of the two analytical techniques are compared to experimental data generated by high speed film analysis of tests of the two components performed on a high G acceleration sled at Ford Motor Company.
[Improvement of 2-mercaptoimidazoline analysis in rubber products containing chlorine].
Kaneko, Reiko; Haneishi, Nahoko; Kawamura, Yoko
2012-01-01
An improved analysis method for 2-mercaptoimidazoline in rubber products containing chlorine was developed. 2-Mercaptoimidazoline (20 µg/mL) is detected by means of TLC with two developing solvents in the official method. But, this method is not quantitative. Instead, we employed HPLC using water-methanol (9 : 1) as the mobile phase. This procedure decreased interfering peaks, and the quantitation limit was 2 µg/mL of standard solution. 2-Mercaptoimidazoline was confirmed by GC-MS (5 µg/mL) and LC/MS (1 µg/mL) in the scan mode. For preparation of test solution, a soaking extraction method, in which 20 mL of methanol was added to the sample and allowed to stand overnight at about 40°C, was used. This gave similar values to the Soxhlet extraction method (official method) and was more convenient. The results indicate that our procedure is suitable for analysis of 2-mercaptoimidazoline. When 2-mercaptoimidazoline is detected, it is confirmed by either GC/MS or LC/MS.
Yoon, Seohyun; Kim, Ja Young; Park, Jooyoung; Kim, Seung-Sup
2017-09-01
Objective Precarious employment is associated with worse mental health, but it is unclear whether changes in employment status are related to suicidal behaviors. This study examined the association between change in employment status and suicidal ideation among workers in South Korea. Methods To maximize power of the analysis, we combined data from the ongoing Korean Welfare Panel Study. We analyzed 3793 participants who were permanent workers at baseline (2011-2014) and who either: (i) maintained permanent employment; (ii) became a full-time precarious worker; (iii) became a part-time precarious worker; or (iv) became unemployed in the following year (2012-2015). Suicidal ideation was assessed annually by asking participants, "Have you ever seriously thought about dying by suicide in the past year?" Logistic regression was applied to examine associations between change in employment status and suicidal ideation, adjusting for potential confounders such as lifetime suicidal ideation and depressive symptoms at baseline. Results Participants who became part-time precarious workers were more likely to have suicidal ideation [odd ratio (OR) 2.37, 95% confidence interval (95% CI) 1.07-5.25, P=0.033] compared to those who remained permanent workers. In analysis restricted to workers who never previously thought about dying by suicide, suicidal ideation was more common among those who became either full-time (OR 2.33, 95% CI 1.09-4.99, P=0.029) or part-time (OR 3.94, 95% CI 1.46-10.64, P=0.007) precarious workers. Conclusions Our findings suggest that change in employment status from permanent to precarious employment may increase suicidal ideation among workers in South Korea.
In-school service predictors of employment for individuals with intellectual disability.
Park, Jiyoon; Bouck, Emily
2018-06-01
Although there are many secondary data analyses of the National Longitudinal Transition Study-2 (NLTS-2) to investigate post-school outcome for students with disabilities, there has been a lack of research with in-school service predictors and post-school outcome for students with specific disability categories. This study was a secondary data analysis of NLTS-2 to investigate the relationship between current employment status and in-school services for individuals with intellectual disability. Statistical methods such as descriptive statistics and logistic regression were used to analyze NLTS-2 data set. The main findings included that in-school services were correlated with current employment status, and that primary disability (i.e., mild intellectual disability and moderate/severe intellectual disability) was associated with current employment status. In-school services are critical in predicting current employment for individuals with intellectual disability. Also, data suggest additional research is needed to investigate various in-school services and variables that could predict employment differences between individuals with mild and moderate/severe intellectual disability. Copyright © 2018 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Lodhi, Ehtisham; Lodhi, Zeeshan; Noman Shafqat, Rana; Chen, Fieda
2017-07-01
Photovoltaic (PV) system usually employed The Maximum power point tracking (MPPT) techniques for increasing its efficiency. The performance of the PV system perhaps boosts by controlling at its apex point of power, in this way maximal power can be given to load. The proficiency of a PV system usually depends upon irradiance, temperature and array architecture. PV array shows a non-linear style for V-I curve and maximal power point on V-P curve also varies with changing environmental conditions. MPPT methods grantees that a PV module is regulated at reference voltage and to produce entire usage of the maximal output power. This paper gives analysis between two widely employed Perturb and Observe (P&O) and Incremental Conductance (INC) MPPT techniques. Their performance is evaluated and compared through theoretical analysis and digital simulation on the basis of response time and efficiency under varying irradiance and temperature condition using Matlab/Simulink.
Simple automatic strategy for background drift correction in chromatographic data analysis.
Fu, Hai-Yan; Li, He-Dong; Yu, Yong-Jie; Wang, Bing; Lu, Peng; Cui, Hua-Peng; Liu, Ping-Ping; She, Yuan-Bin
2016-06-03
Chromatographic background drift correction, which influences peak detection and time shift alignment results, is a critical stage in chromatographic data analysis. In this study, an automatic background drift correction methodology was developed. Local minimum values in a chromatogram were initially detected and organized as a new baseline vector. Iterative optimization was then employed to recognize outliers, which belong to the chromatographic peaks, in this vector, and update the outliers in the baseline until convergence. The optimized baseline vector was finally expanded into the original chromatogram, and linear interpolation was employed to estimate background drift in the chromatogram. The principle underlying the proposed method was confirmed using a complex gas chromatographic dataset. Finally, the proposed approach was applied to eliminate background drift in liquid chromatography quadrupole time-of-flight samples used in the metabolic study of Escherichia coli samples. The proposed method was comparable with three classical techniques: morphological weighted penalized least squares, moving window minimum value strategy and background drift correction by orthogonal subspace projection. The proposed method allows almost automatic implementation of background drift correction, which is convenient for practical use. Copyright © 2016 Elsevier B.V. All rights reserved.
[Marketing research in health service].
Ameri, Cinzia; Fiorini, Fulvio
2015-01-01
Marketing research is the systematic and objective search for, and analysis of, information relevant to the identification and solution of any problem in the field of marketing. The key words in this definition are: systematic, objective and analysis. Marketing research seeks to set about its task in a systematic and objective fashion. This means that a detailed and carefully designed research plan is developed in which each stage of the research is specified. Such a research plan is only considered adequate if it specifies: the research problem in concise and precise terms, the information necessary to address the problem, the methods to be employed in gathering the information and the analytical techniques to be used to interpret it. Maintaining objectivity in marketing research is essential if marketing management is to have sufficient confidence in its results to be prepared to take risky decisions based upon those results. To this end, as far as possible, marketing researchers employ the scientific method. The characteristics of the scientific method are that it translates personal prejudices, notions and opinions into explicit propositions (or hypotheses). These are tested empirically. At the same time alternative explanations of the event or phenomena of interest are given equal consideration.
Yang, Jun-Ho; Yoh, Jack J
2018-01-01
A novel technique is reported for separating overlapping latent fingerprints using chemometric approaches that combine laser-induced breakdown spectroscopy (LIBS) and multivariate analysis. The LIBS technique provides the capability of real time analysis and high frequency scanning as well as the data regarding the chemical composition of overlapping latent fingerprints. These spectra offer valuable information for the classification and reconstruction of overlapping latent fingerprints by implementing appropriate statistical multivariate analysis. The current study employs principal component analysis and partial least square methods for the classification of latent fingerprints from the LIBS spectra. This technique was successfully demonstrated through a classification study of four distinct latent fingerprints using classification methods such as soft independent modeling of class analogy (SIMCA) and partial least squares discriminant analysis (PLS-DA). The novel method yielded an accuracy of more than 85% and was proven to be sufficiently robust. Furthermore, through laser scanning analysis at a spatial interval of 125 µm, the overlapping fingerprints were reconstructed as separate two-dimensional forms.
Sharma, Teenu; Khurana, Rajneet Kaur; Jain, Atul; Katare, O P; Singh, Bhupinder
2018-05-01
The current research work envisages an analytical quality by design-enabled development of a simple, rapid, sensitive, specific, robust and cost-effective stability-indicating reversed-phase high-performance liquid chromatographic method for determining stress-induced forced-degradation products of sorafenib tosylate (SFN). An Ishikawa fishbone diagram was constructed to embark upon analytical target profile and critical analytical attributes, i.e. peak area, theoretical plates, retention time and peak tailing. Factor screening using Taguchi orthogonal arrays and quality risk assessment studies carried out using failure mode effect analysis aided the selection of critical method parameters, i.e. mobile phase ratio and flow rate potentially affecting the chosen critical analytical attributes. Systematic optimization using response surface methodology of the chosen critical method parameters was carried out employing a two-factor-three-level-13-run, face-centered cubic design. A method operable design region was earmarked providing optimum method performance using numerical and graphical optimization. The optimum method employed a mobile phase composition consisting of acetonitrile and water (containing orthophosphoric acid, pH 4.1) at 65:35 v/v at a flow rate of 0.8 mL/min with UV detection at 265 nm using a C 18 column. Response surface methodology validation studies confirmed good efficiency and sensitivity of the developed method for analysis of SFN in mobile phase as well as in human plasma matrix. The forced degradation studies were conducted under different recommended stress conditions as per ICH Q1A (R2). Mass spectroscopy studies showed that SFN degrades in strongly acidic, alkaline and oxidative hydrolytic conditions at elevated temperature, while the drug was per se found to be photostable. Oxidative hydrolysis using 30% H 2 O 2 showed maximum degradation with products at retention times of 3.35, 3.65, 4.20 and 5.67 min. The absence of any significant change in the retention time of SFN and degradation products, formed under different stress conditions, ratified selectivity and specificity of the systematically developed method. Copyright © 2017 John Wiley & Sons, Ltd.
Detection of the Vibration Signal from Human Vocal Folds Using a 94-GHz Millimeter-Wave Radar
Chen, Fuming; Li, Sheng; Zhang, Yang; Wang, Jianqi
2017-01-01
The detection of the vibration signal from human vocal folds provides essential information for studying human phonation and diagnosing voice disorders. Doppler radar technology has enabled the noncontact measurement of the human-vocal-fold vibration. However, existing systems must be placed in close proximity to the human throat and detailed information may be lost because of the low operating frequency. In this paper, a long-distance detection method, involving the use of a 94-GHz millimeter-wave radar sensor, is proposed for detecting the vibration signals from human vocal folds. An algorithm that combines empirical mode decomposition (EMD) and the auto-correlation function (ACF) method is proposed for detecting the signal. First, the EMD method is employed to suppress the noise of the radar-detected signal. Further, the ratio of the energy and entropy is used to detect voice activity in the radar-detected signal, following which, a short-time ACF is employed to extract the vibration signal of the human vocal folds from the processed signal. For validating the method and assessing the performance of the radar system, a vibration measurement sensor and microphone system are additionally employed for comparison. The experimental results obtained from the spectrograms, the vibration frequency of the vocal folds, and coherence analysis demonstrate that the proposed method can effectively detect the vibration of human vocal folds from a long detection distance. PMID:28282892
Kernel-aligned multi-view canonical correlation analysis for image recognition
NASA Astrophysics Data System (ADS)
Su, Shuzhi; Ge, Hongwei; Yuan, Yun-Hao
2016-09-01
Existing kernel-based correlation analysis methods mainly adopt a single kernel in each view. However, only a single kernel is usually insufficient to characterize nonlinear distribution information of a view. To solve the problem, we transform each original feature vector into a 2-dimensional feature matrix by means of kernel alignment, and then propose a novel kernel-aligned multi-view canonical correlation analysis (KAMCCA) method on the basis of the feature matrices. Our proposed method can simultaneously employ multiple kernels to better capture the nonlinear distribution information of each view, so that correlation features learned by KAMCCA can have well discriminating power in real-world image recognition. Extensive experiments are designed on five real-world image datasets, including NIR face images, thermal face images, visible face images, handwritten digit images, and object images. Promising experimental results on the datasets have manifested the effectiveness of our proposed method.
Martins, Angélica Rocha; Talhavini, Márcio; Vieira, Maurício Leite; Zacca, Jorge Jardim; Braga, Jez Willian Batista
2017-08-15
The discrimination of whisky brands and counterfeit identification were performed by UV-Vis spectroscopy combined with partial least squares for discriminant analysis (PLS-DA). In the proposed method all spectra were obtained with no sample preparation. The discrimination models were built with the employment of seven whisky brands: Red Label, Black Label, White Horse, Chivas Regal (12years), Ballantine's Finest, Old Parr and Natu Nobilis. The method was validated with an independent test set of authentic samples belonging to the seven selected brands and another eleven brands not included in the training samples. Furthermore, seventy-three counterfeit samples were also used to validate the method. Results showed correct classification rates for genuine and false samples over 98.6% and 93.1%, respectively, indicating that the method can be helpful for the forensic analysis of whisky samples. Copyright © 2017 Elsevier Ltd. All rights reserved.
Sensitivity analysis and nonlinearity assessment of steam cracking furnace process
NASA Astrophysics Data System (ADS)
Rosli, M. N.; Sudibyo, Aziz, N.
2017-11-01
In this paper, sensitivity analysis and nonlinearity assessment of cracking furnace process are presented. For the sensitivity analysis, the fractional factorial design method is employed as a method to analyze the effect of input parameters, which consist of four manipulated variables and two disturbance variables, to the output variables and to identify the interaction between each parameter. The result of the factorial design method is used as a screening method to reduce the number of parameters, and subsequently, reducing the complexity of the model. It shows that out of six input parameters, four parameters are significant. After the screening is completed, step test is performed on the significant input parameters to assess the degree of nonlinearity of the system. The result shows that the system is highly nonlinear with respect to changes in an air-to-fuel ratio (AFR) and feed composition.
Expression of masculine identity in individuals with a traumatic brain injury.
Keegan, Louise C; Togher, Leanne; Murdock, Macy; Hendry, Emma
2017-01-01
This research seeks to examine and describe how four males with a traumatic brain injury (TBI) use language to negotiate their masculine identities. Qualitative research methods were employed with a 'case study' design that allows for a detailed description of the cases, and the interactions examined. The tools of inquiry applied included a topic analysis, as well as linguistic analysis methods that incorporated the theory of Systemic Functional Linguistics. Such tools were employed in the analysis of 12, two-hour group treatment sessions in order to describe how linguistic choices contributed to the construction of a masculine identity in communicative interactions. Although all participants had significant difficulties with cognitive communication, they all demonstrated an ability to use language to assert their masculine identities. Results revealed that prominent topics used to assert masculinity included confidence, women, risk-taking behaviour and interests and that expressions of masculinity often occurred in giving information roles and involved appraisal and modality. The results have implications for the development of rehabilitation interventions for social communication that provide individuals with TBI with the linguistic tools and communication opportunities necessary in order to successfully express identity and reveal masculinity.
Dinkel, Philipp Johannes; Willmes, Klaus; Krinzinger, Helga; Konrad, Kerstin; Koten Jr, Jan Willem
2013-01-01
FMRI-studies are mostly based on a group study approach, either analyzing one group or comparing multiple groups, or on approaches that correlate brain activation with clinically relevant criteria or behavioral measures. In this study we investigate the potential of fMRI-techniques focusing on individual differences in brain activation within a test-retest reliability context. We employ a single-case analysis approach, which contrasts dyscalculic children with a control group of typically developing children. In a second step, a support-vector machine analysis and cluster analysis techniques served to investigate similarities in multivariate brain activation patterns. Children were confronted with a non-symbolic number comparison and a non-symbolic exact calculation task during fMRI acquisition. Conventional second level group comparison analysis only showed small differences around the angular gyrus bilaterally and the left parieto-occipital sulcus. Analyses based on single-case statistical procedures revealed that developmental dyscalculia is characterized by individual differences predominantly in visual processing areas. Dyscalculic children seemed to compensate for relative under-activation in the primary visual cortex through an upregulation in higher visual areas. However, overlap in deviant activation was low for the dyscalculic children, indicating that developmental dyscalculia is a disorder characterized by heterogeneous brain activation differences. Using support vector machine analysis and cluster analysis, we tried to group dyscalculic and typically developing children according to brain activation. Fronto-parietal systems seem to qualify for a distinction between the two groups. However, this was only effective when reliable brain activations of both tasks were employed simultaneously. Results suggest that deficits in number representation in the visual-parietal cortex get compensated for through finger related aspects of number representation in fronto-parietal cortex. We conclude that dyscalculic children show large individual differences in brain activation patterns. Nonetheless, the majority of dyscalculic children can be differentiated from controls employing brain activation patterns when appropriate methods are used. PMID:24349547
Dinkel, Philipp Johannes; Willmes, Klaus; Krinzinger, Helga; Konrad, Kerstin; Koten, Jan Willem
2013-01-01
FMRI-studies are mostly based on a group study approach, either analyzing one group or comparing multiple groups, or on approaches that correlate brain activation with clinically relevant criteria or behavioral measures. In this study we investigate the potential of fMRI-techniques focusing on individual differences in brain activation within a test-retest reliability context. We employ a single-case analysis approach, which contrasts dyscalculic children with a control group of typically developing children. In a second step, a support-vector machine analysis and cluster analysis techniques served to investigate similarities in multivariate brain activation patterns. Children were confronted with a non-symbolic number comparison and a non-symbolic exact calculation task during fMRI acquisition. Conventional second level group comparison analysis only showed small differences around the angular gyrus bilaterally and the left parieto-occipital sulcus. Analyses based on single-case statistical procedures revealed that developmental dyscalculia is characterized by individual differences predominantly in visual processing areas. Dyscalculic children seemed to compensate for relative under-activation in the primary visual cortex through an upregulation in higher visual areas. However, overlap in deviant activation was low for the dyscalculic children, indicating that developmental dyscalculia is a disorder characterized by heterogeneous brain activation differences. Using support vector machine analysis and cluster analysis, we tried to group dyscalculic and typically developing children according to brain activation. Fronto-parietal systems seem to qualify for a distinction between the two groups. However, this was only effective when reliable brain activations of both tasks were employed simultaneously. Results suggest that deficits in number representation in the visual-parietal cortex get compensated for through finger related aspects of number representation in fronto-parietal cortex. We conclude that dyscalculic children show large individual differences in brain activation patterns. Nonetheless, the majority of dyscalculic children can be differentiated from controls employing brain activation patterns when appropriate methods are used.
Injection Locking Techniques for Spectrum Analysis
NASA Astrophysics Data System (ADS)
Gathma, Timothy D.; Buckwalter, James F.
2011-04-01
Wideband spectrum analysis supports future communication systems that reconfigure and adapt to the capacity of the spectral environment. While test equipment manufacturers offer wideband spectrum analyzers with excellent sensitivity and resolution, these spectrum analyzers typically cannot offer acceptable size, weight, and power (SWAP). CMOS integrated circuits offer the potential to fully integrate spectrum analysis capability with analog front-end circuitry and digital signal processing on a single chip. Unfortunately, CMOS lacks high-Q passives and wideband resonator tunability that is necessary for heterodyne implementations of spectrum analyzers. As an alternative to the heterodyne receiver architectures, two nonlinear methods for performing wideband, low-power spectrum analysis are presented. The first method involves injecting the spectrum of interest into an array of injection-locked oscillators. The second method employs the closed loop dynamics of both injection locking and phase locking to independently estimate the injected frequency and power.
White, Michael J.; Judd, Maya D.; Poliandri, Simone
2012-01-01
Although there has been much optimistic discussion of integrating quantitative and qualitative findings into sociological analysis, there remains a gap regarding the application of mixed approaches. We examine the potential gains and pitfalls of such integration in the context of the growing analytic power of contemporary qualitative data analysis software (QDAS) programs. We illustrate the issues with our own research in a mixed-methods project examining low fertility in Italy, a project that combines analysis of large nationally representative survey data with qualitative in-depth interviews with women across four (4) cities in Italy. Despite the enthusiasm for mixed-methods research, the available software appears to be underutilized. In addition, we suggest that the sociological research community will want to address several conceptual and inferential issues with these approaches. PMID:23543938
White, Michael J; Judd, Maya D; Poliandri, Simone
2012-08-01
Although there has been much optimistic discussion of integrating quantitative and qualitative findings into sociological analysis, there remains a gap regarding the application of mixed approaches. We examine the potential gains and pitfalls of such integration in the context of the growing analytic power of contemporary qualitative data analysis software (QDAS) programs. We illustrate the issues with our own research in a mixed-methods project examining low fertility in Italy, a project that combines analysis of large nationally representative survey data with qualitative in-depth interviews with women across four (4) cities in Italy. Despite the enthusiasm for mixed-methods research, the available software appears to be underutilized. In addition, we suggest that the sociological research community will want to address several conceptual and inferential issues with these approaches.
Time for children: trends in the employment patterns of parents, 1967-2009.
Fox, Liana; Han, Wen-Jui; Ruhm, Christopher; Waldfogel, Jane
2013-02-01
Using data from the 1967-2009 years of the March Current Population Surveys (CPS), we examine two important resources for children's well-being: time and money. We document trends in parental employment, from the perspective of children, and show what underlies these trends. We find that increases in family work hours mainly reflect movements into jobs by parents-particularly mothers, who in prior decades would have remained at home. This increase in market work has raised incomes for children in the typical two-parent family but not for those in lone-parent households. Time use data from 1975 and 2003-2008 reveal that working parents spend less time engaged in primary childcare than their counterparts without jobs but more than employed peers in previous cohorts. Analysis of 2004 work schedule data suggests that non-daytime work provides an alternative method of coordinating employment schedules for some dual-earner families.
A scoping review of spatial cluster analysis techniques for point-event data.
Fritz, Charles E; Schuurman, Nadine; Robertson, Colin; Lear, Scott
2013-05-01
Spatial cluster analysis is a uniquely interdisciplinary endeavour, and so it is important to communicate and disseminate ideas, innovations, best practices and challenges across practitioners, applied epidemiology researchers and spatial statisticians. In this research we conducted a scoping review to systematically search peer-reviewed journal databases for research that has employed spatial cluster analysis methods on individual-level, address location, or x and y coordinate derived data. To illustrate the thematic issues raised by our results, methods were tested using a dataset where known clusters existed. Point pattern methods, spatial clustering and cluster detection tests, and a locally weighted spatial regression model were most commonly used for individual-level, address location data (n = 29). The spatial scan statistic was the most popular method for address location data (n = 19). Six themes were identified relating to the application of spatial cluster analysis methods and subsequent analyses, which we recommend researchers to consider; exploratory analysis, visualization, spatial resolution, aetiology, scale and spatial weights. It is our intention that researchers seeking direction for using spatial cluster analysis methods, consider the caveats and strengths of each approach, but also explore the numerous other methods available for this type of analysis. Applied spatial epidemiology researchers and practitioners should give special consideration to applying multiple tests to a dataset. Future research should focus on developing frameworks for selecting appropriate methods and the corresponding spatial weighting schemes.
Akhtar, Juber; Fareed, Sheeba; Aqil, Mohd
2013-07-01
A sensitive, selective, precise and stability-indicating high-performance thin-layer chromatographic (HPTLC) method for analysis of repaglinide both as a bulk drug and in nanoemulsion formulation was developed and validated. The method employed TLC aluminum plates precoated with silica gel 60F-254 as the stationary phase. The solvent system consisted of chloroform/methanol/ammonia/glacial acetic acid (7.5:1.5:0.9:0.1, v/v/v/v). This system was found to give compact spots for repaglinide (R f value of 0.38 ± 0.02). Repaglinide was subjected to acid and alkali hydrolysis, oxidation, photodegradation and dry heat treatment. Also, the degraded products were well separated from the pure drug. Densitometric analysis of repaglinide was carried out in the absorbance mode at 240 nm. The linear regression data for the calibration plots showed good linear relationship with r (2)= 0.998 ± 0.032 in the concentration range of 50-800 ng. The method was validated for precision, accuracy as recovery, robustness and specificity. The limits of detection and quantitation were 0.023 and 0.069 ng per spot, respectively. The drug undergoes degradation under acidic and basic conditions, oxidation and dry heat treatment. All the peaks of the degraded product were resolved from the standard drug with significantly different R f values. Statistical analysis proves that the method is reproducible and selective for the estimation of the said drug. As the method could effectively separate the drug from its degradation products, it can be employed as a stability-indicating one. Moreover, the proposed HPTLC method was utilized to investigate the degradation kinetics in 1M NaOH.
Multiscale Detrended Cross-Correlation Analysis of STOCK Markets
NASA Astrophysics Data System (ADS)
Yin, Yi; Shang, Pengjian
2014-06-01
In this paper, we employ the detrended cross-correlation analysis (DCCA) to investigate the cross-correlations between different stock markets. We report the results of cross-correlated behaviors in US, Chinese and European stock markets in period 1997-2012 by using DCCA method. The DCCA shows the cross-correlated behaviors of intra-regional and inter-regional stock markets in the short and long term which display the similarities and differences of cross-correlated behaviors simply and roughly and the persistence of cross-correlated behaviors of fluctuations. Then, because of the limitation and inapplicability of DCCA method, we propose multiscale detrended cross-correlation analysis (MSDCCA) method to avoid "a priori" selecting the ranges of scales over which two coefficients of the classical DCCA method are identified, and employ MSDCCA to reanalyze these cross-correlations to exhibit some important details such as the existence and position of minimum, maximum and bimodal distribution which are lost if the scale structure is described by two coefficients only and essential differences and similarities in the scale structures of cross-correlation of intra-regional and inter-regional markets. More statistical characteristics of cross-correlation obtained by MSDCCA method help us to understand how two different stock markets influence each other and to analyze the influence from thus two inter-regional markets on the cross-correlation in detail, thus we get a richer and more detailed knowledge of the complex evolutions of dynamics of the cross-correlations between stock markets. The application of MSDCCA is important to promote our understanding of the internal mechanisms and structures of financial markets and helps to forecast the stock indices based on our current results demonstrated the cross-correlations between stock indices. We also discuss the MSDCCA methods of secant rolling window with different sizes and, lastly, provide some relevant implications and issue.
NASA Astrophysics Data System (ADS)
Bruinen, Anne L.; Fisher, Gregory L.; Balez, Rachelle; van der Sar, Astrid M.; Ooi, Lezanne; Heeren, Ron M. A.
2018-06-01
A unique method for identification of biomolecular components in different biological specimens, while preserving the capability for high speed 2D and 3D molecular imaging, is employed to investigate cellular response to oxidative stress. The employed method enables observing the distribution of the antioxidant α-tocopherol and other molecules in cellular structures via time-of-flight secondary ion mass spectrometry (TOF-SIMS (MS1)) imaging in parallel with tandem mass spectrometry (MS2) imaging, collected simultaneously. The described method is employed to examine a network formed by neuronal cells differentiated from human induced pluripotent stem cells (iPSCs), a model for investigating human neurons in vitro. The antioxidant α-tocopherol is identified in situ within different cellular layers utilizing a 3D TOF-SIMS tandem MS imaging analysis. As oxidative stress also plays an important role in mediating inflammation, the study was expanded to whole body tissue sections of M. marinum-infected zebrafish, a model organism for tuberculosis. The TOF-SIMS tandem MS imaging results reveal an increased presence of α-tocopherol in response to the pathogen. [Figure not available: see fulltext.
The Role of Principals in Professional Learning Communities
ERIC Educational Resources Information Center
Buttram, Joan L.; Farley-Ripple, Elizabeth N.
2016-01-01
The purpose of this article is to identify how principals shape the adoption and implementation of professional learning communities. The study employed a sequential mixed-methods approach in which interviews, observations, and document analysis informed survey design. Teachers were surveyed in four elementary schools about the practices and…
Individual Factors Predicting Mental Health Court Diversion Outcome
ERIC Educational Resources Information Center
Verhaaff, Ashley; Scott, Hannah
2015-01-01
Objective: This study examined which individual factors predict mental health court diversion outcome among a sample of persons with mental illness participating in a postcharge diversion program. Method: The study employed secondary analysis of existing program records for 419 persons with mental illness in a court diversion program. Results:…
A SPATIAL ANALYSIS OF FINE-ROOT BIOMASS FROM STAND DATA IN OREGON AND WASHINGTON
Because of the high spatial variability of fine roots in natural forest stands, accurate estimates of stand-level fine root biomass are difficult and expensive to obtain by standard coring methods. This study compares two different approaches that employ aboveground tree metrics...
Characteristics of Home: Perspectives of Women Who Are Homeless
ERIC Educational Resources Information Center
Walsh, Christine A.; Rutherford, Gayle E.; Kuzmak, Natasha
2009-01-01
We employed participatory, community-based research methods to explore the perceptions of home among women who are homeless. Twenty women engaged in one or more techniques including qualitative interviews, digital story telling, creative writing, photovoice, and design charrette to characterize their perceptions of home. Analysis of the data…
Implementing Service Excellence in Higher Education
ERIC Educational Resources Information Center
Khan, Hina; Matlay, Harry
2009-01-01
Purpose: The purpose of this paper is to provide a critical analysis of the importance of service excellence in higher education. Design/methodology/approach: The research upon which this paper is based employed a phenomenological approach. This method was selected for its focus on respondent perceptions and experiences. Both structured and…
An Analysis of Costs in Institutions of Higher Education in England
ERIC Educational Resources Information Center
Johnes, Geraint; Johnes, Jill; Thanassoulis, Emmanuel
2008-01-01
Cost functions are estimated, using random effects and stochastic frontier methods, for English higher education institutions. The article advances on existing literature by employing finer disaggregation by subject, institution type and location, and by introducing consideration of quality effects. Estimates are provided of average incremental…
Learning from MOOCs: A Qualitative Case Study from the Learners' Perspectives
ERIC Educational Resources Information Center
Park, Yeonjeong; Jung, Insung; Reeves, Thomas C.
2015-01-01
This study describes the massive open online course (MOOC) experiences of three educational technology scholars assuming the roles of learners. Adapting Carroll's model of school learning as a theoretical framework, the study employed an autoethnography method to collect empirical data in three different MOOCs. Data analysis from regularly…
Enhancing Digital Literacy and Learning among Adults with Blogs
ERIC Educational Resources Information Center
Sharp, Laurie A.
2017-01-01
Digital literacy and learning among adults has been identified as an area requiring research. The purpose of the present study was to explore technology acceptance and digital collaborative learning experiences with blogs among adult learners. This analysis employed a quasi-experimental mixed-methods approach guided by a sociocultural theoretical…
What Influences Agents to Pursue a Career in Extension?
ERIC Educational Resources Information Center
Arnold, Shannon; Place, Nick
2010-01-01
The qualitative study reported here explored why agricultural agents pursue an Extension career. A purposive sample was used to select twelve Florida agricultural agents. Interviews investigated positive and negative influences that affected agents' employment decisions. Grounded theory was used as the primary data analysis method (Strauss &…
NASA Technical Reports Server (NTRS)
Ohsaka, K.; Chung, S. K.; Rhim, W. K.
1997-01-01
The specific volumes and viscosities of the Ni-Zr liquid alloys as a function of temperature are determined by employing a digitizing technique and numeric analysis methods applied to the optical images of the electrostatically levitated liquid alloys.
Psychosocial and Cognitive Functioning of Children with Specific Profiles of Maltreatment
ERIC Educational Resources Information Center
Pears, Katherine C.; Kim, Hyoun K.; Fisher, Philip A.
2008-01-01
Objective: Up to 90% of child welfare system cases involve multiple types of maltreatment; however, studies have rarely incorporated multiple dimensions of maltreatment. The present study employed a latent profile analysis to identify naturally occurring subgroups of children who had experienced maltreatment. Methods: Reports of maltreatment…
Internal Labor Markets: An Empirical Investigation.
ERIC Educational Resources Information Center
Mahoney, Thomas A.; Milkovich, George T.
Methods of internal labor market analysis for three organizational areas are presented, along with some evidence about the validity and utility of conceptual descriptions of such markets. The general concept of an internal labor market refers to the process of pricing and allocation of manpower resources with an employing organization and rests…
Apprentice and Ongoing Training Needs in the Electrical and Associated Industries.
ERIC Educational Resources Information Center
Doughney, James; Howes, Jenny; Worland, David; Wragg, Cheryl
A study investigated skill shortages in the electrical and associated industries in Victoria and their nature and contributing factors. Research methods were a literature review, data analysis, and qualitative and quantitative research into apprentices, employers, and practitioners. Findings indicated a decline in the number of apprentices in…
Portrayals of People with Cerebral Palsy in Homicide News
ERIC Educational Resources Information Center
Lucardie, Richard; Sobsey, Dick
2005-01-01
Through content analysis, employing qualitative and quantitative methods, Canadian media representation of people with cerebral palsy (PWCP) in public life was examined. Canadian NewsDisc, an online biographic database service, was used to examine the use of stigmatizing language such as afflicted by, afflicted with, suffered from, suffers from,…
Validating a Lifestyle Physical Activity Measure for People with Serious Mental Illness
ERIC Educational Resources Information Center
Bezyak, Jill L.; Chan, Fong; Chiu, Chung-Yi; Kaya, Cahit; Huck, Garrett
2014-01-01
Purpose: To evaluate the measurement structure of the "Physical Activity Scale for Individuals With Physical Disabilities" (PASIPD) as an assessment tool of lifestyle physical activities for people with severe mental illness. Method: A quantitative descriptive research design using factor analysis was employed. A sample of 72 individuals…
A method for assessment of watershed health is developed by employing measures of reliability, resilience and vulnerability (R-R-V) using stream water quality data. Observed water quality data are usually sparse, so that a water quality time series is often reconstructed using s...
Student Perceptions of Classroom Achievement Goal Structure: Is It Appropriate to Aggregate?
ERIC Educational Resources Information Center
Lam, Arena C.; Ruzek, Erik A.; Schenke, Katerina; Conley, AnneMarie M.; Karabenick, Stuart A.
2015-01-01
Student reports are a common approach to characterizing how students experience their classrooms. We used a recently developed method--multilevel confirmatory factor analysis--to determine whether commonly employed measures of achievement goal structure constructs (mastery and performance) typically verified at the student level can be verified at…
ERIC Educational Resources Information Center
Loehlin, James H.; Norton, Alexandra P.
1988-01-01
Describes a crystallography experiment using both diffraction-angle and diffraction-intensity information to determine the lattice constant and a lattice independent molecular parameter, while still employing standard X-ray powder diffraction techniques. Details the method, experimental details, and analysis for this activity. (CW)
James, Abi; Draffan, E A; Wald, Mike
2017-01-01
This paper presents a gap analysis between crowdsourced functional accessibility evaluations of ebooks conducted by non-experts and the technical accessibility standards employed by developers. It also illustrates how combining these approaches can provide more appropriate information for a wider group of users with print impairments.
Eliciting Taiwanese High School Students' Scientific Ontological and Epistemic Beliefs
ERIC Educational Resources Information Center
Lin, Tzung-Jin; Tsai, Chin-Chung
2017-01-01
This study employed the interview method to clarify the underlying dimensions of and relationships between students' scientific ontological and epistemic beliefs. Forty Taiwanese high school students were invited to participate in this study. Through content analysis of the participants' interview responses two ontological dimensions including…
Communicative Competence of the Fourth Year Students: Basis for Proposed English Language Program
ERIC Educational Resources Information Center
Tuan, Vu Van
2017-01-01
This study on level of communicative competence covering linguistic/grammatical and discourse has aimed at constructing a proposed English language program for 5 key universities in Vietnam. The descriptive method utilized was scientifically employed with comparative techniques and correlational analysis. The researcher treated the surveyed data…
Evidence-Based Clearinghouses in Social Work
ERIC Educational Resources Information Center
Soydan, Haluk; Mullen, Edward J.; Alexandra, Laine; Rehnman, Jenny; Li, You-Ping
2010-01-01
Objectives: The purpose of this article is to describe several evidence-based clearinghouses focused on social work and related intervention outcomes, placing them in the context of how such clearinghouses can contribute to research dissemination to foster effective, evidence-based practice. Method: The study employed an analysis of data provided…
Group Therapy for Children after Homicide and Violence: A Pilot Study
ERIC Educational Resources Information Center
Salloum, Alison
2008-01-01
Objective: This pilot study evaluated a group intervention designed to reduce posttraumatic stress among children after homicide and/or violence. Method: Employing a secondary data analysis of 117 participants in 21 group interventions, pretest and posttest differences in posttraumatic stress levels and between child witnesses and nonwitnesses,…
Teacher Evaluation Comments Within Program Evaluation: An Analysis of Negotiation Structures.
ERIC Educational Resources Information Center
Rothe, J. Peter
How teachers methodically construct report card comments into organized social features is addressed through an examination of negotiation structures which organize anecdotes. Anecdotal comments are key communication devices which teachers employ for reporting. Four hundred and fifty anecdotal comments teachers wrote about students were collected…
Novel Electroactive Polymers as Environmentally Compliant Coatings for Corrosion Control
2006-02-03
Gravametric Analysis (TGA) and Differential Scanning Calorimetry (DSC), respectively. In this work the polymers were characterized by cyclic voltametry ...or less. The Temperature Step / Frequency Sweep method was employed where data were collected from –40 to 100°C and 0.1-100 Hz at a resolution of
Images of Nature in Greek Primary School Textbooks
ERIC Educational Resources Information Center
Korfiatis, Kostas J.; Stamou, Anastasia G.; Paraskevopoulos, Stephanos
2004-01-01
In this article, the environmental content of the textbooks used for the teaching of natural sciences in Greek primary schools was examined. Specifically, by employing the method of content analysis, both representational (metaphors, depictions, values, etc.) and cognitive ecological concepts) elements, building images of nature, and shaping our…
Unemployment Benefit Exhaustion: Incentive Effects on Job-Finding Rates
ERIC Educational Resources Information Center
Filges, Trine; Geerdsen, Lars Pico; Knudsen, Anne-Sofie Due; Jørgensen, Anne-Marie Klint
2015-01-01
Purpose: This systematic review studied the impact of exhaustion of unemployment benefits on the exit rate out of unemployment and into employment prior to benefit exhaustion or shortly thereafter. Method: We followed Campbell Collaboration guidelines to prepare this review, and ultimately located 12 studies for final analysis and interpretation.…
Kuroki, Naomi; Miyashita, Nana; Hino, Yoshiyuki; Kayashima, Kotaro; Fujino, Yoshihisa; Takada, Mikio; Nagata, Tomohisa; Yamataki, Hajime; Sakuragi, Sonoko; Kan, Hirohiko; Morita, Tetsuya; Ito, Akiyoshi; Mori, Koji
2009-09-01
The purpose of this study was to identify what motivates employers to promote good occupational health and safety practices in small-scale enterprises. Previous studies have shown that small-scale enterprises generally pay insufficient attention to issues of occupational health and safety. These findings were mainly derived from questionnaire based surveys. Nevertheless, some small-scale enterprises in which employers exercise good leadership do take a progressive approach to occupational health and safety. Although good practices can be identified in small-scale enterprises, it remains unclear what motivates employers in small-scale enterprises to actively implement occupational health and safety practices. We speculated that identifying employer motivations in promoting occupational health would help to spread good practices among small-scale enterprises. Using a qualitative approach based on the KJ methods, we interviewed ten employers who actively promote occupational health and safety in the workplace. The employers were asked to discuss their views of occupational health and safety in their own words. A semi-structured interview format was used, and transcripts were made of the interviews. Each transcript was independently coded by two or more researchers. These transcripts and codes were integrated and then the research group members discussed the heading titles and structural relationships between them according to the KJ method. Qualitative analysis revealed that all the employers expressed a strong interest in a "good company" and "good management". They emphasized four elements of "good management", namely "securing human resources", "trust of business partners", "social responsibility" and "employer's health condition itself", and considered that addressing occupational health and safety was essential to the achievement of these four elements. Consistent with previous findings, the results showed that implementation of occupational health and safety activities depended on "cost", "human resources", "time to perform", and "advisory organization". These results suggest that employer awareness of the relationship between good management and occupational health is essential to the implementation of occupational health and safety practices in small-scale enterprises.
Analysis of simple 2-D and 3-D metal structures subjected to fragment impact
NASA Technical Reports Server (NTRS)
Witmer, E. A.; Stagliano, T. R.; Spilker, R. L.; Rodal, J. J. A.
1977-01-01
Theoretical methods were developed for predicting the large-deflection elastic-plastic transient structural responses of metal containment or deflector (C/D) structures to cope with rotor burst fragment impact attack. For two-dimensional C/D structures both, finite element and finite difference analysis methods were employed to analyze structural response produced by either prescribed transient loads or fragment impact. For the latter category, two time-wise step-by-step analysis procedures were devised to predict the structural responses resulting from a succession of fragment impacts: the collision force method (CFM) which utilizes an approximate prediction of the force applied to the attacked structure during fragment impact, and the collision imparted velocity method (CIVM) in which the impact-induced velocity increment acquired by a region of the impacted structure near the impact point is computed. The merits and limitations of these approaches are discussed. For the analysis of 3-d responses of C/D structures, only the CIVM approach was investigated.
Singular boundary method for wave propagation analysis in periodic structures
NASA Astrophysics Data System (ADS)
Fu, Zhuojia; Chen, Wen; Wen, Pihua; Zhang, Chuanzeng
2018-07-01
A strong-form boundary collocation method, the singular boundary method (SBM), is developed in this paper for the wave propagation analysis at low and moderate wavenumbers in periodic structures. The SBM is of several advantages including mathematically simple, easy-to-program, meshless with the application of the concept of origin intensity factors in order to eliminate the singularity of the fundamental solutions and avoid the numerical evaluation of the singular integrals in the boundary element method. Due to the periodic behaviors of the structures, the SBM coefficient matrix can be represented as a block Toeplitz matrix. By employing three different fast Toeplitz-matrix solvers, the computational time and storage requirements are significantly reduced in the proposed SBM analysis. To demonstrate the effectiveness of the proposed SBM formulation for wave propagation analysis in periodic structures, several benchmark examples are presented and discussed The proposed SBM results are compared with the analytical solutions, the reference results and the COMSOL software.
A subagging regression method for estimating the qualitative and quantitative state of groundwater
NASA Astrophysics Data System (ADS)
Jeong, J.; Park, E.; Choi, J.; Han, W. S.; Yun, S. T.
2016-12-01
A subagging regression (SBR) method for the analysis of groundwater data pertaining to the estimation of trend and the associated uncertainty is proposed. The SBR method is validated against synthetic data competitively with other conventional robust and non-robust methods. From the results, it is verified that the estimation accuracies of the SBR method are consistent and superior to those of the other methods and the uncertainties are reasonably estimated where the others have no uncertainty analysis option. To validate further, real quantitative and qualitative data are employed and analyzed comparatively with Gaussian process regression (GPR). For all cases, the trend and the associated uncertainties are reasonably estimated by SBR, whereas the GPR has limitations in representing the variability of non-Gaussian skewed data. From the implementations, it is determined that the SBR method has potential to be further developed as an effective tool of anomaly detection or outlier identification in groundwater state data.
Employment Situation of Parents of Long-Term Childhood Cancer Survivors
Mader, Luzius; Rueegg, Corina S.; Vetsch, Janine; Rischewski, Johannes; Ansari, Marc; Kuehni, Claudia E.; Michel, Gisela
2016-01-01
Background Taking care of children diagnosed with cancer affects parents’ professional life. The impact in the long-term however, is not clear. We aimed to compare the employment situation of parents of long-term childhood cancer survivors with control parents of the general population, and to identify clinical and socio-demographic factors associated with parental employment. Methods As part of the Swiss Childhood Cancer Survivor Study, we sent a questionnaire to parents of survivors aged 5–15 years, who survived ≥5 years after diagnosis. Information on control parents of the general population came from the Swiss Health Survey (restricted to men and women with ≥1 child aged 5–15 years). Employment was categorized as not employed, part-time, and full-time employed. We used generalized ordered logistic regression to determine associations with clinical and socio-demographic factors. Clinical data was available from the Swiss Childhood Cancer Registry. Results We included 394 parent-couples of survivors and 3’341 control parents (1’731 mothers; 1’610 fathers). Mothers of survivors were more often not employed (29% versus 22%; ptrend = 0.007). However, no differences between mothers were found in multivariable analysis. Fathers of survivors were more often employed full-time (93% versus 87%; ptrend = 0.002), which remained significant in multivariable analysis. Among parents of survivors, mothers with tertiary education (OR = 2.40, CI:1.14–5.07) were more likely to be employed. Having a migration background (OR = 3.63, CI: 1.71–7.71) increased the likelihood of being full-time employed in mothers of survivors. Less likely to be employed were mothers of survivors diagnosed with lymphoma (OR = 0.31, CI:0.13–0.73) and >2 children (OR = 0.48, CI:0.30–0.75); and fathers of survivors who had had a relapse (OR = 0.13, CI:0.04–0.36). Conclusion Employment situation of parents of long-term survivors reflected the more traditional parenting roles. Specific support for parents with low education, additional children, and whose child had a more severe cancer disease could improve their long-term employment situation. PMID:26990301
Clark, Neil R.; Szymkiewicz, Maciej; Wang, Zichen; Monteiro, Caroline D.; Jones, Matthew R.; Ma’ayan, Avi
2016-01-01
Gene set analysis of differential expression, which identifies collectively differentially expressed gene sets, has become an important tool for biology. The power of this approach lies in its reduction of the dimensionality of the statistical problem and its incorporation of biological interpretation by construction. Many approaches to gene set analysis have been proposed, but benchmarking their performance in the setting of real biological data is difficult due to the lack of a gold standard. In a previously published work we proposed a geometrical approach to differential expression which performed highly in benchmarking tests and compared well to the most popular methods of differential gene expression. As reported, this approach has a natural extension to gene set analysis which we call Principal Angle Enrichment Analysis (PAEA). PAEA employs dimensionality reduction and a multivariate approach for gene set enrichment analysis. However, the performance of this method has not been assessed nor its implementation as a web-based tool. Here we describe new benchmarking protocols for gene set analysis methods and find that PAEA performs highly. The PAEA method is implemented as a user-friendly web-based tool, which contains 70 gene set libraries and is freely available to the community. PMID:26848405
Clark, Neil R; Szymkiewicz, Maciej; Wang, Zichen; Monteiro, Caroline D; Jones, Matthew R; Ma'ayan, Avi
2015-11-01
Gene set analysis of differential expression, which identifies collectively differentially expressed gene sets, has become an important tool for biology. The power of this approach lies in its reduction of the dimensionality of the statistical problem and its incorporation of biological interpretation by construction. Many approaches to gene set analysis have been proposed, but benchmarking their performance in the setting of real biological data is difficult due to the lack of a gold standard. In a previously published work we proposed a geometrical approach to differential expression which performed highly in benchmarking tests and compared well to the most popular methods of differential gene expression. As reported, this approach has a natural extension to gene set analysis which we call Principal Angle Enrichment Analysis (PAEA). PAEA employs dimensionality reduction and a multivariate approach for gene set enrichment analysis. However, the performance of this method has not been assessed nor its implementation as a web-based tool. Here we describe new benchmarking protocols for gene set analysis methods and find that PAEA performs highly. The PAEA method is implemented as a user-friendly web-based tool, which contains 70 gene set libraries and is freely available to the community.
Economic Impacts of Wind Turbine Development in U.S. Counties
DOE Office of Scientific and Technical Information (OSTI.GOV)
J., Brown; B., Hoen; E., Lantz
2011-07-25
The objective is to address the research question using post-project construction, county-level data, and econometric evaluation methods. Wind energy is expanding rapidly in the United States: Over the last 4 years, wind power has contributed approximately 35 percent of all new electric power capacity. Wind power plants are often developed in rural areas where local economic development impacts from the installation are projected, including land lease and property tax payments and employment growth during plant construction and operation. Wind energy represented 2.3 percent of the U.S. electricity supply in 2010, but studies show that penetrations of at least 20 percentmore » are feasible. Several studies have used input-output models to predict direct, indirect, and induced economic development impacts. These analyses have often been completed prior to project construction. Available studies have not yet investigated the economic development impacts of wind development at the county level using post-construction econometric evaluation methods. Analysis of county-level impacts is limited. However, previous county-level analyses have estimated operation-period employment at 0.2 to 0.6 jobs per megawatt (MW) of power installed and earnings at $9,000/MW to $50,000/MW. We find statistically significant evidence of positive impacts of wind development on county-level per capita income from the OLS and spatial lag models when they are applied to the full set of wind and non-wind counties. The total impact on annual per capita income of wind turbine development (measured in MW per capita) in the spatial lag model was $21,604 per MW. This estimate is within the range of values estimated in the literature using input-output models. OLS results for the wind-only counties and matched samples are similar in magnitude, but are not statistically significant at the 10-percent level. We find a statistically significant impact of wind development on employment in the OLS analysis for wind counties only, but not in the other models. Our estimates of employment impacts are not precise enough to assess the validity of employment impacts from input-output models applied in advance of wind energy project construction. The analysis provides empirical evidence of positive income effects at the county level from cumulative wind turbine development, consistent with the range of impacts estimated using input-output models. Employment impacts are less clear.« less
NASA Astrophysics Data System (ADS)
Lao, Jiashun; Nie, He; Jiang, Yonghong
2018-06-01
This paper employs SBW proposed by Baker and Wurgler (2006) to investigate the nonlinear asymmetric Granger causality between investor sentiment and stock returns for US economy while considering different time-scales. The wavelet method is utilized to decompose time series of investor sentiment and stock returns at different time-scales to focus on the local analysis of different time horizons of investors. The linear and nonlinear asymmetric Granger methods are employed to examine the Granger causal relationship on similar time-scales. We find evidence of strong bilateral linear and nonlinear asymmetric Granger causality between longer-term investor sentiment and stock returns. Furthermore, we observe the positive nonlinear causal relationship from stock returns to investor sentiment and the negative nonlinear causal relationship from investor sentiment to stock returns.
Preparation of Ag-loaded octahedral Bi2WO6 photocatalyst and its photocatalytic activity
NASA Astrophysics Data System (ADS)
An, Liang; Wang, Guanghui; Zhou, Xuan; Wang, Yi; Gao, Fang; Cheng, Yang
2014-12-01
In this work, an Ag-loaded octahedral Bi2WO6 photocatalyst has been successfully prepared by the hydrothermal method and photo deposition method. X-ray diffraction (XRD), energy dispersive analysis of X-ray (EDX), field-emission scanning electron microscopy (FE-SEM) and ultra-violet adsorption spectrum (UV-Vis) were employed for characterization of the composite photocatalyst. Furthermore, two different photocatalysts including the obtained Ag-loaded octahedral Bi2WO6 were employed here for photodegradation of model contaminated water of Orange II (OII). Results show that Ag-loaded Bi2WO6 photocatalyst exhibits superior photocatalytic properties compared to the undoped Bi2WO6. The reasons for improvement in photocatalytic activity of the Ag-loaded octahedral Bi2WO6 were also discussed.
Large-scale inverse model analyses employing fast randomized data reduction
NASA Astrophysics Data System (ADS)
Lin, Youzuo; Le, Ellen B.; O'Malley, Daniel; Vesselinov, Velimir V.; Bui-Thanh, Tan
2017-08-01
When the number of observations is large, it is computationally challenging to apply classical inverse modeling techniques. We have developed a new computationally efficient technique for solving inverse problems with a large number of observations (e.g., on the order of 107 or greater). Our method, which we call the randomized geostatistical approach (RGA), is built upon the principal component geostatistical approach (PCGA). We employ a data reduction technique combined with the PCGA to improve the computational efficiency and reduce the memory usage. Specifically, we employ a randomized numerical linear algebra technique based on a so-called "sketching" matrix to effectively reduce the dimension of the observations without losing the information content needed for the inverse analysis. In this way, the computational and memory costs for RGA scale with the information content rather than the size of the calibration data. Our algorithm is coded in Julia and implemented in the MADS open-source high-performance computational framework (http://mads.lanl.gov). We apply our new inverse modeling method to invert for a synthetic transmissivity field. Compared to a standard geostatistical approach (GA), our method is more efficient when the number of observations is large. Most importantly, our method is capable of solving larger inverse problems than the standard GA and PCGA approaches. Therefore, our new model inversion method is a powerful tool for solving large-scale inverse problems. The method can be applied in any field and is not limited to hydrogeological applications such as the characterization of aquifer heterogeneity.
Liang, Xianrui; Ma, Meiling; Su, Weike
2013-01-01
Background: A method for chemical fingerprint analysis of Hibiscus mutabilis L. leaves was developed based on ultra performance liquid chromatography with photodiode array detector (UPLC-PAD) combined with similarity analysis (SA) and hierarchical clustering analysis (HCA). Materials and Methods: 10 batches of Hibiscus mutabilis L. leaves samples were collected from different regions of China. UPLC-PAD was employed to collect chemical fingerprints of Hibiscus mutabilis L. leaves. Results: The relative standard deviations (RSDs) of the relative retention times (RRT) and relative peak areas (RPA) of 10 characteristic peaks (one of them was identified as rutin) in precision, repeatability and stability test were less than 3%, and the method of fingerprint analysis was validated to be suitable for the Hibiscus mutabilis L. leaves. Conclusions: The chromatographic fingerprints showed abundant diversity of chemical constituents qualitatively in the 10 batches of Hibiscus mutabilis L. leaves samples from different locations by similarity analysis on basis of calculating the correlation coefficients between each two fingerprints. Moreover, the HCA method clustered the samples into four classes, and the HCA dendrogram showed the close or distant relations among the 10 samples, which was consistent to the SA result to some extent. PMID:23930008
Analysis of high-incidence separated flow past airfoils
NASA Technical Reports Server (NTRS)
Chia, K. N.; Osswald, G. A.; Chia, U.
1989-01-01
An unsteady Navier-Stokes (NS) analysis is developed and used to carefully examine high-incidence aerodynamic separated flows past airfoils. Clustered conformal C-grids are employed for the 12 percent thick symmetric Joukowski airfoil as well as for the NACA 0012 airfoil with a sharp trailing edge. The clustering is controlled by appropriate one-dimensional stretching transformations. An attempt is made to resolve many of the dominant scales of an unsteady flow with massive separation, while maintaining the transformation metrics to be smooth and continuous in the entire flow field. A fully implicit time-marching alternating-direction implicit-block Gaussian elimination (ADI-BGE) method is employed, in which no use is made of any explicit artificial dissipation. Detailed results are obtained for massively separated, unsteady flow past symmetric Joukowski and NACA 0012 airfoils.
Seyyed Alizadeh Ganji, Seyyed Mohammad; Hayati, Mohammad
2018-06-05
The presence of cyanide ions in wastewater is dangerous to the health and life of living creatures, especially humans. Cyanide concentration should not exceed the acceptable limit in wastewaters to avoid their adverse effects to the environment. In this paper, in order to select the most appropriate method to remove cyanide from the wastewater of the Moteh gold mine, based on the experts' opinions, the use of calcium hypochlorite, sodium hypochlorite, and hydrogen peroxide was chosen as forerunning alternative in the form of a multi-stage model. Then, seven criteria including the amount of material consumed, ease of implementation, safety, ability to remove cyanide, pH, time, and cost of the process to assess the considered methods were determined. Afterwards, seven experts conducted numerous experiments to examine the conditions of each of these criteria. Then, by employing a mathematical method called "numerical taxonomy," the use of sodium hypochlorite was suggested as the best method to remove cyanide from the wastewater of the Moteh gold mine. Finally, the TOPSIS model was used to validate the proposed model, which led to the same results of the suggested method. Also, the results of employing taxonomic analysis and TOPSIS method suggested the use of sodium hypochlorite as the best method for cyanide removal from wastewater. In addition, according to the analysis of various experiments, conditions for complete removal of cyanide using sodium hypochlorite included concentration (8.64 g/L), pH (12.3), and temperature (12 °C).
Analysis of gene expression levels in individual bacterial cells without image segmentation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kwak, In Hae; Son, Minjun; Hagen, Stephen J., E-mail: sjhagen@ufl.edu
2012-05-11
Highlights: Black-Right-Pointing-Pointer We present a method for extracting gene expression data from images of bacterial cells. Black-Right-Pointing-Pointer The method does not employ cell segmentation and does not require high magnification. Black-Right-Pointing-Pointer Fluorescence and phase contrast images of the cells are correlated through the physics of phase contrast. Black-Right-Pointing-Pointer We demonstrate the method by characterizing noisy expression of comX in Streptococcus mutans. -- Abstract: Studies of stochasticity in gene expression typically make use of fluorescent protein reporters, which permit the measurement of expression levels within individual cells by fluorescence microscopy. Analysis of such microscopy images is almost invariably based on amore » segmentation algorithm, where the image of a cell or cluster is analyzed mathematically to delineate individual cell boundaries. However segmentation can be ineffective for studying bacterial cells or clusters, especially at lower magnification, where outlines of individual cells are poorly resolved. Here we demonstrate an alternative method for analyzing such images without segmentation. The method employs a comparison between the pixel brightness in phase contrast vs fluorescence microscopy images. By fitting the correlation between phase contrast and fluorescence intensity to a physical model, we obtain well-defined estimates for the different levels of gene expression that are present in the cell or cluster. The method reveals the boundaries of the individual cells, even if the source images lack the resolution to show these boundaries clearly.« less
Design of a high altitude long endurance flying-wing solar-powered unmanned air vehicle
NASA Astrophysics Data System (ADS)
Alsahlani, A. A.; Johnston, L. J.; Atcliffe, P. A.
2017-06-01
The low-Reynolds number environment of high-altitude §ight places severe demands on the aerodynamic design and stability and control of a high altitude, long endurance (HALE) unmanned air vehicle (UAV). The aerodynamic efficiency of a §ying-wing configuration makes it an attractive design option for such an application and is investigated in the present work. The proposed configuration has a high-aspect ratio, swept-wing planform, the wing sweep being necessary to provide an adequate moment arm for outboard longitudinal and lateral control surfaces. A design optimization framework is developed under a MATLAB environment, combining aerodynamic, structural, and stability analysis. Low-order analysis tools are employed to facilitate efficient computations, which is important when there are multiple optimization loops for the various engineering analyses. In particular, a vortex-lattice method is used to compute the wing planform aerodynamics, coupled to a twodimensional (2D) panel method to derive aerofoil sectional characteristics. Integral boundary-layer methods are coupled to the panel method in order to predict §ow separation boundaries during the design iterations. A quasi-analytical method is adapted for application to flyingwing con¦gurations to predict the wing weight and a linear finite-beam element approach is used for structural analysis of the wing-box. Stability is a particular concern in the low-density environment of high-altitude flight for flying-wing aircraft and so provision of adequate directional stability and control power forms part of the optimization process. At present, a modified Genetic Algorithm is used in all of the optimization loops. Each of the low-order engineering analysis tools is validated using higher-order methods to provide con¦dence in the use of these computationally-efficient tools in the present design-optimization framework. This paper includes the results of employing the present optimization tools in the design of a HALE, flying-wing UAV to indicate that this is a viable design configuration option.
Jenkinson, Garrett; Abante, Jordi; Feinberg, Andrew P; Goutsias, John
2018-03-07
DNA methylation is a stable form of epigenetic memory used by cells to control gene expression. Whole genome bisulfite sequencing (WGBS) has emerged as a gold-standard experimental technique for studying DNA methylation by producing high resolution genome-wide methylation profiles. Statistical modeling and analysis is employed to computationally extract and quantify information from these profiles in an effort to identify regions of the genome that demonstrate crucial or aberrant epigenetic behavior. However, the performance of most currently available methods for methylation analysis is hampered by their inability to directly account for statistical dependencies between neighboring methylation sites, thus ignoring significant information available in WGBS reads. We present a powerful information-theoretic approach for genome-wide modeling and analysis of WGBS data based on the 1D Ising model of statistical physics. This approach takes into account correlations in methylation by utilizing a joint probability model that encapsulates all information available in WGBS methylation reads and produces accurate results even when applied on single WGBS samples with low coverage. Using the Shannon entropy, our approach provides a rigorous quantification of methylation stochasticity in individual WGBS samples genome-wide. Furthermore, it utilizes the Jensen-Shannon distance to evaluate differences in methylation distributions between a test and a reference sample. Differential performance assessment using simulated and real human lung normal/cancer data demonstrate a clear superiority of our approach over DSS, a recently proposed method for WGBS data analysis. Critically, these results demonstrate that marginal methods become statistically invalid when correlations are present in the data. This contribution demonstrates clear benefits and the necessity of modeling joint probability distributions of methylation using the 1D Ising model of statistical physics and of quantifying methylation stochasticity using concepts from information theory. By employing this methodology, substantial improvement of DNA methylation analysis can be achieved by effectively taking into account the massive amount of statistical information available in WGBS data, which is largely ignored by existing methods.
Computer simulation of magnetic resonance spectra employing homotopy.
Gates, K E; Griffin, M; Hanson, G R; Burrage, K
1998-11-01
Multidimensional homotopy provides an efficient method for accurately tracing energy levels and hence transitions in the presence of energy level anticrossings and looping transitions. Herein we describe the application and implementation of homotopy to the analysis of continuous wave electron paramagnetic resonance spectra. The method can also be applied to electron nuclear double resonance, electron spin echo envelope modulation, solid-state nuclear magnetic resonance, and nuclear quadrupole resonance spectra. Copyright 1998 Academic Press.
Heuristic decomposition for non-hierarchic systems
NASA Technical Reports Server (NTRS)
Bloebaum, Christina L.; Hajela, P.
1991-01-01
Design and optimization is substantially more complex in multidisciplinary and large-scale engineering applications due to the existing inherently coupled interactions. The paper introduces a quasi-procedural methodology for multidisciplinary optimization that is applicable for nonhierarchic systems. The necessary decision-making support for the design process is provided by means of an embedded expert systems capability. The method employs a decomposition approach whose modularity allows for implementation of specialized methods for analysis and optimization within disciplines.
2001-08-01
Benefit the Direct and Indirect Methods 16 Figure Figure 1: Military Exchanges’ Food Sales for Fiscal Year 1999 6 Abbreviations AAFES Army and Air Force...military installation. As the franchisee , the exchange service builds and operates the restaurants and directly employs and trains the personnel. In...they do not routinely develop a business case analysis, which would include weighing financial benefits with other factors, when determining which
IMP: Interactive mass properties program. Volume 1: Program description
NASA Technical Reports Server (NTRS)
Stewart, W. A.
1976-01-01
A method of computing a weights and center of gravity analysis of a flight vehicle using interactive graphical capabilities of the Adage 340 computer is described. The equations used to calculate area, volume, and mass properties are based on elemental surface characteristics. The input/output methods employ the graphic support of the Adage computer. Several interactive program options are available for analyzing the mass properties of a vehicle. These options are explained.
The effectiveness of cartographic visualisations in landscape archaeology
NASA Astrophysics Data System (ADS)
Fairbairn, David
2018-05-01
The use of maps and other geovisualisation methods has been longstanding in archaeology. Archaeologists employ advanced contemporary tools in their data collection, analysis and presentation. Maps can be used to render the `big data' commonly collected by archaeological prospection techniques, but are also fundamental output instru-ments for the dissemination of archaeological interpretation and modelling. This paper addresses, through case studies, alternate methods of geovisualisation in archaeology and identifies the efficiencies of each.
Elastic Behavior of a Rubber Layer Bonded between Two Rigid Spheres.
1988-05-01
Cracking, Composites, Compressibility, Def ormition, Dilatancy, Elasticity, Elastomers , Failure, Fracture, Particle ’,-1tr1f6rcement, Rubber, Stress...Analysis. 2.AITRACT (Ca~mmi ON VOW...lds It Y MtE fIdnt & bp04 bo ambwe - Finite element methods ( FEM ) have been employed to calculate the stresses...deformations set up by compression or extension of the layer, using finite element methods ( FEM ) and not invoking the condition of incompressibility
Hernández-Hierro, J M; García-Villanova, R J; González-Martín, I
2008-08-01
The potential of the near infrared spectroscopy (NIRS) technique for the analysis of red paprika for aflatoxin B(1), ochratoxin A and total aflatoxins is explored. As a reference, the results from a chromatographic method with fluorescence detection (HPLC-FD) following an immunoaffinity cleanup (IAC) were employed. For the NIRS measurement, a remote reflectance fibre-optic probe was applied directly onto the samples of paprika. There was no need for pre-treatment or manipulation of the sample. The modified partial least squares (MPLS) algorithm was employed as a regression method. The multiple correlation coefficients (RSQ) and the prediction corrected standard errors (SEP(C)) were respectively 0.955 and 0.2 microg kg(-1), 0.853 and 2.3 microg kg(-1), 0.938 and 0.3 microg kg(-1) for aflatoxin B(1), ochratoxin A and total aflatoxins, respectively. The capacity for prediction of the developed model measured as ratio performance deviation (RPD) for aflatoxin B(1) (5.2), ochratoxin A (2.8) and total aflatoxins (4.4) indicate that NIRS technique using a fibre-optic probe offers an alternative for the determination of these three parameters in paprika, with an advantageously lower cost and higher speed as compared with the chemical method. Content of aflatoxin B(1) and total aflatoxins are the parameters currently employed by the food regulations to limit the levels of the four aflatoxins in many foodstuffs. In addition, aflatoxin B(1) itself is an excellent indicator for aflatoxins' contamination since it is always the most abundant and toxic.
Kozhimannil, Katy Backes; Attanasio, Laura B.; Johnson, Pamela Jo; Gjerdingen, Dwenda K.; McGovern, Patricia M.
2014-01-01
Background Rising rates of labor induction and cesarean delivery, especially when used without a medical reason, have generated concern among clinicians, women, and policymakers. Whether employment status affects pregnant women's childbirth-related care is not known. We estimated the relationship between prenatal employment and obstetric procedures, distinguishing whether women reported that the induction or cesarean was performed for medical reasons. Methods Using data from a nationally-representative sample of women who gave birth in U.S. hospitals (N=1,573), we used propensity score matching to reduce potential bias from non-random selection into employment. Outcomes were cesarean delivery and labor induction, with and without a self-reported medical reason. Exposure was prenatal employment status (full-time employment, not employed). We conducted separate analyses for unmatched and matched cohorts using multivariable regression models. Findings There were no differences in labor induction based on employment status. In unmatched analyses, employed women had higher odds of cesarean delivery overall (adjusted odds ratio (AOR) = 1.45, p=0.046) and cesarean delivery without medical reason (AOR=1.94, p=0.024). Adding an interaction term between employment and college education revealed no significant effects on cesarean without medical reason. There were no significant differences in cesarean delivery by employment status in the propensity score matched analysis. Conclusions Full-time prenatal employment is associated with higher odds of cesarean delivery, but this association was not explained by socio-economic status and no longer existed after accounting for socio-demographic differences by matching women employed full-time with similar women not employed during pregnancy. PMID:25213740
Barnett, Scott D.; Goetz, Lance L.; Toscano, Richard
2015-01-01
Background: Designing effective vocational programs for persons with spinal cord injury (SCI) is essential for improving return to work outcome following injury. The relationship between specific vocational services and positive employment outcome has not been empirically studied. Objective: To examine the association of specific vocational service activities as predictors of employment. Method: Secondary analysis of a randomized, controlled trial of evidence-based supported employment (EBSE) with 12-month follow-up data among 81 Veteran participants with SCI. Results: Primary activities recorded were vocational counseling (23.9%) and vocational case management (23.8%). As expected, job development and employment supports were the most time-consuming activities per appointment. Though the amount of time spent in weekly appointments did not differ by employment outcome, participants obtaining competitive employment averaged significantly more individual activities per appointment. Further, for these participants, job development or placement and employment follow-along or supports were more likely to occur and vocational counseling was less likely to occur. Community-based employment services, including job development or placement and employment follow-along or supports as part of a supported employment model, were associated with competitive employment outcomes. Office-based vocational counseling services, which are common to general models of vocational rehabilitation, were associated with a lack of employment. Conclusions: Vocational services that actively engage Veterans with SCI in job seeking and acquisition and that provide on-the-job support are more likely to lead to employment than general vocational counseling that involves only job preparation. PMID:25762858
Vectorization and parallelization of the finite strip method for dynamic Mindlin plate problems
NASA Technical Reports Server (NTRS)
Chen, Hsin-Chu; He, Ai-Fang
1993-01-01
The finite strip method is a semi-analytical finite element process which allows for a discrete analysis of certain types of physical problems by discretizing the domain of the problem into finite strips. This method decomposes a single large problem into m smaller independent subproblems when m harmonic functions are employed, thus yielding natural parallelism at a very high level. In this paper we address vectorization and parallelization strategies for the dynamic analysis of simply-supported Mindlin plate bending problems and show how to prevent potential conflicts in memory access during the assemblage process. The vector and parallel implementations of this method and the performance results of a test problem under scalar, vector, and vector-concurrent execution modes on the Alliant FX/80 are also presented.
Yang, Litao; Liang, Wanqi; Jiang, Lingxi; Li, Wenquan; Cao, Wei; Wilson, Zoe A; Zhang, Dabing
2008-01-01
Background Real-time PCR techniques are being widely used for nucleic acids analysis, but one limitation of current frequently employed real-time PCR is the high cost of the labeled probe for each target molecule. Results We describe a real-time PCR technique employing attached universal duplex probes (AUDP), which has the advantage of generating fluorescence by probe hydrolysis and strand displacement over current real-time PCR methods. AUDP involves one set of universal duplex probes in which the 5' end of the fluorescent probe (FP) and a complementary quenching probe (QP) lie in close proximity so that fluorescence can be quenched. The PCR primer pair with attached universal template (UT) and the FP are identical to the UT sequence. We have shown that the AUDP technique can be used for detecting multiple target DNA sequences in both simplex and duplex real-time PCR assays for gene expression analysis, genotype identification, and genetically modified organism (GMO) quantification with comparable sensitivity, reproducibility, and repeatability with other real-time PCR methods. Conclusion The results from GMO quantification, gene expression analysis, genotype identification, and GMO quantification using AUDP real-time PCR assays indicate that the AUDP real-time PCR technique has been successfully applied in nucleic acids analysis, and the developed AUDP real-time PCR technique will offer an alternative way for nucleic acid analysis with high efficiency, reliability, and flexibility at low cost. PMID:18522756
Two Paradoxes in Linear Regression Analysis.
Feng, Ge; Peng, Jing; Tu, Dongke; Zheng, Julia Z; Feng, Changyong
2016-12-25
Regression is one of the favorite tools in applied statistics. However, misuse and misinterpretation of results from regression analysis are common in biomedical research. In this paper we use statistical theory and simulation studies to clarify some paradoxes around this popular statistical method. In particular, we show that a widely used model selection procedure employed in many publications in top medical journals is wrong. Formal procedures based on solid statistical theory should be used in model selection.
Dynamic response of a viscoelastic Timoshenko beam
NASA Technical Reports Server (NTRS)
Kalyanasundaram, S.; Allen, D. H.; Schapery, R. A.
1987-01-01
The analysis presented in this study deals with the vibratory response of viscoelastic Timoshenko (1955) beams under the assumption of small material loss tangents. The appropriate method of analysis employed here may be applied to more complex structures. This study compares the damping ratios obtained from the Timoshenko and Euler-Bernoulli theories for a given viscoelastic material system. From this study the effect of shear deformation and rotary inertia on damping ratios can be identified.
Radar signal analysis of ballistic missile with micro-motion based on time-frequency distribution
NASA Astrophysics Data System (ADS)
Wang, Jianming; Liu, Lihua; Yu, Hua
2015-12-01
The micro-motion of ballistic missile targets induces micro-Doppler modulation on the radar return signal, which is a unique feature for the warhead discrimination during flight. In order to extract the micro-Doppler feature of ballistic missile targets, time-frequency analysis is employed to process the micro-Doppler modulated time-varying radar signal. The images of time-frequency distribution (TFD) reveal the micro-Doppler modulation characteristic very well. However, there are many existing time-frequency analysis methods to generate the time-frequency distribution images, including the short-time Fourier transform (STFT), Wigner distribution (WD) and Cohen class distribution, etc. Under the background of ballistic missile defence, the paper aims at working out an effective time-frequency analysis method for ballistic missile warhead discrimination from the decoys.
Walker, Rebecca L; Morrissey, Clair
2014-11-01
While bioethics as a field has concerned itself with methodological issues since the early years, there has been no systematic examination of how ethics is incorporated into research on the Ethical, Legal and Social Implications (ELSI) of the Human Genome Project. Yet ELSI research may bear a particular burden of investigating and substantiating its methods given public funding, an explicitly cross-disciplinary approach, and the perceived significance of adequate responsiveness to advances in genomics. We undertook a qualitative content analysis of a sample of ELSI publications appearing between 2003 and 2008 with the aim of better understanding the methods, aims, and approaches to ethics that ELSI researchers employ. We found that the aims of ethics within ELSI are largely prescriptive and address multiple groups. We also found that the bioethics methods used in the ELSI literature are both diverse between publications and multiple within publications, but are usually not themselves discussed or employed as suggested by bioethics method proponents. Ethics in ELSI is also sometimes undistinguished from related inquiries (such as social, legal, or political investigations). © 2013 John Wiley & Sons Ltd.
Walker, Rebecca; Morrissey, Clair
2013-01-01
While bioethics as a field has concerned itself with methodological issues since the early years, there has been no systematic examination of how ethics is incorporated into research on the Ethical, Legal and Social Implications (ELSI) of the Human Genome Project. Yet ELSI research may bear a particular burden of investigating and substantiating its methods given public funding, an explicitly cross-disciplinary approach, and the perceived significance of adequate responsiveness to advances in genomics. We undertook a qualitative content analysis of a sample of ELSI publications appearing between 2003-2008 with the aim of better understanding the methods, aims, and approaches to ethics that ELSI researchers employ. We found that the aims of ethics within ELSI are largely prescriptive and address multiple groups. We also found that the bioethics methods used in the ELSI literature are both diverse between publications and multiple within publications, but are usually not themselves discussed or employed as suggested by bioethics method proponents. Ethics in ELSI is also sometimes undistinguished from related inquiries (such as social, legal, or political investigations). PMID:23796275
Guo, C; Hu, J-Y; Chen, X-Y; Li, J-Z
2008-02-01
An analytical method for the determination imazaquin residues in soybeans was developed. The developed liquid/liquid partition and strong anion exchange solid-phase extraction procedures provide the effective cleanup, removing the greatest number of sample matrix interferences. By optimizing mobile-phase pH water/acetonitrile conditions with phosphoric acid, using a C-18 reverse-phase chromatographic column and employing ultraviolet detection, excellent peak resolution was achieved. The combined cleanup and chromatographic method steps reported herein were sensitive and reliable for determining the imazaquin residues in soybean samples. This method is characterized by recovery >88.4%, precision <6.7% CV, and sensitivity of 0.005 ppm, in agreement with directives for method validation in residue analysis. Imazaquin residues in soybeans were further confirmed by high performance liquid chromatography-mass spectrometry (LC-MS). The proposed method was successfully applied to the analysis of imazaquin residues in soybean samples grown in an experimental field after treatments of imazaquin formulation.
NASA Technical Reports Server (NTRS)
Gaebler, John A.; Tolson, Robert H.
2010-01-01
In the study of entry, descent, and landing, Monte Carlo sampling methods are often employed to study the uncertainty in the designed trajectory. The large number of uncertain inputs and outputs, coupled with complicated non-linear models, can make interpretation of the results difficult. Three methods that provide statistical insights are applied to an entry, descent, and landing simulation. The advantages and disadvantages of each method are discussed in terms of the insights gained versus the computational cost. The first method investigated was failure domain bounding which aims to reduce the computational cost of assessing the failure probability. Next a variance-based sensitivity analysis was studied for the ability to identify which input variable uncertainty has the greatest impact on the uncertainty of an output. Finally, probabilistic sensitivity analysis is used to calculate certain sensitivities at a reduced computational cost. These methods produce valuable information that identifies critical mission parameters and needs for new technology, but generally at a significant computational cost.
Dettmer, Katja; Stevens, Axel P; Fagerer, Stephan R; Kaspar, Hannelore; Oefner, Peter J
2012-01-01
Two mass spectrometry-based methods for the quantitative analysis of free amino acids are described. The first method uses propyl chloroformate/propanol derivatization and gas chromatography-quadrupole mass spectrometry (GC-qMS) analysis in single-ion monitoring mode. Derivatization is carried out directly in aqueous samples, thereby allowing automation of the entire procedure, including addition of reagents, extraction, and injection into the GC-MS. The method delivers the quantification of 26 amino acids. The isobaric tagging for relative and absolute quantification (iTRAQ) method employs the labeling of amino acids with isobaric iTRAQ tags. The tags contain two different cleavable reporter ions, one for the sample and one for the standard, which are detected by fragmentation in a tandem mass spectrometer. Reversed-phase liquid chromatography of the labeled amino acids is performed prior to mass spectrometric analysis to separate isobaric amino acids. The commercial iTRAQ kit allows for the analysis of 42 physiological amino acids with a respective isotope-labeled standard for each of these 42 amino acids.
Borges, Sivanildo S.; Reis, Boaventura F.
2011-01-01
A photometric procedure for the determination of ClO− in tap water employing a miniaturized multicommuted flow analysis setup and an LED-based photometer is described. The analytical procedure was implemented using leucocrystal violet (LCV; 4,4′,4′′-methylidynetris (N,N-dimethylaniline), C25H31N3) as a chromogenic reagent. Solenoid micropumps employed for solutions propelling were assembled together with the photometer in order to compose a compact unit of small dimensions. After control variables optimization, the system was applied for the determination of ClO− in samples of tap water, and aiming accuracy assessment samples were also analyzed using an independent method. Applying the paired t-test between results obtained using both methods, no significant difference at the 95% confidence level was observed. Other useful features include low reagent consumption, 2.4 μg of LCV per determination, a linear response ranging from 0.02 up to 2.0 mg L−1 ClO−, a relative standard deviation of 1.0% (n = 11) for samples containing 0.2 mg L−1 ClO−, a detection limit of 6.0 μg L−1 ClO−, a sampling throughput of 84 determinations per hour, and a waste generation of 432 μL per determination. PMID:21747732