Brief Experimental Analyses of Academic Performance: Introduction to the Special Series
ERIC Educational Resources Information Center
McComas, Jennifer J.; Burns, Matthew K.
2009-01-01
Academic skills are frequent concerns in K-12 schools that could benefit from the application of applied behavior analysis (ABA). Brief experimental analysis (BEA) of academic performance is perhaps the most promising approach to apply ABA to student learning. Although research has consistently demonstrated the effectiveness of academic…
ERIC Educational Resources Information Center
Leitner, Karl-Heinz; Prikoszovits, Julia; Schaffhauser-Linzatti, Michaela; Stowasser, Rainer; Wagner, Karin
2007-01-01
This paper explores the performance efficiency of natural and technical science departments at Austrian universities using Data Envelopment Analysis (DEA). We present DEA as an alternative tool for benchmarking and ranking the assignment of decision-making units (organisations and organisational units). The method applies a multiple input and…
Comparative analysis of techniques for evaluating the effectiveness of aircraft computing systems
NASA Technical Reports Server (NTRS)
Hitt, E. F.; Bridgman, M. S.; Robinson, A. C.
1981-01-01
Performability analysis is a technique developed for evaluating the effectiveness of fault-tolerant computing systems in multiphase missions. Performability was evaluated for its accuracy, practical usefulness, and relative cost. The evaluation was performed by applying performability and the fault tree method to a set of sample problems ranging from simple to moderately complex. The problems involved as many as five outcomes, two to five mission phases, permanent faults, and some functional dependencies. Transient faults and software errors were not considered. A different analyst was responsible for each technique. Significantly more time and effort were required to learn performability analysis than the fault tree method. Performability is inherently as accurate as fault tree analysis. For the sample problems, fault trees were more practical and less time consuming to apply, while performability required less ingenuity and was more checkable. Performability offers some advantages for evaluating very complex problems.
NASA Technical Reports Server (NTRS)
Ruf, Joseph; Holt, James B.; Canabal, Francisco
1999-01-01
This paper presents the status of analyses on three Rocket Based Combined Cycle configurations underway in the Applied Fluid Dynamics Analysis Group (TD64). TD64 is performing computational fluid dynamics analysis on a Penn State RBCC test rig, the proposed Draco axisymmetric RBCC engine and the Trailblazer engine. The intent of the analysis on the Penn State test rig is to benchmark the Finite Difference Navier Stokes code for ejector mode fluid dynamics. The Draco engine analysis is a trade study to determine the ejector mode performance as a function of three engine design variables. The Trailblazer analysis is to evaluate the nozzle performance in scramjet mode. Results to date of each analysis are presented.
NASA Technical Reports Server (NTRS)
Ruf, Joseph H.; Holt, James B.; Canabal, Francisco
2001-01-01
This paper presents the status of analyses on three Rocket Based Combined Cycle (RBCC) configurations underway in the Applied Fluid Dynamics Analysis Group (TD64). TD64 is performing computational fluid dynamics (CFD) analysis on a Penn State RBCC test rig, the proposed Draco axisymmetric RBCC engine and the Trailblazer engine. The intent of the analysis on the Penn State test rig is to benchmark the Finite Difference Navier Stokes (FDNS) code for ejector mode fluid dynamics. The Draco analysis was a trade study to determine the ejector mode performance as a function of three engine design variables. The Trailblazer analysis is to evaluate the nozzle performance in scramjet mode. Results to date of each analysis are presented.
Advancing our thinking in presence-only and used-available analysis.
Warton, David; Aarts, Geert
2013-11-01
1. The problems of analysing used-available data and presence-only data are equivalent, and this paper uses this equivalence as a platform for exploring opportunities for advancing analysis methodology. 2. We suggest some potential methodological advances in used-available analysis, made possible via lessons learnt in the presence-only literature, for example, using modern methods to improve predictive performance. We also consider the converse - potential advances in presence-only analysis inspired by used-available methodology. 3. Notwithstanding these potential advances in methodology, perhaps a greater opportunity is in advancing our thinking about how to apply a given method to a particular data set. 4. It is shown by example that strikingly different results can be achieved for a single data set by applying a given method of analysis in different ways - hence having chosen a method of analysis, the next step of working out how to apply it is critical to performance. 5. We review some key issues to consider in deciding how to apply an analysis method: apply the method in a manner that reflects the study design; consider data properties; and use diagnostic tools to assess how reasonable a given analysis is for the data at hand. © 2013 The Authors. Journal of Animal Ecology © 2013 British Ecological Society.
Data analysis techniques used at the Oak Ridge Y-12 plant flywheel evaluation laboratory
NASA Astrophysics Data System (ADS)
Steels, R. S., Jr.; Babelay, E. F., Jr.
1980-07-01
Some of the more advanced data analysis techniques applied to the problem of experimentally evaluating the performance of high performance composite flywheels are presented. Real time applications include polar plots of runout with interruptions relating to balance and relative motions between parts, radial growth measurements, and temperature of the spinning part. The technique used to measure torque applied to a containment housing during flywheel failure is also presented. The discussion of pre and post test analysis techniques includes resonant frequency determination with modal analysis, waterfall charts, and runout signals at failure.
A Data Warehouse Architecture for DoD Healthcare Performance Measurements.
1999-09-01
design, develop, implement, and apply statistical analysis and data mining tools to a Data Warehouse of healthcare metrics. With the DoD healthcare...framework, this thesis defines a methodology to design, develop, implement, and apply statistical analysis and data mining tools to a Data Warehouse...21 F. INABILITY TO CONDUCT HELATHCARE ANALYSIS
Applied Behavior Analysis and the Imprisoned Adult Felon Project 1: The Cellblock Token Economy.
ERIC Educational Resources Information Center
Milan, Michael A.; And Others
This report provides a technical-level analysis, discussion, and summary of five experiments in applied behavior analysis. Experiment 1 examined the token economy as a basis for motivating inmate behavior; Experiment 2 examined the relationship between magnitude of token reinforcement and level of inmate performance; Experiment 3 introduced a…
NASA Astrophysics Data System (ADS)
Tran, Trang; Tran, Huy; Mansfield, Marc; Lyman, Seth; Crosman, Erik
2018-03-01
Four-dimensional data assimilation (FDDA) was applied in WRF-CMAQ model sensitivity tests to study the impact of observational and analysis nudging on model performance in simulating inversion layers and O3 concentration distributions within the Uintah Basin, Utah, U.S.A. in winter 2013. Observational nudging substantially improved WRF model performance in simulating surface wind fields, correcting a 10 °C warm surface temperature bias, correcting overestimation of the planetary boundary layer height (PBLH) and correcting underestimation of inversion strengths produced by regular WRF model physics without nudging. However, the combined effects of poor performance of WRF meteorological model physical parameterization schemes in simulating low clouds, and warm and moist biases in the temperature and moisture initialization and subsequent simulation fields, likely amplified the overestimation of warm clouds during inversion days when observational nudging was applied, impacting the resulting O3 photochemical formation in the chemistry model. To reduce the impact of a moist bias in the simulations on warm cloud formation, nudging with the analysis water mixing ratio above the planetary boundary layer (PBL) was applied. However, due to poor analysis vertical temperature profiles, applying analysis nudging also increased the errors in the modeled inversion layer vertical structure compared to observational nudging. Combining both observational and analysis nudging methods resulted in unrealistically extreme stratified stability that trapped pollutants at the lowest elevations at the center of the Uintah Basin and yielded the worst WRF performance in simulating inversion layer structure among the four sensitivity tests. The results of this study illustrate the importance of carefully considering the representativeness and quality of the observational and model analysis data sets when applying nudging techniques within stable PBLs, and the need to evaluate model results on a basin-wide scale.
A New Approach to Aircraft Robust Performance Analysis
NASA Technical Reports Server (NTRS)
Gregory, Irene M.; Tierno, Jorge E.
2004-01-01
A recently developed algorithm for nonlinear system performance analysis has been applied to an F16 aircraft to begin evaluating the suitability of the method for aerospace problems. The algorithm has a potential to be much more efficient than the current methods in performance analysis for aircraft. This paper is the initial step in evaluating this potential.
DOT National Transportation Integrated Search
1978-10-01
This report presents a method that may be used to evaluate the reliability of performance of individual subjects, particularly in applied laboratory research. The method is based on analysis of variance of a tasks-by-subjects data matrix, with all sc...
NASA Technical Reports Server (NTRS)
Evers, Ken H.; Bachert, Robert F.
1987-01-01
The IDEAL (Integrated Design and Engineering Analysis Languages) modeling methodology has been formulated and applied over a five-year period. It has proven to be a unique, integrated approach utilizing a top-down, structured technique to define and document the system of interest; a knowledge engineering technique to collect and organize system descriptive information; a rapid prototyping technique to perform preliminary system performance analysis; and a sophisticated simulation technique to perform in-depth system performance analysis.
Application of neural networks and sensitivity analysis to improved prediction of trauma survival.
Hunter, A; Kennedy, L; Henry, J; Ferguson, I
2000-05-01
The performance of trauma departments is widely audited by applying predictive models that assess probability of survival, and examining the rate of unexpected survivals and deaths. Although the TRISS methodology, a logistic regression modelling technique, is still the de facto standard, it is known that neural network models perform better. A key issue when applying neural network models is the selection of input variables. This paper proposes a novel form of sensitivity analysis, which is simpler to apply than existing techniques, and can be used for both numeric and nominal input variables. The technique is applied to the audit survival problem, and used to analyse the TRISS variables. The conclusions discuss the implications for the design of further improved scoring schemes and predictive models.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhou, Ning; Huang, Zhenyu; Tuffner, Francis K.
2010-02-28
Small signal stability problems are one of the major threats to grid stability and reliability. Prony analysis has been successfully applied on ringdown data to monitor electromechanical modes of a power system using phasor measurement unit (PMU) data. To facilitate an on-line application of mode estimation, this paper develops a recursive algorithm for implementing Prony analysis and proposed an oscillation detection method to detect ringdown data in real time. By automatically detecting ringdown data, the proposed method helps guarantee that Prony analysis is applied properly and timely on the ringdown data. Thus, the mode estimation results can be performed reliablymore » and timely. The proposed method is tested using Monte Carlo simulations based on a 17-machine model and is shown to be able to properly identify the oscillation data for on-line application of Prony analysis. In addition, the proposed method is applied to field measurement data from WECC to show the performance of the proposed algorithm.« less
Local regression type methods applied to the study of geophysics and high frequency financial data
NASA Astrophysics Data System (ADS)
Mariani, M. C.; Basu, K.
2014-09-01
In this work we applied locally weighted scatterplot smoothing techniques (Lowess/Loess) to Geophysical and high frequency financial data. We first analyze and apply this technique to the California earthquake geological data. A spatial analysis was performed to show that the estimation of the earthquake magnitude at a fixed location is very accurate up to the relative error of 0.01%. We also applied the same method to a high frequency data set arising in the financial sector and obtained similar satisfactory results. The application of this approach to the two different data sets demonstrates that the overall method is accurate and efficient, and the Lowess approach is much more desirable than the Loess method. The previous works studied the time series analysis; in this paper our local regression models perform a spatial analysis for the geophysics data providing different information. For the high frequency data, our models estimate the curve of best fit where data are dependent on time.
ERIC Educational Resources Information Center
Montoneri, Bernard; Lin, Tyrone T.; Lee, Chia-Chi; Huang, Shio-Ling
2012-01-01
This paper applies data envelopment analysis (DEA) to explore the quantitative relative efficiency of 18 classes of freshmen students studying a course of English conversation in a university of Taiwan from the academic year 2004-2006. A diagram of teaching performance improvement mechanism is designed to identify key performance indicators for…
ERIC Educational Resources Information Center
Luiselli, James K.; Bass, Jennifer D.; Whitcomb, Sara A.
2010-01-01
Staff training is a critical performance improvement objective within behavioral health care organizations. This study evaluated a systematic training program for teaching applied behavior analysis knowledge competencies to newly hired direct-care employees at a day and residential habilitation services agency for adults with intellectual and…
ERIC Educational Resources Information Center
Darabi, A. Aubteen
2005-01-01
This article reports a case study describing how the principles of a cognitive apprenticeship (CA) model developed by Collins, Brown, and Holum (1991) were applied to a graduate course on performance systems analysis (PSA), and the differences this application made in student performance and evaluation of the course compared to the previous…
ERIC Educational Resources Information Center
Shinn, Mark; And Others
Two studies were conducted to (1) analyze the subtest characteristics of the Woodcock-Johnson Psycho-Educational Battery, and (2) apply those results to an analysis of 50 fourth grade learning disabled (LD) students' performance on the Battery. Analyses indicated that the poorer performance of LD students on the Woodcock-Johnson Tests of Cognitive…
Nanometer scale composition study of MBE grown BGaN performed by atom probe tomography
NASA Astrophysics Data System (ADS)
Bonef, Bastien; Cramer, Richard; Speck, James S.
2017-06-01
Laser assisted atom probe tomography is used to characterize the alloy distribution in BGaN. The effect of the evaporation conditions applied on the atom probe specimens on the mass spectrum and the quantification of the III site atoms is first evaluated. The evolution of the Ga++/Ga+ charge state ratio is used to monitor the strength of the applied field. Experiments revealed that applying high electric fields on the specimen results in the loss of gallium atoms, leading to the over-estimation of boron concentration. Moreover, spatial analysis of the surface field revealed a significant loss of atoms at the center of the specimen where high fields are applied. A good agreement between X-ray diffraction and atom probe tomography concentration measurements is obtained when low fields are applied on the tip. A random distribution of boron in the BGaN layer grown by molecular beam epitaxy is obtained by performing accurate and site specific statistical distribution analysis.
Activity analysis: contributions to the innovation of projects for aircrafts cabins.
Rossi, N T; Greghi, F M; Menegon, L N; Souza, G B J
2012-01-01
This article presents results obtained from some ergonomics intervention in the project for the conception of aircraft's cabins. The study's aim is to analyze the contribution of the method adopted in the passengers' activities analysis in reference situations, real-use situations in aircraft's cabins, applied to analyze typical activities performed by people in their own environment. Within this perspective, the study shows two analyses which highlight the use of electronic device. The first analysis has been registered through a shooting filming in a real commercial flight. In the second one, the use is developed within the domestic environment. The same method has been applied in both contexts and it is based on activity analysis. Starting with the filming activity, postures and actions analysis, self-confrontation interviews, action course reconstruction and elaboration of postures envelopes. The results point out that the developed method might be applied to different contexts, evincing different ways of space occupation to meet human personal needs while performing an activity, which can help us with the anticipation of the users' needs, as well as indicate some innovation possibilities.
Giera, Brian; Bukosky, Scott; Lee, Elaine; ...
2018-01-23
Here, quantitative color analysis is performed on videos of high contrast, low power reversible electrophoretic deposition (EPD)-based displays operated under different applied voltages. This analysis is coded in an open-source software, relies on a color differentiation metric, ΔE * 00, derived from digital video, and provides an intuitive relationship between the operating conditions of the devices and their performance. Time-dependent ΔE * 00 color analysis reveals color relaxation behavior, recoverability for different voltage sequences, and operating conditions that can lead to optimal performance.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Giera, Brian; Bukosky, Scott; Lee, Elaine
Here, quantitative color analysis is performed on videos of high contrast, low power reversible electrophoretic deposition (EPD)-based displays operated under different applied voltages. This analysis is coded in an open-source software, relies on a color differentiation metric, ΔE * 00, derived from digital video, and provides an intuitive relationship between the operating conditions of the devices and their performance. Time-dependent ΔE * 00 color analysis reveals color relaxation behavior, recoverability for different voltage sequences, and operating conditions that can lead to optimal performance.
On the stiffness analysis of a cable driven leg exoskeleton.
Sanjeevi, N S S; Vashista, Vineet
2017-07-01
Robotic systems are being used for gait rehabilitation of patients with neurological disorder. These devices are externally powered to apply external forces on human limbs to assist the leg motion. Patients while walking with these devices adapt their walking pattern in response to the applied forces. The efficacy of a rehabilitation paradigm thus depends on the human-robot interaction. A cable driven leg exoskeleton (CDLE) use actuated cables to apply external joint torques on human leg. Cables are lightweight and flexible but can only be pulled, thus a CDLE requires redundant cables. Redundancy in CDLE can be utilized to appropriately tune a robot's performance. In this work, we present the stiffness analysis of CDLE. Different stiffness performance indices are established to study the role of system parameters in improving the human-robot interaction.
NASA Technical Reports Server (NTRS)
Barnett, Alan R.; Widrick, Timothy W.; Ludwiczak, Damian R.
1995-01-01
Solving for the displacements of free-free coupled systems acted upon by static loads is commonly performed throughout the aerospace industry. Many times, these problems are solved using static analysis with inertia relief. This solution technique allows for a free-free static analysis by balancing the applied loads with inertia loads generated by the applied loads. For some engineering applications, the displacements of the free-free coupled system induce additional static loads. Hence, the applied loads are equal to the original loads plus displacement-dependent loads. Solving for the final displacements of such systems is commonly performed using iterative solution techniques. Unfortunately, these techniques can be time-consuming and labor-intensive. Since the coupled system equations for free-free systems with displacement-dependent loads can be written in closed-form, it is advantageous to solve for the displacements in this manner. Implementing closed-form equations in static analysis with inertia relief is analogous to implementing transfer functions in dynamic analysis. Using a MSC/NASTRAN DMAP Alter, displacement-dependent loads have been included in static analysis with inertia relief. Such an Alter has been used successfully to solve efficiently a common aerospace problem typically solved using an iterative technique.
Wavelet versus detrended fluctuation analysis of multifractal structures
NASA Astrophysics Data System (ADS)
Oświȩcimka, Paweł; Kwapień, Jarosław; Drożdż, Stanisław
2006-07-01
We perform a comparative study of applicability of the multifractal detrended fluctuation analysis (MFDFA) and the wavelet transform modulus maxima (WTMM) method in proper detecting of monofractal and multifractal character of data. We quantify the performance of both methods by using different sorts of artificial signals generated according to a few well-known exactly soluble mathematical models: monofractal fractional Brownian motion, bifractal Lévy flights, and different sorts of multifractal binomial cascades. Our results show that in the majority of situations in which one does not know a priori the fractal properties of a process, choosing MFDFA should be recommended. In particular, WTMM gives biased outcomes for the fractional Brownian motion with different values of Hurst exponent, indicating spurious multifractality. In some cases WTMM can also give different results if one applies different wavelets. We do not exclude using WTMM in real data analysis, but it occurs that while one may apply MFDFA in a more automatic fashion, WTMM must be applied with care. In the second part of our work, we perform an analogous analysis on empirical data coming from the American and from the German stock market. For this data both methods detect rich multifractality in terms of broad f(α) , but MFDFA suggests that this multifractality is poorer than in the case of WTMM.
ERIC Educational Resources Information Center
Ibourk, Aomar
2013-01-01
Based on data from international surveys measuring learning (TIMSS), this article focuses on the analysis of the academic performance Moroccan students. The results of the econometric model show that the students' characteristics, their family environment and school context are key determinants of these performances. The study also shows that the…
NASA Technical Reports Server (NTRS)
Goldstein, Arthur W; Alpert, Sumner; Beede, William; Kovach, Karl
1949-01-01
In order to understand the operation and the interaction of jet-engine components during engine operation and to determine how component characteristics may be used to compute engine performance, a method to analyze and to estimate performance of such engines was devised and applied to the study of the characteristics of a research turbojet engine built for this investigation. An attempt was made to correlate turbine performance obtained from engine experiments with that obtained by the simpler procedure of separately calibrating the turbine with cold air as a driving fluid in order to investigate the applicability of component calibration. The system of analysis was also applied to prediction of the engine and component performance with assumed modifications of the burner and bearing characteristics, to prediction of component and engine operation during engine acceleration, and to estimates of the performance of the engine and the components when the exhaust gas was used to drive a power turbine.
Özdemir, Hatice; Özdoğan, Alper
2018-01-30
The aim of this study was to investigate that heat treatments with different numbers applied to superstructure porcelain whether effects microstructure and mechanical properties of lithium disilicate ceramic (LDC). Eighty disc-shaped specimens were fabricated from IPS e.max Press. Specimens were fired at heating values of porcelain in different numbers and divided four groups (n=5). Initial Vickers hardness were measured and X-ray diffraction (XRD) analysis was performed. Different surface treatment were applied and then Vickers hardness, surface roughness and environmental scanning electron microscopy (ESEM) analysis were performed. Data were analyzed with Varyans analysis and Tukey HSD test (α=0.05). Initial hardness among groups was no significant different (p>0.05), but hardness and surface roughness after surface treatments were significant different (p<0.05). Lithium disilicate (LD) peaks decrease depended on firing numbers. ESEM observations showed that firing number and surface treatments effect microstructure of LDC. Increasing firing numbers and surface treatments effect the microstructure of LDC.
Performance Analysis of Multilevel Parallel Applications on Shared Memory Architectures
NASA Technical Reports Server (NTRS)
Jost, Gabriele; Jin, Haoqiang; Labarta, Jesus; Gimenez, Judit; Caubet, Jordi; Biegel, Bryan A. (Technical Monitor)
2002-01-01
In this paper we describe how to apply powerful performance analysis techniques to understand the behavior of multilevel parallel applications. We use the Paraver/OMPItrace performance analysis system for our study. This system consists of two major components: The OMPItrace dynamic instrumentation mechanism, which allows the tracing of processes and threads and the Paraver graphical user interface for inspection and analyses of the generated traces. We describe how to use the system to conduct a detailed comparative study of a benchmark code implemented in five different programming paradigms applicable for shared memory
Ecological Regional Analysis Applied to Campus Sustainability Performance
ERIC Educational Resources Information Center
Weber, Shana; Newman, Julie; Hill, Adam
2017-01-01
Purpose: Sustainability performance in higher education is often evaluated at a generalized large scale. It remains unknown to what extent campus efforts address regional sustainability needs. This study begins to address this gap by evaluating trends in performance through the lens of regional environmental characteristics.…
Lunar Exploration Architecture Level Key Drivers and Sensitivities
NASA Technical Reports Server (NTRS)
Goodliff, Kandyce; Cirillo, William; Earle, Kevin; Reeves, J. D.; Shyface, Hilary; Andraschko, Mark; Merrill, R. Gabe; Stromgren, Chel; Cirillo, Christopher
2009-01-01
Strategic level analysis of the integrated behavior of lunar transportation and lunar surface systems architecture options is performed to assess the benefit, viability, affordability, and robustness of system design choices. This analysis employs both deterministic and probabilistic modeling techniques so that the extent of potential future uncertainties associated with each option are properly characterized. The results of these analyses are summarized in a predefined set of high-level Figures of Merit (FOMs) so as to provide senior NASA Constellation Program (CxP) and Exploration Systems Mission Directorate (ESMD) management with pertinent information to better inform strategic level decision making. The strategic level exploration architecture model is designed to perform analysis at as high a level as possible but still capture those details that have major impacts on system performance. The strategic analysis methodology focuses on integrated performance, affordability, and risk analysis, and captures the linkages and feedbacks between these three areas. Each of these results leads into the determination of the high-level FOMs. This strategic level analysis methodology has been previously applied to Space Shuttle and International Space Station assessments and is now being applied to the development of the Constellation Program point-of-departure lunar architecture. This paper provides an overview of the strategic analysis methodology and the lunar exploration architecture analyses to date. In studying these analysis results, the strategic analysis team has identified and characterized key drivers affecting the integrated architecture behavior. These key drivers include inclusion of a cargo lander, mission rate, mission location, fixed-versus- variable costs/return on investment, and the requirement for probabilistic analysis. Results of sensitivity analysis performed on lunar exploration architecture scenarios are also presented.
Dinç, Erdal; Ertekin, Zehra Ceren; Büker, Eda
2016-09-01
Two-way and three-way calibration models were applied to ultra high performance liquid chromatography with photodiode array data with coeluted peaks in the same wavelength and time regions for the simultaneous quantitation of ciprofloxacin and ornidazole in tablets. The chromatographic data cube (tensor) was obtained by recording chromatographic spectra of the standard and sample solutions containing ciprofloxacin and ornidazole with sulfadiazine as an internal standard as a function of time and wavelength. Parallel factor analysis and trilinear partial least squares were used as three-way calibrations for the decomposition of the tensor, whereas three-way unfolded partial least squares was applied as a two-way calibration to the unfolded dataset obtained from the data array of ultra high performance liquid chromatography with photodiode array detection. The validity and ability of two-way and three-way analysis methods were tested by analyzing validation samples: synthetic mixture, interday and intraday samples, and standard addition samples. Results obtained from two-way and three-way calibrations were compared to those provided by traditional ultra high performance liquid chromatography. The proposed methods, parallel factor analysis, trilinear partial least squares, unfolded partial least squares, and traditional ultra high performance liquid chromatography were successfully applied to the quantitative estimation of the solid dosage form containing ciprofloxacin and ornidazole. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Performance criteria for emergency medicine residents: a job analysis.
Blouin, Danielle; Dagnone, Jeffrey Damon
2008-11-01
A major role of admission interviews is to assess a candidate's suitability for a residency program. Structured interviews have greater reliability and validity than do unstructured ones. The development of content for a structured interview is typically based on the dimensions of performance that are perceived as important to succeed in a particular line of work. A formal job analysis is normally conducted to determine these dimensions. The dimensions essential to succeed as an emergency medicine (EM) resident have not yet been studied. We aimed to analyze the work of EM residents to determine these essential dimensions. The "critical incident technique" was used to generate scenarios of poor and excellent resident performance. Two reviewers independently read each scenario and labelled the performance dimensions that were reflected in each. All labels assigned to a particular scenario were pooled and reviewed again until a consensus was reached. Five faculty members (25% of our total faculty) comprised the subject experts. Fifty-one incidents were generated and 50 different labels were applied. Eleven dimensions of performance applied to at least 5 incidents. "Professionalism" was the most valued performance dimension, represented in 56% of the incidents, followed by "self-confidence" (22%), "experience" (20%) and "knowledge" (20%). "Professionalism," "self-confidence," "experience" and "knowledge" were identified as the performance dimensions essential to succeed as an EM resident based on our formal job analysis using the critical incident technique. Performing a formal job analysis may assist training program directors with developing admission interviews.
A comparison of heuristic and model-based clustering methods for dietary pattern analysis.
Greve, Benjamin; Pigeot, Iris; Huybrechts, Inge; Pala, Valeria; Börnhorst, Claudia
2016-02-01
Cluster analysis is widely applied to identify dietary patterns. A new method based on Gaussian mixture models (GMM) seems to be more flexible compared with the commonly applied k-means and Ward's method. In the present paper, these clustering approaches are compared to find the most appropriate one for clustering dietary data. The clustering methods were applied to simulated data sets with different cluster structures to compare their performance knowing the true cluster membership of observations. Furthermore, the three methods were applied to FFQ data assessed in 1791 children participating in the IDEFICS (Identification and Prevention of Dietary- and Lifestyle-Induced Health Effects in Children and Infants) Study to explore their performance in practice. The GMM outperformed the other methods in the simulation study in 72 % up to 100 % of cases, depending on the simulated cluster structure. Comparing the computationally less complex k-means and Ward's methods, the performance of k-means was better in 64-100 % of cases. Applied to real data, all methods identified three similar dietary patterns which may be roughly characterized as a 'non-processed' cluster with a high consumption of fruits, vegetables and wholemeal bread, a 'balanced' cluster with only slight preferences of single foods and a 'junk food' cluster. The simulation study suggests that clustering via GMM should be preferred due to its higher flexibility regarding cluster volume, shape and orientation. The k-means seems to be a good alternative, being easier to use while giving similar results when applied to real data.
Quantitation of flavonoid constituents in citrus fruits.
Kawaii, S; Tomono, Y; Katase, E; Ogawa, K; Yano, M
1999-09-01
Twenty-four flavonoids have been determined in 66 Citrus species and near-citrus relatives, grown in the same field and year, by means of reversed phase high-performance liquid chromatography analysis. Statistical methods have been applied to find relations among the species. The F ratios of 21 flavonoids obtained by applying ANOVA analysis are significant, indicating that a classification of the species using these variables is reasonable to pursue. Principal component analysis revealed that the distributions of Citrus species belonging to different classes were largely in accordance with Tanaka's classification system.
Animal research in the Journal of Applied Behavior Analysis.
Edwards, Timothy L; Poling, Alan
2011-01-01
This review summarizes the 6 studies with nonhuman animal subjects that have appeared in the Journal of Applied Behavior Analysis and offers suggestions for future research in this area. Two of the reviewed articles described translational research in which pigeons were used to illustrate and examine behavioral phenomena of applied significance (say-do correspondence and fluency), 3 described interventions that changed animals' behavior (self-injury by a baboon, feces throwing and spitting by a chimpanzee, and unsafe trailer entry by horses) in ways that benefited the animals and the people in charge of them, and 1 described the use of trained rats that performed a service to humans (land-mine detection). We suggest that each of these general research areas merits further attention and that the Journal of Applied Behavior Analysis is an appropriate outlet for some of these publications.
AFRL Solid Propellant Laboratory Explosive Siting and Renovation Lessons Learned
2010-07-01
Area 1-30A explosive facility and provide consultation/support during the review process for each of the site plans. • Applied Engineering Services...provided consultation/support during the siting review process. • Applied Engineering Services (AES) Inc. performed a detailed structural, blast, thermal... Applied Engineering Services (AES) Inc. structural, blast, thermal and fragment hazard analysis to determine the appropriate siting values based on
Boot, Walter R; Sumner, Anna; Towne, Tyler J; Rodriguez, Paola; Anders Ericsson, K
2017-04-01
Video games are ideal platforms for the study of skill acquisition for a variety of reasons. However, our understanding of the development of skill and the cognitive representations that support skilled performance can be limited by a focus on game scores. We present an alternative approach to the study of skill acquisition in video games based on the tools of the Expert Performance Approach. Our investigation was motivated by a detailed analysis of the behaviors responsible for the superior performance of one of the highest scoring players of the video game Space Fortress (Towne, Boot, & Ericsson, ). This analysis revealed how certain behaviors contributed to his exceptional performance. In this study, we recruited a participant for a similar training regimen, but we collected concurrent and retrospective verbal protocol data throughout training. Protocol analysis revealed insights into strategies, errors, mental representations, and shifting game priorities. We argue that these insights into the developing representations that guided skilled performance could only easily have been derived from the tools of the Expert Performance Approach. We propose that the described approach could be applied to understand performance and skill acquisition in many different video games (and other short- to medium-term skill acquisition paradigms) and help reveal mechanisms of transfer from gameplay to other measures of laboratory and real-world performance. Copyright © 2016 Cognitive Science Society, Inc.
NASA Astrophysics Data System (ADS)
Saini, Abhishek; Ahmad, Dilshad; Patra, Karali
2016-04-01
Dielectric elastomers have received a great deal of attention recently as potential materials for many new types of sensors, actuators and future energy generators. When subjected to high electric field, dielectric elastomer membrane sandwiched between compliant electrodes undergoes large deformation with a fast response speed. Moreover, dielectric elastomers have high specific energy density, toughness, flexibility and shape processability. Therefore, dielectric elastomer membranes have gained importance to be applied as micro pumps for microfluidics and biomedical applications. This work intends to extend the electromechanical performance analysis of inflated dielectric elastomer membranes to be applied as micro pumps. Mechanical burst test and cyclic tests were performed to investigate the mechanical breakdown and hysteresis loss of the dielectric membrane, respectively. Varying high electric field was applied on the inflated membrane under different static pressure to determine the electromechanical behavior and nonplanar actuation of the membrane. These tests were repeated for membranes with different pre-stretch values. Results show that pre-stretching improves the electromechanical performance of the inflated membrane. The present work will help to select suitable parameters for designing micro pumps using dielectric elastomer membrane. However this material lacks durability in operation.This issue also needs to be investigated further for realizing practical micro pumps.
The parameters effect on the structural performance of damaged steel box beam using Taguchi method
NASA Astrophysics Data System (ADS)
El-taly, Boshra A.; Abd El Hameed, Mohamed F.
2018-03-01
In the current study, the influence of notch or opening parameters and the positions of the applied load on the structural performance of steel box beams up to failure was investigated using Finite Element analysis program, ANSYS. The Taguchi-based design of experiments technique was used to plan the current study. The plan included 12 box steel beams; three intact beams, and nine damaged beams (with opening) in the beams web. The numerical studies were conducted under varying the spacing between the two concentrated point loads (location of applied loads), the notch (opening) position, and the ratio between depth and width of the notch with a constant notch area. According to Taguchi analysis, factor X (location of the applied loads) was found the highest contributing parameters for the variation of the ultimate load, vertical deformation, shear stresses, and the compressive normal stresses.
Applying Behavior Analytic Procedures to Effectively Teach Literacy Skills in the Classroom
ERIC Educational Resources Information Center
Joseph, Laurice M.; Alber-Morgan, Sheila; Neef, Nancy
2016-01-01
The purpose of this article is to discuss the application of behavior analytic procedures for advancing and evaluating methods for teaching literacy skills in the classroom. Particularly, applied behavior analysis has contributed substantially to examining the relationship between teacher behavior and student literacy performance. Teacher…
Jetfighter: An Experiential Value Chain Exercise
ERIC Educational Resources Information Center
Sheehan, Norman T.; Gamble, Edward N.
2010-01-01
Value chain analysis is widely taught in business schools and applied by practitioners to improve business performance. Despite its ubiquity, many students struggle to understand and apply value chain concepts in practice. JetFighter uses a complex manufacturing process (making intricate paper planes) to provide students an opportunity to enhance…
USDA-ARS?s Scientific Manuscript database
Computer Monte-Carlo (MC) simulations (Geant4) of neutron propagation and acquisition of gamma response from soil samples was applied to evaluate INS system performance characteristic [sensitivity, minimal detectable level (MDL)] for soil carbon measurement. The INS system model with best performanc...
Data-driven advice for applying machine learning to bioinformatics problems
Olson, Randal S.; La Cava, William; Mustahsan, Zairah; Varik, Akshay; Moore, Jason H.
2017-01-01
As the bioinformatics field grows, it must keep pace not only with new data but with new algorithms. Here we contribute a thorough analysis of 13 state-of-the-art, commonly used machine learning algorithms on a set of 165 publicly available classification problems in order to provide data-driven algorithm recommendations to current researchers. We present a number of statistical and visual comparisons of algorithm performance and quantify the effect of model selection and algorithm tuning for each algorithm and dataset. The analysis culminates in the recommendation of five algorithms with hyperparameters that maximize classifier performance across the tested problems, as well as general guidelines for applying machine learning to supervised classification problems. PMID:29218881
NASA Astrophysics Data System (ADS)
Geng, Li; Feng, Jiantao; Sun, Quanmei; Liu, Jing; Hua, Wenda; Li, Jing; Ao, Zhuo; You, Ke; Guo, Yanli; Liao, Fulong; Zhang, Youyi; Guo, Hongyan; Han, Jinsong; Xiong, Guangwu; Zhang, Lufang; Han, Dong
2015-09-01
Applying an atomic force microscope, we performed a nanomechanical analysis of morphologically normal cervical squamous cells (MNSCs) which are commonly used in cervical screening. Results showed that nanomechanical parameters of MNSCs correlate well with cervical malignancy, and may have potential in cancer screening to provide early diagnosis.Applying an atomic force microscope, we performed a nanomechanical analysis of morphologically normal cervical squamous cells (MNSCs) which are commonly used in cervical screening. Results showed that nanomechanical parameters of MNSCs correlate well with cervical malignancy, and may have potential in cancer screening to provide early diagnosis. Electronic supplementary information (ESI) available. See DOI: 10.1039/c5nr03662c
NASA Astrophysics Data System (ADS)
Jiang, Yao; Li, Tie-Min; Wang, Li-Ping
2015-09-01
This paper investigates the stiffness modeling of compliant parallel mechanism (CPM) based on the matrix method. First, the general compliance matrix of a serial flexure chain is derived. The stiffness modeling of CPMs is next discussed in detail, considering the relative positions of the applied load and the selected displacement output point. The derived stiffness models have simple and explicit forms, and the input, output, and coupling stiffness matrices of the CPM can easily be obtained. The proposed analytical model is applied to the stiffness modeling and performance analysis of an XY parallel compliant stage with input and output decoupling characteristics. Then, the key geometrical parameters of the stage are optimized to obtain the minimum input decoupling degree. Finally, a prototype of the compliant stage is developed and its input axial stiffness, coupling characteristics, positioning resolution, and circular contouring performance are tested. The results demonstrate the excellent performance of the compliant stage and verify the effectiveness of the proposed theoretical model. The general stiffness models provided in this paper will be helpful for performance analysis, especially in determining coupling characteristics, and the structure optimization of the CPM.
Separation analysis, a tool for analyzing multigrid algorithms
NASA Technical Reports Server (NTRS)
Costiner, Sorin; Taasan, Shlomo
1995-01-01
The separation of vectors by multigrid (MG) algorithms is applied to the study of convergence and to the prediction of the performance of MG algorithms. The separation operator for a two level cycle algorithm is derived. It is used to analyze the efficiency of the cycle when mixing of eigenvectors occurs. In particular cases the separation analysis reduces to Fourier type analysis. The separation operator of a two level cycle for a Schridubger eigenvalue problem, is derived and analyzed in a Fourier basis. Separation analysis gives information on how to choose performance relaxations and inter-level transfers. Separation analysis is a tool for analyzing and designing algorithms, and for optimizing their performance.
A CAD Approach to Integrating NDE With Finite Element
NASA Technical Reports Server (NTRS)
Abdul-Aziz, Ali; Downey, James; Ghosn, Louis J.; Baaklini, George Y.
2004-01-01
Nondestructive evaluation (NDE) is one of several technologies applied at NASA Glenn Research Center to determine atypical deformities, cracks, and other anomalies experienced by structural components. NDE consists of applying high-quality imaging techniques (such as x-ray imaging and computed tomography (CT)) to discover hidden manufactured flaws in a structure. Efforts are in progress to integrate NDE with the finite element (FE) computational method to perform detailed structural analysis of a given component. This report presents the core outlines for an in-house technical procedure that incorporates this combined NDE-FE interrelation. An example is presented to demonstrate the applicability of this analytical procedure. FE analysis of a test specimen is performed, and the resulting von Mises stresses and the stress concentrations near the anomalies are observed, which indicates the fidelity of the procedure. Additional information elaborating on the steps needed to perform such an analysis is clearly presented in the form of mini step-by-step guidelines.
Examination of Spectral Transformations on Spectral Mixture Analysis
NASA Astrophysics Data System (ADS)
Deng, Y.; Wu, C.
2018-04-01
While many spectral transformation techniques have been applied on spectral mixture analysis (SMA), few study examined their necessity and applicability. This paper focused on exploring the difference between spectrally transformed schemes and untransformed scheme to find out which transformed scheme performed better in SMA. In particular, nine spectrally transformed schemes as well as untransformed scheme were examined in two study areas. Each transformed scheme was tested 100 times using different endmember classes' spectra under the endmember model of vegetation- high albedo impervious surface area-low albedo impervious surface area-soil (V-ISAh-ISAl-S). Performance of each scheme was assessed based on mean absolute error (MAE). Statistical analysis technique, Paired-Samples T test, was applied to test the significance of mean MAEs' difference between transformed and untransformed schemes. Results demonstrated that only NSMA could exceed the untransformed scheme in all study areas. Some transformed schemes showed unstable performance since they outperformed the untransformed scheme in one area but weakened the SMA result in another region.
Rodriguez-Nogales, J M; Garcia, M C; Marina, M L
2006-02-03
A perfusion reversed-phase high performance liquid chromatography (RP-HPLC) method has been designed to allow rapid (3.4 min) separations of maize proteins with high resolution. Several factors, such as extraction conditions, temperature, detection wavelength and type and concentration of ion-pairing agent were optimised. A fine optimisation of the gradient elution was also performed by applying experimental design. Commercial maize products for human consumption (flours, precocked flours, fried snacks and extruded snacks) were characterised for the first time by perfusion RP-HPLC and their chromatographic profiles allowed a differentiation among products relating the different technological process used for their preparation. Furthermore, applying discriminant analysis makes it possible to group the samples according with the technological process suffered by maize products, obtaining a good prediction in 92% of the samples.
Identification of human operator performance models utilizing time series analysis
NASA Technical Reports Server (NTRS)
Holden, F. M.; Shinners, S. M.
1973-01-01
The results of an effort performed by Sperry Systems Management Division for AMRL in applying time series analysis as a tool for modeling the human operator are presented. This technique is utilized for determining the variation of the human transfer function under various levels of stress. The human operator's model is determined based on actual input and output data from a tracking experiment.
NAA For Human Serum Analysis: Comparison With Conventional Analyses
DOE Office of Scientific and Technical Information (OSTI.GOV)
Oliveira, Laura C.; Zamboni, Cibele B.; Medeiros, Jose A. G.
2010-08-04
Instrumental and Comparator methods of Neutron Activation Analysis (NAA) were applied to determine elements of clinical relevancy in serum samples of adult population (Sao Paulo city, Brazil). A comparison with the conventional analyses, Colorimetric for calcium, Titrymetric for chlorine and Ion Specific Electrode for sodium and potassium determination were also performed permitting a discussion about the performance of NAA methods for clinical chemistry research.
NASA Astrophysics Data System (ADS)
Fernandez, Carlos; Platero, Carlos; Campoy, Pascual; Aracil, Rafael
1994-11-01
This paper describes some texture-based techniques that can be applied to quality assessment of flat products continuously produced (metal strips, wooden surfaces, cork, textile products, ...). Since the most difficult task is that of inspecting for product appearance, human-like inspection ability is required. A common feature to all these products is the presence of non- deterministic texture on their surfaces. Two main subjects are discussed: statistical techniques for both surface finishing determination and surface defect analysis as well as real-time implementation for on-line inspection in high-speed applications. For surface finishing determination a Gray Level Difference technique is presented to perform over low resolution images, that is, no-zoomed images. Defect analysis is performed by means of statistical texture analysis over defective portions of the surface. On-line implementation is accomplished by means of neural networks. When a defect arises, textural analysis is applied which result in a data-vector, acting as input of a neural net, previously trained in a supervised way. This approach tries to reach on-line performance in automated visual inspection applications when texture is presented in flat product surfaces.
A crash course on data analysis in asteroseismology
NASA Astrophysics Data System (ADS)
Appourchaux, Thierry
2014-02-01
In this course, I try to provide a few basics required for performing data analysis in asteroseismology. First, I address how one can properly treat times series: the sampling, the filtering effect, the use of Fourier transform, the associated statistics. Second, I address how one can apply statistics for decision making and for parameter estimation either in a frequentist of a Bayesian framework. Last, I review how these basic principle have been applied (or not) in asteroseismology.
NASA Astrophysics Data System (ADS)
Tavakoli Taba, Seyedamir; Hossain, Liaquat; Heard, Robert; Brennan, Patrick; Lee, Warwick; Lewis, Sarah
2017-03-01
Rationale and objectives: Observer performance has been widely studied through examining the characteristics of individuals. Applying a systems perspective, while understanding of the system's output, requires a study of the interactions between observers. This research explains a mixed methods approach to applying a social network analysis (SNA), together with a more traditional approach of examining personal/ individual characteristics in understanding observer performance in mammography. Materials and Methods: Using social networks theories and measures in order to understand observer performance, we designed a social networks survey instrument for collecting personal and network data about observers involved in mammography performance studies. We present the results of a study by our group where 31 Australian breast radiologists originally reviewed 60 mammographic cases (comprising of 20 abnormal and 40 normal cases) and then completed an online questionnaire about their social networks and personal characteristics. A jackknife free response operating characteristic (JAFROC) method was used to measure performance of radiologists. JAFROC was tested against various personal and network measures to verify the theoretical model. Results: The results from this study suggest a strong association between social networks and observer performance for Australian radiologists. Network factors accounted for 48% of variance in observer performance, in comparison to 15.5% for the personal characteristics for this study group. Conclusion: This study suggest a strong new direction for research into improving observer performance. Future studies in observer performance should consider social networks' influence as part of their research paradigm, with equal or greater vigour than traditional constructs of personal characteristics.
System Sensitivity Analysis Applied to the Conceptual Design of a Dual-Fuel Rocket SSTO
NASA Technical Reports Server (NTRS)
Olds, John R.
1994-01-01
This paper reports the results of initial efforts to apply the System Sensitivity Analysis (SSA) optimization method to the conceptual design of a single-stage-to-orbit (SSTO) launch vehicle. SSA is an efficient, calculus-based MDO technique for generating sensitivity derivatives in a highly multidisciplinary design environment. The method has been successfully applied to conceptual aircraft design and has been proven to have advantages over traditional direct optimization methods. The method is applied to the optimization of an advanced, piloted SSTO design similar to vehicles currently being analyzed by NASA as possible replacements for the Space Shuttle. Powered by a derivative of the Russian RD-701 rocket engine, the vehicle employs a combination of hydrocarbon, hydrogen, and oxygen propellants. Three primary disciplines are included in the design - propulsion, performance, and weights & sizing. A complete, converged vehicle analysis depends on the use of three standalone conceptual analysis computer codes. Efforts to minimize vehicle dry (empty) weight are reported in this paper. The problem consists of six system-level design variables and one system-level constraint. Using SSA in a 'manual' fashion to generate gradient information, six system-level iterations were performed from each of two different starting points. The results showed a good pattern of convergence for both starting points. A discussion of the advantages and disadvantages of the method, possible areas of improvement, and future work is included.
Mendes, Paula; Nunes, Luis Miguel; Teixeira, Margarida Ribau
2014-09-01
This article demonstrates how decision-makers can be guided in the process of defining performance target values in the balanced scorecard system. We apply a method based on sensitivity analysis with Monte Carlo simulation to the municipal solid waste management system in Loulé Municipality (Portugal). The method includes two steps: sensitivity analysis of performance indicators to identify those performance indicators with the highest impact on the balanced scorecard model outcomes; and sensitivity analysis of the target values for the previously identified performance indicators. Sensitivity analysis shows that four strategic objectives (IPP1: Comply with the national waste strategy; IPP4: Reduce nonrenewable resources and greenhouse gases; IPP5: Optimize the life-cycle of waste; and FP1: Meet and optimize the budget) alone contribute 99.7% of the variability in overall balanced scorecard value. Thus, these strategic objectives had a much stronger impact on the estimated balanced scorecard outcome than did others, with the IPP1 and the IPP4 accounting for over 55% and 22% of the variance in overall balanced scorecard value, respectively. The remaining performance indicators contribute only marginally. In addition, a change in the value of a single indicator's target value made the overall balanced scorecard value change by as much as 18%. This may lead to involuntarily biased decisions by organizations regarding performance target-setting, if not prevented with the help of methods such as that proposed and applied in this study. © The Author(s) 2014.
Chiu, Chi-yang; Jung, Jeesun; Chen, Wei; Weeks, Daniel E; Ren, Haobo; Boehnke, Michael; Amos, Christopher I; Liu, Aiyi; Mills, James L; Ting Lee, Mei-ling; Xiong, Momiao; Fan, Ruzong
2017-01-01
To analyze next-generation sequencing data, multivariate functional linear models are developed for a meta-analysis of multiple studies to connect genetic variant data to multiple quantitative traits adjusting for covariates. The goal is to take the advantage of both meta-analysis and pleiotropic analysis in order to improve power and to carry out a unified association analysis of multiple studies and multiple traits of complex disorders. Three types of approximate F -distributions based on Pillai–Bartlett trace, Hotelling–Lawley trace, and Wilks's Lambda are introduced to test for association between multiple quantitative traits and multiple genetic variants. Simulation analysis is performed to evaluate false-positive rates and power of the proposed tests. The proposed methods are applied to analyze lipid traits in eight European cohorts. It is shown that it is more advantageous to perform multivariate analysis than univariate analysis in general, and it is more advantageous to perform meta-analysis of multiple studies instead of analyzing the individual studies separately. The proposed models require individual observations. The value of the current paper can be seen at least for two reasons: (a) the proposed methods can be applied to studies that have individual genotype data; (b) the proposed methods can be used as a criterion for future work that uses summary statistics to build test statistics to meta-analyze the data. PMID:28000696
Chiu, Chi-Yang; Jung, Jeesun; Chen, Wei; Weeks, Daniel E; Ren, Haobo; Boehnke, Michael; Amos, Christopher I; Liu, Aiyi; Mills, James L; Ting Lee, Mei-Ling; Xiong, Momiao; Fan, Ruzong
2017-02-01
To analyze next-generation sequencing data, multivariate functional linear models are developed for a meta-analysis of multiple studies to connect genetic variant data to multiple quantitative traits adjusting for covariates. The goal is to take the advantage of both meta-analysis and pleiotropic analysis in order to improve power and to carry out a unified association analysis of multiple studies and multiple traits of complex disorders. Three types of approximate F -distributions based on Pillai-Bartlett trace, Hotelling-Lawley trace, and Wilks's Lambda are introduced to test for association between multiple quantitative traits and multiple genetic variants. Simulation analysis is performed to evaluate false-positive rates and power of the proposed tests. The proposed methods are applied to analyze lipid traits in eight European cohorts. It is shown that it is more advantageous to perform multivariate analysis than univariate analysis in general, and it is more advantageous to perform meta-analysis of multiple studies instead of analyzing the individual studies separately. The proposed models require individual observations. The value of the current paper can be seen at least for two reasons: (a) the proposed methods can be applied to studies that have individual genotype data; (b) the proposed methods can be used as a criterion for future work that uses summary statistics to build test statistics to meta-analyze the data.
Applying simulation to optimize plastic molded optical parts
NASA Astrophysics Data System (ADS)
Jaworski, Matthew; Bakharev, Alexander; Costa, Franco; Friedl, Chris
2012-10-01
Optical injection molded parts are used in many different industries including electronics, consumer, medical and automotive due to their cost and performance advantages compared to alternative materials such as glass. The injection molding process, however, induces elastic (residual stress) and viscoelastic (flow orientation stress) deformation into the molded article which alters the material's refractive index to be anisotropic in different directions. Being able to predict and correct optical performance issues associated with birefringence early in the design phase is a huge competitive advantage. This paper reviews how to apply simulation analysis of the entire molding process to optimize manufacturability and part performance.
Wear behavior of AA 5083/SiC nano-particle metal matrix composite: Statistical analysis
NASA Astrophysics Data System (ADS)
Hussain Idrisi, Amir; Ismail Mourad, Abdel-Hamid; Thekkuden, Dinu Thomas; Christy, John Victor
2018-03-01
This paper reports study on statistical analysis of the wear characteristics of AA5083/SiC nanocomposite. The aluminum matrix composites with different wt % (0%, 1% and 2%) of SiC nanoparticles were fabricated by using stir casting route. The developed composites were used in the manufacturing of spur gears on which the study was conducted. A specially designed test rig was used in testing the wear performance of the gears. The wear was investigated under different conditions of applied load (10N, 20N, and 30N) and operation time (30 mins, 60 mins, 90 mins, and 120mins). The analysis carried out at room temperature under constant speed of 1450 rpm. The wear parameters were optimized by using Taguchi’s method. During this statistical approach, L27 Orthogonal array was selected for the analysis of output. Furthermore, analysis of variance (ANOVA) was used to investigate the influence of applied load, operation time and SiC wt. % on wear behaviour. The wear resistance was analyzed by selecting “smaller is better” characteristics as the objective of the model. From this research, it is observed that experiment time and SiC wt % have the most significant effect on the wear performance followed by the applied load.
Rater Severity in Large-Scale Assessment: Is It Invariant?
ERIC Educational Resources Information Center
McQueen, Joy; Congdon, Peter J.
A study was conducted to investigate the stability of rater severity over an extended rating period. Multifaceted Rasch analysis was applied to ratings of writing performances of 8,285 primary school (elementary) students. Each performance was rated on two performance dimensions by two trained raters over a period of 7 rating days. Performances…
Evaluating evaluation forms form.
Smith, Roger P
2004-02-01
To provide a tool for evaluating evaluation forms. A new form has been developed and tested on itself and a sample of evaluation forms obtained from the graduate medical education offices of several local universities. Additional forms from hospital administration were also subjected to analysis. The new form performed well when applied to itself. The form performed equally well when applied to the other (subject) forms, although their scores were embarrassingly poor. A new form for evaluating evaluation forms is needed, useful, and now available.
Large-scale structural analysis: The structural analyst, the CSM Testbed and the NAS System
NASA Technical Reports Server (NTRS)
Knight, Norman F., Jr.; Mccleary, Susan L.; Macy, Steven C.; Aminpour, Mohammad A.
1989-01-01
The Computational Structural Mechanics (CSM) activity is developing advanced structural analysis and computational methods that exploit high-performance computers. Methods are developed in the framework of the CSM testbed software system and applied to representative complex structural analysis problems from the aerospace industry. An overview of the CSM testbed methods development environment is presented and some numerical methods developed on a CRAY-2 are described. Selected application studies performed on the NAS CRAY-2 are also summarized.
Analysis Tools for CFD Multigrid Solvers
NASA Technical Reports Server (NTRS)
Mineck, Raymond E.; Thomas, James L.; Diskin, Boris
2004-01-01
Analysis tools are needed to guide the development and evaluate the performance of multigrid solvers for the fluid flow equations. Classical analysis tools, such as local mode analysis, often fail to accurately predict performance. Two-grid analysis tools, herein referred to as Idealized Coarse Grid and Idealized Relaxation iterations, have been developed and evaluated within a pilot multigrid solver. These new tools are applicable to general systems of equations and/or discretizations and point to problem areas within an existing multigrid solver. Idealized Relaxation and Idealized Coarse Grid are applied in developing textbook-efficient multigrid solvers for incompressible stagnation flow problems.
NASA Astrophysics Data System (ADS)
Badrzadeh, Honey; Sarukkalige, Ranjan; Jayawardena, A. W.
2013-12-01
Discrete wavelet transform was applied to decomposed ANN and ANFIS inputs.Novel approach of WNF with subtractive clustering applied for flow forecasting.Forecasting was performed in 1-5 step ahead, using multi-variate inputs.Forecasting accuracy of peak values and longer lead-time significantly improved.
Dual keel Space Station payload pointing system design and analysis feasibility study
NASA Technical Reports Server (NTRS)
Smagala, Tom; Class, Brian F.; Bauer, Frank H.; Lebair, Deborah A.
1988-01-01
A Space Station attached Payload Pointing System (PPS) has been designed and analyzed. The PPS is responsible for maintaining fixed payload pointing in the presence of disturbance applied to the Space Station. The payload considered in this analysis is the Solar Optical Telescope. System performance is evaluated via digital time simulations by applying various disturbance forces to the Space Station. The PPS meets the Space Station articulated pointing requirement for all disturbances except Shuttle docking and some centrifuge cases.
Arched-outer-race ball-bearing analysis considering centrifugal forces
NASA Technical Reports Server (NTRS)
Hamrock, B. J.; Anderson, W. J.
1972-01-01
A first-order thrust load analysis that considers centrifugal forces but which neglects gyroscopics, elastohydrodynamics, and thermal effects was performed. The analysis was applied to a 150-mm-bore angular-contact ball bearing. Fatigue life, contact loads, and contact angles are shown for conventional and arched bearings. The results indicate that an arched bearing is highly desirable for high-speed applications. In particular, at an applied load of 4448 n (1000 lb) and a DN value of 3 million (20,000 rpm) the arched bearing shows an improvement in life of 306 percent over that of a conventional bearing.
Applying Statistics in the Undergraduate Chemistry Laboratory: Experiments with Food Dyes.
ERIC Educational Resources Information Center
Thomasson, Kathryn; Lofthus-Merschman, Sheila; Humbert, Michelle; Kulevsky, Norman
1998-01-01
Describes several experiments to teach different aspects of the statistical analysis of data using household substances and a simple analysis technique. Each experiment can be performed in three hours. Students learn about treatment of spurious data, application of a pooled variance, linear least-squares fitting, and simultaneous analysis of dyes…
NASA Astrophysics Data System (ADS)
Trisutomo, S.
2017-07-01
Importance-Performance Analysis (IPA) has been widely applied in many cases. In this research, IPA was applied to measure perceive on coastal tourism objects and its possibility to be developed as coastal cruise tourism in Makassar. Three objects, i.e. Akkarena recreational site, Losari public space at waterfront, and Paotere traditional Phinisi ships port, were selected and assessed visually from water area by a group of purposive resource persons. The importance and performance of 10 attributes of each site were scored using Likert scale from 1 to 5. Data were processed by SPSS-21 than resulted Cartesian graph which the scores were divided in four quadrants: Quadrant I concentric here, Quadrant II keep up the good work, Quadrant III low priority, and Quadrant IV possible overkill. The attributes in each quadrant could be considered as the platform for preliminary planning of coastal cruise tour in Makassar
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rubin, H.; Bemporad, G.A.
This manuscript concerns the possible improvement of the conventional solar pond (CSP) performance by applying a multiselective injection and withdrawal procedure. The authors apply the term advanced solar pond (ASP), for a solar pond (SP) in which such a procedure is applied. The multiselective injection and withdrawal procedure creates in the SP a stratified thermal layer, namely a flowing layer which is subject to salinity and temperature stratification. This phenomenon is associated with reduction of heat losses into the atmosphere and an increase of the temperature of the fluid layer adjacent to the SP bottom. In the framework of thismore » study transport phenomena in the ASP are analyzed and simulated by applying a simplified mathematical model. The analysis and simulations indicate that the multiselective and withdrawal procedure may significantly improve the performance of the SP.« less
ERIC Educational Resources Information Center
Montoneri, Bernard
2013-01-01
Effective teaching performance is a crucial factor contributing to students' learning improvement. Students' ratings of teachers at the end of each semester can indirectly provide valuable information about teachers' performance. This paper selects classes of freshmen students taking a course of English in a university of Taiwan from the academic…
ERIC Educational Resources Information Center
Lee, Jaekyung
2010-01-01
This study examines potential consequences of the discrepancies between national and state performance standards for school funding in Kentucky and Maine. Applying the successful schools observation method and cost function analysis method to integrated data-sets that match schools' eight-grade mathematics test performance measures to district…
Reliability analysis of composite structures
NASA Technical Reports Server (NTRS)
Kan, Han-Pin
1992-01-01
A probabilistic static stress analysis methodology has been developed to estimate the reliability of a composite structure. Closed form stress analysis methods are the primary analytical tools used in this methodology. These structural mechanics methods are used to identify independent variables whose variations significantly affect the performance of the structure. Once these variables are identified, scatter in their values is evaluated and statistically characterized. The scatter in applied loads and the structural parameters are then fitted to appropriate probabilistic distribution functions. Numerical integration techniques are applied to compute the structural reliability. The predicted reliability accounts for scatter due to variability in material strength, applied load, fabrication and assembly processes. The influence of structural geometry and mode of failure are also considerations in the evaluation. Example problems are given to illustrate various levels of analytical complexity.
Liu, Xiaona; Zhang, Qiao; Wu, Zhisheng; Shi, Xinyuan; Zhao, Na; Qiao, Yanjiang
2015-01-01
Laser-induced breakdown spectroscopy (LIBS) was applied to perform a rapid elemental analysis and provenance study of Blumea balsamifera DC. Principal component analysis (PCA) and partial least squares discriminant analysis (PLS-DA) were implemented to exploit the multivariate nature of the LIBS data. Scores and loadings of computed principal components visually illustrated the differing spectral data. The PLS-DA algorithm showed good classification performance. The PLS-DA model using complete spectra as input variables had similar discrimination performance to using selected spectral lines as input variables. The down-selection of spectral lines was specifically focused on the major elements of B. balsamifera samples. Results indicated that LIBS could be used to rapidly analyze elements and to perform provenance study of B. balsamifera. PMID:25558999
A Feature Fusion Based Forecasting Model for Financial Time Series
Guo, Zhiqiang; Wang, Huaiqing; Liu, Quan; Yang, Jie
2014-01-01
Predicting the stock market has become an increasingly interesting research area for both researchers and investors, and many prediction models have been proposed. In these models, feature selection techniques are used to pre-process the raw data and remove noise. In this paper, a prediction model is constructed to forecast stock market behavior with the aid of independent component analysis, canonical correlation analysis, and a support vector machine. First, two types of features are extracted from the historical closing prices and 39 technical variables obtained by independent component analysis. Second, a canonical correlation analysis method is utilized to combine the two types of features and extract intrinsic features to improve the performance of the prediction model. Finally, a support vector machine is applied to forecast the next day's closing price. The proposed model is applied to the Shanghai stock market index and the Dow Jones index, and experimental results show that the proposed model performs better in the area of prediction than other two similar models. PMID:24971455
Semi-supervised vibration-based classification and condition monitoring of compressors
NASA Astrophysics Data System (ADS)
Potočnik, Primož; Govekar, Edvard
2017-09-01
Semi-supervised vibration-based classification and condition monitoring of the reciprocating compressors installed in refrigeration appliances is proposed in this paper. The method addresses the problem of industrial condition monitoring where prior class definitions are often not available or difficult to obtain from local experts. The proposed method combines feature extraction, principal component analysis, and statistical analysis for the extraction of initial class representatives, and compares the capability of various classification methods, including discriminant analysis (DA), neural networks (NN), support vector machines (SVM), and extreme learning machines (ELM). The use of the method is demonstrated on a case study which was based on industrially acquired vibration measurements of reciprocating compressors during the production of refrigeration appliances. The paper presents a comparative qualitative analysis of the applied classifiers, confirming the good performance of several nonlinear classifiers. If the model parameters are properly selected, then very good classification performance can be obtained from NN trained by Bayesian regularization, SVM and ELM classifiers. The method can be effectively applied for the industrial condition monitoring of compressors.
NASA Technical Reports Server (NTRS)
Gleman, Stuart M. (Inventor); Rowe, Geoffrey K. (Inventor)
1999-01-01
An ultrasonic bolt gage is described which uses a crosscorrelation algorithm to determine a tension applied to a fastener, such as a bolt. The cross-correlation analysis is preferably performed using a processor operating on a series of captured ultrasonic echo waveforms. The ultrasonic bolt gage is further described as using the captured ultrasonic echo waveforms to perform additional modes of analysis, such as feature recognition. Multiple tension data outputs, therefore, can be obtained from a single data acquisition for increased measurement reliability. In addition, one embodiment of the gage has been described as multi-channel, having a multiplexer for performing a tension analysis on one of a plurality of bolts.
NASA Astrophysics Data System (ADS)
El-Gafy, Inas
2017-10-01
Analysis the water-food-energy nexus is the first step to assess the decision maker in developing and evaluating national strategies that take into account the nexus. The main objective of the current research is providing a method for the decision makers to analysis the water-food-energy nexus of the crop production system at the national level and carrying out a quantitative assessment of it. Through the proposed method, indicators considering the water and energy consumption, mass productivity, and economic productivity were suggested. Based on these indicators a water-food-energy nexus index (WFENI) was performed. The study showed that the calculated WFENI of the Egyptian summer crops have scores that range from 0.21 to 0.79. Comparing to onion (the highest scoring WFENI,i.e., the best score), rice has the lowest WFENI among the summer food crops. Analysis of the water-food-energy nexus of forty-two Egyptian crops in year 2010 was caried out (energy consumed for irrigation represent 7.4% of the total energy footprint). WFENI can be applied to developed strategies for the optimal cropping pattern that minimizing the water and energy consumption and maximizing their productivity. It can be applied as a holistic tool to evaluate the progress in the water and agricultural national strategies. Moreover, WFENI could be applied yearly to evaluate the performance of the water-food-energy nexus managmant.
Reasons to value the health care intangible asset valuation.
Reilly, Robert F
2012-01-01
There are numerous individual reasons to conduct a health care intangible asset valuation. This discussion summarized many of these reasons and considered the common categories of these individual reasons. Understanding the reason for the intangible asset analysis is an important prerequisite to conducting the valuation, both for the analyst and the health care owner/operator. This is because an intangible asset valuation may not be the type of analysis that the owner/operator really needs. Rather, the owner/operator may really need an economic damages measurement, a license royalty rate analysis, an intercompany transfer price study, a commercialization potential evaluation, or some other type of intangible asset analysis. In addition, a clear definition of the reason for the valuation will allow the analyst to understand if (1) any specific analytical guidelines, procedures, or regulations apply and (2) any specific reporting requirement applies. For example, intangible asset valuations prepared for fair value accounting purposes should meet specific ASC 820 fair value accounting guidance. Intangible asset valuations performed for intercompany transfer price tax purposes should comply with the guidance provided in the Section 482 regulations. Likewise, intangible asset valuations prepared for Section 170 charitable contribution purposes should comply with specific reporting requirements. The individual reasons for the health care intangible asset valuation may influence the standard of value applied, the valuation date selected, the valuation approaches and methods applied, the form and format of valuation report prepared, and even the type of professional employed to perform the valuation.
NASA Astrophysics Data System (ADS)
Kim, Jeong-Man; Koo, Min-Mo; Jeong, Jae-Hoon; Hong, Keyyong; Cho, Il-Hyoung; Choi, Jang-Young
2017-05-01
This paper reports the design and analysis of a tubular permanent magnet linear generator (TPMLG) for a small-scale wave-energy converter. The analytical field computation is performed by applying a magnetic vector potential and a 2-D analytical model to determine design parameters. Based on analytical solutions, parametric analysis is performed to meet the design specifications of a wave-energy converter (WEC). Then, 2-D FEA is employed to validate the analytical method. Finally, the experimental result confirms the predictions of the analytical and finite element analysis (FEA) methods under regular and irregular wave conditions.
Enabling Efficient Climate Science Workflows in High Performance Computing Environments
NASA Astrophysics Data System (ADS)
Krishnan, H.; Byna, S.; Wehner, M. F.; Gu, J.; O'Brien, T. A.; Loring, B.; Stone, D. A.; Collins, W.; Prabhat, M.; Liu, Y.; Johnson, J. N.; Paciorek, C. J.
2015-12-01
A typical climate science workflow often involves a combination of acquisition of data, modeling, simulation, analysis, visualization, publishing, and storage of results. Each of these tasks provide a myriad of challenges when running on a high performance computing environment such as Hopper or Edison at NERSC. Hurdles such as data transfer and management, job scheduling, parallel analysis routines, and publication require a lot of forethought and planning to ensure that proper quality control mechanisms are in place. These steps require effectively utilizing a combination of well tested and newly developed functionality to move data, perform analysis, apply statistical routines, and finally, serve results and tools to the greater scientific community. As part of the CAlibrated and Systematic Characterization, Attribution and Detection of Extremes (CASCADE) project we highlight a stack of tools our team utilizes and has developed to ensure that large scale simulation and analysis work are commonplace and provide operations that assist in everything from generation/procurement of data (HTAR/Globus) to automating publication of results to portals like the Earth Systems Grid Federation (ESGF), all while executing everything in between in a scalable environment in a task parallel way (MPI). We highlight the use and benefit of these tools by showing several climate science analysis use cases they have been applied to.
Niskanen, Toivo
2017-12-12
The aim of this study was to examine how the developed taxonomy of cognitive work analysis (CWA) can be applied in combination with statistical analysis regarding different sociotechnical categories. This study applied a combination of quantitative and qualitative methodologies. Workers (n = 120) and managers (n = 85) in the chemical industry were asked in a questionnaire how different occupational safety and health (OSH) measures were being implemented. The exploration of the qualitative CWA taxonomy consisted of an analysis of the following topics: (a) work domain; (b) control task; (c) strategies; (d) social organization and cooperation; (e) worker competencies. The following hypotheses were supported - activities of the management had positive impacts on the aggregated variables: near-accident investigation and instructions (H 1 ); OSH training (H 2 ); operations, technical processes and safe use of chemicals (H 3 ); use of personal protective equipment (H 4 ); measuring, follow-up and prevention of major accidents (H 5 ). The CWA taxonomy was applied in mixed methods when testing H 1 -H 5 . A special approach is to analyze the work demands of complex sociotechnical systems with the taxonomy of CWA. In problem-solving, the CWA taxonomy should seek to capitalize on the strengths and minimize the limitations of safety performance.
Heat exchanger selection and design analyses for metal hydride heat pump systems
Mazzucco, Andrea; Voskuilen, Tyler G.; Waters, Essene L.; ...
2016-01-01
This paper presents a design analysis for the development of highly efficient heat exchangers within stationary metal hydride heat pumps. The design constraints and selected performance criteria are applied to three representative heat exchangers. The proposed thermal model can be applied to select the most efficient heat exchanger design and provides outcomes generally valid in a pre-design stage. Heat transfer effectiveness is the principal performance parameter guiding the selection analysis, the results of which appear to be mildly (up to 13%) affected by the specific Nusselt correlation used. The thermo-physical properties of the heat transfer medium and geometrical parameters aremore » varied in the sensitivity analysis, suggesting that the length of independent tubes is the physical parameter that influences the performance of the heat exchangers the most. The practical operative regions for each heat exchanger are identified by finding the conditions over which the heat removal from the solid bed enables a complete and continuous hydriding reaction. The most efficient solution is a design example that achieves the target effectiveness of 95%.« less
2018-01-01
Exhaust gas recirculation (EGR) is one of the main methods of reducing NOX emissions and has been widely used in marine diesel engines. This paper proposes an optimized comprehensive assessment method based on multi-objective grey situation decision theory, grey relation theory and grey entropy analysis to evaluate the performance and optimize rate determination of EGR, which currently lack clear theoretical guidance. First, multi-objective grey situation decision theory is used to establish the initial decision-making model according to the main EGR parameters. The optimal compromise between diesel engine combustion and emission performance is transformed into a decision-making target weight problem. After establishing the initial model and considering the characteristics of EGR under different conditions, an optimized target weight algorithm based on grey relation theory and grey entropy analysis is applied to generate the comprehensive evaluation and decision-making model. Finally, the proposed method is successfully applied to a TBD234V12 turbocharged diesel engine, and the results clearly illustrate the feasibility of the proposed method for providing theoretical support and a reference for further EGR optimization. PMID:29377956
Zu, Xianghuan; Yang, Chuanlei; Wang, Hechun; Wang, Yinyan
2018-01-01
Exhaust gas recirculation (EGR) is one of the main methods of reducing NOX emissions and has been widely used in marine diesel engines. This paper proposes an optimized comprehensive assessment method based on multi-objective grey situation decision theory, grey relation theory and grey entropy analysis to evaluate the performance and optimize rate determination of EGR, which currently lack clear theoretical guidance. First, multi-objective grey situation decision theory is used to establish the initial decision-making model according to the main EGR parameters. The optimal compromise between diesel engine combustion and emission performance is transformed into a decision-making target weight problem. After establishing the initial model and considering the characteristics of EGR under different conditions, an optimized target weight algorithm based on grey relation theory and grey entropy analysis is applied to generate the comprehensive evaluation and decision-making model. Finally, the proposed method is successfully applied to a TBD234V12 turbocharged diesel engine, and the results clearly illustrate the feasibility of the proposed method for providing theoretical support and a reference for further EGR optimization.
Bondemark, Lars; Abdulraheem, Salem
2017-10-21
To systematically evaluate in five orthodontic journals how many randomized controlled trials (RCTs) use intention to treat (ITT) analysis and to assess the methodological quality of the ITT analysis, and finally, to demonstrate in an academic way how outcomes can be affected when not implementing the ITT analysis. A search of the database, Medline, was performed via PubMed for publication type 'randomized controlled trial' published for each journal between 1 January 2013 and 30 April 2017. The five orthodontic journals assessed were the American Journal of Orthodontics and Dentofacial Orthopedics, Angle Orthodontics, European Journal of Orthodontics, Journal of Orthodontics, and Orthodontics and Craniofacial Research. Two independent reviewers assessed each RCT to determine whether the trial reported an ITT or not or if a per-protocol analysis was accomplished. The initial search generated 137 possible trials. After applying the inclusion and exclusion criteria, 90 RCTs were included and assessed. Seventeen out of 90 RCTs (18.9%) either reported an ITT analysis in the text and/or supported the ITT by flow diagrams or tables. However, six RCTs applied and reported the ITT analysis correctly, while the majority performed a per-protocol analysis instead. Nearly all the trials that applied the ITT analysis incorrectly analysed the results using a per-protocol analysis, and thus, overestimating the results and/or having a reduced sample size which then could produce a diminished statistical power. © The Author 2017. Published by Oxford University Press on behalf of the European Orthodontic Society. All rights reserved. For permissions, please email: journals.permissions@oup.com
NASA Astrophysics Data System (ADS)
Silva, N.; Esper, A.
2012-01-01
The work presented in this article represents the results of applying RAMS analysis to a critical space control system, both at system and software levels. The system level RAMS analysis allowed the assignment of criticalities to the high level components, which was further refined by a tailored software level RAMS analysis. The importance of the software level RAMS analysis in the identification of new failure modes and its impact on the system level RAMS analysis is discussed. Recommendations of changes in the software architecture have also been proposed in order to reduce the criticality of the SW components to an acceptable minimum. The dependability analysis was performed in accordance to ECSS-Q-ST-80, which had to be tailored and complemented in some aspects. This tailoring will also be detailed in the article and lessons learned from the application of this tailoring will be shared, stating the importance to space systems safety evaluations. The paper presents the applied techniques, the relevant results obtained, the effort required for performing the tasks and the planned strategy for ROI estimation, as well as the soft skills required and acquired during these activities.
Structural mode significance using INCA. [Interactive Controls Analysis computer program
NASA Technical Reports Server (NTRS)
Bauer, Frank H.; Downing, John P.; Thorpe, Christopher J.
1990-01-01
Structural finite element models are often too large to be used in the design and analysis of control systems. Model reduction techniques must be applied to reduce the structural model to manageable size. In the past, engineers either performed the model order reduction by hand or used distinct computer programs to retrieve the data, to perform the significance analysis and to reduce the order of the model. To expedite this process, the latest version of INCA has been expanded to include an interactive graphical structural mode significance and model order reduction capability.
ERIC Educational Resources Information Center
Gutierrez, Michael
2017-01-01
Interteaching is a college teaching method grounded in the principles of applied behavior analysis. Research on interteaching demonstrates that it improves academic performance, and students report greater satisfaction with interteaching as compared to traditional teaching styles. The current study investigates whether discussion group size, a…
Improving Generalization of Academic Skills: Commentary on the Special Issue
ERIC Educational Resources Information Center
Skinner, Christopher H.; Daly, Edward J., III
2010-01-01
Behavior analysts have long been interested in developing and promoting the use of effective generalization strategies for behavioral interventions. Perhaps because research on academic performance has lagged behind in the field of applied behavior analysis, far less research on this topic has been conducted for academic performance problems. The…
Analysis performed in support of the Ad-Hoc Working Group of RTCA SC-159 on RAIM/FDE issues
DOT National Transportation Integrated Search
2002-01-01
In 1999, the FAA requested that RTCA SC-159 address one of the recommendations from the study performed by the Johns Hopkins University (JHU) Applied Physics Lab (APL) on the use of GPS and augmented GPS for aviation applications. This recommendation...
Bring It to the Pitch: Combining Video and Movement Data to Enhance Team Sport Analysis.
Stein, Manuel; Janetzko, Halldor; Lamprecht, Andreas; Breitkreutz, Thorsten; Zimmermann, Philipp; Goldlucke, Bastian; Schreck, Tobias; Andrienko, Gennady; Grossniklaus, Michael; Keim, Daniel A
2018-01-01
Analysts in professional team sport regularly perform analysis to gain strategic and tactical insights into player and team behavior. Goals of team sport analysis regularly include identification of weaknesses of opposing teams, or assessing performance and improvement potential of a coached team. Current analysis workflows are typically based on the analysis of team videos. Also, analysts can rely on techniques from Information Visualization, to depict e.g., player or ball trajectories. However, video analysis is typically a time-consuming process, where the analyst needs to memorize and annotate scenes. In contrast, visualization typically relies on an abstract data model, often using abstract visual mappings, and is not directly linked to the observed movement context anymore. We propose a visual analytics system that tightly integrates team sport video recordings with abstract visualization of underlying trajectory data. We apply appropriate computer vision techniques to extract trajectory data from video input. Furthermore, we apply advanced trajectory and movement analysis techniques to derive relevant team sport analytic measures for region, event and player analysis in the case of soccer analysis. Our system seamlessly integrates video and visualization modalities, enabling analysts to draw on the advantages of both analysis forms. Several expert studies conducted with team sport analysts indicate the effectiveness of our integrated approach.
Application of a High-Fidelity Icing Analysis Method to a Model-Scale Rotor in Forward Flight
NASA Technical Reports Server (NTRS)
Narducci, Robert; Orr, Stanley; Kreeger, Richard E.
2012-01-01
An icing analysis process involving the loose coupling of OVERFLOW-RCAS for rotor performance prediction and with LEWICE3D for thermal analysis and ice accretion is applied to a model-scale rotor for validation. The process offers high-fidelity rotor analysis for the noniced and iced rotor performance evaluation that accounts for the interaction of nonlinear aerodynamics with blade elastic deformations. Ice accumulation prediction also involves loosely coupled data exchanges between OVERFLOW and LEWICE3D to produce accurate ice shapes. Validation of the process uses data collected in the 1993 icing test involving Sikorsky's Powered Force Model. Non-iced and iced rotor performance predictions are compared to experimental measurements as are predicted ice shapes.
Tools for Large-Scale Mobile Malware Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bierma, Michael
Analyzing mobile applications for malicious behavior is an important area of re- search, and is made di cult, in part, by the increasingly large number of appli- cations available for the major operating systems. There are currently over 1.2 million apps available in both the Google Play and Apple App stores (the respec- tive o cial marketplaces for the Android and iOS operating systems)[1, 2]. Our research provides two large-scale analysis tools to aid in the detection and analysis of mobile malware. The rst tool we present, Andlantis, is a scalable dynamic analysis system capa- ble of processing over 3000more » Android applications per hour. Traditionally, Android dynamic analysis techniques have been relatively limited in scale due to the compu- tational resources required to emulate the full Android system to achieve accurate execution. Andlantis is the most scalable Android dynamic analysis framework to date, and is able to collect valuable forensic data, which helps reverse-engineers and malware researchers identify and understand anomalous application behavior. We discuss the results of running 1261 malware samples through the system, and provide examples of malware analysis performed with the resulting data. While techniques exist to perform static analysis on a large number of appli- cations, large-scale analysis of iOS applications has been relatively small scale due to the closed nature of the iOS ecosystem, and the di culty of acquiring appli- cations for analysis. The second tool we present, iClone, addresses the challenges associated with iOS research in order to detect application clones within a dataset of over 20,000 iOS applications.« less
Methods for assessing the stability of slopes during earthquakes-A retrospective
Jibson, R.W.
2011-01-01
During the twentieth century, several methods to assess the stability of slopes during earthquakes were developed. Pseudostatic analysis was the earliest method; it involved simply adding a permanent body force representing the earthquake shaking to a static limit-equilibrium analysis. Stress-deformation analysis, a later development, involved much more complex modeling of slopes using a mesh in which the internal stresses and strains within elements are computed based on the applied external loads, including gravity and seismic loads. Stress-deformation analysis provided the most realistic model of slope behavior, but it is very complex and requires a high density of high-quality soil-property data as well as an accurate model of soil behavior. In 1965, Newmark developed a method that effectively bridges the gap between these two types of analysis. His sliding-block model is easy to apply and provides a useful index of co-seismic slope performance. Subsequent modifications to sliding-block analysis have made it applicable to a wider range of landslide types. Sliding-block analysis provides perhaps the greatest utility of all the types of analysis. It is far easier to apply than stress-deformation analysis, and it yields much more useful information than does pseudostatic analysis. ?? 2010.
Applying Score Analysis to a Rehearsal Pedagogy of Expressive Performance
ERIC Educational Resources Information Center
Byo, James L.
2014-01-01
The discoveries of score analysis (e.g., minor seventh chord, ostinato, phrase elision, melodic fragment, half cadence) are more than just compositional techniques or music vocabulary. They are sounds--fascinating, storytelling, dynamic modes of expression--that when approached as such enrich the rehearsal experience. This article presents a…
A Theory of Term Importance in Automatic Text Analysis.
ERIC Educational Resources Information Center
Salton, G.; And Others
Most existing automatic content analysis and indexing techniques are based on work frequency characteristics applied largely in an ad hoc manner. Contradictory requirements arise in this connection, in that terms exhibiting high occurrence frequencies in individual documents are often useful for high recall performance (to retrieve many relevant…
Orderedness and Stratificational "and" Nodes.
ERIC Educational Resources Information Center
Herrick, Earl M.
It is possible to apply Lamb's stratificational theory and analysis to English graphonomy, but additional notation devices must be used to explain particular graphemes and their characteristics. The author presents cases where Lamb's notation is inadequate. In those cases, he devises new means for performing the analysis. The result of this…
ERIC Educational Resources Information Center
Jones, Lawrence; Graham, Ian
1986-01-01
Reviews the main principles of interfacing and discusses the software developed to perform kinetic data capture and analysis with a BBC microcomputer linked to a recording spectrophotometer. Focuses on the steps in software development. Includes results of a lactate dehydrogenase assay. (ML)
High Performance Visualization using Query-Driven Visualizationand Analytics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bethel, E. Wes; Campbell, Scott; Dart, Eli
2006-06-15
Query-driven visualization and analytics is a unique approach for high-performance visualization that offers new capabilities for knowledge discovery and hypothesis testing. The new capabilities akin to finding needles in haystacks are the result of combining technologies from the fields of scientific visualization and scientific data management. This approach is crucial for rapid data analysis and visualization in the petascale regime. This article describes how query-driven visualization is applied to a hero-sized network traffic analysis problem.
System for corrosion monitoring in pipeline applying fuzzy logic mathematics
NASA Astrophysics Data System (ADS)
Kuzyakov, O. N.; Kolosova, A. L.; Andreeva, M. A.
2018-05-01
A list of factors influencing corrosion rate on the external side of underground pipeline is determined. Principles of constructing a corrosion monitoring system are described; the system performance algorithm and program are elaborated. A comparative analysis of methods for calculating corrosion rate is undertaken. Fuzzy logic mathematics is applied to reduce calculations while considering a wider range of corrosion factors.
ERIC Educational Resources Information Center
Brackenbury, Tim; Zickar, Michael J.; Munson, Benjamin; Storkel, Holly L.
2017-01-01
Purpose: Item response theory (IRT) is a psychometric approach to measurement that uses latent trait abilities (e.g., speech sound production skills) to model performance on individual items that vary by difficulty and discrimination. An IRT analysis was applied to preschoolers' productions of the words on the Goldman-Fristoe Test of…
Velocity filtering applied to optical flow calculations
NASA Technical Reports Server (NTRS)
Barniv, Yair
1990-01-01
Optical flow is a method by which a stream of two-dimensional images obtained from a forward-looking passive sensor is used to map the three-dimensional volume in front of a moving vehicle. Passive ranging via optical flow is applied here to the helicopter obstacle-avoidance problem. Velocity filtering is used as a field-based method to determine range to all pixels in the initial image. The theoretical understanding and performance analysis of velocity filtering as applied to optical flow is expanded and experimental results are presented.
ERIC Educational Resources Information Center
Benincasa, Luciana
2017-01-01
The paper applies Goffman's frame analysis and ethnomethodology to student performance on mathematical word problems. In educational research, frame analysis has usually been limited to primary frames. Instead, in this paper I focus on the kind of secondary frame that Goffman calls 'utilitarian make-believe'. The data consist of a fragment of…
ERIC Educational Resources Information Center
Guilamo-Ramos; Vincent; Jaccard, James; Dittus, Patricia; Gonzalez, Bernardo; Bouris, Alida
2008-01-01
A framework for the analysis of adolescent problem behaviors was explicated that draws on five major theories of human behavior. The framework emphasizes intentions to perform behaviors and factors that influence intentions as well as moderate the impact of intentions on behavior. The framework was applied to the analysis of adolescent sexual risk…
Castañer, Marta; Andueza, Juan; Hileno, Raúl; Puigarnau, Silvia; Prat, Queralt; Camerino, Oleguer
2018-01-01
Laterality is a key aspect of the analysis of basic and specific motor skills. It is relevant to sports because it involves motor laterality profiles beyond left-right preference and spatial orientation of the body. The aim of this study was to obtain the laterality profiles of young athletes, taking into account the synergies between the support and precision functions of limbs and body parts in the performance of complex motor skills. We applied two instruments: (a) MOTORLAT, a motor laterality inventory comprising 30 items of basic, specific, and combined motor skills, and (b) the Precision and Agility Tapping over Hoops (PATHoops) task, in which participants had to perform a path by stepping in each of 14 hoops arranged on the floor, allowing the observation of their feet, left-right preference and spatial orientation. A total of 96 young athletes performed the PATHoops task and the 30 MOTORLAT items, allowing us to obtain data about limb dominance and spatial orientation of the body in the performance of complex motor skills. Laterality profiles were obtained by means of a cluster analysis and a correlational analysis and a contingency analysis were applied between the motor skills and spatial orientation actions performed. The results obtained using MOTORLAT show that the combined motor skills criterion (for example, turning while jumping) differentiates athletes' uses of laterality, showing a clear tendency toward mixed laterality profiles in the performance of complex movements. In the PATHoops task, the best spatial orientation strategy was “same way” (same foot and spatial wing) followed by “opposite way” (opposite foot and spatial wing), in keeping with the research assumption that actions unfolding in a horizontal direction in front of an observer's eyes are common in a variety of sports. PMID:29930527
NASA Astrophysics Data System (ADS)
Murata, Isao; Ohta, Masayuki; Miyamaru, Hiroyuki; Kondo, Keitaro; Yoshida, Shigeo; Iida, Toshiyuki; Ochiai, Kentaro; Konno, Chikara
2011-10-01
Nuclear data are indispensable for development of fusion reactor candidate materials. However, benchmarking of the nuclear data in MeV energy region is not yet adequate. In the present study, benchmark performance in the MeV energy region was investigated theoretically for experiments by using a 14 MeV neutron source. We carried out a systematical analysis for light to heavy materials. As a result, the benchmark performance for the neutron spectrum was confirmed to be acceptable, while for gamma-rays it was not sufficiently accurate. Consequently, a spectrum shifter has to be applied. Beryllium had the best performance as a shifter. Moreover, a preliminary examination of whether it is really acceptable that only the spectrum before the last collision is considered in the benchmark performance analysis. It was pointed out that not only the last collision but also earlier collisions should be considered equally in the benchmark performance analysis.
NASA Technical Reports Server (NTRS)
Biess, J. J.; Yu, Y.; Middlebrook, R. D.; Schoenfeld, A. D.
1974-01-01
A review is given of future power processing systems planned for the next 20 years, and the state-of-the-art of power processing design modeling and analysis techniques used to optimize power processing systems. A methodology of modeling and analysis of power processing equipment and systems has been formulated to fulfill future tradeoff studies and optimization requirements. Computer techniques were applied to simulate power processor performance and to optimize the design of power processing equipment. A program plan to systematically develop and apply the tools for power processing systems modeling and analysis is presented so that meaningful results can be obtained each year to aid the power processing system engineer and power processing equipment circuit designers in their conceptual and detail design and analysis tasks.
Neden, Catherine A; Parkin, Claire; Blow, Carol; Siriwardena, Aloysius Niroshan
2018-05-08
The aim of this study was to assess whether the absolute standard of candidates sitting the MRCGP Applied Knowledge Test (AKT) between 2011 and 2016 had changed. It is a descriptive study comparing the performance on marker questions of a reference group of UK graduates taking the AKT for the first time between 2011 and 2016. Using aggregated examination data, the performance of individual 'marker' questions was compared using Pearson's chi-squared tests and trend-line analysis. Binary logistic regression was used to analyse changes in performance over the study period. Changes in performance of individual marker questions using Pearson's chi-squared test showed statistically significant differences in 32 of the 49 questions included in the study. Trend line analysis showed a positive trend in 29 questions and a negative trend in the remaining 23. The magnitude of change was small. Logistic regression did not demonstrate any evidence for a change in the performance of the question set over the study period. However, candidates were more likely to get items on administration wrong compared with clinical medicine or research. There was no evidence of a change in performance of the question set as a whole.
A Comparison of Vocal Demands with Vocal Performance among Classroom Student Teachers
ERIC Educational Resources Information Center
Franca, Maria Claudia
2013-01-01
Purpose: This investigation compared voice performance of student teachers across an academic semester in order to examine the effect of increasing demands on their voice. Method: A repeated measures design was applied to the data analysis: all participants were tested three separate times throughout the semester. The equipments used for…
Federal Register 2010, 2011, 2012, 2013, 2014
2010-08-06
... considerations affecting the design and conduct of repellent studies when human subjects are involved. Any... recommendations for the design and execution of studies to evaluate the performance of pesticide products intended... recommends appropriate study designs and methods for selecting subjects, statistical analysis, and reporting...
Leadership: Enhancing Team Adaptability in Dynamic Settings
2008-04-01
performance , and team member satisfaction : A meta-analysis. Journal of Applied Psychology, 88, 741-749. DeShon, R. P., Kozlowski, S. W. J., Schmidt, A...University of Pennsylvania), and Steve W. J. Kozlowski (Michigan State University) 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) University...NUMBER 8. PERFORMING ORGANIZATION REPORT NUMBER 9. SPONSORING/MONITORING AGENCY NAME(S) AND ADDRESS(ES) U. S. Army Research Institute for the Behavioral
Impact of Spatial Scales on the Intercomparison of Climate Scenarios
DOE Office of Scientific and Technical Information (OSTI.GOV)
Luo, Wei; Steptoe, Michael; Chang, Zheng
2017-01-01
Scenario analysis has been widely applied in climate science to understand the impact of climate change on the future human environment, but intercomparison and similarity analysis of different climate scenarios based on multiple simulation runs remain challenging. Although spatial heterogeneity plays a key role in modeling climate and human systems, little research has been performed to understand the impact of spatial variations and scales on similarity analysis of climate scenarios. To address this issue, the authors developed a geovisual analytics framework that lets users perform similarity analysis of climate scenarios from the Global Change Assessment Model (GCAM) using a hierarchicalmore » clustering approach.« less
Hao, Bibo; Sun, Wen; Yu, Yiqin; Li, Jing; Hu, Gang; Xie, Guotong
2016-01-01
Recent advances in cloud computing and machine learning made it more convenient for researchers to gain insights from massive healthcare data, while performing analyses on healthcare data in current practice still lacks efficiency for researchers. What's more, collaborating among different researchers and sharing analysis results are challenging issues. In this paper, we developed a practice to make analytics process collaborative and analysis results reproducible by exploiting and extending Jupyter Notebook. After applying this practice in our use cases, we can perform analyses and deliver results with less efforts in shorter time comparing to our previous practice.
Karasawa, N; Mitsutake, A; Takano, H
2017-12-01
Proteins implement their functionalities when folded into specific three-dimensional structures, and their functions are related to the protein structures and dynamics. Previously, we applied a relaxation mode analysis (RMA) method to protein systems; this method approximately estimates the slow relaxation modes and times via simulation and enables investigation of the dynamic properties underlying the protein structural fluctuations. Recently, two-step RMA with multiple evolution times has been proposed and applied to a slightly complex homopolymer system, i.e., a single [n]polycatenane. This method can be applied to more complex heteropolymer systems, i.e., protein systems, to estimate the relaxation modes and times more accurately. In two-step RMA, we first perform RMA and obtain rough estimates of the relaxation modes and times. Then, we apply RMA with multiple evolution times to a small number of the slowest relaxation modes obtained in the previous calculation. Herein, we apply this method to the results of principal component analysis (PCA). First, PCA is applied to a 2-μs molecular dynamics simulation of hen egg-white lysozyme in aqueous solution. Then, the two-step RMA method with multiple evolution times is applied to the obtained principal components. The slow relaxation modes and corresponding relaxation times for the principal components are much improved by the second RMA.
NASA Astrophysics Data System (ADS)
Karasawa, N.; Mitsutake, A.; Takano, H.
2017-12-01
Proteins implement their functionalities when folded into specific three-dimensional structures, and their functions are related to the protein structures and dynamics. Previously, we applied a relaxation mode analysis (RMA) method to protein systems; this method approximately estimates the slow relaxation modes and times via simulation and enables investigation of the dynamic properties underlying the protein structural fluctuations. Recently, two-step RMA with multiple evolution times has been proposed and applied to a slightly complex homopolymer system, i.e., a single [n ] polycatenane. This method can be applied to more complex heteropolymer systems, i.e., protein systems, to estimate the relaxation modes and times more accurately. In two-step RMA, we first perform RMA and obtain rough estimates of the relaxation modes and times. Then, we apply RMA with multiple evolution times to a small number of the slowest relaxation modes obtained in the previous calculation. Herein, we apply this method to the results of principal component analysis (PCA). First, PCA is applied to a 2-μ s molecular dynamics simulation of hen egg-white lysozyme in aqueous solution. Then, the two-step RMA method with multiple evolution times is applied to the obtained principal components. The slow relaxation modes and corresponding relaxation times for the principal components are much improved by the second RMA.
Jitter Reduces Response-Time Variability in ADHD: An Ex-Gaussian Analysis.
Lee, Ryan W Y; Jacobson, Lisa A; Pritchard, Alison E; Ryan, Matthew S; Yu, Qilu; Denckla, Martha B; Mostofsky, Stewart; Mahone, E Mark
2015-09-01
"Jitter" involves randomization of intervals between stimulus events. Compared with controls, individuals with ADHD demonstrate greater intrasubject variability (ISV) performing tasks with fixed interstimulus intervals (ISIs). Because Gaussian curves mask the effect of extremely slow or fast response times (RTs), ex-Gaussian approaches have been applied to study ISV. This study applied ex-Gaussian analysis to examine the effects of jitter on RT variability in children with and without ADHD. A total of 75 children, aged 9 to 14 years (44 ADHD, 31 controls), completed a go/no-go test with two conditions: fixed ISI and jittered ISI. ADHD children showed greater variability, driven by elevations in exponential (tau), but not normal (sigma) components of the RT distribution. Jitter decreased tau in ADHD to levels not statistically different than controls, reducing lapses in performance characteristic of impaired response control. Jitter may provide a nonpharmacologic mechanism to facilitate readiness to respond and reduce lapses from sustained (controlled) performance. © 2012 SAGE Publications.
Tool Efficiency Analysis model research in SEMI industry
NASA Astrophysics Data System (ADS)
Lei, Ma; Nana, Zhang; Zhongqiu, Zhang
2018-06-01
One of the key goals in SEMI industry is to improve equipment through put and ensure equipment production efficiency maximization. This paper is based on SEMI standards in semiconductor equipment control, defines the transaction rules between different tool states, and presents a TEA system model which is to analysis tool performance automatically based on finite state machine. The system was applied to fab tools and verified its effectiveness successfully, and obtained the parameter values used to measure the equipment performance, also including the advices of improvement.
Overlapped Partitioning for Ensemble Classifiers of P300-Based Brain-Computer Interfaces
Onishi, Akinari; Natsume, Kiyohisa
2014-01-01
A P300-based brain-computer interface (BCI) enables a wide range of people to control devices that improve their quality of life. Ensemble classifiers with naive partitioning were recently applied to the P300-based BCI and these classification performances were assessed. However, they were usually trained on a large amount of training data (e.g., 15300). In this study, we evaluated ensemble linear discriminant analysis (LDA) classifiers with a newly proposed overlapped partitioning method using 900 training data. In addition, the classification performances of the ensemble classifier with naive partitioning and a single LDA classifier were compared. One of three conditions for dimension reduction was applied: the stepwise method, principal component analysis (PCA), or none. The results show that an ensemble stepwise LDA (SWLDA) classifier with overlapped partitioning achieved a better performance than the commonly used single SWLDA classifier and an ensemble SWLDA classifier with naive partitioning. This result implies that the performance of the SWLDA is improved by overlapped partitioning and the ensemble classifier with overlapped partitioning requires less training data than that with naive partitioning. This study contributes towards reducing the required amount of training data and achieving better classification performance. PMID:24695550
Overlapped partitioning for ensemble classifiers of P300-based brain-computer interfaces.
Onishi, Akinari; Natsume, Kiyohisa
2014-01-01
A P300-based brain-computer interface (BCI) enables a wide range of people to control devices that improve their quality of life. Ensemble classifiers with naive partitioning were recently applied to the P300-based BCI and these classification performances were assessed. However, they were usually trained on a large amount of training data (e.g., 15300). In this study, we evaluated ensemble linear discriminant analysis (LDA) classifiers with a newly proposed overlapped partitioning method using 900 training data. In addition, the classification performances of the ensemble classifier with naive partitioning and a single LDA classifier were compared. One of three conditions for dimension reduction was applied: the stepwise method, principal component analysis (PCA), or none. The results show that an ensemble stepwise LDA (SWLDA) classifier with overlapped partitioning achieved a better performance than the commonly used single SWLDA classifier and an ensemble SWLDA classifier with naive partitioning. This result implies that the performance of the SWLDA is improved by overlapped partitioning and the ensemble classifier with overlapped partitioning requires less training data than that with naive partitioning. This study contributes towards reducing the required amount of training data and achieving better classification performance.
The person's conception of the structures of developing intellect: early adolescence to middle age.
Demetriou, A; Efklides, A
1989-08-01
According to experiential structuralism, thought abilities have six capacity spheres: experimental, propositional, quantitative, imaginal, qualitative, and metacognitive. The first five are applied to the environment. The metacognitive capacity is applied to the others, serving as the interface between reality and the cognitive system or between any of the other capacities. To test this postulate, 648 subjects aged 12 to 40 years, solved eight tasks that were addressed, in pairs, to the first four capacity spheres. One of the tasks in each pair tapped the first and the other the third formal level of the sphere. Having solved the tasks, the subjects were required to rate each pair of tasks in terms of similarity of operations, difficulty, and success of solution. Factor analysis of difficulty and success evaluation scores revealed the same capacity-specific factors as the analysis of performance scores. Factor analysis of similarity scores differentiated between same- and different-sphere pairs. Analysis of variance showed that difficulty and success evaluation scores preserved performance differences between the first and the third formal tasks. Cognitive level, age, socioeconomic status, and sex were related to the metacognitive measures in ways similar to their relations to performance measures. These findings were integrated into a model aimed at capturing real-time metacognitive functioning.
Chandra, Preeti; Kannujia, Rekha; Saxena, Ankita; Srivastava, Mukesh; Bahadur, Lal; Pal, Mahesh; Singh, Bhim Pratap; Kumar Ojha, Sanjeev; Kumar, Brijesh
2016-09-10
An ultra-high performance liquid chromatography electrospray ionization tandem mass spectrometry method has been developed and validated for simultaneous quantification of six major bioactive compounds in five varieties of Withania somnifera in various plant parts (leaf, stem and root). The analysis was accomplished on Waters ACQUITY UPLC BEH C18 column with linear gradient elution of water/formic acid (0.1%) and acetonitrile at a flow rate of 0.3mLmin(-1). The proposed method was validated with acceptable linearity (r(2), 0.9989-0.9998), precision (RSD, 0.16-2.01%), stability (RSD, 1.04-1.62%) and recovery (RSD ≤2.45%), under optimum conditions. The method was also successfully applied for the simultaneous determination of six marker compounds in twenty-six marketed formulations. Hierarchical cluster analysis and principal component analysis were applied to discriminate these twenty-six batches based on characteristics of the bioactive compounds. The results indicated that this method is advance, rapid, sensitive and suitable to reveal the quality of Withania somnifera and also capable of performing quality evaluation of polyherbal formulations having similar markers/raw herbs. Copyright © 2016 Elsevier B.V. All rights reserved.
How to Perform a Systematic Review and Meta-analysis of Diagnostic Imaging Studies.
Cronin, Paul; Kelly, Aine Marie; Altaee, Duaa; Foerster, Bradley; Petrou, Myria; Dwamena, Ben A
2018-05-01
A systematic review is a comprehensive search, critical evaluation, and synthesis of all the relevant studies on a specific (clinical) topic that can be applied to the evaluation of diagnostic and screening imaging studies. It can be a qualitative or a quantitative (meta-analysis) review of available literature. A meta-analysis uses statistical methods to combine and summarize the results of several studies. In this review, a 12-step approach to performing a systematic review (and meta-analysis) is outlined under the four domains: (1) Problem Formulation and Data Acquisition, (2) Quality Appraisal of Eligible Studies, (3) Statistical Analysis of Quantitative Data, and (4) Clinical Interpretation of the Evidence. This review is specifically geared toward the performance of a systematic review and meta-analysis of diagnostic test accuracy (imaging) studies. Copyright © 2018 The Association of University Radiologists. Published by Elsevier Inc. All rights reserved.
Zhou, Xiao-Rong; Huang, Shui-Sheng; Gong, Xin-Guo; Cen, Li-Ping; Zhang, Cong; Zhu, Hong; Yang, Jun-Jing; Chen, Li
2012-04-01
To construct a performance evaluation and management system on advanced schistosomiasis medical treatment, and analyze and evaluate the work of the advanced schistosomiasis medical treatment over the years. By applying the database management technique and C++ programming technique, we inputted the information of the advanced schistosomiasis cases into the system, and comprehensively evaluated the work of the advanced schistosomiasis medical treatment through the cost-effect analysis, cost-effectiveness analysis, and cost-benefit analysis. We made a set of software formula about cost-effect analysis, cost-effectiveness analysis, and cost-benefit analysis. This system had many features such as clear building, easy to operate, friendly surface, convenient information input and information search. It could benefit the performance evaluation of the province's advanced schistosomiasis medical treatment work. This system can satisfy the current needs of advanced schistosomiasis medical treatment work and can be easy to be widely used.
Multidisciplinary design optimization using multiobjective formulation techniques
NASA Technical Reports Server (NTRS)
Chattopadhyay, Aditi; Pagaldipti, Narayanan S.
1995-01-01
This report addresses the development of a multidisciplinary optimization procedure using an efficient semi-analytical sensitivity analysis technique and multilevel decomposition for the design of aerospace vehicles. A semi-analytical sensitivity analysis procedure is developed for calculating computational grid sensitivities and aerodynamic design sensitivities. Accuracy and efficiency of the sensitivity analysis procedure is established through comparison of the results with those obtained using a finite difference technique. The developed sensitivity analysis technique are then used within a multidisciplinary optimization procedure for designing aerospace vehicles. The optimization problem, with the integration of aerodynamics and structures, is decomposed into two levels. Optimization is performed for improved aerodynamic performance at the first level and improved structural performance at the second level. Aerodynamic analysis is performed by solving the three-dimensional parabolized Navier Stokes equations. A nonlinear programming technique and an approximate analysis procedure are used for optimization. The proceduredeveloped is applied to design the wing of a high speed aircraft. Results obtained show significant improvements in the aircraft aerodynamic and structural performance when compared to a reference or baseline configuration. The use of the semi-analytical sensitivity technique provides significant computational savings.
Liu, Boquan; Polce, Evan; Sprott, Julien C; Jiang, Jack J
2018-05-17
The purpose of this study is to introduce a chaos level test to evaluate linear and nonlinear voice type classification method performances under varying signal chaos conditions without subjective impression. Voice signals were constructed with differing degrees of noise to model signal chaos. Within each noise power, 100 Monte Carlo experiments were applied to analyze the output of jitter, shimmer, correlation dimension, and spectrum convergence ratio. The computational output of the 4 classifiers was then plotted against signal chaos level to investigate the performance of these acoustic analysis methods under varying degrees of signal chaos. A diffusive behavior detection-based chaos level test was used to investigate the performances of different voice classification methods. Voice signals were constructed by varying the signal-to-noise ratio to establish differing signal chaos conditions. Chaos level increased sigmoidally with increasing noise power. Jitter and shimmer performed optimally when the chaos level was less than or equal to 0.01, whereas correlation dimension was capable of analyzing signals with chaos levels of less than or equal to 0.0179. Spectrum convergence ratio demonstrated proficiency in analyzing voice signals with all chaos levels investigated in this study. The results of this study corroborate the performance relationships observed in previous studies and, therefore, demonstrate the validity of the validation test method. The presented chaos level validation test could be broadly utilized to evaluate acoustic analysis methods and establish the most appropriate methodology for objective voice analysis in clinical practice.
NASA Technical Reports Server (NTRS)
1974-01-01
Studies were conducted to develop appropriate space shuttle electrical power distribution and control (EPDC) subsystem simulation models and to apply the computer simulations to systems analysis of the EPDC. A previously developed software program (SYSTID) was adapted for this purpose. The following objectives were attained: (1) significant enhancement of the SYSTID time domain simulation software, (2) generation of functionally useful shuttle EPDC element models, and (3) illustrative simulation results in the analysis of EPDC performance, under the conditions of fault, current pulse injection due to lightning, and circuit protection sizing and reaction times.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Atzeni, Simone; Ahn, Dong; Gopalakrishnan, Ganesh
2017-01-12
Archer is built on top of the LLVM/Clang compilers that support OpenMP. It applies static and dynamic analysis techniques to detect data races in OpenMP programs generating a very low runtime and memory overhead. Static analyses identify data race free OpenMP regions and exclude them from runtime analysis, which is performed by ThreadSanitizer included in LLVM/Clang.
ERIC Educational Resources Information Center
Burton, D. Bradley; And Others
1994-01-01
A maximum-likelihood confirmatory factor analysis was performed by applying LISREL VII to the Wechsler Adult Intelligence Scale-Revised results of a normal elderly sample of 225 adults. Results indicate that a three-factor model fits best across all sample combinations. A mild gender effect is discussed. (SLD)
Microfluidic Gel Electrophoresis in the Undergraduate Laboratory Applied to Food Analysis
ERIC Educational Resources Information Center
Chao, Tzu-Chiao; Bhattacharya, Sanchari; Ros, Alexandra
2012-01-01
A microfluidics-based laboratory experiment for the analysis of DNA fragments in an analytical undergraduate course is presented. The experiment is set within the context of food species identification via amplified DNA fragments. The students are provided with berry samples from which they extract DNA and perform polymerase chain reaction (PCR)…
USDA-ARS?s Scientific Manuscript database
A rapid, effective technique applying vortex-assisted liquid–liquid microextraction (VALLME) prior to ultra high performance liquid chromatography-evaporating light scattering detectection/ mass spectroscopy (UHPLC-ELSD/MS) determination was developed for the analysis of four cucurbitane triterpenoi...
14 CFR 1274.801 - Adjustments to performance costs.
Code of Federal Regulations, 2010 CFR
2010-01-01
... NASA's initial cost share or funding levels, detailed cost analysis techniques may be applied, which... shall continue to maintain the share ratio requirements (normally 50/50) stated in § 1274.204(b). ...
Virginia Bridge Information Systems Laboratory.
DOT National Transportation Integrated Search
2014-06-01
This report presents the results of applied data mining of legacy bridge databases, focusing on the Pontis and : National Bridge Inventory databases maintained by the Virginia Department of Transportation (VDOT). Data : analysis was performed using a...
Helicopter Dynamic Performance Program. Volume 2. User’s Manual
1980-07-01
Technologies Corporation Stratford, Connecticut 06602 July 1980 Final Report / • ., . Approved for public release; I distribution unlimited. ; A - Prepared...for : APPLIED TECHNOLOGY LABORATORY U. S. ARMY RESEARCH AND TECHNOLOGY LABORATORIES (AVRADCOM) Fort Eustis, Va. 23604 80 8 27 D APPLIED TECHNOLOGY ... Technology Division. Technic31 review of this report was also pro- vided by Messrs. W_ A. Pleasants of Design Integration and Analysis Technical Area
ERIC Educational Resources Information Center
Henninger, Jacqueline C.; Flowers, Patricia J.; Councill, Kimberly H.
2006-01-01
The purpose of the study was to examine the effect of teacher experience on student progress and performance quality in an introductory applied lesson. Nine experienced teachers and 15 pre-service teachers taught an adult beginner to play "Mary Had a Little Lamb" on a wind instrument. The lessons were videotaped for subsequent analysis of teaching…
A Quad-Cantilevered Plate micro-sensor for intracranial pressure measurement.
Lalkov, Vasko; Qasaimeh, Mohammad A
2017-07-01
This paper proposes a new design for pressure-sensing micro-plate platform to bring higher sensitivity to a pressure sensor based on piezoresistive MEMS sensing mechanism. The proposed design is composed of a suspended plate having four stepped cantilever beams connected to its corners, and thus defined as Quad-Cantilevered Plate (QCP). Finite element analysis was performed to determine the optimal design for sensitivity and structural stability under a range of applied forces. Furthermore, a piezoresistive analysis was performed to calculate sensor sensitivity. Both the maximum stress and the change in resistance of the piezoresistor associated with the QCP were found to be higher compared to previously published designs, and linearly related to the applied pressure as desired. Therefore, the QCP demonstrates greater sensitivity, and could be potentially used as an efficient pressure sensor for intracranial pressure measurement.
Clerici, Nicola; Bodini, Antonio; Ferrarini, Alessandro
2004-10-01
In order to achieve improved sustainability, local authorities need to use tools that adequately describe and synthesize environmental information. This article illustrates a methodological approach that organizes a wide suite of environmental indicators into few aggregated indices, making use of correlation, principal component analysis, and fuzzy sets. Furthermore, a weighting system, which includes stakeholders' priorities and ambitions, is applied. As a case study, the described methodology is applied to the Reggio Emilia Province in Italy, by considering environmental information from 45 municipalities. Principal component analysis is used to condense an initial set of 19 indicators into 6 fundamental dimensions that highlight patterns of environmental conditions at the provincial scale. These dimensions are further aggregated in two indices of environmental performance through fuzzy sets. The simple form of these indices makes them particularly suitable for public communication, as they condensate a wide set of heterogeneous indicators. The main outcomes of the analysis and the potential applications of the method are discussed.
Error analysis of filtering operations in pixel-duplicated images of diabetic retinopathy
NASA Astrophysics Data System (ADS)
Mehrubeoglu, Mehrube; McLauchlan, Lifford
2010-08-01
In this paper, diabetic retinopathy is chosen for a sample target image to demonstrate the effectiveness of image enlargement through pixel duplication in identifying regions of interest. Pixel duplication is presented as a simpler alternative to data interpolation techniques for detecting small structures in the images. A comparative analysis is performed on different image processing schemes applied to both original and pixel-duplicated images. Structures of interest are detected and and classification parameters optimized for minimum false positive detection in the original and enlarged retinal pictures. The error analysis demonstrates the advantages as well as shortcomings of pixel duplication in image enhancement when spatial averaging operations (smoothing filters) are also applied.
Numerical analysis of helical dielectric elastomer actuator
NASA Astrophysics Data System (ADS)
Park, Jang Ho; Nair, Saurabh; Kim, Daewon
2017-04-01
Dielectric elastomer actuators (DEA) are known for its capability of experiencing extreme strains, as it can expand and contract based on specific actuation voltage applied. On contrary, helical DEA (HDEA) with its unique configuration does not only provide the contractile and extendable capabilities, but also can aid in attaining results for bending and torsion. The concept of HDEA embraces many new techniques and can be applied in multiple disciplines. Thus, this paper focuses on the simulation of HDEA with helical compliant electrodes that is a major factor prior to its application. The attributes of the material used to build the structure plays a vital role in the behavior of the system. For numerical analysis of HDEA, the material characteristics are input into a commercial grade software, and then the appropriate analysis is performed to retrieve its outcome. Applying the material characteristics into numerical analysis modeling, the functionality of HDEA for various activations can be achieved, which is used to test and comply with the fabricated final product.
Tipping point analysis of atmospheric oxygen concentration
DOE Office of Scientific and Technical Information (OSTI.GOV)
Livina, V. N.; Forbes, A. B.; Vaz Martins, T. M.
2015-03-15
We apply tipping point analysis to nine observational oxygen concentration records around the globe, analyse their dynamics and perform projections under possible future scenarios, leading to oxygen deficiency in the atmosphere. The analysis is based on statistical physics framework with stochastic modelling, where we represent the observed data as a composition of deterministic and stochastic components estimated from the observed data using Bayesian and wavelet techniques.
Stability and Hopf bifurcation in a simplified BAM neural network with two time delays.
Cao, Jinde; Xiao, Min
2007-03-01
Various local periodic solutions may represent different classes of storage patterns or memory patterns, and arise from the different equilibrium points of neural networks (NNs) by applying Hopf bifurcation technique. In this paper, a bidirectional associative memory NN with four neurons and multiple delays is considered. By applying the normal form theory and the center manifold theorem, analysis of its linear stability and Hopf bifurcation is performed. An algorithm is worked out for determining the direction and stability of the bifurcated periodic solutions. Numerical simulation results supporting the theoretical analysis are also given.
Development of a probabilistic analysis methodology for structural reliability estimation
NASA Technical Reports Server (NTRS)
Torng, T. Y.; Wu, Y.-T.
1991-01-01
The novel probabilistic analysis method for assessment of structural reliability presented, which combines fast-convolution with an efficient structural reliability analysis, can after identifying the most important point of a limit state proceed to establish a quadratic-performance function. It then transforms the quadratic function into a linear one, and applies fast convolution. The method is applicable to problems requiring computer-intensive structural analysis. Five illustrative examples of the method's application are given.
Sciutto, Giorgia; Oliveri, Paolo; Catelli, Emilio; Bonacini, Irene
2017-01-01
In the field of applied researches in heritage science, the use of multivariate approach is still quite limited and often chemometric results obtained are often underinterpreted. Within this scenario, the present paper is aimed at disseminating the use of suitable multivariate methodologies and proposes a procedural workflow applied on a representative group of case studies, of considerable importance for conservation purposes, as a sort of guideline on the processing and on the interpretation of this FTIR data. Initially, principal component analysis (PCA) is performed and the score values are converted into chemical maps. Successively, the brushing approach is applied, demonstrating its usefulness for a deep understanding of the relationships between the multivariate map and PC score space, as well as for the identification of the spectral bands mainly involved in the definition of each area localised within the score maps. PMID:29333162
Evaluating health service quality: using importance performance analysis.
Izadi, Azar; Jahani, Younes; Rafiei, Sima; Masoud, Ali; Vali, Leila
2017-08-14
Purpose Measuring healthcare service quality provides an objective guide for managers and policy makers to improve their services and patient satisfaction. Consequently, the purpose of this paper is to measure service quality provided to surgical and medical inpatients at Kerman Medical Sciences University (KUMS) in 2015. Design/methodology/approach A descriptive-analytic study, using a cross-sectional method in the KUMS training hospitals, was implemented between October 2 and March 15, 2015. Using stratified random sampling, 268 patients were selected. Data were collected using an importance-performance analysis (IPA) questionnaire, which measures current performance and determines each item's importance from the patients' perspectives. These data indicate overall satisfaction and appropriate practical strategies for managers to plan accordingly. Findings Findings revealed a significant gap between service importance and performance. From the patients' viewpoint, tangibility was the highest priority (mean=3.54), while reliability was given the highest performance (mean=3.02). The least important and lowest performance level was social accountability (mean=1.91 and 1.98, respectively). Practical implications Healthcare managers should focus on patient viewpoints and apply patient comments to solve problems, improve service quality and patient satisfaction. Originality/value The authors applied an IPA questionnaire to measure service quality provided to surgical and medical ward patients. This method identifies and corrects service quality shortcomings and improving service recipient perceptions.
A novel non-contact radar sensor for affective and interactive analysis.
Lin, Hong-Dun; Lee, Yen-Shien; Shih, Hsiang-Lan; Chuang, Bor-Nian
2013-01-01
Currently, many physiological signal sensing techniques have been applied for affective analysis in Human-Computer Interaction applications. Most known maturely developed sensing methods (EEG/ECG/EMG/Temperature/BP etc. al.) replied on contact way to obtain desired physiological information for further data analysis. However, those methods might cause some inconvenient and uncomfortable problems, and not easy to be used for affective analysis in interactive performing. To improve this issue, a novel technology based on low power radar technology (Nanosecond Pulse Near-field Sensing, NPNS) with 300 MHz radio-frequency was proposed to detect humans' pulse signal by the non-contact way for heartbeat signal extraction. In this paper, a modified nonlinear HRV calculated algorithm was also developed and applied on analyzing affective status using extracted Peak-to-Peak Interval (PPI) information from detected pulse signal. The proposed new affective analysis method is designed to continuously collect the humans' physiological signal, and validated in a preliminary experiment with sound, light and motion interactive performance. As a result, the mean bias between PPI (from NPNS) and RRI (from ECG) shows less than 1ms, and the correlation is over than 0.88, respectively.
NASA Astrophysics Data System (ADS)
Rasia, Rodolfo J.; Rasia-Valverde, Juana R.; Stoltz, Jean F.
1996-01-01
Laser backscattering is an excellent tool to investigate size and concentration of suspended particles. It was successfully applied to the analysis of erythrocyte aggregation. A method is proposed that applies laser backscattering to the evaluation of the strength of the immunologic erythrocyte agglutination by approaching the energy required for the mechanical dissociation of agglutinates. Mills and Snabre have proposed a theory of laser backscattering for erythrocyte aggregation analysis. It is applied here to analyze the dissociation process of erythrocyte agglutinates performed by imposing a constant shear rate to the agglutinate suspension in a couette viscometer until a dispersion of isolated red cells is attained. Experimental verifications of the method were performed on the erythrocytes of the ABO group reacting against an anti-A test serum in twofold series dilutions. Spent energy is approached by a numerical process carried out on the backscattered intensity data registered during mechanical dissociation. Velocities of agglutination and dissociation lead to the calculation of dissociation parameters These values are used to evaluate the strength of the immunological reaction and to discriminate weak subgroups of ABO system.
Ziemann, Alexandra; Fouillet, Anne; Brand, Helmut; Krafft, Thomas
2016-01-01
Introduction Syndromic surveillance aims at augmenting traditional public health surveillance with timely information. To gain a head start, it mainly analyses existing data such as from web searches or patient records. Despite the setup of many syndromic surveillance systems, there is still much doubt about the benefit of the approach. There are diverse interactions between performance indicators such as timeliness and various system characteristics. This makes the performance assessment of syndromic surveillance systems a complex endeavour. We assessed if the comparison of several syndromic surveillance systems through Qualitative Comparative Analysis helps to evaluate performance and identify key success factors. Materials and Methods We compiled case-based, mixed data on performance and characteristics of 19 syndromic surveillance systems in Europe from scientific and grey literature and from site visits. We identified success factors by applying crisp-set Qualitative Comparative Analysis. We focused on two main areas of syndromic surveillance application: seasonal influenza surveillance and situational awareness during different types of potentially health threatening events. Results We found that syndromic surveillance systems might detect the onset or peak of seasonal influenza earlier if they analyse non-clinical data sources. Timely situational awareness during different types of events is supported by an automated syndromic surveillance system capable of analysing multiple syndromes. To our surprise, the analysis of multiple data sources was no key success factor for situational awareness. Conclusions We suggest to consider these key success factors when designing or further developing syndromic surveillance systems. Qualitative Comparative Analysis helped interpreting complex, mixed data on small-N cases and resulted in concrete and practically relevant findings. PMID:27182731
Ziemann, Alexandra; Fouillet, Anne; Brand, Helmut; Krafft, Thomas
2016-01-01
Syndromic surveillance aims at augmenting traditional public health surveillance with timely information. To gain a head start, it mainly analyses existing data such as from web searches or patient records. Despite the setup of many syndromic surveillance systems, there is still much doubt about the benefit of the approach. There are diverse interactions between performance indicators such as timeliness and various system characteristics. This makes the performance assessment of syndromic surveillance systems a complex endeavour. We assessed if the comparison of several syndromic surveillance systems through Qualitative Comparative Analysis helps to evaluate performance and identify key success factors. We compiled case-based, mixed data on performance and characteristics of 19 syndromic surveillance systems in Europe from scientific and grey literature and from site visits. We identified success factors by applying crisp-set Qualitative Comparative Analysis. We focused on two main areas of syndromic surveillance application: seasonal influenza surveillance and situational awareness during different types of potentially health threatening events. We found that syndromic surveillance systems might detect the onset or peak of seasonal influenza earlier if they analyse non-clinical data sources. Timely situational awareness during different types of events is supported by an automated syndromic surveillance system capable of analysing multiple syndromes. To our surprise, the analysis of multiple data sources was no key success factor for situational awareness. We suggest to consider these key success factors when designing or further developing syndromic surveillance systems. Qualitative Comparative Analysis helped interpreting complex, mixed data on small-N cases and resulted in concrete and practically relevant findings.
ERIC Educational Resources Information Center
Chen, Sheng-Tung; Kuo, Hsiao-I.; Chen, Chi-Chung
2012-01-01
The two-stage least squares approach together with quantile regression analysis is adopted here to estimate the educational production function. Such a methodology is able to capture the extreme behaviors of the two tails of students' performance and the estimation outcomes have important policy implications. Our empirical study is applied to the…
ERIC Educational Resources Information Center
Lee, Jeong W.
Quantitative financial measures were applied to evaluate the performance of the North Dakota Public Employee Retirement System (NDPERS) pension fund portfolios and the Teachers Insurance and Annuity Association (TIAA)/College Retirement Equities Fund (CREF) portfolios, thus providing a relative performance assessment. Ten years of data were…
Predicting Salesperson Performance: A Review of the Literature
1987-02-01
Review, Journal of Marketing Research , Industrial Marketing Management, Personnel Psychology, Journal of Vocational Behavior, Journal of Applied...Science Quarterly, Academy of Management Journal, Academy of Management Review, Journal of Marketing, Journal of Marketing Research , Industrial...W., & Walker, 0. C., Jr. (1985). The determinants of salesperson performance: A meta- analysis. Journal of Marketing Research , 22, 103-118
ERIC Educational Resources Information Center
Bransky, Judith; Qualter, Anne
1993-01-01
Describes the findings of secondary analysis of data from the Assessment of Performance Unit (APU) Science. The most striking feature of the study is the extremely low level of scores obtained for questions which invite a written response. The results also clearly show the consistent negative reaction of girls to the technical context of…
ERIC Educational Resources Information Center
Meredith, Tamara R.
2017-01-01
Facebook studio groups/pages are commonly used by applied music faculty to communicate with current students, recruit new students, share students' activities, and promote faculty members' professional performances and academic endeavors. However, the blurred lines between academic, professional performance, and social activities in the field have…
Non-destructive Analysis Reveals Effect of Installation Details on Plywood Siding Performance
Christopher G. Hunt; Gregory T. Schueneman; Steven Lacher; Xiping Wang; R. Sam Williams
2015-01-01
This study evaluated the influence of a variety of construction techniques on the performance of plywood siding and the applied paint, using both ultrasound and conventional visual inspection techniques. The impact of bottom edge contact, flashing vs. caulking board ends, priming the bottom edge, location (Wisconsin vs. Mississippi) and a gap behind the siding to...
NASA Astrophysics Data System (ADS)
Tolba, Khaled Ibrahim; Morgenthal, Guido
2018-01-01
This paper presents an analysis of the scalability and efficiency of a simulation framework based on the vortex particle method. The code is applied for the numerical aerodynamic analysis of line-like structures. The numerical code runs on multicore CPU and GPU architectures using OpenCL framework. The focus of this paper is the analysis of the parallel efficiency and scalability of the method being applied to an engineering test case, specifically the aeroelastic response of a long-span bridge girder at the construction stage. The target is to assess the optimal configuration and the required computer architecture, such that it becomes feasible to efficiently utilise the method within the computational resources available for a regular engineering office. The simulations and the scalability analysis are performed on a regular gaming type computer.
High-performance concrete : applying life-cycle cost analysis and developing specifications.
DOT National Transportation Integrated Search
2016-12-01
Numerous studies and transportation agency experience across the nation have established that highperformance concrete (HPC) technology improves concrete quality and extends the service life of concrete structures at risk of chlorideinduced cor...
Airport Landside. Volume I. Planning Guide.
DOT National Transportation Integrated Search
1982-01-01
This volume describes a methodology for performing airport landside planning by applying the Airport Landside Simulation Model (ALSIM) developed by TSC. For this analysis, the airport landside is defined as extending from the airport boundary to the ...
Purification of crime scene DNA extracts using centrifugal filter devices
2013-01-01
Background The success of forensic DNA analysis is limited by the size, quality and purity of biological evidence found at crime scenes. Sample impurities can inhibit PCR, resulting in partial or negative DNA profiles. Various DNA purification methods are applied to remove impurities, for example, employing centrifugal filter devices. However, irrespective of method, DNA purification leads to DNA loss. Here we evaluate the filter devices Amicon Ultra 30 K and Microsep 30 K with respect to recovery rate and general performance for various types of PCR-inhibitory crime scene samples. Methods Recovery rates for DNA purification using Amicon Ultra 30 K and Microsep 30 K were gathered using quantitative PCR. Mock crime scene DNA extracts were analyzed using quantitative PCR and short tandem repeat (STR) profiling to test the general performance and inhibitor-removal properties of the two filter devices. Additionally, the outcome of long-term routine casework DNA analysis applying each of the devices was evaluated. Results Applying Microsep 30 K, 14 to 32% of the input DNA was recovered, whereas Amicon Ultra 30 K retained 62 to 70% of the DNA. The improved purity following filter purification counteracted some of this DNA loss, leading to slightly increased electropherogram peak heights for blood on denim (Amicon Ultra 30 K and Microsep 30 K) and saliva on envelope (Amicon Ultra 30 K). Comparing Amicon Ultra 30 K and Microsep 30 K for purification of DNA extracts from mock crime scene samples, the former generated significantly higher peak heights for rape case samples (P-values <0.01) and for hairs (P-values <0.036). In long-term routine use of the two filter devices, DNA extracts purified with Amicon Ultra 30 K were considerably less PCR-inhibitory in Quantifiler Human qPCR analysis compared to Microsep 30 K. Conclusions Amicon Ultra 30 K performed better than Microsep 30 K due to higher DNA recovery and more efficient removal of PCR-inhibitory substances. The different performances of the filter devices are likely caused by the quality of the filters and plastic wares, for example, their DNA binding properties. DNA purification using centrifugal filter devices can be necessary for successful DNA profiling of impure crime scene samples and for consistency between different PCR-based analysis systems, such as quantification and STR analysis. In order to maximize the possibility to obtain complete STR DNA profiles and to create an efficient workflow, the level of DNA purification applied should be correlated to the inhibitor-tolerance of the STR analysis system used. PMID:23618387
Probabilistic Analysis Techniques Applied to Complex Spacecraft Power System Modeling
NASA Technical Reports Server (NTRS)
Hojnicki, Jeffrey S.; Rusick, Jeffrey J.
2005-01-01
Electric power system performance predictions are critical to spacecraft, such as the International Space Station (ISS), to ensure that sufficient power is available to support all the spacecraft s power needs. In the case of the ISS power system, analyses to date have been deterministic, meaning that each analysis produces a single-valued result for power capability because of the complexity and large size of the model. As a result, the deterministic ISS analyses did not account for the sensitivity of the power capability to uncertainties in model input variables. Over the last 10 years, the NASA Glenn Research Center has developed advanced, computationally fast, probabilistic analysis techniques and successfully applied them to large (thousands of nodes) complex structural analysis models. These same techniques were recently applied to large, complex ISS power system models. This new application enables probabilistic power analyses that account for input uncertainties and produce results that include variations caused by these uncertainties. Specifically, N&R Engineering, under contract to NASA, integrated these advanced probabilistic techniques with Glenn s internationally recognized ISS power system model, System Power Analysis for Capability Evaluation (SPACE).
A Bayesian hierarchical diffusion model decomposition of performance in Approach–Avoidance Tasks
Krypotos, Angelos-Miltiadis; Beckers, Tom; Kindt, Merel; Wagenmakers, Eric-Jan
2015-01-01
Common methods for analysing response time (RT) tasks, frequently used across different disciplines of psychology, suffer from a number of limitations such as the failure to directly measure the underlying latent processes of interest and the inability to take into account the uncertainty associated with each individual's point estimate of performance. Here, we discuss a Bayesian hierarchical diffusion model and apply it to RT data. This model allows researchers to decompose performance into meaningful psychological processes and to account optimally for individual differences and commonalities, even with relatively sparse data. We highlight the advantages of the Bayesian hierarchical diffusion model decomposition by applying it to performance on Approach–Avoidance Tasks, widely used in the emotion and psychopathology literature. Model fits for two experimental data-sets demonstrate that the model performs well. The Bayesian hierarchical diffusion model overcomes important limitations of current analysis procedures and provides deeper insight in latent psychological processes of interest. PMID:25491372
Analysis of complex network performance and heuristic node removal strategies
NASA Astrophysics Data System (ADS)
Jahanpour, Ehsan; Chen, Xin
2013-12-01
Removing important nodes from complex networks is a great challenge in fighting against criminal organizations and preventing disease outbreaks. Six network performance metrics, including four new metrics, are applied to quantify networks' diffusion speed, diffusion scale, homogeneity, and diameter. In order to efficiently identify nodes whose removal maximally destroys a network, i.e., minimizes network performance, ten structured heuristic node removal strategies are designed using different node centrality metrics including degree, betweenness, reciprocal closeness, complement-derived closeness, and eigenvector centrality. These strategies are applied to remove nodes from the September 11, 2001 hijackers' network, and their performance are compared to that of a random strategy, which removes randomly selected nodes, and the locally optimal solution (LOS), which removes nodes to minimize network performance at each step. The computational complexity of the 11 strategies and LOS is also analyzed. Results show that the node removal strategies using degree and betweenness centralities are more efficient than other strategies.
Relative performance of academic departments using DEA with sensitivity analysis.
Tyagi, Preeti; Yadav, Shiv Prasad; Singh, S P
2009-05-01
The process of liberalization and globalization of Indian economy has brought new opportunities and challenges in all areas of human endeavor including education. Educational institutions have to adopt new strategies to make best use of the opportunities and counter the challenges. One of these challenges is how to assess the performance of academic programs based on multiple criteria. Keeping this in view, this paper attempts to evaluate the performance efficiencies of 19 academic departments of IIT Roorkee (India) through data envelopment analysis (DEA) technique. The technique has been used to assess the performance of academic institutions in a number of countries like USA, UK, Australia, etc. But we are using it first time in Indian context to the best of our knowledge. Applying DEA models, we calculate technical, pure technical and scale efficiencies and identify the reference sets for inefficient departments. Input and output projections are also suggested for inefficient departments to reach the frontier. Overall performance, research performance and teaching performance are assessed separately using sensitivity analysis.
Combination of PCA and LORETA for sources analysis of ERP data: an emotional processing study
NASA Astrophysics Data System (ADS)
Hu, Jin; Tian, Jie; Yang, Lei; Pan, Xiaohong; Liu, Jiangang
2006-03-01
The purpose of this paper is to study spatiotemporal patterns of neuronal activity in emotional processing by analysis of ERP data. 108 pictures (categorized as positive, negative and neutral) were presented to 24 healthy, right-handed subjects while 128-channel EEG data were recorded. An analysis of two steps was applied to the ERP data. First, principal component analysis was performed to obtain significant ERP components. Then LORETA was applied to each component to localize their brain sources. The first six principal components were extracted, each of which showed different spatiotemporal patterns of neuronal activity. The results agree with other emotional study by fMRI or PET. The combination of PCA and LORETA can be used to analyze spatiotemporal patterns of ERP data in emotional processing.
Siddiqui, Mohd Maroof; Srivastava, Geetika; Saeed, Syed Hasan
2016-01-01
Insomnia is a sleep disorder in which the subject encounters problems in sleeping. The aim of this study is to identify insomnia events from normal or effected person using time frequency analysis of PSD approach applied on EEG signals using channel ROC-LOC. In this research article, attributes and waveform of EEG signals of Human being are examined. The aim of this study is to draw the result in the form of signal spectral analysis of the changes in the domain of different stages of sleep. The analysis and calculation is performed in all stages of sleep of PSD of each EEG segment. Results indicate the possibility of recognizing insomnia events based on delta, theta, alpha and beta segments of EEG signals.
Unsupervised analysis of small animal dynamic Cerenkov luminescence imaging
NASA Astrophysics Data System (ADS)
Spinelli, Antonello E.; Boschi, Federico
2011-12-01
Clustering analysis (CA) and principal component analysis (PCA) were applied to dynamic Cerenkov luminescence images (dCLI). In order to investigate the performances of the proposed approaches, two distinct dynamic data sets obtained by injecting mice with 32P-ATP and 18F-FDG were acquired using the IVIS 200 optical imager. The k-means clustering algorithm has been applied to dCLI and was implemented using interactive data language 8.1. We show that cluster analysis allows us to obtain good agreement between the clustered and the corresponding emission regions like the bladder, the liver, and the tumor. We also show a good correspondence between the time activity curves of the different regions obtained by using CA and manual region of interest analysis on dCLIT and PCA images. We conclude that CA provides an automatic unsupervised method for the analysis of preclinical dynamic Cerenkov luminescence image data.
Using Solid State Disk Array as a Cache for LHC ATLAS Data Analysis
NASA Astrophysics Data System (ADS)
Yang, W.; Hanushevsky, A. B.; Mount, R. P.; Atlas Collaboration
2014-06-01
User data analysis in high energy physics presents a challenge to spinning-disk based storage systems. The analysis is data intense, yet reads are small, sparse and cover a large volume of data files. It is also unpredictable due to users' response to storage performance. We describe here a system with an array of Solid State Disk as a non-conventional, standalone file level cache in front of the spinning disk storage to help improve the performance of LHC ATLAS user analysis at SLAC. The system uses several days of data access records to make caching decisions. It can also use information from other sources such as a work-flow management system. We evaluate the performance of the system both in terms of caching and its impact on user analysis jobs. The system currently uses Xrootd technology, but the technique can be applied to any storage system.
Dynamic and thermal analysis of high speed tapered roller bearings under combined loading
NASA Technical Reports Server (NTRS)
Crecelius, W. J.; Milke, D. R.
1973-01-01
The development of a computer program capable of predicting the thermal and kinetic performance of high-speed tapered roller bearings operating with fluid lubrication under applied axial, radial and moment loading (five degrees of freedom) is detailed. Various methods of applying lubrication can be considered as well as changes in bearing internal geometry which occur as the bearing is brought to operating speeds, loads and temperatures.
Using Decision Analysis to Select Facility Maintenance Management Information Systems
2010-03-01
efficient way possible. Many of today’s maintenance managers thus apply computerized tools that come in the form of information systems that assist in... apply to effectively select a maintenance management information system that enables them to meet the needs of their customers. 3 1.2 Background...recession of the early 1990s. During this time, companies downsized their white-collar workforce performing daily operation and maintenance functions
Incorporating Resilience into Transportation Planning
DOE Office of Scientific and Technical Information (OSTI.GOV)
Connelly, Elizabeth; Melaina, Marc
To aid decision making for developing transportation infrastructure, the National Renewable Energy Laboratory has developed the Scenario Evaluation, Regionalization and Analysis (SERA) model. The SERA model is a geospatially and temporally oriented model that has been applied to determine optimal production and delivery scenarios for hydrogen, given resource availability and technology cost and performance, for use in fuel cell vehicles. In addition, the SERA model has been applied to plug-in electric vehicles.
NASA Astrophysics Data System (ADS)
Asztalos, Stephen J.; Hennig, Wolfgang; Warburton, William K.
2016-01-01
Pulse shape discrimination applied to certain fast scintillators is usually performed offline. In sufficiently high-event rate environments data transfer and storage become problematic, which suggests a different analysis approach. In response, we have implemented a general purpose pulse shape analysis algorithm in the XIA Pixie-500 and Pixie-500 Express digital spectrometers. In this implementation waveforms are processed in real time, reducing the pulse characteristics to a few pulse shape analysis parameters and eliminating time-consuming waveform transfer and storage. We discuss implementation of these features, their advantages, necessary trade-offs and performance. Measurements from bench top and experimental setups using fast scintillators and XIA processors are presented.
The research of suspen-dome structure
NASA Astrophysics Data System (ADS)
Gong, Shengyuan
2017-09-01
After overcoming the shortcomings of single-layer latticed shell and cable dome structure, the suspen-dome was developed by inheriting the advantages of them, and it was recognized and applied as a new type of prestressed force large span space structure. Based on the analysis of the background and mechanical principle, the researches of suspen-dome are reviewed, including form-finding analysis, the analysis of static force and stability, the dynamic behaviors and the earthquake resistant behavior, the analysis of prestressing force and optimization design, and the research status of the design of the fir-resistant performance etc. This thesis summarizes the methods of various researches, being a reference for further structural performance research and structural engineering application.
Comparative Analysis of English Language Student's School Paths at a Mexico University
ERIC Educational Resources Information Center
Robelo, Octaviano García; Marquez, Jorge Hernández; Pérez, Ileana Casasola
2017-01-01
Seven factors related to academic paths of students of the Bachelor of English Language of a public university in Mexico are investigated. With a non-experimental descriptive design, a Likert scale was applied to evaluate the college students' perception of these factors. A comparative analysis between three types of school paths was performed. It…
ERIC Educational Resources Information Center
Yan, Xi
2014-01-01
This paper explores the ideologies of English in China through a meta-discursive analysis of Chinese netizens' comments on the performance of English by Huang Xiaoming, a famous Chinese actor. By applying Park and Wee's framework for analysing ideological evaluations of appropriation (i.e. ideologies of allegiance, competence, and authenticity) to…
ERIC Educational Resources Information Center
Cowan, Richard J.; Abel, Leah; Candel, Lindsay
2017-01-01
We conducted a meta-analysis of single-subject research studies investigating the effectiveness of antecedent strategies grounded in behavioral momentum for improving compliance and on-task performance for students with autism. First, we assessed the research rigor of those studies meeting our inclusionary criteria. Next, in order to apply a…
Water quality parameter measurement using spectral signatures
NASA Technical Reports Server (NTRS)
White, P. E.
1973-01-01
Regression analysis is applied to the problem of measuring water quality parameters from remote sensing spectral signature data. The equations necessary to perform regression analysis are presented and methods of testing the strength and reliability of a regression are described. An efficient algorithm for selecting an optimal subset of the independent variables available for a regression is also presented.
Soft X-ray astronomy using grazing incidence optics
NASA Technical Reports Server (NTRS)
Davis, John M.
1989-01-01
The instrumental background of X-ray astronomy with an emphasis on high resolution imagery is outlined. Optical and system performance, in terms of resolution, are compared and methods for improving the latter in finite length instruments described. The method of analysis of broadband images to obtain diagnostic information is described and is applied to the analysis of coronal structures.
Utility of KTEA-3 Error Analysis for the Diagnosis of Specific Learning Disabilities
ERIC Educational Resources Information Center
Flanagan, Dawn P.; Mascolo, Jennifer T.; Alfonso, Vincent C.
2017-01-01
Through the use of excerpts from one of our own case studies, this commentary applied concepts inherent in, but not limited to, the neuropsychological literature to the interpretation of performance on the Kaufman Tests of Educational Achievement-Third Edition (KTEA-3), particularly at the level of error analysis. The approach to KTEA-3 test…
ERIC Educational Resources Information Center
Tyner, Bryan C.; Fienup, Daniel M.
2016-01-01
Task analyses are ubiquitous to applied behavior analysis interventions, yet little is known about the factors that make them effective. Numerous task analyses have been published in behavior analytic journals for constructing single-subject design graphs; however, learner outcomes using these task analyses may fall short of what could be…
Martin, Steve; Nutley, Sandra; Downe, James; Grace, Clive
2016-03-01
Approaches to performance assessment have been described as 'performance regimes', but there has been little analysis of what is meant by this concept and whether it has any real value. We draw on four perspectives on regimes - 'institutions and instruments', 'risk regulation regimes', 'internal logics and effects' and 'analytics of government' - to explore how the concept of a multi-dimensional regime can be applied to performance assessment in public services. We conclude that the concept is valuable. It helps to frame comparative and longitudinal analyses of approaches to performance assessment and draws attention to the ways in which public service performance regimes operate at different levels, how they change over time and what drives their development. Areas for future research include analysis of the impacts of performance regimes and interactions between their visible features (such as inspections, performance indicators and star ratings) and the veiled rationalities which underpin them.
Spatial analysis of groundwater levels using Fuzzy Logic and geostatistical tools
NASA Astrophysics Data System (ADS)
Theodoridou, P. G.; Varouchakis, E. A.; Karatzas, G. P.
2017-12-01
The spatial variability evaluation of the water table of an aquifer provides useful information in water resources management plans. Geostatistical methods are often employed to map the free surface of an aquifer. In geostatistical analysis using Kriging techniques the selection of the optimal variogram is very important for the optimal method performance. This work compares three different criteria to assess the theoretical variogram that fits to the experimental one: the Least Squares Sum method, the Akaike Information Criterion and the Cressie's Indicator. Moreover, variable distance metrics such as the Euclidean, Minkowski, Manhattan, Canberra and Bray-Curtis are applied to calculate the distance between the observation and the prediction points, that affects both the variogram calculation and the Kriging estimator. A Fuzzy Logic System is then applied to define the appropriate neighbors for each estimation point used in the Kriging algorithm. The two criteria used during the Fuzzy Logic process are the distance between observation and estimation points and the groundwater level value at each observation point. The proposed techniques are applied to a data set of 250 hydraulic head measurements distributed over an alluvial aquifer. The analysis showed that the Power-law variogram model and Manhattan distance metric within ordinary kriging provide the best results when the comprehensive geostatistical analysis process is applied. On the other hand, the Fuzzy Logic approach leads to a Gaussian variogram model and significantly improves the estimation performance. The two different variogram models can be explained in terms of a fractional Brownian motion approach and of aquifer behavior at local scale. Finally, maps of hydraulic head spatial variability and of predictions uncertainty are constructed for the area with the two different approaches comparing their advantages and drawbacks.
Intraoral distalizer effects with conventional and skeletal anchorage: a meta-analysis.
Grec, Roberto Henrique da Costa; Janson, Guilherme; Branco, Nuria Castello; Moura-Grec, Patrícia Garcia; Patel, Mayara Paim; Castanha Henriques, José Fernando
2013-05-01
The aims of this meta-analysis were to quantify and to compare the amounts of distalization and anchorage loss of conventional and skeletal anchorage methods in the correction of Class II malocclusion with intraoral distalizers. The literature was searched through 5 electronic databases, and inclusion criteria were applied. Articles that presented pretreatment and posttreatment cephalometric values were preferred. Quality assessments of the studies were performed. The averages and standard deviations of molar and premolar effects were extracted from the studies to perform a meta-analysis. After applying the inclusion and exclusion criteria, 40 studies were included in the systematic review. After the quality analysis, 2 articles were classified as high quality, 27 as medium quality, and 11 as low quality. For the meta-analysis, 6 studies were included, and they showed average molar distalization amounts of 3.34 mm with conventional anchorage and 5.10 mm with skeletal anchorage. The meta-analysis of premolar movement showed estimates of combined effects of 2.30 mm (mesialization) in studies with conventional anchorage and -4.01 mm (distalization) in studies with skeletal anchorage. There was scientific evidence that both anchorage systems are effective for distalization; however, with skeletal anchorage, there was no anchorage loss when direct anchorage was used. Copyright © 2013 American Association of Orthodontists. Published by Mosby, Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Khatri, Kshitij; Pu, Yi; Klein, Joshua A.; Wei, Juan; Costello, Catherine E.; Lin, Cheng; Zaia, Joseph
2018-04-01
Analysis of singly glycosylated peptides has evolved to a point where large-scale LC-MS analyses can be performed at almost the same scale as proteomics experiments. While collisionally activated dissociation (CAD) remains the mainstay of bottom-up analyses, it performs poorly for the middle-down analysis of multiply glycosylated peptides. With improvements in instrumentation, electron-activated dissociation (ExD) modes are becoming increasingly prevalent for proteomics experiments and for the analysis of fragile modifications such as glycosylation. While these methods have been applied for glycopeptide analysis in isolated studies, an organized effort to compare their efficiencies, particularly for analysis of multiply glycosylated peptides (termed here middle-down glycoproteomics), has not been made. We therefore compared the performance of different ExD modes for middle-down glycopeptide analyses. We identified key features among the different dissociation modes and show that increased electron energy and supplemental activation provide the most useful data for middle-down glycopeptide analysis. [Figure not available: see fulltext.
Apollo/Skylab suit program management systems study. Volume 2: Cost analysis
NASA Technical Reports Server (NTRS)
1974-01-01
The business management methods employed in the performance of the Apollo-Skylab Suit Program are studied. The data accumulated over the span of the contract as well as the methods used to accumulate the data are examined. Management methods associated with the monitoring and control of resources applied towards the performance of the contract are also studied and recommended upon. The primary objective is the compilation, analysis, and presentation of historical cost performance criteria. Cost data are depicted for all phases of the Apollo-Skylab program in common, meaningful terms, whereby the data may be applicable to future suit program planning efforts.
An analysis of the ArcCHECK-MR diode array's performance for ViewRay quality assurance.
Ellefson, Steven T; Culberson, Wesley S; Bednarz, Bryan P; DeWerd, Larry A; Bayouth, John E
2017-07-01
The ArcCHECK-MR diode array utilizes a correction system with a virtual inclinometer to correct the angular response dependencies of the diodes. However, this correction system cannot be applied to measurements on the ViewRay MR-IGRT system due to the virtual inclinometer's incompatibility with the ViewRay's multiple simultaneous beams. Additionally, the ArcCHECK's current correction factors were determined without magnetic field effects taken into account. In the course of performing ViewRay IMRT quality assurance with the ArcCHECK, measurements were observed to be consistently higher than the ViewRay TPS predictions. The goals of this study were to quantify the observed discrepancies and test whether applying the current factors improves the ArcCHECK's accuracy for measurements on the ViewRay. Gamma and frequency analysis were performed on 19 ViewRay patient plans. Ion chamber measurements were performed at a subset of diode locations using a PMMA phantom with the same dimensions as the ArcCHECK. A new method for applying directionally dependent factors utilizing beam information from the ViewRay TPS was developed in order to analyze the current ArcCHECK correction factors. To test the current factors, nine ViewRay plans were altered to be delivered with only a single simultaneous beam and were measured with the ArcCHECK. The current correction factors were applied using both the new and current methods. The new method was also used to apply corrections to the original 19 ViewRay plans. It was found the ArcCHECK systematically reports doses higher than those actually delivered by the ViewRay. Application of the current correction factors by either method did not consistently improve measurement accuracy. As dose deposition and diode response have both been shown to change under the influence of a magnetic field, it can be concluded the current ArcCHECK correction factors are invalid and/or inadequate to correct measurements on the ViewRay system. © 2017 The Authors. Journal of Applied Clinical Medical Physics published by Wiley Periodicals, Inc. on behalf of American Association of Physicists in Medicine.
NASA Technical Reports Server (NTRS)
Baumeister, Joseph F.
1990-01-01
Analysis of energy emitted from simple or complex cavity designs can lead to intricate solutions due to nonuniform radiosity and irradiation within a cavity. A numerical ray tracing technique was applied to simulate radiation propagating within and from various cavity designs. To obtain the energy balance relationships between isothermal and nonisothermal cavity surfaces and space, the computer code NEVADA was utilized for its statistical technique applied to numerical ray tracing. The analysis method was validated by comparing results with known theoretical and limiting solutions, and the electrical resistance network method. In general, for nonisothermal cavities the performance (apparent emissivity) is a function of cylinder length-to-diameter ratio, surface emissivity, and cylinder surface temperatures. The extent of nonisothermal conditions in a cylindrical cavity significantly affects the overall cavity performance. Results are presented over a wide range of parametric variables for use as a possible design reference.
NASA Technical Reports Server (NTRS)
Stokes, LeBarian
2009-01-01
This procedure establishes a system for performing testing in the Six-Degree-Of-Freedom Dynamic Test System (SDTS). Testing includes development and verification testing of customer supplied Test Articles (TAs) and other testing requirements, as requested. This procedure applies to all SDTS testing operations and equipment. The procedure provides an overview of testing performed in the SDTS including test identification requirements, test planning and procedure development, test and performance inspection, test data analysis, and test report generation.
The Thistle Field - Analysis of its past performance and optimisation of its future development
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bayat, M.G.; Tehrani, D.H.
1985-01-01
The Thistle Field geology and its reservoir performance over the past six years have been reviewed. The latest reservoir simulation study of the field, covering the performance history-matching, and the conclusions of various prediction cases are reported. The special features of PORES, Britoil in-house 3D 3-phase fully implicit numerical simulator and its modeling aids as applied to the Thistle Field are presented.
Ruijter, Jan M; Pfaffl, Michael W; Zhao, Sheng; Spiess, Andrej N; Boggy, Gregory; Blom, Jochen; Rutledge, Robert G; Sisti, Davide; Lievens, Antoon; De Preter, Katleen; Derveaux, Stefaan; Hellemans, Jan; Vandesompele, Jo
2013-01-01
RNA transcripts such as mRNA or microRNA are frequently used as biomarkers to determine disease state or response to therapy. Reverse transcription (RT) in combination with quantitative PCR (qPCR) has become the method of choice to quantify small amounts of such RNA molecules. In parallel with the democratization of RT-qPCR and its increasing use in biomedical research or biomarker discovery, we witnessed a growth in the number of gene expression data analysis methods. Most of these methods are based on the principle that the position of the amplification curve with respect to the cycle-axis is a measure for the initial target quantity: the later the curve, the lower the target quantity. However, most methods differ in the mathematical algorithms used to determine this position, as well as in the way the efficiency of the PCR reaction (the fold increase of product per cycle) is determined and applied in the calculations. Moreover, there is dispute about whether the PCR efficiency is constant or continuously decreasing. Together this has lead to the development of different methods to analyze amplification curves. In published comparisons of these methods, available algorithms were typically applied in a restricted or outdated way, which does not do them justice. Therefore, we aimed at development of a framework for robust and unbiased assessment of curve analysis performance whereby various publicly available curve analysis methods were thoroughly compared using a previously published large clinical data set (Vermeulen et al., 2009) [11]. The original developers of these methods applied their algorithms and are co-author on this study. We assessed the curve analysis methods' impact on transcriptional biomarker identification in terms of expression level, statistical significance, and patient-classification accuracy. The concentration series per gene, together with data sets from unpublished technical performance experiments, were analyzed in order to assess the algorithms' precision, bias, and resolution. While large differences exist between methods when considering the technical performance experiments, most methods perform relatively well on the biomarker data. The data and the analysis results per method are made available to serve as benchmark for further development and evaluation of qPCR curve analysis methods (http://qPCRDataMethods.hfrc.nl). Copyright © 2012 Elsevier Inc. All rights reserved.
Aguilar, María Esther Urrutia; Rosas, Efrén Raúl Ponce; León, Silvia Ortiz; Ochoa, Laura Peñaloza; Guzmán, Rosalinda Guevara
2017-01-01
To identify and compare the predictive agents associated with medical students´ academic performance that are undertaking cellular biology and human histology, as well as those physiotherapists that take molecular, cellular and tissue biology. An academic follow up was carried out during school. Tools on previous knowledge, vocation, psychological and confrontational means were applied at the beginning of the school year; and the last two were applied two more times afterwards. Data were analyzed considering descriptive, comparative, correlational and predictive statistics. The students´ participation was voluntary and data confidentiality was looked after. Copyright: © 2017 SecretarÍa de Salud
Preprocessing of 2-Dimensional Gel Electrophoresis Images Applied to Proteomic Analysis: A Review.
Goez, Manuel Mauricio; Torres-Madroñero, Maria Constanza; Röthlisberger, Sarah; Delgado-Trejos, Edilson
2018-02-01
Various methods and specialized software programs are available for processing two-dimensional gel electrophoresis (2-DGE) images. However, due to the anomalies present in these images, a reliable, automated, and highly reproducible system for 2-DGE image analysis has still not been achieved. The most common anomalies found in 2-DGE images include vertical and horizontal streaking, fuzzy spots, and background noise, which greatly complicate computational analysis. In this paper, we review the preprocessing techniques applied to 2-DGE images for noise reduction, intensity normalization, and background correction. We also present a quantitative comparison of non-linear filtering techniques applied to synthetic gel images, through analyzing the performance of the filters under specific conditions. Synthetic proteins were modeled into a two-dimensional Gaussian distribution with adjustable parameters for changing the size, intensity, and degradation. Three types of noise were added to the images: Gaussian, Rayleigh, and exponential, with signal-to-noise ratios (SNRs) ranging 8-20 decibels (dB). We compared the performance of wavelet, contourlet, total variation (TV), and wavelet-total variation (WTTV) techniques using parameters SNR and spot efficiency. In terms of spot efficiency, contourlet and TV were more sensitive to noise than wavelet and WTTV. Wavelet worked the best for images with SNR ranging 10-20 dB, whereas WTTV performed better with high noise levels. Wavelet also presented the best performance with any level of Gaussian noise and low levels (20-14 dB) of Rayleigh and exponential noise in terms of SNR. Finally, the performance of the non-linear filtering techniques was evaluated using a real 2-DGE image with previously identified proteins marked. Wavelet achieved the best detection rate for the real image. Copyright © 2018 Beijing Institute of Genomics, Chinese Academy of Sciences and Genetics Society of China. Production and hosting by Elsevier B.V. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liu, Xi; School of Electrical and Computer Engineering, Georgia Institute of Technology, Atlanta, Georgia 30332; Thadesar, Paragkumar A.
2014-09-15
In-situ microscale thermomechanical strain measurements have been performed in combination with synchrotron x-ray microdiffraction to understand the fundamental cause of failures in microelectronics devices with through-silicon vias. The physics behind the raster scan and data analysis of the measured strain distribution maps is explored utilizing the energies of indexed reflections from the measured data and applying them for beam intensity analysis and effective penetration depth determination. Moreover, a statistical analysis is performed for the beam intensity and strain distributions along the beam penetration path to account for the factors affecting peak search and strain refinement procedure.
NASA Technical Reports Server (NTRS)
Clare, L. P.; Yan, T.-Y.
1985-01-01
The analysis of the ALOHA random access protocol for communications channels with fading is presented. The protocol is modified to send multiple contiguous copies of a message at each transmission attempt. Both pure and slotted ALOHA channels are considered. A general two state model is used for the channel error process to account for the channel fading memory. It is shown that greater throughput and smaller delay may be achieved using repetitions. The model is applied to the analysis of the delay-throughput performance in a fading mobile communications environment. Numerical results are given for NASA's Mobile Satellite Experiment.
Process safety improvement--quality and target zero.
Van Scyoc, Karl
2008-11-15
Process safety practitioners have adopted quality management principles in design of process safety management systems with positive effect, yet achieving safety objectives sometimes remain a distant target. Companies regularly apply tools and methods which have roots in quality and productivity improvement. The "plan, do, check, act" improvement loop, statistical analysis of incidents (non-conformities), and performance trending popularized by Dr. Deming are now commonly used in the context of process safety. Significant advancements in HSE performance are reported after applying methods viewed as fundamental for quality management. In pursuit of continual process safety improvement, the paper examines various quality improvement methods, and explores how methods intended for product quality can be additionally applied to continual improvement of process safety. Methods such as Kaizen, Poke yoke, and TRIZ, while long established for quality improvement, are quite unfamiliar in the process safety arena. These methods are discussed for application in improving both process safety leadership and field work team performance. Practical ways to advance process safety, based on the methods, are given.
Nichitayio, M Yu; Bazyak, A M; Klochan, V V; Grusha, P K; Goman, A V
2015-02-01
Comparative analysis of results of the laser diode (the wave length 940 nm) and elec- trocoagulation application while performing laparoscopic cholecystectomy was con- ducted. For an acute calculous cholecystitis 52 patients were operated, in whom instead of electrocoagulation the laser was applied, provide for reduction of thermal impact on tissues, the complications absence, reduction of the patients stationary treatment duration postoperatively from (5.2 ± 1.2) to (4.9 ± 0.6) days.
CSM Testbed Development and Large-Scale Structural Applications
NASA Technical Reports Server (NTRS)
Knight, Norman F., Jr.; Gillian, R. E.; Mccleary, Susan L.; Lotts, C. G.; Poole, E. L.; Overman, A. L.; Macy, S. C.
1989-01-01
A research activity called Computational Structural Mechanics (CSM) conducted at the NASA Langley Research Center is described. This activity is developing advanced structural analysis and computational methods that exploit high-performance computers. Methods are developed in the framework of the CSM Testbed software system and applied to representative complex structural analysis problems from the aerospace industry. An overview of the CSM Testbed methods development environment is presented and some new numerical methods developed on a CRAY-2 are described. Selected application studies performed on the NAS CRAY-2 are also summarized.
ERIC Educational Resources Information Center
Huang, Wen-Hao David; Hood, Denice Ward; Yoo, Sun Joo
2014-01-01
Web 2.0 applications have been widely applied for teaching and learning in US higher education in recent years. Their potential impact on learning motivation and learner performance, however, has not attracted substantial research efforts. To better understand how Web 2.0 applications might impact learners' motivation in higher education…
NASA Astrophysics Data System (ADS)
Azougagh, Yassine; Benhida, Khalid; Elfezazi, Said
2016-02-01
In this paper, the focus is on studying the performance of complex systems in a supply chain context by developing a structured modelling approach based on the methodology ASDI (Analysis, Specification, Design and Implementation) by combining the modelling by Petri nets and simulation using ARENA. The linear approach typically followed in conducting of this kind of problems has to cope with a difficulty of modelling due to the complexity and the number of parameters of concern. Therefore, the approach used in this work is able to structure modelling a way to cover all aspects of the performance study. The modelling structured approach is first introduced before being applied to the case of an industrial system in the field of phosphate. Results of the performance indicators obtained from the models developed, permitted to test the behaviour and fluctuations of this system and to develop improved models of the current situation. In addition, in this paper, it was shown how Arena software can be adopted to simulate complex systems effectively. The method in this research can be applied to investigate various improvements scenarios and their consequences before implementing them in reality.
Analysis Method for Non-Nominal First Acquisition
NASA Technical Reports Server (NTRS)
Sieg, Detlef; Mugellesi-Dow, Roberta
2007-01-01
First this paper describes a method how the trajectory of the launcher can be modelled for the contingency analysis without having much information about the launch vehicle itself. From a dense sequence of state vectors a velocity profile is derived which is sufficiently accurate to enable the Flight Dynamics Team to integrate parts of the launcher trajectory on its own and to simulate contingency cases by modifying the velocity profile. Then the paper focuses on the thorough visibility analysis which has to follow the contingency case or burn performance simulations. In the ideal case it is possible to identify a ground station which is able to acquire the satellite independent from the burn performance. The correlations between the burn performance and the pointing at subsequent ground stations are derived with the aim of establishing simple guidelines which can be applied quickly and which significantly improve the chance of acquisition at subsequent ground stations. In the paper the method is applied to the Soyuz/Fregat launch with the MetOp satellite. Overall the paper shows that the launcher trajectory modelling with the simulation of contingency cases in connection with a ground station visibility analysis leads to a proper selection of ground stations and acquisition methods. In the MetOp case this ensured successful contact of all ground stations during the first hour after separation without having to rely on any early orbit determination result or state vector update.
Portugal, Fátima C M; Pinto, Moisés L; Pires, João; Nogueira, J M F
2010-06-04
Polyurethane (PU) foams were applied for stir bar sorptive extraction of five triazinic metabolites (desethyl-2-hydroxyatrazine, desisopropylatrazine, desethylatrazine, 2-hydroxyatrazine and desethylterbuthylazine) in water matrices, followed by liquid desorption and high performance liquid chromatography with diode array detection (SBSE(PU)-LD/HPLC-DAD). The optimum conditions for SBSE(PU)-LD were 5h of extraction (1000 rpm) and 5% (v/v) of methanol for the analysis of desethyl-2-hydroxyatrazine and 2-hydroxyatrazine, 15% (w/v) of sodium chloride for the remaining compounds and acetonitrile as back-extraction solvent (5 mL) under ultrasonic treatment (60 min). The methodology provided recoveries up to 26.3%, remarkable precision (RSD<2.4%), excellent linear dynamic ranges between 5.0 and 122.1 microg/L (r(2)>0.9993) and convenient detection limits (0.4-1.3 microg/L). The proposed method was applied in the analysis of triazinic metabolites in tap, river and ground waters, with remarkable performance and negligible matrix effects. The comparison of the recoveries obtained by PU and commercial stir bars was also performed, where the yields achieved with the former were up to ten times higher proving that PU is appropriate for analysis at trace level of this type of polar compounds in water matrices. Copyright 2010 Elsevier B.V. All rights reserved.
Smart Sensor-Based Motion Detection System for Hand Movement Training in Open Surgery.
Sun, Xinyao; Byrns, Simon; Cheng, Irene; Zheng, Bin; Basu, Anup
2017-02-01
We introduce a smart sensor-based motion detection technique for objective measurement and assessment of surgical dexterity among users at different experience levels. The goal is to allow trainees to evaluate their performance based on a reference model shared through communication technology, e.g., the Internet, without the physical presence of an evaluating surgeon. While in the current implementation we used a Leap Motion Controller to obtain motion data for analysis, our technique can be applied to motion data captured by other smart sensors, e.g., OptiTrack. To differentiate motions captured from different participants, measurement and assessment in our approach are achieved using two strategies: (1) low level descriptive statistical analysis, and (2) Hidden Markov Model (HMM) classification. Based on our surgical knot tying task experiment, we can conclude that finger motions generated from users with different surgical dexterity, e.g., expert and novice performers, display differences in path length, number of movements and task completion time. In order to validate the discriminatory ability of HMM for classifying different movement patterns, a non-surgical task was included in our analysis. Experimental results demonstrate that our approach had 100 % accuracy in discriminating between expert and novice performances. Our proposed motion analysis technique applied to open surgical procedures is a promising step towards the development of objective computer-assisted assessment and training systems.
NASA Astrophysics Data System (ADS)
Curceac, S.; Ternynck, C.; Ouarda, T.
2015-12-01
Over the past decades, a substantial amount of research has been conducted to model and forecast climatic variables. In this study, Nonparametric Functional Data Analysis (NPFDA) methods are applied to forecast air temperature and wind speed time series in Abu Dhabi, UAE. The dataset consists of hourly measurements recorded for a period of 29 years, 1982-2010. The novelty of the Functional Data Analysis approach is in expressing the data as curves. In the present work, the focus is on daily forecasting and the functional observations (curves) express the daily measurements of the above mentioned variables. We apply a non-linear regression model with a functional non-parametric kernel estimator. The computation of the estimator is performed using an asymmetrical quadratic kernel function for local weighting based on the bandwidth obtained by a cross validation procedure. The proximities between functional objects are calculated by families of semi-metrics based on derivatives and Functional Principal Component Analysis (FPCA). Additionally, functional conditional mode and functional conditional median estimators are applied and the advantages of combining their results are analysed. A different approach employs a SARIMA model selected according to the minimum Akaike (AIC) and Bayessian (BIC) Information Criteria and based on the residuals of the model. The performance of the models is assessed by calculating error indices such as the root mean square error (RMSE), relative RMSE, BIAS and relative BIAS. The results indicate that the NPFDA models provide more accurate forecasts than the SARIMA models. Key words: Nonparametric functional data analysis, SARIMA, time series forecast, air temperature, wind speed
Generic Degraded Congiguration Probability Analysis for DOE Codisposal Waste Package
DOE Office of Scientific and Technical Information (OSTI.GOV)
S.F.A. Deng; M. Saglam; L.J. Gratton
2001-05-23
In accordance with the technical work plan, ''Technical Work Plan For: Department of Energy Spent Nuclear Fuel Work Packages'' (CRWMS M&O 2000c), this Analysis/Model Report (AMR) is developed for the purpose of screening out degraded configurations for U.S. Department of Energy (DOE) spent nuclear fuel (SNF) types. It performs the degraded configuration parameter and probability evaluations of the overall methodology specified in the ''Disposal Criticality Analysis Methodology Topical Report'' (YMP 2000, Section 3) to qualifying configurations. Degradation analyses are performed to assess realizable parameter ranges and physical regimes for configurations. Probability calculations are then performed for configurations characterized by k{submore » eff} in excess of the Critical Limit (CL). The scope of this document is to develop a generic set of screening criteria or models to screen out degraded configurations having potential for exceeding a criticality limit. The developed screening criteria include arguments based on physical/chemical processes and probability calculations and apply to DOE SNF types when codisposed with the high-level waste (HLW) glass inside a waste package. The degradation takes place inside the waste package and is long after repository licensing has expired. The emphasis of this AMR is on degraded configuration screening and the probability analysis is one of the approaches used for screening. The intended use of the model is to apply the developed screening criteria to each DOE SNF type following the completion of the degraded mode criticality analysis internal to the waste package.« less
NASA Technical Reports Server (NTRS)
Fink, Pamela K.; Palmer, Karol K.
1988-01-01
The development of a probabilistic structural analysis methodology (PSAM) is described. In the near-term, the methodology will be applied to designing critical components of the next generation space shuttle main engine. In the long-term, PSAM will be applied very broadly, providing designers with a new technology for more effective design of structures whose character and performance are significantly affected by random variables. The software under development to implement the ideas developed in PSAM resembles, in many ways, conventional deterministic structural analysis code. However, several additional capabilities regarding the probabilistic analysis makes the input data requirements and the resulting output even more complex. As a result, an intelligent front- and back-end to the code is being developed to assist the design engineer in providing the input data in a correct and appropriate manner. The type of knowledge that this entails is, in general, heuristically-based, allowing the fairly well-understood technology of production rules to apply with little difficulty. However, the PSAM code, called NESSUS, is written in FORTRAN-77 and runs on a DEC VAX. Thus, the associated expert system, called NESSUS/EXPERT, must run on a DEC VAX as well, and integrate effectively and efficiently with the existing FORTRAN code. This paper discusses the process undergone to select a suitable tool, identify an appropriate division between the functions that should be performed in FORTRAN and those that should be performed by production rules, and how integration of the conventional and AI technologies was achieved.
Use of model calibration to achieve high accuracy in analysis of computer networks
Frogner, Bjorn; Guarro, Sergio; Scharf, Guy
2004-05-11
A system and method are provided for creating a network performance prediction model, and calibrating the prediction model, through application of network load statistical analyses. The method includes characterizing the measured load on the network, which may include background load data obtained over time, and may further include directed load data representative of a transaction-level event. Probabilistic representations of load data are derived to characterize the statistical persistence of the network performance variability and to determine delays throughout the network. The probabilistic representations are applied to the network performance prediction model to adapt the model for accurate prediction of network performance. Certain embodiments of the method and system may be used for analysis of the performance of a distributed application characterized as data packet streams.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhou, Ning; Huang, Zhenyu; Tuffner, Francis K.
2010-07-31
Small signal stability problems are one of the major threats to grid stability and reliability. Prony analysis has been successfully applied on ringdown data to monitor electromechanical modes of a power system using phasor measurement unit (PMU) data. To facilitate an on-line application of mode estimation, this paper developed a recursive algorithm for implementing Prony analysis and proposed an oscillation detection method to detect ringdown data in real time. By automatically detect ringdown data, the proposed method helps guarantee that Prony analysis is applied properly and timely on the ringdown data. Thus, the mode estimation results can be performed reliablymore » and timely. The proposed method is tested using Monte Carlo simulations based on a 17-machine model and is shown to be able to properly identify the oscillation data for on-line application of Prony analysis.« less
Evaluation of FTIR spectroscopy as diagnostic tool for colorectal cancer using spectral analysis
NASA Astrophysics Data System (ADS)
Dong, Liu; Sun, Xuejun; Chao, Zhang; Zhang, Shiyun; Zheng, Jianbao; Gurung, Rajendra; Du, Junkai; Shi, Jingsen; Xu, Yizhuang; Zhang, Yuanfu; Wu, Jinguang
2014-03-01
The aim of this study is to confirm FTIR spectroscopy as a diagnostic tool for colorectal cancer. 180 freshly removed colorectal samples were collected from 90 patients for spectrum analysis. The ratios of spectral intensity and relative intensity (/I1460) were calculated. Principal component analysis (PCA) and Fisher's discriminant analysis (FDA) were applied to distinguish the malignant from normal. The FTIR parameters of colorectal cancer and normal tissues were distinguished due to the contents or configurations of nucleic acids, proteins, lipids and carbohydrates. Related to nitrogen containing, water, protein and nucleic acid were increased significantly in the malignant group. Six parameters were selected as independent factors to perform discriminant functions. The sensitivity for FTIR in diagnosing colorectal cancer was 96.6% by discriminant analysis. Our study demonstrates that FTIR can be a useful technique for detection of colorectal cancer and may be applied in clinical colorectal cancer diagnosis.
Cost/benefit analysis of advanced materials technology candidates for the 1980's, part 2
NASA Technical Reports Server (NTRS)
Dennis, R. E.; Maertins, H. F.
1980-01-01
Cost/benefit analyses to evaluate advanced material technologies projects considered for general aviation and turboprop commuter aircraft through estimated life-cycle costs, direct operating costs, and development costs are discussed. Specifically addressed is the selection of technologies to be evaluated; development of property goals; assessment of candidate technologies on typical engines and aircraft; sensitivity analysis of the changes in property goals on performance and economics, cost, and risk analysis for each technology; and ranking of each technology by relative value. The cost/benefit analysis was applied to a domestic, nonrevenue producing, business-type jet aircraft configured with two TFE731-3 turbofan engines, and to a domestic, nonrevenue producing, business type turboprop aircraft configured with two TPE331-10 turboprop engines. In addition, a cost/benefit analysis was applied to a commercial turboprop aircraft configured with a growth version of the TPE331-10.
NASA Astrophysics Data System (ADS)
Vieira, Rodrigo Drumond; Kelly, Gregory J.
2014-11-01
In this paper, we present and apply a multi-level method for discourse analysis in science classrooms. This method is based on the structure of human activity (activity, actions, and operations) and it was applied to study a pre-service physics teacher methods course. We argue that such an approach, based on a cultural psychological perspective, affords opportunities for analysts to perform a theoretically based detailed analysis of discourse events. Along with the presentation of analysis, we show and discuss how the articulation of different levels offers interpretative criteria for analyzing instructional conversations. We synthesize the results into a model for a teacher's practice and discuss the implications and possibilities of this approach for the field of discourse analysis in science classrooms. Finally, we reflect on how the development of teachers' understanding of their activity structures can contribute to forms of progressive discourse of science education.
Analysis of cold worked holes for structural life extension
NASA Technical Reports Server (NTRS)
Wieland, David H.; Cutshall, Jon T.; Burnside, O. Hal; Cardinal, Joseph W.
1994-01-01
Cold working holes for improved fatigue life of fastener holes are widely used on aircraft. This paper presents methods used by the authors to determine the percent of cold working to be applied and to analyze fatigue crack growth of cold worked fastener holes. An elastic, perfectly-plastic analysis of a thick-walled tube is used to determine the stress field during the cold working process and the residual stress field after the process is completed. The results of the elastic/plastic analysis are used to determine the amount of cold working to apply to a hole. The residual stress field is then used to perform damage tolerance analysis of a crack growing out of a cold worked fastener hole. This analysis method is easily implemented in existing crack growth computer codes so that the cold worked holes can be used to extend the structural life of aircraft. Analytical results are compared to test data where appropriate.
NASA Astrophysics Data System (ADS)
Mitsutake, Ayori; Takano, Hiroshi
2015-09-01
It is important to extract reaction coordinates or order parameters from protein simulations in order to investigate the local minimum-energy states and the transitions between them. The most popular method to obtain such data is principal component analysis, which extracts modes of large conformational fluctuations around an average structure. We recently applied relaxation mode analysis for protein systems, which approximately estimates the slow relaxation modes and times from a simulation and enables investigations of the dynamic properties underlying the structural fluctuations of proteins. In this study, we apply this relaxation mode analysis to extract reaction coordinates for a system in which there are large conformational changes such as those commonly observed in protein folding/unfolding. We performed a 750-ns simulation of chignolin protein near its folding transition temperature and observed many transitions between the most stable, misfolded, intermediate, and unfolded states. We then applied principal component analysis and relaxation mode analysis to the system. In the relaxation mode analysis, we could automatically extract good reaction coordinates. The free-energy surfaces provide a clearer understanding of the transitions not only between local minimum-energy states but also between the folded and unfolded states, even though the simulation involved large conformational changes. Moreover, we propose a new analysis method called Markov state relaxation mode analysis. We applied the new method to states with slow relaxation, which are defined by the free-energy surface obtained in the relaxation mode analysis. Finally, the relaxation times of the states obtained with a simple Markov state model and the proposed Markov state relaxation mode analysis are compared and discussed.
The MEM of spectral analysis applied to L.O.D.
NASA Astrophysics Data System (ADS)
Fernandez, L. I.; Arias, E. F.
The maximum entropy method (MEM) has been widely applied for polar motion studies taking advantage of its performance on the management of complex time series. The authors used the algorithm of the MEM to estimate Cross Spectral function in order to compare interannual Length-of-Day (LOD) time series with Southern Oscillation Index (SOI) and Sea Surface Temperature (SST) series, which are close related to El Niño-Southern Oscillation (ENSO) events.
NASA Technical Reports Server (NTRS)
Rockfeller, W C
1939-01-01
Equations have been developed for the analysis of the performance of the ideal airplane, leading to an approximate physical interpretation of the performance problem. The basic sea-level airplane parameters have been generalized to altitude parameters and a new parameter has been introduced and physically interpreted. The performance analysis for actual airplanes has been obtained in terms of the equivalent ideal airplane in order that the charts developed for use in practical calculations will for the most part apply to any type of engine-propeller combination and system of control, the only additional material required consisting of the actual engine and propeller curves for propulsion unit. Finally, a more exact method for the calculation of the climb characteristics for the constant-speed controllable propeller is presented in the appendix.
NASA Armstrong's Approach to Store Separation Analysis
NASA Technical Reports Server (NTRS)
Acuff, Chris; Bui, Trong
2015-01-01
Presentation will an overview of NASA Armstrong's store separation capabilities and how they have been applied recently. Objective of the presentation is to brief Generation Orbit and other potential partners on NASA Armstrong's store separation capabilities. It will include discussions on the use of NAVSEP and Cart3D, as well as some Python scripting work to perform the analysis, and a short overview of this methodology applied to the Towed Glider Air Launch System. Collaboration with potential customers in this area could lead to funding for the further development of a store separation capability at NASA Armstrong, which would boost the portfolio of engineering expertise at the center.
NASA Astrophysics Data System (ADS)
Petrus, H. T. B. M.; Diga, A.; Rhamdani, A. R.; Warmada, I. W.; Yuliansyah, A. T.; Perdana, I.
2017-04-01
The performance and kinetic of nickel laterite reduction were studied. In this work, the reduction of nickel laterite ores by anthracite coal, representing the high-grade carbon content matter, and lamtoro charcoal, representing the bioreductor, were conducted in air and CO2 atmosphere, within the temperature ranged from 800°C and 1000°C. XRD analysis was applied to observe the performance of anthracite and lamtoro as a reductor. Two models were applied, sphere particle geometry model and Ginstling-Brounhstein diffusion model, to study the kinetic parameters. The results indicated that the type of reductant and the reduction atmosphere used greatly influence the kinetic parameters. The obtained values of activation energy vary in the range of 13.42-18.12 kcal/mol.
Factors affecting construction performance: exploratory factor analysis
NASA Astrophysics Data System (ADS)
Soewin, E.; Chinda, T.
2018-04-01
The present work attempts to develop a multidimensional performance evaluation framework for a construction company by considering all relevant measures of performance. Based on the previous studies, this study hypothesizes nine key factors, with a total of 57 associated items. The hypothesized factors, with their associated items, are then used to develop questionnaire survey to gather data. The exploratory factor analysis (EFA) was applied to the collected data which gave rise 10 factors with 57 items affecting construction performance. The findings further reveal that the items constituting ten key performance factors (KPIs) namely; 1) Time, 2) Cost, 3) Quality, 4) Safety & Health, 5) Internal Stakeholder, 6) External Stakeholder, 7) Client Satisfaction, 8) Financial Performance, 9) Environment, and 10) Information, Technology & Innovation. The analysis helps to develop multi-dimensional performance evaluation framework for an effective measurement of the construction performance. The 10 key performance factors can be broadly categorized into economic aspect, social aspect, environmental aspect, and technology aspects. It is important to understand a multi-dimension performance evaluation framework by including all key factors affecting the construction performance of a company, so that the management level can effectively plan to implement an effective performance development plan to match with the mission and vision of the company.
Biosignals Analysis for Kidney Function Effect Analysis of Fennel Aromatherapy
Kim, Bong-Hyun; Cho, Dong-Uk; Seo, Ssang-Hee
2015-01-01
Human effort in order to enjoy a healthy life is diverse. IT technology to these analyzes, the results of development efforts, it has been applied. Therefore, I use the care and maintenance diagnostic health management and prevention than treatment. In particular, the aromatherapy treatment easy to use without the side effects there is no irritation, are widely used in modern society. In this paper, we measured the aroma effect by applying a biosignal analysis techniques; an experiment was performed to analyze. In particular, we design methods and processes of research based on the theory aroma that affect renal function. Therefore, in this paper, measuring the biosignals and after fennel aromatherapy treatment prior to the enforcement of the mutual comparison, through the analysis, studies were carried out to analyze the effect of fennel aromatherapy therapy on kidney function. PMID:25977696
FaSTR DNA: a new expert system for forensic DNA analysis.
Power, Timothy; McCabe, Brendan; Harbison, Sally Ann
2008-06-01
The automation of DNA profile analysis of reference and crime samples continues to gain pace driven in part by a realisation by the criminal justice system of the positive impact DNA technology can have in aiding in the solution of crime and the apprehension of suspects. Expert systems to automate the profile analysis component of the process are beginning to be developed. In this paper, we report the validation of a new expert system FaSTR DNA, an expert system suitable for the analysis of DNA profiles from single source reference samples and from crime samples. We compare the performance of FaSTR DNA with that of other equivalent systems, GeneMapper ID v3.2 (Applied Biosystems, Foster City, CA) and FSS-i(3) v4 (The Forensic Science Service((R)) DNA expert System Suite FSS-i(3), Forensic Science Service, Birmingham, UK) with GeneScan Analysis v3.7/Genotyper v3.7 software (Applied Biosystems, Foster City, CA, USA) with manual review. We have shown that FaSTR DNA provides an alternative solution to automating DNA profile analysis and is appropriate for implementation into forensic laboratories. The FaSTR DNA system was demonstrated to be comparable in performance to that of GeneMapper ID v3.2 and superior to that of FSS-i(3) v4 for the analysis of DNA profiles from crime samples.
Speed Accuracy Tradeoffs in Human Speech Production
2017-05-01
for considering Fitts’ law in the domain of speech production is elucidated. Methodological challenges in applying Fitts-style analysis are addressed...order to assess whether articulatory kinematics conform to Fitts’ law. A second, associated goal is to address the methodological challenges inherent in...performing Fitts-style analysis on rtMRI data of speech production. Methodological challenges include segmenting continuous speech into specific motor
ERIC Educational Resources Information Center
Steinhauser, Marco; Hubner, Ronald
2009-01-01
It has been suggested that performance in the Stroop task is influenced by response conflict as well as task conflict. The present study investigated the idea that both conflict types can be isolated by applying ex-Gaussian distribution analysis which decomposes response time into a Gaussian and an exponential component. Two experiments were…
LC-MS and MS/MS in the analysis of recombinant proteins
NASA Astrophysics Data System (ADS)
Coulot, M.; Domon, B.; Grossenbacher, H.; Guenat, C.; Maerki, W.; Müller, D. R.; Richter, W. J.
1993-03-01
Applicability and performance of electrospray ionization mass spectrometry (ESIMS) is demonstrated for protein analysis. ESIMS is applied in conjunction with on-line HPLC (LC-ESlMS) and direct tandem mass spectrometry (positive and negative ion mode ESlMS/MS) to the structural characterization of a recombinant protein (r-hirudin variant 1) and a congener phosphorylated at threonine 45 (RP-1).
White matter degeneration in schizophrenia: a comparative diffusion tensor analysis
NASA Astrophysics Data System (ADS)
Ingalhalikar, Madhura A.; Andreasen, Nancy C.; Kim, Jinsuh; Alexander, Andrew L.; Magnotta, Vincent A.
2010-03-01
Schizophrenia is a serious and disabling mental disorder. Diffusion tensor imaging (DTI) studies performed on schizophrenia have demonstrated white matter degeneration either due to loss of myelination or deterioration of fiber tracts although the areas where the changes occur are variable across studies. Most of the population based studies analyze the changes in schizophrenia using scalar indices computed from the diffusion tensor such as fractional anisotropy (FA) and relative anisotropy (RA). The scalar measures may not capture the complete information from the diffusion tensor. In this paper we have applied the RADTI method on a group of 9 controls and 9 patients with schizophrenia. The RADTI method converts the tensors to log-Euclidean space where a linear regression model is applied and hypothesis testing is performed between the control and patient groups. Results show that there is a significant difference in the anisotropy between patients and controls especially in the parts of forceps minor, superior corona radiata, anterior limb of internal capsule and genu of corpus callosum. To check if the tensor analysis gives a better idea of the changes in anisotropy, we compared the results with voxelwise FA analysis as well as voxelwise geodesic anisotropy (GA) analysis.
Song, Hongjun; Cai, Ziliang; Noh, Hongseok Moses; Bennett, Dawn J
2010-03-21
In this paper we present a numerical and experimental investigation of a chaotic mixer in a microchannel via low frequency switching transverse electroosmotic flow. By applying a low frequency, square-wave electric field to a pair of parallel electrodes placed at the bottom of the channel, a complex 3D spatial and time-dependence flow was generated to stretch and fold the fluid. This significantly enhanced the mixing effect. The mixing mechanism was first investigated by numerical and experimental analysis. The effects of operational parameters such as flow rate, frequency, and amplitude of the applied voltage have also been investigated. It is found that the best mixing performance is achieved when the frequency is around 1 Hz, and the required mixing length is about 1.5 mm for the case of applied electric potential 5 V peak-to-peak and flow rate 75 microL h(-1). The mixing performance was significantly enhanced when the applied electric potential increased or the flow rate of fluids decreased.
NASA Astrophysics Data System (ADS)
Syarifah, V. B.; Rafi, M.; Wahyuni, W. T.
2017-05-01
Brotowali (Tinospora crispa) is widely used in Indonesia as ingredient of herbal medicine formulation. To ensure the quality, safety, and efficacy of herbal medicine products, its chemical constituents should be continuously evaluated. High performance liquid chromatography (HPLC) fingerprint is one of powerful technique for this quality control process. In this study, HPLC fingerprint analysis method was developed for quality control of brotowali. HPLC analysis was performed in C18 column and detection was performed using photodiode array detector. The optimum mobile phase for brotowali fingerprint was acetonitrile (ACN) and 0.1% formic acid in gradient elution mode at a flow rate of 1 mL/min. The number of peaks detected in HPLC fingerprint of brotowali was 32 peaks and 23 peaks for stems and leaves, respectively. Berberine as marker compound was detected at retention time of 20.525 minutes. Evaluation of analytical performance including precision, reproducibility, and stability prove that this HPLC fingerprint analysis was reliable and could be applied for quality control of brotowali.
Fluid-structure interaction analysis of deformation of sail of 30-foot yacht
NASA Astrophysics Data System (ADS)
Bak, Sera; Yoo, Jaehoon; Song, Chang Yong
2013-06-01
Most yacht sails are made of thin fabric, and they have a cambered shape to generate lift force; however, their shape can be easily deformed by wind pressure. Deformation of the sail shape changes the flow characteristics over the sail, which in turn further deforms the sail shape. Therefore, fluid-structure interaction (FSI) analysis is applied for the precise evaluation or optimization of the sail design. In this study, fluid flow analyses are performed for the main sail of a 30-foot yacht, and the results are applied to loading conditions for structural analyses. By applying the supporting forces from the rig, such as the mast and boom-end outhaul, as boundary conditions for structural analysis, the deformed sail shape is identified. Both the flow analyses and the structural analyses are iteratively carried out for the deformed sail shape. A comparison of the flow characteristics and surface pressures over the deformed sail shape with those over the initial shape shows that a considerable difference exists between the two and that FSI analysis is suitable for application to sail design.
Jatobá, Alessandro; de Carvalho, Paulo Victor R; da Cunha, Amauri Marques
2012-01-01
Work in organizations requires a minimum level of consensus on the understanding of the practices performed. To adopt technological devices to support the activities in environments where work is complex, characterized by the interdependence among a large number of variables, understanding about how work is done not only takes an even greater importance, but also becomes a more difficult task. Therefore, this study aims to present a method for modeling of work in complex systems, which allows improving the knowledge about the way activities are performed where these activities do not simply happen by performing procedures. Uniting techniques of Cognitive Task Analysis with the concept of Work Process, this work seeks to provide a method capable of providing a detailed and accurate vision of how people perform their tasks, in order to apply information systems for supporting work in organizations.
Development of a model to assess environmental performance, concerning HSE-MS principles.
Abbaspour, M; Hosseinzadeh Lotfi, F; Karbassi, A R; Roayaei, E; Nikoomaram, H
2010-06-01
The main objective of the present study was to develop a valid and appropriate model to evaluate companies' efficiency and environmental performance, concerning health, safety, and environmental management system principles. The proposed model overcomes the shortcomings of the previous models developed in this area. This model has been designed on the basis of a mathematical method known as Data Envelopment Analysis (DEA). In order to differentiate high-performing companies from weak ones, one of DEA nonradial models named as enhanced Russell graph efficiency measure has been applied. Since some of the environmental performance indicators cannot be controlled by companies' managers, it was necessary to develop the model in a way that it could be applied when discretionary and/or nondiscretionary factors were involved. The model, then, has been modified on a real case that comprised 12 oil and gas general contractors. The results showed the relative efficiency, inefficiency sources, and the rank of contractors.
Kremen, Arie; Tsompanakis, Yiannis
2010-04-01
The slope-stability of a proposed vertical extension of a balefill was investigated in the present study, in an attempt to determine a geotechnically conservative design, compliant with New Jersey Department of Environmental Protection regulations, to maximize the utilization of unclaimed disposal capacity. Conventional geotechnical analytical methods are generally limited to well-defined failure modes, which may not occur in landfills or balefills due to the presence of preferential slip surfaces. In addition, these models assume an a priori stress distribution to solve essentially indeterminate problems. In this work, a different approach has been applied, which avoids several of the drawbacks of conventional methods. Specifically, the analysis was performed in a two-stage process: (a) calculation of stress distribution, and (b) application of an optimization technique to identify the most probable failure surface. The stress analysis was performed using a finite element formulation and the location of the failure surface was located by dynamic programming optimization method. A sensitivity analysis was performed to evaluate the effect of the various waste strength parameters of the underlying mathematical model on the results, namely the factor of safety of the landfill. Although this study focuses on the stability investigation of an expanded balefill, the methodology presented can easily be applied to general geotechnical investigations.
Cassimatis, Constantine; Liu, Karen P Y; Fahey, Paul; Bissett, Michelle
2016-09-01
A systematic review with meta-analysis was performed to investigate the effect external sensory cued therapy on activities of daily living (ADL) performance that include walking and daily tasks such as dressing for individuals with Parkinson's disease (PD). A detailed computer-aided search of the literature was applied to MEDLINE, Cumulative Index to Nursing and Allied Health Literature, EMBASE and PubMed. Studies investigating the effects of external sensory cued therapy on ADL performance for individuals with PD in all stages of disease progression were collected. Relevant articles were critically reviewed and study results were synthesized by two independent researchers. A data-analysis method was used to extract data from selected articles. A meta-analysis was carried out for all randomized-controlled trials. Six studies with 243 individuals with PD were included in this review. All six studies yielded positive findings in favour of external sensory cues. The meta-analysis showed that external sensory cued therapy improved statistically after treatment (P=0.011) and at follow-up (P<0.001) for ADL performance. The results of this review provided evidence of an improvement in ADL performance in general in individuals with PD. It is recommended that clinicians incorporate external sensory into a training programme focused on improving daily task performance.
The development of a reliable amateur boxing performance analysis template.
Thomson, Edward; Lamb, Kevin; Nicholas, Ceri
2013-01-01
The aim of this study was to devise a valid performance analysis system for the assessment of the movement characteristics associated with competitive amateur boxing and assess its reliability using analysts of varying experience of the sport and performance analysis. Key performance indicators to characterise the demands of an amateur contest (offensive, defensive and feinting) were developed and notated using a computerised notational analysis system. Data were subjected to intra- and inter-observer reliability assessment using median sign tests and calculating the proportion of agreement within predetermined limits of error. For all performance indicators, intra-observer reliability revealed non-significant differences between observations (P > 0.05) and high agreement was established (80-100%) regardless of whether exact or the reference value of ±1 was applied. Inter-observer reliability was less impressive for both analysts (amateur boxer and experienced analyst), with the proportion of agreement ranging from 33-100%. Nonetheless, there was no systematic bias between observations for any indicator (P > 0.05), and the proportion of agreement within the reference range (±1) was 100%. A reliable performance analysis template has been developed for the assessment of amateur boxing performance and is available for use by researchers, coaches and athletes to classify and quantify the movement characteristics of amateur boxing.
Effect of pole zero location on system dynamics of boost converter for micro grid
NASA Astrophysics Data System (ADS)
Lavanya, A.; Vijayakumar, K.; Navamani, J. D.; Jayaseelan, N.
2018-04-01
Green clean energy like photo voltaic, wind energy, fuel cell can be brought together by microgrid.For low voltage sources like photovoltaic cell boost converter is very much essential. This paper explores the dynamic analysis of boost converter in a continuous conduction mode (CCM). The transient performance and stability analysis is carried out in this paper using time domain analysis and frequency domain analysis techniques. Boost converter is simulated using both PSIM and MATLAB software. Furthermore, state space model obtained and the transfer function is derived. The converter behaviour when a step input is applied is analyzed and stability of the converter is analyzed from bode plot frequency for open loop. Effect of the locations of poles and zeros in the transfer function of boost converter and how the performance parameters are affected is discussed in this paper. Closed loop performance with PI controller is also analyzed for boost converter.
Apparatus and system for multivariate spectral analysis
Keenan, Michael R.; Kotula, Paul G.
2003-06-24
An apparatus and system for determining the properties of a sample from measured spectral data collected from the sample by performing a method of multivariate spectral analysis. The method can include: generating a two-dimensional matrix A containing measured spectral data; providing a weighted spectral data matrix D by performing a weighting operation on matrix A; factoring D into the product of two matrices, C and S.sup.T, by performing a constrained alternating least-squares analysis of D=CS.sup.T, where C is a concentration intensity matrix and S is a spectral shapes matrix; unweighting C and S by applying the inverse of the weighting used previously; and determining the properties of the sample by inspecting C and S. This method can be used by a spectrum analyzer to process X-ray spectral data generated by a spectral analysis system that can include a Scanning Electron Microscope (SEM) with an Energy Dispersive Detector and Pulse Height Analyzer.
Multi-class ERP-based BCI data analysis using a discriminant space self-organizing map.
Onishi, Akinari; Natsume, Kiyohisa
2014-01-01
Emotional or non-emotional image stimulus is recently applied to event-related potential (ERP) based brain computer interfaces (BCI). Though the classification performance is over 80% in a single trial, a discrimination between those ERPs has not been considered. In this research we tried to clarify the discriminability of four-class ERP-based BCI target data elicited by desk, seal, spider images and letter intensifications. A conventional self organizing map (SOM) and newly proposed discriminant space SOM (ds-SOM) were applied, then the discriminabilites were visualized. We also classify all pairs of those ERPs by stepwise linear discriminant analysis (SWLDA) and verify the visualization of discriminabilities. As a result, the ds-SOM showed understandable visualization of the data with a shorter computational time than the traditional SOM. We also confirmed the clear boundary between the letter cluster and the other clusters. The result was coherent with the classification performances by SWLDA. The method might be helpful not only for developing a new BCI paradigm, but also for the big data analysis.
Environment, power, and society. [stressing energy language and energy analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Odum, H.T.
Studies of the energetics of ecological systems suggest general means for applying basic laws of energy and matter to the complex systems of nature and man. In this book, energy language is used to consider the pressing problem of survival in our time--the partnership of man in nature. An effort is made to show that energy analysis can help answer many of the questions of economics, law, and religion. Models for the analysis of a system are made by recognizing major divisions whose causal relationships are indicated by the pathways of interchange of energy and work. Then simulation allows themore » model's performance to be tested against the performance of the real system. Ideal energy flows are illustrated with ecological systems and then applied to all kinds of situations from very small biochemical processes to the large overall systems of man and the biosphere. Energy diagraming is included to consider the great problems of power, pollution, population, food, and war. This account also attempts to introduce ecology through the energy language.« less
Jiménez-Carvelo, Ana M; Pérez-Castaño, Estefanía; González-Casado, Antonio; Cuadros-Rodríguez, Luis
2017-04-15
A new method for differentiation of olive oil (independently of the quality category) from other vegetable oils (canola, safflower, corn, peanut, seeds, grapeseed, palm, linseed, sesame and soybean) has been developed. The analytical procedure for chromatographic fingerprinting of the methyl-transesterified fraction of each vegetable oil, using normal-phase liquid chromatography, is described and the chemometric strategies applied and discussed. Some chemometric methods, such as k-nearest neighbours (kNN), partial least squared-discriminant analysis (PLS-DA), support vector machine classification analysis (SVM-C), and soft independent modelling of class analogies (SIMCA), were applied to build classification models. Performance of the classification was evaluated and ranked using several classification quality metrics. The discriminant analysis, based on the use of one input-class, (plus a dummy class) was applied for the first time in this study. Copyright © 2016 Elsevier Ltd. All rights reserved.
UPLC analysis of free amino acids in wines: profiling of on-lees aged wines.
Fiechter, G; Pavelescu, D; Mayer, H K
2011-05-15
The evolution of free amino acid (FAA) profiles intrinsic to on-lees aged white wines was determined by ultra performance liquid chromatography (UPLC™). On basis of the AccQ.Tag™ method as a commercialized amino acid analysis solution for HPLC, a new protocol for dedicated amino acid analysis using 6-aminoquinolyl-N-hydroxysuccinimidyl carbamate (AQC) for pre-column derivatization was established by method transfer onto UPLC™ conditions. Since AQC derivatives enable both fluorescence (AccQ.Tag™ method) and UV detection, the performed method transfer additionally included changing to a more versatile UV detection. Emphasizing enhanced performance of UPLC™, the newly established protocol facilitated rapid and reliable separations of 24 amino acids within 23 min, hence proved to be superior compared to the original HPLC protocol due to significant improvements in resolution and reduced runtime. Applying UV detection enabled adequate quantifications of AQC amino acid derivatives at μM level (LOQs from 0.12 to 1.10 μM), thus proved sufficient sensitivity for amino acid profiling in wine samples. Moreover, this compiled methodology was successfully applied to monitor the changes of FAA concentrations in four distinct sets of on-lees aged white wines (fermented with different yeasts) at three progressing ripening periods, each (control, 3 and 6 months aging). For the control wines, the applied winery yeast significantly affected total FAA amounts (1450-1740 mg L(-1)). During maturation, the proceeding yeast autolysis implied a rather complex impact on FAAs, yielding total FAA excretions up to 360 mg L(-1). However, the magnitude for increases of specific FAAs (up to +200%) highly depended on the individual amino acids as well as on the applied fermenting yeast. Given the overall complexity of yeast autolysis in winemaking, the application of efficient LC techniques such as UPLC™ may indeed contribute as a valuable tool in wine research for product monitoring and characterization of intrinsic developments during wine maturation. Copyright © 2011 Elsevier B.V. All rights reserved.
An item response curves analysis of the Force Concept Inventory
NASA Astrophysics Data System (ADS)
Morris, Gary A.; Harshman, Nathan; Branum-Martin, Lee; Mazur, Eric; Mzoughi, Taha; Baker, Stephen D.
2012-09-01
Several years ago, we introduced the idea of item response curves (IRC), a simplistic form of item response theory (IRT), to the physics education research community as a way to examine item performance on diagnostic instruments such as the Force Concept Inventory (FCI). We noted that a full-blown analysis using IRT would be a next logical step, which several authors have since taken. In this paper, we show that our simple approach not only yields similar conclusions in the analysis of the performance of items on the FCI to the more sophisticated and complex IRT analyses but also permits additional insights by characterizing both the correct and incorrect answer choices. Our IRC approach can be applied to a variety of multiple-choice assessments but, as applied to a carefully designed instrument such as the FCI, allows us to probe student understanding as a function of ability level through an examination of each answer choice. We imagine that physics teachers could use IRC analysis to identify prominent misconceptions and tailor their instruction to combat those misconceptions, fulfilling the FCI authors' original intentions for its use. Furthermore, the IRC analysis can assist test designers to improve their assessments by identifying nonfunctioning distractors that can be replaced with distractors attractive to students at various ability levels.
Yoo, Doo Han; Lee, Jae Shin
2016-07-01
[Purpose] This study examined the clinical usefulness of the clock drawing test applying Rasch analysis for predicting the level of cognitive impairment. [Subjects and Methods] A total of 187 stroke patients with cognitive impairment were enrolled in this study. The 187 patients were evaluated by the clock drawing test developed through Rasch analysis along with the mini-mental state examination of cognitive evaluation tool. An analysis of the variance was performed to examine the significance of the mini-mental state examination and the clock drawing test according to the general characteristics of the subjects. Receiver operating characteristic analysis was performed to determine the cutoff point for cognitive impairment and to calculate the sensitivity and specificity values. [Results] The results of comparison of the clock drawing test with the mini-mental state showed significant differences in according to gender, age, education, and affected side. A total CDT of 10.5, which was selected as the cutoff point to identify cognitive impairement, showed a sensitivity, specificity, Youden index, positive predictive, and negative predicive values of 86.4%, 91.5%, 0.8, 95%, and 88.2%. [Conclusion] The clock drawing test is believed to be useful in assessments and interventions based on its excellent ability to identify cognitive disorders.
Computer assisted analysis of auroral images obtained from high altitude polar satellites
NASA Technical Reports Server (NTRS)
Samadani, Ramin; Flynn, Michael
1993-01-01
Automatic techniques that allow the extraction of physically significant parameters from auroral images were developed. This allows the processing of a much larger number of images than is currently possible with manual techniques. Our techniques were applied to diverse auroral image datasets. These results were made available to geophysicists at NASA and at universities in the form of a software system that performs the analysis. After some feedback from users, an upgraded system was transferred to NASA and to two universities. The feasibility of user-trained search and retrieval of large amounts of data using our automatically derived parameter indices was demonstrated. Techniques based on classification and regression trees (CART) were developed and applied to broaden the types of images to which the automated search and retrieval may be applied. Our techniques were tested with DE-1 auroral images.
Investigation of energy management strategies for photovoltaic systems - An analysis technique
NASA Technical Reports Server (NTRS)
Cull, R. C.; Eltimsahy, A. H.
1982-01-01
Progress is reported in formulating energy management strategies for stand-alone PV systems, developing an analytical tool that can be used to investigate these strategies, applying this tool to determine the proper control algorithms and control variables (controller inputs and outputs) for a range of applications, and quantifying the relative performance and economics when compared to systems that do not apply energy management. The analysis technique developed may be broadly applied to a variety of systems to determine the most appropriate energy management strategies, control variables and algorithms. The only inputs required are statistical distributions for stochastic energy inputs and outputs of the system and the system's device characteristics (efficiency and ratings). Although the formulation was originally driven by stand-alone PV system needs, the techniques are also applicable to hybrid and grid connected systems.
Investigation of energy management strategies for photovoltaic systems - An analysis technique
NASA Astrophysics Data System (ADS)
Cull, R. C.; Eltimsahy, A. H.
Progress is reported in formulating energy management strategies for stand-alone PV systems, developing an analytical tool that can be used to investigate these strategies, applying this tool to determine the proper control algorithms and control variables (controller inputs and outputs) for a range of applications, and quantifying the relative performance and economics when compared to systems that do not apply energy management. The analysis technique developed may be broadly applied to a variety of systems to determine the most appropriate energy management strategies, control variables and algorithms. The only inputs required are statistical distributions for stochastic energy inputs and outputs of the system and the system's device characteristics (efficiency and ratings). Although the formulation was originally driven by stand-alone PV system needs, the techniques are also applicable to hybrid and grid connected systems.
Adjoint equations and analysis of complex systems: Application to virus infection modelling
NASA Astrophysics Data System (ADS)
Marchuk, G. I.; Shutyaev, V.; Bocharov, G.
2005-12-01
Recent development of applied mathematics is characterized by ever increasing attempts to apply the modelling and computational approaches across various areas of the life sciences. The need for a rigorous analysis of the complex system dynamics in immunology has been recognized since more than three decades ago. The aim of the present paper is to draw attention to the method of adjoint equations. The methodology enables to obtain information about physical processes and examine the sensitivity of complex dynamical systems. This provides a basis for a better understanding of the causal relationships between the immune system's performance and its parameters and helps to improve the experimental design in the solution of applied problems. We show how the adjoint equations can be used to explain the changes in hepatitis B virus infection dynamics between individual patients.
The BioMedical Evidence Graph (BMEG) | Informatics Technology for Cancer Research (ITCR)
The BMEG is a Cancer Data integration Platform that utilizes methods collected from DREAM challenges and applied to large datasets, such as the TCGA, and makes them avalible for analysis using a high performance graph database
DOE Office of Scientific and Technical Information (OSTI.GOV)
Prowell, Stacy J; Symons, Christopher T
2015-01-01
Producing trusted results from high-performance codes is essential for policy and has significant economic impact. We propose combining rigorous analytical methods with machine learning techniques to achieve the goal of repeatable, trustworthy scientific computing.
Crosta, Fernando; Nishiwaki-Dantas, Maria Cristina; Silvino, Wilmar; Dantas, Paulo Elias Correa
2005-01-01
To verify the frequency of study design, applied statistical analysis and approval by institutional review offices (Ethics Committee) of articles published in the "Arquivos Brasileiros de Oftalmologia" during a 10-year interval, with later comparative and critical analysis by some of the main international journals in the field of Ophthalmology. Systematic review without metanalysis was performed. Scientific papers published in the "Arquivos Brasileiros de Oftalmologia" between January 1993 and December 2002 were reviewed by two independent reviewers and classified according to the applied study design, statistical analysis and approval by the institutional review offices. To categorize those variables, a descriptive statistical analysis was used. After applying inclusion and exclusion criteria, 584 articles for evaluation of statistical analysis and, 725 articles for evaluation of study design were reviewed. Contingency table (23.10%) was the most frequently applied statistical method, followed by non-parametric tests (18.19%), Student's t test (12.65%), central tendency measures (10.60%) and analysis of variance (9.81%). Of 584 reviewed articles, 291 (49.82%) presented no statistical analysis. Observational case series (26.48%) was the most frequently used type of study design, followed by interventional case series (18.48%), observational case description (13.37%), non-random clinical study (8.96%) and experimental study (8.55%). We found a higher frequency of observational clinical studies, lack of statistical analysis in almost half of the published papers. Increase in studies with approval by institutional review Ethics Committee was noted since it became mandatory in 1996.
NASA Technical Reports Server (NTRS)
Egolf, T. Alan; Anderson, Olof L.; Edwards, David E.; Landgrebe, Anton J.
1988-01-01
A computer program, the Propeller Nacelle Aerodynamic Performance Prediction Analysis (PANPER), was developed for the prediction and analysis of the performance and airflow of propeller-nacelle configurations operating over a forward speed range inclusive of high speed flight typical of recent propfan designs. A propeller lifting line, wake program was combined with a compressible, viscous center body interaction program, originally developed for diffusers, to compute the propeller-nacelle flow field, blade loading distribution, propeller performance, and the nacelle forebody pressure and viscous drag distributions. The computer analysis is applicable to single and coaxial counterrotating propellers. The blade geometries can include spanwise variations in sweep, droop, taper, thickness, and airfoil section type. In the coaxial mode of operation the analysis can treat both equal and unequal blade number and rotational speeds on the propeller disks. The nacelle portion of the analysis can treat both free air and tunnel wall configurations including wall bleed. The analysis was applied to many different sets of flight conditions using selected aerodynamic modeling options. The influence of different propeller nacelle-tunnel wall configurations was studied. Comparisons with available test data for both single and coaxial propeller configurations are presented along with a discussion of the results.
White blood cell counting analysis of blood smear images using various segmentation strategies
NASA Astrophysics Data System (ADS)
Safuan, Syadia Nabilah Mohd; Tomari, Razali; Zakaria, Wan Nurshazwani Wan; Othman, Nurmiza
2017-09-01
In white blood cell (WBC) diagnosis, the most crucial measurement parameter is the WBC counting. Such information is widely used to evaluate the effectiveness of cancer therapy and to diagnose several hidden infection within human body. The current practice of manual WBC counting is laborious and a very subjective assessment which leads to the invention of computer aided system (CAS) with rigorous image processing solution. In the CAS counting work, segmentation is the crucial step to ensure the accuracy of the counted cell. The optimal segmentation strategy that can work under various blood smeared image acquisition conditions is remain a great challenge. In this paper, a comparison between different segmentation methods based on color space analysis to get the best counting outcome is elaborated. Initially, color space correction is applied to the original blood smeared image to standardize the image color intensity level. Next, white blood cell segmentation is performed by using combination of several color analysis subtraction which are RGB, CMYK and HSV, and Otsu thresholding. Noises and unwanted regions that present after the segmentation process is eliminated by applying a combination of morphological and Connected Component Labelling (CCL) filter. Eventually, Circle Hough Transform (CHT) method is applied to the segmented image to estimate the number of WBC including the one under the clump region. From the experiment, it is found that G-S yields the best performance.
ERIC Educational Resources Information Center
Murillo, F. Javier; Roman, Marcela
2011-01-01
The purpose of this investigation is to determine the incidence of school infrastructure and resources and its impact on the academic performance of primary education students in Latin America. A 4-level multilevel model was applied to the data of the Second Regional Comparative and Explanatory Study (SERCE) conducted by UNESCO, which researched…
40 CFR Table 7 to Subpart Mmmmm of... - Applicability of General Provisions to Subpart MMMMM
Code of Federal Regulations, 2010 CFR
2010-07-01
... alternative test method Yes. § 63.7(g) Performance test data analysis, recordkeeping, and reporting Yes. § 63... emissions monitoring systems (CEMS). § 63.8(g) Data reduction Yes Applies as modified by § 63.8794(g). § 63... CMS notifications—date of CMS performance evaluation Yes. § 63.9(g)(2) Use of COMS data No Subpart...
Noise distribution and denoising of current density images
Beheshti, Mohammadali; Foomany, Farbod H.; Magtibay, Karl; Jaffray, David A.; Krishnan, Sridhar; Nanthakumar, Kumaraswamy; Umapathy, Karthikeyan
2015-01-01
Abstract. Current density imaging (CDI) is a magnetic resonance (MR) imaging technique that could be used to study current pathways inside the tissue. The current distribution is measured indirectly as phase changes. The inherent noise in the MR imaging technique degrades the accuracy of phase measurements leading to imprecise current variations. The outcome can be affected significantly, especially at a low signal-to-noise ratio (SNR). We have shown the residual noise distribution of the phase to be Gaussian-like and the noise in CDI images approximated as a Gaussian. This finding matches experimental results. We further investigated this finding by performing comparative analysis with denoising techniques, using two CDI datasets with two different currents (20 and 45 mA). We found that the block-matching and three-dimensional (BM3D) technique outperforms other techniques when applied on current density (J). The minimum gain in noise power by BM3D applied to J compared with the next best technique in the analysis was found to be around 2 dB per pixel. We characterize the noise profile in CDI images and provide insights on the performance of different denoising techniques when applied at two different stages of current density reconstruction. PMID:26158100
NASA Astrophysics Data System (ADS)
Zhao, Yan-Ru; Yu, Ke-Qiang; Li, Xiaoli; He, Yong
2016-12-01
Infected petals are often regarded as the source for the spread of fungi Sclerotinia sclerotiorum in all growing process of rapeseed (Brassica napus L.) plants. This research aimed to detect fungal infection of rapeseed petals by applying hyperspectral imaging in the spectral region of 874-1734 nm coupled with chemometrics. Reflectance was extracted from regions of interest (ROIs) in the hyperspectral image of each sample. Firstly, principal component analysis (PCA) was applied to conduct a cluster analysis with the first several principal components (PCs). Then, two methods including X-loadings of PCA and random frog (RF) algorithm were used and compared for optimizing wavebands selection. Least squares-support vector machine (LS-SVM) methodology was employed to establish discriminative models based on the optimal and full wavebands. Finally, area under the receiver operating characteristics curve (AUC) was utilized to evaluate classification performance of these LS-SVM models. It was found that LS-SVM based on the combination of all optimal wavebands had the best performance with AUC of 0.929. These results were promising and demonstrated the potential of applying hyperspectral imaging in fungus infection detection on rapeseed petals.
Dai, Wensheng; Wu, Jui-Yu; Lu, Chi-Jie
2014-01-01
Sales forecasting is one of the most important issues in managing information technology (IT) chain store sales since an IT chain store has many branches. Integrating feature extraction method and prediction tool, such as support vector regression (SVR), is a useful method for constructing an effective sales forecasting scheme. Independent component analysis (ICA) is a novel feature extraction technique and has been widely applied to deal with various forecasting problems. But, up to now, only the basic ICA method (i.e., temporal ICA model) was applied to sale forecasting problem. In this paper, we utilize three different ICA methods including spatial ICA (sICA), temporal ICA (tICA), and spatiotemporal ICA (stICA) to extract features from the sales data and compare their performance in sales forecasting of IT chain store. Experimental results from a real sales data show that the sales forecasting scheme by integrating stICA and SVR outperforms the comparison models in terms of forecasting error. The stICA is a promising tool for extracting effective features from branch sales data and the extracted features can improve the prediction performance of SVR for sales forecasting.
Dai, Wensheng
2014-01-01
Sales forecasting is one of the most important issues in managing information technology (IT) chain store sales since an IT chain store has many branches. Integrating feature extraction method and prediction tool, such as support vector regression (SVR), is a useful method for constructing an effective sales forecasting scheme. Independent component analysis (ICA) is a novel feature extraction technique and has been widely applied to deal with various forecasting problems. But, up to now, only the basic ICA method (i.e., temporal ICA model) was applied to sale forecasting problem. In this paper, we utilize three different ICA methods including spatial ICA (sICA), temporal ICA (tICA), and spatiotemporal ICA (stICA) to extract features from the sales data and compare their performance in sales forecasting of IT chain store. Experimental results from a real sales data show that the sales forecasting scheme by integrating stICA and SVR outperforms the comparison models in terms of forecasting error. The stICA is a promising tool for extracting effective features from branch sales data and the extracted features can improve the prediction performance of SVR for sales forecasting. PMID:25165740
Šiljić Tomić, Aleksandra; Antanasijević, Davor; Ristić, Mirjana; Perić-Grujić, Aleksandra; Pocajt, Viktor
2018-01-01
Accurate prediction of water quality parameters (WQPs) is an important task in the management of water resources. Artificial neural networks (ANNs) are frequently applied for dissolved oxygen (DO) prediction, but often only their interpolation performance is checked. The aims of this research, beside interpolation, were the determination of extrapolation performance of ANN model, which was developed for the prediction of DO content in the Danube River, and the assessment of relationship between the significance of inputs and prediction error in the presence of values which were of out of the range of training. The applied ANN is a polynomial neural network (PNN) which performs embedded selection of most important inputs during learning, and provides a model in the form of linear and non-linear polynomial functions, which can then be used for a detailed analysis of the significance of inputs. Available dataset that contained 1912 monitoring records for 17 water quality parameters was split into a "regular" subset that contains normally distributed and low variability data, and an "extreme" subset that contains monitoring records with outlier values. The results revealed that the non-linear PNN model has good interpolation performance (R 2 =0.82), but it was not robust in extrapolation (R 2 =0.63). The analysis of extrapolation results has shown that the prediction errors are correlated with the significance of inputs. Namely, the out-of-training range values of the inputs with low importance do not affect significantly the PNN model performance, but their influence can be biased by the presence of multi-outlier monitoring records. Subsequently, linear PNN models were successfully applied to study the effect of water quality parameters on DO content. It was observed that DO level is mostly affected by temperature, pH, biological oxygen demand (BOD) and phosphorus concentration, while in extreme conditions the importance of alkalinity and bicarbonates rises over pH and BOD. Copyright © 2017 Elsevier B.V. All rights reserved.
Militello, L G; Hutton, R J
1998-11-01
Cognitive task analysis (CTA) is a set of methods for identifying cognitive skills, or mental demands, needed to perform a task proficiently. The product of the task analysis can be used to inform the design of interfaces and training systems. However, CTA is resource intensive and has previously been of limited use to design practitioners. A streamlined method of CTA, Applied Cognitive Task Analysis (ACTA), is presented in this paper. ACTA consists of three interview methods that help the practitioner to extract information about the cognitive demands and skills required for a task. ACTA also allows the practitioner to represent this information in a format that will translate more directly into applied products, such as improved training scenarios or interface recommendations. This paper will describe the three methods, an evaluation study conducted to assess the usability and usefulness of the methods, and some directions for future research for making cognitive task analysis accessible to practitioners. ACTA techniques were found to be easy to use, flexible, and to provide clear output. The information and training materials developed based on ACTA interviews were found to be accurate and important for training purposes.
Artificial intelligence in sports on the example of weight training.
Novatchkov, Hristo; Baca, Arnold
2013-01-01
The overall goal of the present study was to illustrate the potential of artificial intelligence (AI) techniques in sports on the example of weight training. The research focused in particular on the implementation of pattern recognition methods for the evaluation of performed exercises on training machines. The data acquisition was carried out using way and cable force sensors attached to various weight machines, thereby enabling the measurement of essential displacement and force determinants during training. On the basis of the gathered data, it was consequently possible to deduce other significant characteristics like time periods or movement velocities. These parameters were applied for the development of intelligent methods adapted from conventional machine learning concepts, allowing an automatic assessment of the exercise technique and providing individuals with appropriate feedback. In practice, the implementation of such techniques could be crucial for the investigation of the quality of the execution, the assistance of athletes but also coaches, the training optimization and for prevention purposes. For the current study, the data was based on measurements from 15 rather inexperienced participants, performing 3-5 sets of 10-12 repetitions on a leg press machine. The initially preprocessed data was used for the extraction of significant features, on which supervised modeling methods were applied. Professional trainers were involved in the assessment and classification processes by analyzing the video recorded executions. The so far obtained modeling results showed good performance and prediction outcomes, indicating the feasibility and potency of AI techniques in assessing performances on weight training equipment automatically and providing sportsmen with prompt advice. Key pointsArtificial intelligence is a promising field for sport-related analysis.Implementations integrating pattern recognition techniques enable the automatic evaluation of data measurements.Artificial neural networks applied for the analysis of weight training data show good performance and high classification rates.
Artificial Intelligence in Sports on the Example of Weight Training
Novatchkov, Hristo; Baca, Arnold
2013-01-01
The overall goal of the present study was to illustrate the potential of artificial intelligence (AI) techniques in sports on the example of weight training. The research focused in particular on the implementation of pattern recognition methods for the evaluation of performed exercises on training machines. The data acquisition was carried out using way and cable force sensors attached to various weight machines, thereby enabling the measurement of essential displacement and force determinants during training. On the basis of the gathered data, it was consequently possible to deduce other significant characteristics like time periods or movement velocities. These parameters were applied for the development of intelligent methods adapted from conventional machine learning concepts, allowing an automatic assessment of the exercise technique and providing individuals with appropriate feedback. In practice, the implementation of such techniques could be crucial for the investigation of the quality of the execution, the assistance of athletes but also coaches, the training optimization and for prevention purposes. For the current study, the data was based on measurements from 15 rather inexperienced participants, performing 3-5 sets of 10-12 repetitions on a leg press machine. The initially preprocessed data was used for the extraction of significant features, on which supervised modeling methods were applied. Professional trainers were involved in the assessment and classification processes by analyzing the video recorded executions. The so far obtained modeling results showed good performance and prediction outcomes, indicating the feasibility and potency of AI techniques in assessing performances on weight training equipment automatically and providing sportsmen with prompt advice. Key points Artificial intelligence is a promising field for sport-related analysis. Implementations integrating pattern recognition techniques enable the automatic evaluation of data measurements. Artificial neural networks applied for the analysis of weight training data show good performance and high classification rates. PMID:24149722
In-injection port thermal desorption for explosives trace evidence analysis.
Sigman, M E; Ma, C Y
1999-10-01
A gas chromatographic method utilizing thermal desorption of a dry surface wipe for the analysis of explosives trace chemical evidence has been developed and validated using electron capture and negative ion chemical ionization mass spectrometric detection. Thermal desorption was performed within a split/splitless injection port with minimal instrument modification. Surface-abraded Teflon tubing provided the solid support for sample collection and desorption. Performance was characterized by desorption efficiency, reproducibility, linearity of the calibration, and method detection and quantitation limits. Method validation was performed with a series of dinitrotoluenes, trinitrotoluene, two nitroester explosives, and one nitramine explosive. The method was applied to the sampling of a single piece of debris from an explosion containing trinitrotoluene.
EEG source analysis of data from paralysed subjects
NASA Astrophysics Data System (ADS)
Carabali, Carmen A.; Willoughby, John O.; Fitzgibbon, Sean P.; Grummett, Tyler; Lewis, Trent; DeLosAngeles, Dylan; Pope, Kenneth J.
2015-12-01
One of the limitations of Encephalography (EEG) data is its quality, as it is usually contaminated with electric signal from muscle. This research intends to study results of two EEG source analysis methods applied to scalp recordings taken in paralysis and in normal conditions during the performance of a cognitive task. The aim is to determinate which types of analysis are appropriate for dealing with EEG data containing myogenic components. The data used are the scalp recordings of six subjects in normal conditions and during paralysis while performing different cognitive tasks including the oddball task which is the object of this research. The data were pre-processed by filtering it and correcting artefact, then, epochs of one second long for targets and distractors were extracted. Distributed source analysis was performed in BESA Research 6.0, using its results and information from the literature, 9 ideal locations for source dipoles were identified. The nine dipoles were used to perform discrete source analysis, fitting them to the averaged epochs for obtaining source waveforms. The results were statistically analysed comparing the outcomes before and after the subjects were paralysed. Finally, frequency analysis was performed for better explain the results. The findings were that distributed source analysis could produce confounded results for EEG contaminated with myogenic signals, conversely, statistical analysis of the results from discrete source analysis showed that this method could help for dealing with EEG data contaminated with muscle electrical signal.
Development of Officer Selection Battery Forms 3 and 4
1986-03-01
the development, standardization, and validation of two parallel forms of a test to be used for assessing young men and women applying to ROTC. Fairly...appropriate di6ffculty, high reliability, and state-of-the-art validity and fairness for mit~orities and women . EDGAR M. JOHNSON Technical Directcr 4v 4...administrable, test for use in assessing young men and women applying to Advanced Army ROTC. Procedur .-: Earlier research had performed an analysis of the
Meta-analyzing dependent correlations: an SPSS macro and an R script.
Cheung, Shu Fai; Chan, Darius K-S
2014-06-01
The presence of dependent correlation is a common problem in meta-analysis. Cheung and Chan (2004, 2008) have shown that samplewise-adjusted procedures perform better than the more commonly adopted simple within-sample mean procedures. However, samplewise-adjusted procedures have rarely been applied in meta-analytic reviews, probably due to the lack of suitable ready-to-use programs. In this article, we compare the samplewise-adjusted procedures with existing procedures to handle dependent effect sizes, and present the samplewise-adjusted procedures in a way that will make them more accessible to researchers conducting meta-analysis. We also introduce two tools, an SPSS macro and an R script, that researchers can apply to their meta-analyses; these tools are compatible with existing meta-analysis software packages.
Benchmarking hydrological model predictive capability for UK River flows and flood peaks.
NASA Astrophysics Data System (ADS)
Lane, Rosanna; Coxon, Gemma; Freer, Jim; Wagener, Thorsten
2017-04-01
Data and hydrological models are now available for national hydrological analyses. However, hydrological model performance varies between catchments, and lumped, conceptual models are not able to produce adequate simulations everywhere. This study aims to benchmark hydrological model performance for catchments across the United Kingdom within an uncertainty analysis framework. We have applied four hydrological models from the FUSE framework to 1128 catchments across the UK. These models are all lumped models and run at a daily timestep, but differ in the model structural architecture and process parameterisations, therefore producing different but equally plausible simulations. We apply FUSE over a 20 year period from 1988-2008, within a GLUE Monte Carlo uncertainty analyses framework. Model performance was evaluated for each catchment, model structure and parameter set using standard performance metrics. These were calculated both for the whole time series and to assess seasonal differences in model performance. The GLUE uncertainty analysis framework was then applied to produce simulated 5th and 95th percentile uncertainty bounds for the daily flow time-series and additionally the annual maximum prediction bounds for each catchment. The results show that the model performance varies significantly in space and time depending on catchment characteristics including climate, geology and human impact. We identify regions where models are systematically failing to produce good results, and present reasons why this could be the case. We also identify regions or catchment characteristics where one model performs better than others, and have explored what structural component or parameterisation enables certain models to produce better simulations in these catchments. Model predictive capability was assessed for each catchment, through looking at the ability of the models to produce discharge prediction bounds which successfully bound the observed discharge. These results improve our understanding of the predictive capability of simple conceptual hydrological models across the UK and help us to identify where further effort is needed to develop modelling approaches to better represent different catchment and climate typologies.
Integrated flight/propulsion control - Subsystem specifications for performance
NASA Technical Reports Server (NTRS)
Neighbors, W. K.; Rock, Stephen M.
1993-01-01
A procedure is presented for calculating multiple subsystem specifications given a number of performance requirements on the integrated system. This procedure applies to problems where the control design must be performed in a partitioned manner. It is based on a structured singular value analysis, and generates specifications as magnitude bounds on subsystem uncertainties. The performance requirements should be provided in the form of bounds on transfer functions of the integrated system. This form allows the expression of model following, command tracking, and disturbance rejection requirements. The procedure is demonstrated on a STOVL aircraft design.
Identifying the "Right Stuff": An Exploration-Focused Astronaut Job Analysis
NASA Technical Reports Server (NTRS)
Barrett, J. D.; Holland, A. W.; Vessey, W. B.
2015-01-01
Industrial and organizational (I/O) psychologists play a key role in NASA astronaut candidate selection through the identification of the competencies necessary to successfully engage in the astronaut job. A set of psychosocial competencies, developed by I/O psychologists during a prior job analysis conducted in 1996 and updated in 2003, were identified as necessary for individuals working and living in the space shuttle and on the International Space Station (ISS). This set of competencies applied to the space shuttle and applies to current ISS missions, but may not apply to longer-duration or long-distance exploration missions. With the 2015 launch of the first 12- month ISS mission and the shift in the 2020s to missions beyond low earth orbit, the type of missions that astronauts will conduct and the environment in which they do their work will change dramatically, leading to new challenges for these crews. To support future astronaut selection, training, and research, I/O psychologists in NASA's Behavioral Health and Performance (BHP) Operations and Research groups engaged in a joint effort to conduct an updated analysis of the astronaut job for current and future operations. This project will result in the identification of behavioral competencies critical to performing the astronaut job, along with relative weights for each of the identified competencies, through the application of job analysis techniques. While this job analysis is being conducted according to job analysis best practices, the project poses a number of novel challenges. These challenges include the need to identify competencies for multiple mission types simultaneously, to evaluate jobs that have no incumbents as they have never before been conducted, and working with a very limited population of subject matter experts. Given these challenges, under the guidance of job analysis experts, we used the following methods to conduct the job analysis and identify the key competencies for current and potential future missions.
Knight, Jo; North, Bernard V; Sham, Pak C; Curtis, David
2003-12-31
This paper presents a method of performing model-free LOD-score based linkage analysis on quantitative traits. It is implemented in the QMFLINK program. The method is used to perform a genome screen on the Framingham Heart Study data. A number of markers that show some support for linkage in our study coincide substantially with those implicated in other linkage studies of hypertension. Although the new method needs further testing on additional real and simulated data sets we can already say that it is straightforward to apply and may offer a useful complementary approach to previously available methods for the linkage analysis of quantitative traits.
Knight, Jo; North, Bernard V; Sham, Pak C; Curtis, David
2003-01-01
This paper presents a method of performing model-free LOD-score based linkage analysis on quantitative traits. It is implemented in the QMFLINK program. The method is used to perform a genome screen on the Framingham Heart Study data. A number of markers that show some support for linkage in our study coincide substantially with those implicated in other linkage studies of hypertension. Although the new method needs further testing on additional real and simulated data sets we can already say that it is straightforward to apply and may offer a useful complementary approach to previously available methods for the linkage analysis of quantitative traits. PMID:14975142
Sensitivity of VIIRS Polarization Measurements
NASA Technical Reports Server (NTRS)
Waluschka, Eugene
2010-01-01
The design of an optical system typically involves a sensitivity analysis where the various lens parameters, such as lens spacing and curvatures, to name two parameters, are (slightly) varied to see what, if any, effect this has on the performance and to establish manufacturing tolerances. A sinular analysis was performed for the VIIRS instruments polarization measurements to see how real world departures from perfectly linearly polarized light entering VIIRS effects the polarization measurement. The methodology and a few of the results of this polarization sensitivity analysis are presented and applied to the construction of a single polarizer which will cover the VIIRS VIS/NIR spectral range. Keywords: VIIRS, polarization, ray, trace; polarizers, Bolder Vision, MOXTEK
Dynamic test/analysis correlation using reduced analytical models
NASA Technical Reports Server (NTRS)
Mcgowan, Paul E.; Angelucci, A. Filippo; Javeed, Mehzad
1992-01-01
Test/analysis correlation is an important aspect of the verification of analysis models which are used to predict on-orbit response characteristics of large space structures. This paper presents results of a study using reduced analysis models for performing dynamic test/analysis correlation. The reduced test-analysis model (TAM) has the same number and orientation of DOF as the test measurements. Two reduction methods, static (Guyan) reduction and the Improved Reduced System (IRS) reduction, are applied to the test/analysis correlation of a laboratory truss structure. Simulated test results and modal test data are used to examine the performance of each method. It is shown that selection of DOF to be retained in the TAM is critical when large structural masses are involved. In addition, the use of modal test results may provide difficulties in TAM accuracy even if a large number of DOF are retained in the TAM.
Performance Analysis of Receive Diversity in Wireless Sensor Networks over GBSBE Models
Goel, Shivali; Abawajy, Jemal H.; Kim, Tai-hoon
2010-01-01
Wireless sensor networks have attracted a lot of attention recently. In this paper, we develop a channel model based on the elliptical model for multipath components involving randomly placed scatterers in the scattering region with sensors deployed on a field. We verify that in a sensor network, the use of receive diversity techniques improves the performance of the system. Extensive performance analysis of the system is carried out for both single and multiple antennas with the applied receive diversity techniques. Performance analyses based on variations in receiver height, maximum multipath delay and transmit power have been performed considering different numbers of antenna elements present in the receiver array, Our results show that increasing the number of antenna elements for a wireless sensor network does indeed improve the BER rates that can be obtained. PMID:22163510
Boakye-Dankwa, Ernest; Teeple, Erin; Gore, Rebecca; Punnett, Laura
2017-11-01
We performed an integrated cross-sectional analysis of relationships between long-term care work environments, employee and resident satisfaction, and quality of patient care. Facility-level data came from a network of 203 skilled nursing facilities in 13 states in the eastern United States owned or managed by one company. K-means cluster analysis was applied to investigate clustered associations between safe resident handling program (SRHP) performance, resident care outcomes, employee satisfaction, rates of workers' compensation claims, and resident satisfaction. Facilities in the better-performing cluster were found to have better patient care outcomes and resident satisfaction; lower rates of workers compensation claims; better SRHP performance; higher employee retention; and greater worker job satisfaction and engagement. The observed clustered relationships support the utility of integrated performance assessment in long-term care facilities.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nakajima, K.; Bunko, H.; Tada, A.
1984-01-01
Phase analysis has been applied to Wolff-Parkinson-White syndrome (WPW) to detect the site of accessory conduction pathway (ACP); however, there was a limitation to estimate the precise location of ACP by planar phase analysis. In this study, the authors applied phase analysis to gated blood pool tomography. Twelve patients with WPW who underwent epicardial mapping and surgical division of ACP were studied by both of gated emission computed tomography (GECT) and routine gated blood pool study (GBPS). The GBPS was performed with Tc-99m red blood cells in multiple projections; modified left anterior oblique, right anterior oblique and/or left lateral views.more » In GECT, short axial, horizontal and vertical long axial blood pool images were reconstructed. Phase analysis was performed using fundamental frequency of the Fourier transform in both GECT and GBPS images, and abnormal initial contractions on both the planar and tomographic phase analysis were compared with the location of surgically confirmed ACPs. In planar phase analysis, abnormal initial phase was identified in 7 out of 12 (58%) patients, while in tomographic phase analysis, the localization of ACP was predicted in 11 out of 12 (92%) patients. Tomographic phase analysis is superior to planar phase images in 8 out of 12 patients to estimate the location of ACP. Phase analysis by GECT can avoid overlap of blood pool in cardiac chambers and has advantage to identify the propagation of phase three-dimensionally. Tomographic phase analysis is a good adjunctive method for patients with WPW to estimate the site of ACP.« less
Towards a Grand Unified Theory of sports performance.
Glazier, Paul S
2017-12-01
Sports performance is generally considered to be governed by a range of interacting physiological, biomechanical, and psychological variables, amongst others. Despite sports performance being multi-factorial, however, the majority of performance-oriented sports science research has predominantly been monodisciplinary in nature, presumably due, at least in part, to the lack of a unifying theoretical framework required to integrate the various subdisciplines of sports science. In this target article, I propose a Grand Unified Theory (GUT) of sports performance-and, by elaboration, sports science-based around the constraints framework introduced originally by Newell (1986). A central tenet of this GUT is that, at both the intra- and inter-individual levels of analysis, patterns of coordination and control, which directly determine the performance outcome, emerge from the confluence of interacting organismic, environmental, and task constraints via the formation and self-organisation of coordinative structures. It is suggested that this GUT could be used to: foster interdisciplinary research collaborations; break down the silos that have developed in sports science and restore greater disciplinary balance to the field; promote a more holistic understanding of sports performance across all levels of analysis; increase explanatory power of applied research work; provide stronger rationale for data collection and variable selection; and direct the development of integrated performance monitoring technologies. This GUT could also provide a scientifically rigorous basis for integrating the subdisciplines of sports science in applied sports science support programmes adopted by high-performance agencies and national governing bodies for various individual and team sports. Copyright © 2017 Elsevier B.V. All rights reserved.
Representing energy efficiency diagnosis strategies in cognitive work analysis.
Hilliard, Antony; Jamieson, Greg A
2017-03-01
This article describes challenges encountered in applying Jens Rasmussen's Cognitive Work Analysis (CWA) framework to the practice of energy efficiency Monitoring & Targeting (M&T). Eight theoretic issues encountered in the analysis are described with respect to Rasmussen's work and the modeling solutions we adopted. We grappled with how to usefully apply Work Domain Analysis (WDA) to analyze categories of domains with secondary purposes and no ideal grain of decomposition. This difficulty encouraged us to pursue Control Task (ConTA) and Strategies (StrA) analysis, which are under-explored as bases for interface design. In ConTA we found M&T was best represented by two interlinked work functions; one controlling energy, the other maintaining knowledge representations. From StrA, we identified a popular representation-dependent strategy and inferred information required to diagnose faults in system performance and knowledge representation. This article presents and discusses excerpts from our analysis, and outlines their application to diagnosis support tools. Copyright © 2015 Elsevier Ltd. All rights reserved.
Analysis of Photovoltaic System Energy Performance Evaluation Method
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kurtz, S.; Newmiller, J.; Kimber, A.
2013-11-01
Documentation of the energy yield of a large photovoltaic (PV) system over a substantial period can be useful to measure a performance guarantee, as an assessment of the health of the system, for verification of a performance model to then be applied to a new system, or for a variety of other purposes. Although the measurement of this performance metric might appear to be straight forward, there are a number of subtleties associated with variations in weather and imperfect data collection that complicate the determination and data analysis. A performance assessment is most valuable when it is completed with amore » very low uncertainty and when the subtleties are systematically addressed, yet currently no standard exists to guide this process. This report summarizes a draft methodology for an Energy Performance Evaluation Method, the philosophy behind the draft method, and the lessons that were learned by implementing the method.« less
Space Suit Performance: Methods for Changing the Quality of Quantitative Data
NASA Technical Reports Server (NTRS)
Cowley, Matthew; Benson, Elizabeth; Rajulu, Sudhakar
2014-01-01
NASA is currently designing a new space suit capable of working in deep space and on Mars. Designing a suit is very difficult and often requires trade-offs between performance, cost, mass, and system complexity. To verify that new suits will enable astronauts to perform to their maximum capacity, prototype suits must be built and tested with human subjects. However, engineers and flight surgeons often have difficulty understanding and applying traditional representations of human data without training. To overcome these challenges, NASA is developing modern simulation and analysis techniques that focus on 3D visualization. Early understanding of actual performance early on in the design cycle is extremely advantageous to increase performance capabilities, reduce the risk of injury, and reduce costs. The primary objective of this project was to test modern simulation and analysis techniques for evaluating the performance of a human operating in extra-vehicular space suits.
Analysis of Expedited Defense Contracting Methods in the Acquisition of Emerging Technology
2016-12-01
NAVAL POSTGRADUATE SCHOOL MONTEREY, CALIFORNIA MBA PROFESSIONAL REPORT ANALYSIS OF EXPEDITED DEFENSE CONTRACTING METHODS IN THE...CONTRACTING METHODS IN THE ACQUISITION OF EMERGING TECHNOLOGY 5. FUNDING NUMBERS 6. AUTHOR(S) Jacob D. Sabin and Mark K. Zakner 7. PERFORMING...firms. The DOD has authority for applying non-traditional contracting methods to better adapt to this competitive marketplace. This project studied non
ERIC Educational Resources Information Center
Alahiotis, Stamatis N.; Karatzia-Stavlioti, Eleni
2006-01-01
In this paper we perform text analysis on the new Cross Thematic Curriculum Framework Syllabus Design for compulsory education, which was constructed by the Hellenic Pedagogical Institute and is soon going to be applied in Greek schools. This curriculum text is treated as a policy text which introduces important changes in Greek school practice,…
NASA Astrophysics Data System (ADS)
Crawford, I.; Ruske, S.; Topping, D. O.; Gallagher, M. W.
2015-07-01
In this paper we present improved methods for discriminating and quantifying Primary Biological Aerosol Particles (PBAP) by applying hierarchical agglomerative cluster analysis to multi-parameter ultra violet-light induced fluorescence (UV-LIF) spectrometer data. The methods employed in this study can be applied to data sets in excess of 1×106 points on a desktop computer, allowing for each fluorescent particle in a dataset to be explicitly clustered. This reduces the potential for misattribution found in subsampling and comparative attribution methods used in previous approaches, improving our capacity to discriminate and quantify PBAP meta-classes. We evaluate the performance of several hierarchical agglomerative cluster analysis linkages and data normalisation methods using laboratory samples of known particle types and an ambient dataset. Fluorescent and non-fluorescent polystyrene latex spheres were sampled with a Wideband Integrated Bioaerosol Spectrometer (WIBS-4) where the optical size, asymmetry factor and fluorescent measurements were used as inputs to the analysis package. It was found that the Ward linkage with z-score or range normalisation performed best, correctly attributing 98 and 98.1 % of the data points respectively. The best performing methods were applied to the BEACHON-RoMBAS ambient dataset where it was found that the z-score and range normalisation methods yield similar results with each method producing clusters representative of fungal spores and bacterial aerosol, consistent with previous results. The z-score result was compared to clusters generated with previous approaches (WIBS AnalysiS Program, WASP) where we observe that the subsampling and comparative attribution method employed by WASP results in the overestimation of the fungal spore concentration by a factor of 1.5 and the underestimation of bacterial aerosol concentration by a factor of 5. We suggest that this likely due to errors arising from misatrribution due to poor centroid definition and failure to assign particles to a cluster as a result of the subsampling and comparative attribution method employed by WASP. The methods used here allow for the entire fluorescent population of particles to be analysed yielding an explict cluster attribution for each particle, improving cluster centroid definition and our capacity to discriminate and quantify PBAP meta-classes compared to previous approaches.
Fertigation uniformity under sprinkler irrigation: evaluation and analysis
USDA-ARS?s Scientific Manuscript database
n modern farming systems, fertigation is widely practiced as a cost effective and convenient method for applying soluble fertilizers to crops. Along with efficiency and adequacy, uniformity is an important fertigation performance evaluation criterion. Fertigation uniformity is defined here as a comp...
DOT National Transportation Integrated Search
2009-10-01
In this study, the concept of the hybrid FRP-concrete structural systems was applied to both bridge : superstructure and deck systems. Results from the both experimental and computational analysis for : both the hybrid bridge superstructure and deck ...
NASA Astrophysics Data System (ADS)
Slooff, J. W.
1985-05-01
The physical mechanisms governing the hydrodynamics of sailing yacht keels and the parameters that, through these mechanisms, determine keel performance are discussed. It is concluded that due to the presence of the free water surface optimum keel shapes differ from optimum shapes for aircraft wings. Utilizing computational fluid dynamics analysis and optimization it is found that the performance of conventional keels can be improved significantly by reducing taper or even applying inverse taper (upside-down keel) and that decisive improvements in performance can be realized through keels with winglets.
Structural Margins Assessment Approach
NASA Technical Reports Server (NTRS)
Ryan, Robert S.
1988-01-01
A general approach to the structural design and verification used to determine the structural margins of the space vehicle elements under Marshall Space Flight Center (MSFC) management is described. The Space Shuttle results and organization will be used as illustrations for techniques discussed. Given also are: (1) the system analyses performed or to be performed by, and (2) element analyses performed by MSFC and its contractors. Analysis approaches and their verification will be addressed. The Shuttle procedures are general in nature and apply to other than Shuttle space vehicles.
Wouters, Bert; Broeckhoven, Ken; Wouters, Sam; Bruggink, Cees; Agroskin, Yury; Pohl, Christopher A; Eeltink, Sebastiaan
2014-11-28
The gradient-performance limits of capillary ion chromatography have been assessed at maximum system pressure (34.5 MPa) using capillary columns packed with 4.1 μm macroporous anion-exchange particles coated with 65 nm positively-charged nanobeads. In analogy to the van-Deemter curve, the gradient performance was assessed applying different flow rates, while decreasing the gradient time inversely proportional to the increase in flow rate in order to maintain the same retention properties. The gradient kinetic-performance limits were determined at maximum system pressure, applying tG/t0=5, 10, and 20. In addition, the effect of retention on peak width was assessed in gradient mode for mono-, di-, and trivalent inorganic anions. The peak width of late-eluting ions can be significantly reduced by using concave gradient, resulting in better detection sensitivity. A signal enhancement factor of 8 was measured for a late-eluting ion when applying a concave instead of a linear gradient. For the analysis of a complex anion mixture, a coupled column with a total length of 1.05 m was operated at the kinetic-performance limit applying a linear 250 min gradient (tG/t0=10). The peak capacity varied between 200 and 380 depending on analyte retention, and hence on charge and size of the ion. Copyright © 2014 Elsevier B.V. All rights reserved.
ERP correlates of error processing during performance on the Halstead Category Test.
Santos, I M; Teixeira, A R; Tomé, A M; Pereira, A T; Rodrigues, P; Vagos, P; Costa, J; Carrito, M L; Oliveira, B; DeFilippis, N A; Silva, C F
2016-08-01
The Halstead Category Test (HCT) is a neuropsychological test that measures a person's ability to formulate and apply abstract principles. Performance must be adjusted based on feedback after each trial and errors are common until the underlying rules are discovered. Event-related potential (ERP) studies associated with the HCT are lacking. This paper demonstrates the use of a methodology inspired on Singular Spectrum Analysis (SSA) applied to EEG signals, to remove high amplitude ocular and movement artifacts during performance on the test. This filtering technique introduces no phase or latency distortions, with minimum loss of relevant EEG information. Importantly, the test was applied in its original clinical format, without introducing adaptations to ERP recordings. After signal treatment, the feedback-related negativity (FRN) wave, which is related to error-processing, was identified. This component peaked around 250ms, after feedback, in fronto-central electrodes. As expected, errors elicited more negative amplitudes than correct responses. Results are discussed in terms of the increased clinical potential that coupling ERP information with behavioral performance data can bring to the specificity of the HCT in diagnosing different types of impairment in frontal brain function. Copyright © 2016. Published by Elsevier B.V.
Stressors in elite sport: a coach perspective.
Thelwell, Richard C; Weston, Neil J V; Greenlees, Iain A; Hutchings, Nicholas V
2008-07-01
We examined the varying performance and organizational stressors experienced by coaches who operate with elite athletes. Following interviews with eleven coaches, content analysis of the data revealed coaches to experience comparable numbers of performance and organizational stressors. Performance stressors were divided between their own performance and that of their athletes, while organizational stressors included environmental, leadership, personal, and team factors. The findings provide evidence that coaches experience a variety of stressors that adds weight to the argument that they should be labelled as "performers" in their own right. A variety of future research topics and applied issues are also discussed.
Closed-loop, pilot/vehicle analysis of the approach and landing task
NASA Technical Reports Server (NTRS)
Schmidt, D. K.; Anderson, M. R.
1985-01-01
Optimal-control-theoretic modeling and frequency-domain analysis is the methodology proposed to evaluate analytically the handling qualities of higher-order manually controlled dynamic systems. Fundamental to the methodology is evaluating the interplay between pilot workload and closed-loop pilot/vehicle performance and stability robustness. The model-based metric for pilot workload is the required pilot phase compensation. Pilot/vehicle performance and loop stability is then evaluated using frequency-domain techniques. When these techniques were applied to the flight-test data for thirty-two highly-augmented fighter configurations, strong correlation was obtained between the analytical and experimental results.
NASA Technical Reports Server (NTRS)
Pellett, G. L.; Adams, B. R.
1983-01-01
A performance evaluation is conducted for a molecular beam/mass spectrometer (MB/MS) system, as applied to a 1-30 torr microwave-discharge flow reactor (MWFR) used in the formation of the methylperoxy radical and a study of its subsequent destruction in the presence or absence of NO(x). The modulated MB/MS system is four-staged and differentially pumped. The results obtained by the MWFR study is illustrative of overall system performance, including digital waveform analysis; significant improvements over previous designs are noted in attainable S/N ratio, detection limit, and accuracy.
NASA Astrophysics Data System (ADS)
Raju, B. S.; Sekhar, U. Chandra; Drakshayani, D. N.
2017-08-01
The paper investigates optimization of stereolithography process for SL5530 epoxy resin material to enhance part quality. The major characteristics indexed for performance selected to evaluate the processes are tensile strength, Flexural strength, Impact strength and Density analysis and corresponding process parameters are Layer thickness, Orientation and Hatch spacing. In this study, the process is intrinsically with multiple parameters tuning so that grey relational analysis which uses grey relational grade as performance index is specially adopted to determine the optimal combination of process parameters. Moreover, the principal component analysis is applied to evaluate the weighting values corresponding to various performance characteristics so that their relative importance can be properly and objectively desired. The results of confirmation experiments reveal that grey relational analysis coupled with principal component analysis can effectively acquire the optimal combination of process parameters. Hence, this confirm that the proposed approach in this study can be an useful tool to improve the process parameters in stereolithography process, which is very useful information for machine designers as well as RP machine users.
LLIMAS: Revolutionizing integrating modeling and analysis at MIT Lincoln Laboratory
NASA Astrophysics Data System (ADS)
Doyle, Keith B.; Stoeckel, Gerhard P.; Rey, Justin J.; Bury, Mark E.
2017-08-01
MIT Lincoln Laboratory's Integrated Modeling and Analysis Software (LLIMAS) enables the development of novel engineering solutions for advanced prototype systems through unique insights into engineering performance and interdisciplinary behavior to meet challenging size, weight, power, environmental, and performance requirements. LLIMAS is a multidisciplinary design optimization tool that wraps numerical optimization algorithms around an integrated framework of structural, thermal, optical, stray light, and computational fluid dynamics analysis capabilities. LLIMAS software is highly extensible and has developed organically across a variety of technologies including laser communications, directed energy, photometric detectors, chemical sensing, laser radar, and imaging systems. The custom software architecture leverages the capabilities of existing industry standard commercial software and supports the incorporation of internally developed tools. Recent advances in LLIMAS's Structural-Thermal-Optical Performance (STOP), aeromechanical, and aero-optical capabilities as applied to Lincoln prototypes are presented.
Modeling and performance analysis of QoS data
NASA Astrophysics Data System (ADS)
Strzeciwilk, Dariusz; Zuberek, Włodzimierz M.
2016-09-01
The article presents the results of modeling and analysis of data transmission performance on systems that support quality of service. Models are designed and tested, taking into account multiservice network architecture, i.e. supporting the transmission of data related to different classes of traffic. Studied were mechanisms of traffic shaping systems, which are based on the Priority Queuing with an integrated source of data and the various sources of data that is generated. Discussed were the basic problems of the architecture supporting QoS and queuing systems. Designed and built were models based on Petri nets, supported by temporal logics. The use of simulation tools was to verify the mechanisms of shaping traffic with the applied queuing algorithms. It is shown that temporal models of Petri nets can be effectively used in the modeling and analysis of the performance of computer networks.
Method of multivariate spectral analysis
Keenan, Michael R.; Kotula, Paul G.
2004-01-06
A method of determining the properties of a sample from measured spectral data collected from the sample by performing a multivariate spectral analysis. The method can include: generating a two-dimensional matrix A containing measured spectral data; providing a weighted spectral data matrix D by performing a weighting operation on matrix A; factoring D into the product of two matrices, C and S.sup.T, by performing a constrained alternating least-squares analysis of D=CS.sup.T, where C is a concentration intensity matrix and S is a spectral shapes matrix; unweighting C and S by applying the inverse of the weighting used previously; and determining the properties of the sample by inspecting C and S. This method can be used to analyze X-ray spectral data generated by operating a Scanning Electron Microscope (SEM) with an attached Energy Dispersive Spectrometer (EDS).
Karakaya, Jale; Karabulut, Erdem; Yucel, Recai M.
2015-01-01
Modern statistical methods using incomplete data have been increasingly applied in a wide variety of substantive problems. Similarly, receiver operating characteristic (ROC) analysis, a method used in evaluating diagnostic tests or biomarkers in medical research, has also been increasingly popular problem in both its development and application. While missing-data methods have been applied in ROC analysis, the impact of model mis-specification and/or assumptions (e.g. missing at random) underlying the missing data has not been thoroughly studied. In this work, we study the performance of multiple imputation (MI) inference in ROC analysis. Particularly, we investigate parametric and non-parametric techniques for MI inference under common missingness mechanisms. Depending on the coherency of the imputation model with the underlying data generation mechanism, our results show that MI generally leads to well-calibrated inferences under ignorable missingness mechanisms. PMID:26379316
Applied Swarm-based medicine: collecting decision trees for patterns of algorithms analysis.
Panje, Cédric M; Glatzer, Markus; von Rappard, Joscha; Rothermundt, Christian; Hundsberger, Thomas; Zumstein, Valentin; Plasswilm, Ludwig; Putora, Paul Martin
2017-08-16
The objective consensus methodology has recently been applied in consensus finding in several studies on medical decision-making among clinical experts or guidelines. The main advantages of this method are an automated analysis and comparison of treatment algorithms of the participating centers which can be performed anonymously. Based on the experience from completed consensus analyses, the main steps for the successful implementation of the objective consensus methodology were identified and discussed among the main investigators. The following steps for the successful collection and conversion of decision trees were identified and defined in detail: problem definition, population selection, draft input collection, tree conversion, criteria adaptation, problem re-evaluation, results distribution and refinement, tree finalisation, and analysis. This manuscript provides information on the main steps for successful collection of decision trees and summarizes important aspects at each point of the analysis.
Silicone Disclosing Material used after Ceramic Surface Treatment Reduces Bond Strength.
Fraga, Sara; Oliveira, Sara Cioccari; Pereira, Gabriel Kalil Rocha; Beekman, Pieter; Rippe, Marília Pivetta; Kleverlaan, Cornelis J
To evaluate the effect of a silicone disclosing procedure performed at different timepoints on the shear bond strength (SBS) of cements (self-adhesive composite cement, self-etch composite cement, resin-reinforced glass-ionomer cement) to different substrates (zirconia, lithium disilicate, bovine dentin). The substrate/cement combinations were assigned to two groups (n = 15) according to the timepoint, at which the vinyl polyether silicone disclosing agent was applied: after (experimental groups, EXP) or before (control groups, CTRL) specific micromechanical treatments of the substrate surface. To increase standardization, the cements were applied into rubber rings (2.2 mm diameter x 1.0 mm thickness) positioned on the substrate surface. After luting procedures, all specimens were stored in 37°C distilled water for 24 h, then subjected to SBS testing using a wire loop of 0.2 mm diameter at a crosshead speed of 1 mm/min until failure. Failure analysis was performed for all tested specimens. SBS data were submitted to Weibull analysis. The silicone disclosing procedure performed after micromechanical surface treatment reduced the characteristic shear bond strength to zirconia and lithium disilicate when compared to CTRL. However, for dentin specimens, there was no significant difference between CTRL and EXP for any of the cements investigated. Failure analysis showed a predominance of interfacial failures. The silicone disclosing procedure performed after the micromechanical treatment of ceramic surfaces negatively affected the cement bond strength. Therefore, after using it to check the fit of a prosthesis, clinicians should carefully clean the ceramic surface.
Inertial Sensor Technology for Elite Swimming Performance Analysis: A Systematic Review
Mooney, Robert; Corley, Gavin; Godfrey, Alan; Quinlan, Leo R; ÓLaighin, Gearóid
2015-01-01
Technical evaluation of swimming performance is an essential factor of elite athletic preparation. Novel methods of analysis, incorporating body worn inertial sensors (i.e., Microelectromechanical systems, or MEMS, accelerometers and gyroscopes), have received much attention recently from both research and commercial communities as an alternative to video-based approaches. This technology may allow for improved analysis of stroke mechanics, race performance and energy expenditure, as well as real-time feedback to the coach, potentially enabling more efficient, competitive and quantitative coaching. The aim of this paper is to provide a systematic review of the literature related to the use of inertial sensors for the technical analysis of swimming performance. This paper focuses on providing an evaluation of the accuracy of different feature detection algorithms described in the literature for the analysis of different phases of swimming, specifically starts, turns and free-swimming. The consequences associated with different sensor attachment locations are also considered for both single and multiple sensor configurations. Additional information such as this should help practitioners to select the most appropriate systems and methods for extracting the key performance related parameters that are important to them for analysing their swimmers’ performance and may serve to inform both applied and research practices. PMID:26712760
Computing Lives And Reliabilities Of Turboprop Transmissions
NASA Technical Reports Server (NTRS)
Coy, J. J.; Savage, M.; Radil, K. C.; Lewicki, D. G.
1991-01-01
Computer program PSHFT calculates lifetimes of variety of aircraft transmissions. Consists of main program, series of subroutines applying to specific configurations, generic subroutines for analysis of properties of components, subroutines for analysis of system, and common block. Main program selects routines used in analysis and causes them to operate in desired sequence. Series of configuration-specific subroutines put in configuration data, perform force and life analyses for components (with help of generic component-property-analysis subroutines), fill property array, call up system-analysis routines, and finally print out results of analysis for system and components. Written in FORTRAN 77(IV).
Classification of fMRI resting-state maps using machine learning techniques: A comparative study
NASA Astrophysics Data System (ADS)
Gallos, Ioannis; Siettos, Constantinos
2017-11-01
We compare the efficiency of Principal Component Analysis (PCA) and nonlinear learning manifold algorithms (ISOMAP and Diffusion maps) for classifying brain maps between groups of schizophrenia patients and healthy from fMRI scans during a resting-state experiment. After a standard pre-processing pipeline, we applied spatial Independent component analysis (ICA) to reduce (a) noise and (b) spatial-temporal dimensionality of fMRI maps. On the cross-correlation matrix of the ICA components, we applied PCA, ISOMAP and Diffusion Maps to find an embedded low-dimensional space. Finally, support-vector-machines (SVM) and k-NN algorithms were used to evaluate the performance of the algorithms in classifying between the two groups.
Shape Optimization by Bayesian-Validated Computer-Simulation Surrogates
NASA Technical Reports Server (NTRS)
Patera, Anthony T.
1997-01-01
A nonparametric-validated, surrogate approach to optimization has been applied to the computational optimization of eddy-promoter heat exchangers and to the experimental optimization of a multielement airfoil. In addition to the baseline surrogate framework, a surrogate-Pareto framework has been applied to the two-criteria, eddy-promoter design problem. The Pareto analysis improves the predictability of the surrogate results, preserves generality, and provides a means to rapidly determine design trade-offs. Significant contributions have been made in the geometric description used for the eddy-promoter inclusions as well as to the surrogate framework itself. A level-set based, geometric description has been developed to define the shape of the eddy-promoter inclusions. The level-set technique allows for topology changes (from single-body,eddy-promoter configurations to two-body configurations) without requiring any additional logic. The continuity of the output responses for input variations that cross the boundary between topologies has been demonstrated. Input-output continuity is required for the straightforward application of surrogate techniques in which simplified, interpolative models are fitted through a construction set of data. The surrogate framework developed previously has been extended in a number of ways. First, the formulation for a general, two-output, two-performance metric problem is presented. Surrogates are constructed and validated for the outputs. The performance metrics can be functions of both outputs, as well as explicitly of the inputs, and serve to characterize the design preferences. By segregating the outputs and the performance metrics, an additional level of flexibility is provided to the designer. The validated outputs can be used in future design studies and the error estimates provided by the output validation step still apply, and require no additional appeals to the expensive analysis. Second, a candidate-based a posteriori error analysis capability has been developed which provides probabilistic error estimates on the true performance for a design randomly selected near the surrogate-predicted optimal design.
Reduced Order Model Implementation in the Risk-Informed Safety Margin Characterization Toolkit
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mandelli, Diego; Smith, Curtis L.; Alfonsi, Andrea
2015-09-01
The RISMC project aims to develop new advanced simulation-based tools to perform Probabilistic Risk Analysis (PRA) for the existing fleet of U.S. nuclear power plants (NPPs). These tools numerically model not only the thermo-hydraulic behavior of the reactor primary and secondary systems but also external events temporal evolution and components/system ageing. Thus, this is not only a multi-physics problem but also a multi-scale problem (both spatial, µm-mm-m, and temporal, ms-s-minutes-years). As part of the RISMC PRA approach, a large amount of computationally expensive simulation runs are required. An important aspect is that even though computational power is regularly growing, themore » overall computational cost of a RISMC analysis may be not viable for certain cases. A solution that is being evaluated is the use of reduce order modeling techniques. During the FY2015, we investigated and applied reduced order modeling techniques to decrease the RICM analysis computational cost by decreasing the number of simulations runs to perform and employ surrogate models instead of the actual simulation codes. This report focuses on the use of reduced order modeling techniques that can be applied to any RISMC analysis to generate, analyze and visualize data. In particular, we focus on surrogate models that approximate the simulation results but in a much faster time (µs instead of hours/days). We apply reduced order and surrogate modeling techniques to several RISMC types of analyses using RAVEN and RELAP-7 and show the advantages that can be gained.« less
Reliability in content analysis: The case of semantic feature norms classification.
Bolognesi, Marianna; Pilgram, Roosmaryn; van den Heerik, Romy
2017-12-01
Semantic feature norms (e.g., STIMULUS: car → RESPONSE:
A root cause analysis project in a medication safety course.
Schafer, Jason J
2012-08-10
To develop, implement, and evaluate team-based root cause analysis projects as part of a required medication safety course for second-year pharmacy students. Lectures, in-class activities, and out-of-class reading assignments were used to develop students' medication safety skills and introduce them to the culture of medication safety. Students applied these skills within teams by evaluating cases of medication errors using root cause analyses. Teams also developed error prevention strategies and formally presented their findings. Student performance was assessed using a medication errors evaluation rubric. Of the 211 students who completed the course, the majority performed well on root cause analysis assignments and rated them favorably on course evaluations. Medication error evaluation and prevention was successfully introduced in a medication safety course using team-based root cause analysis projects.
Analysis of a Hovering Rotor in Icing Conditions
NASA Technical Reports Server (NTRS)
Narducci, Robert; Kreeger, Richard E.
2012-01-01
A high fidelity analysis method is proposed to evaluate the ice accumulation and the ensuing rotor performance degradation for a helicopter flying through an icing cloud. The process uses computational fluid dynamics (CFD) coupled to a rotorcraft comprehensive code to establish the aerodynamic environment of a trimmed rotor prior to icing. Based on local aerodynamic conditions along the rotor span and accounting for the azimuthal variation, an ice accumulation analysis using NASA's Lewice3D code is made to establish the ice geometry. Degraded rotor performance is quantified by repeating the high fidelity rotor analysis with updates which account for ice shape and mass. The process is applied on a full-scale UH-1H helicopter in hover using data recorded during the Helicopter Icing Flight Test Program.
NASA Astrophysics Data System (ADS)
Zargari, Abolfazl; Du, Yue; Thai, Theresa C.; Gunderson, Camille C.; Moore, Kathleen; Mannel, Robert S.; Liu, Hong; Zheng, Bin; Qiu, Yuchen
2018-02-01
The objective of this study is to investigate the performance of global and local features to better estimate the characteristics of highly heterogeneous metastatic tumours, for accurately predicting the treatment effectiveness of the advanced stage ovarian cancer patients. In order to achieve this , a quantitative image analysis scheme was developed to estimate a total of 103 features from three different groups including shape and density, Wavelet, and Gray Level Difference Method (GLDM) features. Shape and density features are global features, which are directly applied on the entire target image; wavelet and GLDM features are local features, which are applied on the divided blocks of the target image. To assess the performance, the new scheme was applied on a retrospective dataset containing 120 recurrent and high grade ovary cancer patients. The results indicate that the three best performed features are skewness, root-mean-square (rms) and mean of local GLDM texture, indicating the importance of integrating local features. In addition, the averaged predicting performance are comparable among the three different categories. This investigation concluded that the local features contains at least as copious tumour heterogeneity information as the global features, which may be meaningful on improving the predicting performance of the quantitative image markers for the diagnosis and prognosis of ovary cancer patients.
A comparison of economic evaluation models as applied to geothermal energy technology
NASA Technical Reports Server (NTRS)
Ziman, G. M.; Rosenberg, L. S.
1983-01-01
Several cost estimation and financial cash flow models have been applied to a series of geothermal case studies. In order to draw conclusions about relative performance and applicability of these models to geothermal projects, the consistency of results was assessed. The model outputs of principal interest in this study were net present value, internal rate of return, or levelized breakeven price. The models used were VENVAL, a venture analysis model; the Geothermal Probabilistic Cost Model (GPC Model); the Alternative Power Systems Economic Analysis Model (APSEAM); the Geothermal Loan Guarantee Cash Flow Model (GCFM); and the GEOCOST and GEOCITY geothermal models. The case studies to which the models were applied include a geothermal reservoir at Heber, CA; a geothermal eletric power plant to be located at the Heber site; an alcohol fuels production facility to be built at Raft River, ID; and a direct-use, district heating system in Susanville, CA.
Sun, Jianghao; Chen, Pei
2012-03-05
A practical ultra high-performance liquid chromatography (UHPLC) method was developed for fingerprint analysis of and determination of yohimbine in yohimbe barks and related dietary supplements. Good separation was achieved using a Waters Acquity BEH C(18) column with gradient elution using 0.1% (v/v) aqueous ammonium hydroxide and 0.1% ammonium hydroxide in methanol as the mobile phases. The study is the first reported chromatographic method that separates corynanthine from yohimbine in yohimbe bark extract. The chromatographic fingerprint analysis was applied to the analysis of 18 yohimbe commercial dietary supplement samples. Quantitation of yohimbine, the traditional method for analysis of yohimbe barks, were also performed to evaluate the results of the fingerprint analysis. Wide variability was observed in fingerprints and yohimbine content among yohimbe dietary supplement samples. For most of the dietary supplements, the yohimbine content was not consistent with the label claims. Copyright © 2011. Published by Elsevier B.V.
DOT National Transportation Integrated Search
2012-03-01
This study was undertaken to: 1) apply a benchmarking process to identify best practices within four areas Wisconsin Department of Transportation (WisDOT) construction management and 2) analyze two performance metrics, % Cost vs. % Time, tracked by t...
Qualitative analysis of precipiation distribution in Poland with use of different data sources
NASA Astrophysics Data System (ADS)
Walawender, J.; Dyras, I.; Łapeta, B.; Serafin-Rek, D.; Twardowski, A.
2008-04-01
Geographical Information Systems (GIS) can be used to integrate data from different sources and in different formats to perform innovative spatial and temporal analysis. GIS can be also applied for climatic research to manage, investigate and display all kinds of weather data. The main objective of this study is to demonstrate that GIS is a useful tool to examine and visualise precipitation distribution obtained from different data sources: ground measurements, satellite and radar data. Three selected days (30 cases) with convective rainfall situations were analysed. Firstly, scalable GRID-based approach was applied to store data from three different sources in comparable layout. Then, geoprocessing algorithm was created within ArcGIS 9.2 environment. The algorithm included: GRID definition, reclassification and raster algebra. All of the calculations and procedures were performed automatically. Finally, contingency tables and pie charts were created to show relationship between ground measurements and both satellite and radar derived data. The results were visualised on maps.
DC Potentials Applied to an End-cap Electrode of a 3-D Ion Trap for Enhanced MSn Functionality
Prentice, Boone M.; Xu, Wei; Ouyang, Zheng; McLuckey, Scott A.
2010-01-01
The effects of the application of various DC magnitudes and polarities to an end-cap of a 3-D quadrupole ion trap throughout a mass spectrometry experiment were investigated. Application of a monopolar DC field was achieved by applying a DC potential to the exit end-cap electrode, while maintaining the entrance end-cap electrode at ground potential. Control over the monopolar DC magnitude and polarity during time periods associated with ion accumulation, mass analysis, ion isolation, ion/ion reaction, and ion activation can have various desirable effects. Included amongst these are increased ion capture efficiency, increased ion ejection efficiency during mass analysis, effective isolation of ions using lower AC resonance ejection amplitudes, improved temporal control of the overlap of oppositely charged ion populations, and the performance of “broad-band” collision induced dissociation (CID). These results suggest general means to improve the performance of the 3-D ion trap in a variety of mass spectrometry and tandem mass spectrometry experiments. PMID:21927573
Decision analysis in clinical cardiology: When is coronary angiography required in aortic stenosis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Georgeson, S.; Meyer, K.B.; Pauker, S.G.
1990-03-15
Decision analysis offers a reproducible, explicit approach to complex clinical decisions. It consists of developing a model, typically a decision tree, that separates choices from chances and that specifies and assigns relative values to outcomes. Sensitivity analysis allows exploration of alternative assumptions. Cost-effectiveness analysis shows the relation between dollars spent and improved health outcomes achieved. In a tutorial format, this approach is applied to the decision whether to perform coronary angiography in a patient who requires aortic valve replacement for critical aortic stenosis.
How GPs value guidelines applied to patients with multimorbidity: a qualitative study
Luijks, Hilde; Lucassen, Peter; van Weel, Chris; Loeffen, Maartje; Lagro-Janssen, Antoine; Schermer, Tjard
2015-01-01
Objectives To explore and describe the value general practitioner (GPs) attribute to medical guidelines when they are applied to patients with multimorbidity, and to describe which benefits GPs experience from guideline adherence in these patients. Also, we aimed to identify limitations from guideline adherence in patients with multimorbidity, as perceived by GPs, and to describe their empirical solutions to manage these obstacles. Design Focus group study with purposive sampling of participants. Focus groups were guided by an experienced moderator who used an interview guide. Interviews were transcribed verbatim. Data analysis was performed by two researchers using the constant comparison analysis technique and field notes were used in the analysis. Data collection proceeded until saturation was reached. Setting Primary care, eastern part of The Netherlands. Participants Dutch GPs, heterogeneous in age, sex and academic involvement. Results 25 GPs participated in five focus groups. GPs valued the guidance that guidelines provide, but experienced shortcomings when they were applied to patients with multimorbidity. Taking these patients’ personal circumstances into account was regarded as important, but it was impeded by a consistent focus on guideline adherence. Preventative measures were considered less appropriate in (elderly) patients with multimorbidity. Moreover, the applicability of guidelines in patients with multimorbidity was questioned. GPs’ extensive practical experience with managing multimorbidity resulted in several empirical solutions, for example, using their ‘common sense’ to respond to the perceived shortcomings. Conclusions GPs applying guidelines for patients with multimorbidity integrate patient-specific factors in their medical decisions, aiming for patient-centred solutions. Such integration of clinical experience and best evidence is required to practise evidence-based medicine. More flexibility in pay-for-performance systems is needed to facilitate this integration. Several improvements in guideline reporting are necessary to enhance the applicability of guidelines in patients with multimorbidity. PMID:26503382
Bertelli, Davide; Brighenti, Virginia; Marchetti, Lucia; Reik, Anna; Pellati, Federica
2018-06-01
Humulus lupulus L. (hop) represents one of the most cultivated crops, it being a key ingredient in the brewing process. Many health-related properties have been described for hop extracts, making this plant gain more interest in the field of pharmaceutical and nutraceutical research. Among the analytical tools available for the phytochemical characterization of plant extracts, quantitative nuclear magnetic resonance (qNMR) represents a new and powerful technique. In this ambit, the present study was aimed at the development of a new, simple, and efficient qNMR method for the metabolite fingerprinting of bioactive compounds in hop cones, taking advantage of the novel ERETIC 2 tool. To the best of our knowledge, this is the first attempt to apply this method to complex matrices of natural origin, such as hop extracts. The qNMR method set up in this study was applied to the quantification of both prenylflavonoids and bitter acids in eight hop cultivars. The performance of this analytical method was compared with that of HPLC-UV/DAD, which represents the most frequently used technique in the field of natural product analysis. The quantitative data obtained for hop samples by means of the two aforementioned techniques highlighted that the amount of bioactive compounds was slightly higher when qNMR was applied, although the order of magnitude of the values was the same. The accuracy of qNMR was comparable to that of the chromatographic method, thus proving to be a reliable tool for the analysis of these secondary metabolites in hop extracts. Graphical abstract Graphical abstract related to the extraction and analytical methods applied in this work for the analysis of bioactive compounds in Humulus lupulus L. (hop) cones.
Viscoelastic properties of chalcogenide glasses and the simulation of their molding processes
NASA Astrophysics Data System (ADS)
Liu, Weiguo; Shen, Ping; Jin, Na
In order to simulate the precision molding process, the viscoelastic properties of chalcogenide glasses under high temperatures were investigated. Thermomechanical analysis were performed to measure and analysis the thermomechanical properties of chalcogenide glasses. The creep responses of the glasses at different temperatures were obtained. Finite element analysis was applied for the simulation of the molding processes. The simulation results were in consistence with previously reported experiment results. Stress concentration and evolution during the molding processes was also described with the simulation results.
NASA Technical Reports Server (NTRS)
Hulka, J. R.; Jones, G. W.
2010-01-01
Liquid rocket engines using oxygen and methane propellants are being considered by the National Aeronautics and Space Administration (NASA) for in-space vehicles. This propellant combination has not been previously used in a flight-qualified engine system, so limited test data and analysis results are available at this stage of early development. NASA has funded several hardware-oriented activities with oxygen and methane propellants over the past several years with the Propulsion and Cryogenic Advanced Development (PCAD) project, under the Exploration Technology Development Program. As part of this effort, the NASA Marshall Space Flight Center has conducted combustion, performance, and combustion stability analyses of several of the configurations. This paper summarizes the analyses of combustion and performance as a follow-up to a paper published in the 2008 JANNAF/LPS meeting. Combustion stability analyses are presented in a separate paper. The current paper includes test and analysis results of coaxial element injectors using liquid oxygen and liquid methane or gaseous methane propellants. Several thrust chamber configurations have been modeled, including thrust chambers with multi-element swirl coax element injectors tested at the NASA MSFC, and a uni-element chamber with shear and swirl coax injectors tested at The Pennsylvania State University. Configurations were modeled with two one-dimensional liquid rocket combustion analysis codes, the Rocket Combustor Interaction Design and Analysis (ROCCID), and the Coaxial Injector Combustion Model (CICM). Significant effort was applied to show how these codes can be used to model combustion and performance with oxygen/methane propellants a priori, and what anchoring or calibrating features need to be applied or developed in the future. This paper describes the test hardware configurations, presents the results of all the analyses, and compares the results from the two analytical methods
Váradi, Csaba; Mittermayr, Stefan; Millán-Martín, Silvia; Bones, Jonathan
2016-12-01
Capillary electrophoresis (CE) offers excellent efficiency and orthogonality to liquid chromatographic (LC) separations for oligosaccharide structural analysis. Combination of CE with high resolution mass spectrometry (MS) for glycan analysis remains a challenging task due to the MS incompatibility of background electrolyte buffers and additives commonly used in offline CE separations. Here, a novel method is presented for the analysis of 2-aminobenzoic acid (2-AA) labelled glycans by capillary electrophoresis coupled to mass spectrometry (CE-MS). To ensure maximum resolution and excellent precision without the requirement for excessive analysis times, CE separation conditions including the concentration and pH of the background electrolyte, the effect of applied pressure on the capillary inlet and the capillary length were evaluated. Using readily available 12/13 C 6 stable isotopologues of 2-AA, the developed method can be applied for quantitative glycan profiling in a twoplex manner based on the generation of extracted ion electropherograms (EIE) for 12 C 6 'light' and 13 C 6 'heavy' 2-AA labelled glycan isotope clusters. The twoplex quantitative CE-MS glycan analysis platform is ideally suited for comparability assessment of biopharmaceuticals, such as monoclonal antibodies, for differential glycomic analysis of clinical material for potential biomarker discovery or for quantitative microheterogeneity analysis of different glycosylation sites within a glycoprotein. Additionally, due to the low injection volume requirements of CE, subsequent LC-MS analysis of the same sample can be performed facilitating the use of orthogonal separation techniques for structural elucidation or verification of quantitative performance.
Tao, Lingyan; Zhang, Qing; Wu, Yongjiang; Liu, Xuesong
2016-12-01
In this study, a fast and effective high-performance liquid chromatography method was developed to obtain a fingerprint chromatogram and quantitative analysis simultaneously of four indexes including gallic acid, chlorogenic acid, albiflorin and paeoniflorin of the traditional Chinese medicine Moluodan Concentrated Pill. The method was performed by using a Waters X-bridge C 18 reversed phase column on an Agilent 1200S high-performance liquid chromatography system coupled with diode array detection. The mobile phase of the high-performance liquid chromatography method was composed of 20 mmol/L phosphate solution and acetonitrile with a 1 mL/min eluent velocity, under a detection temperature of 30°C and a UV detection wavelength of 254 nm. After the methodology validation, 16 batches of Moluodan Concentrated Pill were analyzed by this high-performance liquid chromatography method and both qualitative and quantitative evaluation results were achieved by similarity analysis, principal component analysis and hierarchical cluster analysis. The results of these three chemometrics were in good agreement and all indicated that batch 10 and batch 16 showed significant differences with the other 14 batches. This suggested that the developed high-performance liquid chromatography method could be applied in the quality evaluation of Moluodan Concentrated Pill. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Full long-term design response analysis of a wave energy converter
Coe, Ryan G.; Michelen, Carlos; Eckert-Gallup, Aubrey; ...
2017-09-21
Efficient design of wave energy converters requires an accurate understanding of expected loads and responses during the deployment lifetime of a device. A study has been conducted to better understand best-practices for prediction of design responses in a wave energy converter. A case-study was performed in which a simplified wave energy converter was analyzed to predict several important device design responses. The application and performance of a full long-term analysis, in which numerical simulations were used to predict the device response for a large number of distinct sea states, was studied. Environmental characterization and selection of sea states for thismore » analysis at the intended deployment site were performed using principle-components analysis. The full long-term analysis applied here was shown to be stable when implemented with a relatively low number of sea states and convergent with an increasing number of sea states. As the number of sea states utilized in the analysis was increased, predicted response levels did not change appreciably. Furthermore, uncertainty in the response levels was reduced as more sea states were utilized.« less
Full long-term design response analysis of a wave energy converter
DOE Office of Scientific and Technical Information (OSTI.GOV)
Coe, Ryan G.; Michelen, Carlos; Eckert-Gallup, Aubrey
Efficient design of wave energy converters requires an accurate understanding of expected loads and responses during the deployment lifetime of a device. A study has been conducted to better understand best-practices for prediction of design responses in a wave energy converter. A case-study was performed in which a simplified wave energy converter was analyzed to predict several important device design responses. The application and performance of a full long-term analysis, in which numerical simulations were used to predict the device response for a large number of distinct sea states, was studied. Environmental characterization and selection of sea states for thismore » analysis at the intended deployment site were performed using principle-components analysis. The full long-term analysis applied here was shown to be stable when implemented with a relatively low number of sea states and convergent with an increasing number of sea states. As the number of sea states utilized in the analysis was increased, predicted response levels did not change appreciably. Furthermore, uncertainty in the response levels was reduced as more sea states were utilized.« less
Time-dependent inertia analysis of vehicle mechanisms
NASA Astrophysics Data System (ADS)
Salmon, James Lee
Two methods for performing transient inertia analysis of vehicle hardware systems are developed in this dissertation. The analysis techniques can be used to predict the response of vehicle mechanism systems to the accelerations associated with vehicle impacts. General analytical methods for evaluating translational or rotational system dynamics are generated and evaluated for various system characteristics. The utility of the derived techniques are demonstrated by applying the generalized methods to two vehicle systems. Time dependent acceleration measured during a vehicle to vehicle impact are used as input to perform a dynamic analysis of an automobile liftgate latch and outside door handle. Generalized Lagrange equations for a non-conservative system are used to formulate a second order nonlinear differential equation defining the response of the components to the transient input. The differential equation is solved by employing the fourth order Runge-Kutta method. The events are then analyzed using commercially available two dimensional rigid body dynamic analysis software. The results of the two analytical techniques are compared to experimental data generated by high speed film analysis of tests of the two components performed on a high G acceleration sled at Ford Motor Company.
Computational structural mechanics methods research using an evolving framework
NASA Technical Reports Server (NTRS)
Knight, N. F., Jr.; Lotts, C. G.; Gillian, R. E.
1990-01-01
Advanced structural analysis and computational methods that exploit high-performance computers are being developed in a computational structural mechanics research activity sponsored by the NASA Langley Research Center. These new methods are developed in an evolving framework and applied to representative complex structural analysis problems from the aerospace industry. An overview of the methods development environment is presented, and methods research areas are described. Selected application studies are also summarized.
Analysis of NREL Cold-Drink Vending Machines for Energy Savings
DOE Office of Scientific and Technical Information (OSTI.GOV)
Deru, M.; Torcellini, P.; Bottom, K.
NREL Staff, as part of Sustainable NREL, an initiative to improve the overall energy and environmental performance of the lab, decided to control how its vending machines used energy. The cold-drink vending machines across the lab were analyzed for potential energy savings opportunities. This report gives the monitoring and the analysis of two energy conservation measures applied to the cold-drink vending machines at NREL.
Quantitative analysis of voids in percolating structures in two-dimensional N-body simulations
NASA Technical Reports Server (NTRS)
Harrington, Patrick M.; Melott, Adrian L.; Shandarin, Sergei F.
1993-01-01
We present in this paper a quantitative method for defining void size in large-scale structure based on percolation threshold density. Beginning with two-dimensional gravitational clustering simulations smoothed to the threshold of nonlinearity, we perform percolation analysis to determine the large scale structure. The resulting objective definition of voids has a natural scaling property, is topologically interesting, and can be applied immediately to redshift surveys.
Implementing a Quantitative Analysis Design Tool for Future Generation Interfaces
2012-03-01
with Remotely Piloted Aircraft (RPA) has resulted in the need of a platform to evaluate interface design. The Vigilant Spirit Control Station ( VSCS ...Spirit interface. A modified version of the HCI Index was successfully applied to perform a quantitative analysis of the baseline VSCS interface and...time of the original VSCS interface. These results revealed the effectiveness of the tool and demonstrated in the design of future generation
ERIC Educational Resources Information Center
Jimenez, Evelyn
2013-01-01
This capstone project applied Clark and Estes' (2008) gap analysis framework to identify performance gaps, develop perceived root causes, validate the causes, and formulate research-based solutions to present to Trojan High School. The purpose was to examine ways to increase the academic achievement of ELL students, specifically Latinos, by…
Volumetric neuroimage analysis extensions for the MIPAV software package.
Bazin, Pierre-Louis; Cuzzocreo, Jennifer L; Yassa, Michael A; Gandler, William; McAuliffe, Matthew J; Bassett, Susan S; Pham, Dzung L
2007-09-15
We describe a new collection of publicly available software tools for performing quantitative neuroimage analysis. The tools perform semi-automatic brain extraction, tissue classification, Talairach alignment, and atlas-based measurements within a user-friendly graphical environment. They are implemented as plug-ins for MIPAV, a freely available medical image processing software package from the National Institutes of Health. Because the plug-ins and MIPAV are implemented in Java, both can be utilized on nearly any operating system platform. In addition to the software plug-ins, we have also released a digital version of the Talairach atlas that can be used to perform regional volumetric analyses. Several studies are conducted applying the new tools to simulated and real neuroimaging data sets.
Zhao, Bing Tian; Kim, Eun Jung; Son, Kun Ho; Son, Jong Keun; Min, Byung Sun; Woo, Mi Hee
2015-08-01
To establish a standard of quality control and to identify different origins for the Rutaceae family [Citri Unshiu Peel (CU), Citri Unshiu Immature Peel (CI), Ponciri Immature Fructus (PI), Aurantii Immature Fructus (AI), and Aurantii Fructus (AU)], 13 standards including rutin (1), narirutin (2), naringin (3), hesperidin (4), neohesperidin (5), neoponcirin (6), poncirin (7), naringenin (8), isosinensetin (9), sinensetin (10), nobiletin (11), heptamethoxyflavone (12), and tangeretin (13) were determined by high performance liquid chromatography (HPLC)/photo-diode array (PDA) analysis. A YMC ODS C18 (250 × 4.6 mm, 5 µm) column was used and the ratio of mobile phases of water (A) and acetonitrile (B) delivered to the column for gradient elution was applied. This method was fully validated with respect to linearity, accuracy, precision, stability, and robustness. The HPLC/PDA method was applied successfully to quantify 13 major compounds in the extracts of CU, CI, PI, AI, and AU. The pattern recognition analysis combined with LC chromatographic data was performed by repeated analysis of 27 reference samples in the above five Rutaceae oriental medicinal drugs. The established HPLC method was rapid and reliable for quantitative analysis and quality control of multiple components in five Rutaceae species with different origins.
Leavesley, Silas J; Sweat, Brenner; Abbott, Caitlyn; Favreau, Peter; Rich, Thomas C
2018-01-01
Spectral imaging technologies have been used for many years by the remote sensing community. More recently, these approaches have been applied to biomedical problems, where they have shown great promise. However, biomedical spectral imaging has been complicated by the high variance of biological data and the reduced ability to construct test scenarios with fixed ground truths. Hence, it has been difficult to objectively assess and compare biomedical spectral imaging assays and technologies. Here, we present a standardized methodology that allows assessment of the performance of biomedical spectral imaging equipment, assays, and analysis algorithms. This methodology incorporates real experimental data and a theoretical sensitivity analysis, preserving the variability present in biomedical image data. We demonstrate that this approach can be applied in several ways: to compare the effectiveness of spectral analysis algorithms, to compare the response of different imaging platforms, and to assess the level of target signature required to achieve a desired performance. Results indicate that it is possible to compare even very different hardware platforms using this methodology. Future applications could include a range of optimization tasks, such as maximizing detection sensitivity or acquisition speed, providing high utility for investigators ranging from design engineers to biomedical scientists. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Nanoliter-Scale Oil-Air-Droplet Chip-Based Single Cell Proteomic Analysis.
Li, Zi-Yi; Huang, Min; Wang, Xiu-Kun; Zhu, Ying; Li, Jin-Song; Wong, Catherine C L; Fang, Qun
2018-04-17
Single cell proteomic analysis provides crucial information on cellular heterogeneity in biological systems. Herein, we describe a nanoliter-scale oil-air-droplet (OAD) chip for achieving multistep complex sample pretreatment and injection for single cell proteomic analysis in the shotgun mode. By using miniaturized stationary droplet microreaction and manipulation techniques, our system allows all sample pretreatment and injection procedures to be performed in a nanoliter-scale droplet with minimum sample loss and a high sample injection efficiency (>99%), thus substantially increasing the analytical sensitivity for single cell samples. We applied the present system in the proteomic analysis of 100 ± 10, 50 ± 5, 10, and 1 HeLa cell(s), and protein IDs of 1360, 612, 192, and 51 were identified, respectively. The OAD chip-based system was further applied in single mouse oocyte analysis, with 355 protein IDs identified at the single oocyte level, which demonstrated its special advantages of high enrichment of sequence coverage, hydrophobic proteins, and enzymatic digestion efficiency over the traditional in-tube system.
The use of the wavelet cluster analysis for asteroid family determination
NASA Technical Reports Server (NTRS)
Benjoya, Phillippe; Slezak, E.; Froeschle, Claude
1992-01-01
The asteroid family determination has been analysis method dependent for a longtime. A new cluster analysis based on the wavelet transform has allowed an automatic definition of families with a degree of significance versus randomness. Actually this method is rather general and can be applied to any kind of structural analysis. We will rather concentrate on the main features of the method. The analysis has been performed on the set of 4100 asteroid proper elements computed by Milani and Knezevic (see Milani and Knezevic 1990). Twenty one families have been found and influence of the chosen metric has been tested. The results have beem compared to Zappala et al.'s ones (see Zappala et al 1990) obtained by the use of a completely different method applied to the same set of data. For the first time, a good overlapping has been found between both method results, not only for the big well known families but also for the smallest ones.
Li, Zenghui; Xu, Bin; Yang, Jian; Song, Jianshe
2015-01-01
This paper focuses on suppressing spectral overlap for sub-band spectral estimation, with which we can greatly decrease the computational complexity of existing spectral estimation algorithms, such as nonlinear least squares spectral analysis and non-quadratic regularized sparse representation. Firstly, our study shows that the nominal ability of the high-order analysis filter to suppress spectral overlap is greatly weakened when filtering a finite-length sequence, because many meaningless zeros are used as samples in convolution operations. Next, an extrapolation-based filtering strategy is proposed to produce a series of estimates as the substitutions of the zeros and to recover the suppression ability. Meanwhile, a steady-state Kalman predictor is applied to perform a linearly-optimal extrapolation. Finally, several typical methods for spectral analysis are applied to demonstrate the effectiveness of the proposed strategy. PMID:25609038
Energy and exergy assessments for an enhanced use of energy in buildings
NASA Astrophysics Data System (ADS)
Goncalves, Pedro Manuel Ferreira
Exergy analysis has been found to be a useful method for improving the conversion efficiency of energy resources, since it helps to identify locations, types and true magnitudes of wastes and losses. It has also been applied for other purposes, such as distinguishing high- from low-quality energy sources or defining the engineering technological limits in designing more energy-efficient systems. In this doctoral thesis, the exergy analysis is widely applied in order to highlight and demonstrate it as a significant method of performing energy assessments of buildings and related energy supply systems. It aims to make the concept more familiar and accessible for building professionals and to encourage its wider use in engineering practice. Case study I aims to show the importance of exergy analysis in the energy performance assessment of eight space heating building options evaluated under different outdoor environmental conditions. This study is concerned with the so-called "reference state", which in this study is calculated using the average outdoor temperature for a given period of analysis. Primary energy and related exergy ratios are assessed and compared. Higher primary exergy ratios are obtained for low outdoor temperatures, while the primary energy ratios are assumed as constant for the same scenarios. The outcomes of this study demonstrate the significance of exergy analysis in comparison with energy analysis when different reference states are compared. Case study II and Case study III present two energy and exergy assessment studies applied to a hotel and a student accommodation building, respectively. Case study II compares the energy and exergy performance of the main end uses of a hotel building located in Coimbra in central Portugal, using data derived from an energy audit. Case study III uses data collected from energy utilities bills to estimate the energy and exergy performance associated to each building end use. Additionally, a set of energy supply options are proposed and assessed as primary energy demand and exergy efficiency, showing it as a possible benchmarking method for future legislative frameworks regarding the energy performance assessment of buildings. Case study IV proposes a set of complementary indicators for comparing cogeneration and separate heat and electricity production systems. It aims to identify the advantages of exergy analysis relative to energy analysis, giving particular examples where these advantages are significant. The results demonstrate that exergy analysis can reveal meaningful information that might not be accessible using a conventional energy analysis approach, which is particularly evident when cogeneration and separated systems provide heat at very different temperatures. Case study V follows the exergy analysis method to evaluate the energy and exergy performance of a desiccant cooling system, aiming to assess and locate irreversibilities sources. The results reveal that natural gas boiler is the most inefficient component of the plant in question, followed by the chiller and heating coil. A set of alternative heating supply options for desiccant wheel regeneration is proposed, showing that, while some renewables may effectively reduce the primary energy demand of the plant, although this may not correspond to the optimum level of exergy efficiency. The thermal and chemical exergy components of moist air are also evaluated, as well as, the influence of outdoor environmental conditions on the energy/exergy performance of the plant. This research provides knowledge that is essential for the future development of complementary energy- and exergy-based indicators, helping to improve the current methodologies on performance assessments of buildings, cogeneration and desiccant cooling systems. The significance of exergy analysis is demonstrated for different types of buildings, which may be located in different climates (reference states) and be supplied by different types of energy sources. (Abstract shortened by ProQuest.).
Exploiting spectral content for image segmentation in GPR data
NASA Astrophysics Data System (ADS)
Wang, Patrick K.; Morton, Kenneth D., Jr.; Collins, Leslie M.; Torrione, Peter A.
2011-06-01
Ground-penetrating radar (GPR) sensors provide an effective means for detecting changes in the sub-surface electrical properties of soils, such as changes indicative of landmines or other buried threats. However, most GPR-based pre-screening algorithms only localize target responses along the surface of the earth, and do not provide information regarding an object's position in depth. As a result, feature extraction algorithms are forced to process data from entire cubes of data around pre-screener alarms, which can reduce feature fidelity and hamper performance. In this work, spectral analysis is investigated as a method for locating subsurface anomalies in GPR data. In particular, a 2-D spatial/frequency decomposition is applied to pre-screener flagged GPR B-scans. Analysis of these spatial/frequency regions suggests that aspects (e.g. moments, maxima, mode) of the frequency distribution of GPR energy can be indicative of the presence of target responses. After translating a GPR image to a function of the spatial/frequency distributions at each pixel, several image segmentation approaches can be applied to perform segmentation in this new transformed feature space. To illustrate the efficacy of the approach, a performance comparison between feature processing with and without the image segmentation algorithm is provided.
Methodology, Methods, and Metrics for Testing and Evaluating Augmented Cognition Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Greitzer, Frank L.
The augmented cognition research community seeks cognitive neuroscience-based solutions to improve warfighter performance by applying and managing mitigation strategies to reduce workload and improve the throughput and quality of decisions. The focus of augmented cognition mitigation research is to define, demonstrate, and exploit neuroscience and behavioral measures that support inferences about the warfighter’s cognitive state that prescribe the nature and timing of mitigation. A research challenge is to develop valid evaluation methodologies, metrics and measures to assess the impact of augmented cognition mitigations. Two considerations are external validity, which is the extent to which the results apply to operational contexts;more » and internal validity, which reflects the reliability of performance measures and the conclusions based on analysis of results. The scientific rigor of the research methodology employed in conducting empirical investigations largely affects the validity of the findings. External validity requirements also compel us to demonstrate operational significance of mitigations. Thus it is important to demonstrate effectiveness of mitigations under specific conditions. This chapter reviews some cognitive science and methodological considerations in designing augmented cognition research studies and associated human performance metrics and analysis methods to assess the impact of augmented cognition mitigations.« less
Analysis performance of proton exchange membrane fuel cell (PEMFC)
NASA Astrophysics Data System (ADS)
Mubin, A. N. A.; Bahrom, M. H.; Azri, M.; Ibrahim, Z.; Rahim, N. A.; Raihan, S. R. S.
2017-06-01
Recently, the proton exchange membrane fuel cell (PEMFC) has gained much attention to the technology of renewable energy due to its mechanically ideal and zero emission power source. PEMFC performance reflects from the surroundings such as temperature and pressure. This paper presents an analysis of the performance of the PEMFC by developing the mathematical thermodynamic modelling using Matlab/Simulink. Apart from that, the differential equation of the thermodynamic model of the PEMFC is used to explain the contribution of heat to the performance of the output voltage of the PEMFC. On the other hand, the partial pressure equation of the hydrogen is included in the PEMFC mathematical modeling to study the PEMFC voltage behaviour related to the input variable input hydrogen pressure. The efficiency of the model is 33.8% which calculated by applying the energy conversion device equations on the thermal efficiency. PEMFC’s voltage output performance is increased by increasing the hydrogen input pressure and temperature.
A case study using the PrOACT-URL and BRAT frameworks for structured benefit risk assessment.
Nixon, Richard; Dierig, Christoph; Mt-Isa, Shahrul; Stöckert, Isabelle; Tong, Thaison; Kuhls, Silvia; Hodgson, Gemma; Pears, John; Waddingham, Ed; Hockley, Kimberley; Thomson, Andrew
2016-01-01
While benefit-risk assessment is a key component of the drug development and maintenance process, it is often described in a narrative. In contrast, structured benefit-risk assessment builds on established ideas from decision analysis and comprises a qualitative framework and quantitative methodology. We compare two such frameworks, applying multi-criteria decision-analysis (MCDA) within the PrOACT-URL framework and weighted net clinical benefit (wNCB), within the BRAT framework. These are applied to a case study of natalizumab for the treatment of relapsing remitting multiple sclerosis. We focus on the practical considerations of applying these methods and give recommendations for visual presentation of results. In the case study, we found structured benefit-risk analysis to be a useful tool for structuring, quantifying, and communicating the relative benefit and safety profiles of drugs in a transparent, rational and consistent way. The two frameworks were similar. MCDA is a generic and flexible methodology that can be used to perform a structured benefit-risk in any common context. wNCB is a special case of MCDA and is shown to be equivalent to an extension of the number needed to treat (NNT) principle. It is simpler to apply and understand than MCDA and can be applied when all outcomes are measured on a binary scale. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
NASA Astrophysics Data System (ADS)
Volodin, Boris; Dolgy, Sergei; Ban, Vladimir S.; Gracin, Davor; Juraić, Krunoslav; Gracin, Leo
2014-03-01
Shifted Excitation Raman Difference Spectroscopy (SERDS) has proven an effective method for performing Raman analysis of fluorescent samples. This technique allows achieving excellent signal to noise performance with shorter excitation wavelengths, thus taking full advantage of the superior signal strength afforded by shorter excitation wavelengths and the superior performance, also combined with lower cost, delivered by silicon CCDs. The technique is enabled by use of two closely space fixed-wavelength laser diode sources stabilized with the Volume Bragg gratings (VBGs). A side by side comparison reveals that SERDS technique delivers superior signal to noise ratio and better detection limits in most situations, even when a longer excitation wavelength is employed for the purpose of elimination of the fluorescence. We have applied the SERDS technique to the quantitative analysis of the presence of trace amounts of methanol in red wines, which is an important task in quality control operations within wine industry and is currently difficult to perform in the field. So far conventional Raman spectroscopy analysis of red wines has been impractical due to the high degree of fluorescence.
Regalia, Giulia; Coelli, Stefania; Biffi, Emilia; Ferrigno, Giancarlo; Pedrocchi, Alessandra
2016-01-01
Neuronal spike sorting algorithms are designed to retrieve neuronal network activity on a single-cell level from extracellular multiunit recordings with Microelectrode Arrays (MEAs). In typical analysis of MEA data, one spike sorting algorithm is applied indiscriminately to all electrode signals. However, this approach neglects the dependency of algorithms' performances on the neuronal signals properties at each channel, which require data-centric methods. Moreover, sorting is commonly performed off-line, which is time and memory consuming and prevents researchers from having an immediate glance at ongoing experiments. The aim of this work is to provide a versatile framework to support the evaluation and comparison of different spike classification algorithms suitable for both off-line and on-line analysis. We incorporated different spike sorting "building blocks" into a Matlab-based software, including 4 feature extraction methods, 3 feature clustering methods, and 1 template matching classifier. The framework was validated by applying different algorithms on simulated and real signals from neuronal cultures coupled to MEAs. Moreover, the system has been proven effective in running on-line analysis on a standard desktop computer, after the selection of the most suitable sorting methods. This work provides a useful and versatile instrument for a supported comparison of different options for spike sorting towards more accurate off-line and on-line MEA data analysis.
Pedrocchi, Alessandra
2016-01-01
Neuronal spike sorting algorithms are designed to retrieve neuronal network activity on a single-cell level from extracellular multiunit recordings with Microelectrode Arrays (MEAs). In typical analysis of MEA data, one spike sorting algorithm is applied indiscriminately to all electrode signals. However, this approach neglects the dependency of algorithms' performances on the neuronal signals properties at each channel, which require data-centric methods. Moreover, sorting is commonly performed off-line, which is time and memory consuming and prevents researchers from having an immediate glance at ongoing experiments. The aim of this work is to provide a versatile framework to support the evaluation and comparison of different spike classification algorithms suitable for both off-line and on-line analysis. We incorporated different spike sorting “building blocks” into a Matlab-based software, including 4 feature extraction methods, 3 feature clustering methods, and 1 template matching classifier. The framework was validated by applying different algorithms on simulated and real signals from neuronal cultures coupled to MEAs. Moreover, the system has been proven effective in running on-line analysis on a standard desktop computer, after the selection of the most suitable sorting methods. This work provides a useful and versatile instrument for a supported comparison of different options for spike sorting towards more accurate off-line and on-line MEA data analysis. PMID:27239191
NASA Astrophysics Data System (ADS)
Kamakoshi, Y.; Nishida, S.; Kanbe, K.; Shohji, I.
2017-10-01
In recent years, powder metallurgy (P/M) materials have been expected to be applied to automobile products. Then, not only high cost performance but also more strength, wear resistance, long-life and so on are required for P/M materials. As an improvement method of mechanical properties of P/M materials, a densification is expected to be one of effective processes. In this study, to examine behaviours of the densification of Mo-alloyed sintered steel in a cold-forging process, finite element method (FEM) analysis was performed. Firstly, a columnar specimen was cut out from the inner part of a sintered specimen and a load-stroke diagram was obtained by the compression test. 2D FEM analysis was performed using the obtained load-stroke diagram. To correct the errors of stress between the porous mode and the rigid-elastic mode of analysis software, the analysis of a polynominal approximation was performed. As a result, the modified true stress-true strain diagram was obtained for the sintered steel with the densification. Afterwards, 3D FEM analysis of backward extrusion was carried out using the modified true stress-true strain diagram. It was confirmed that both the shape and density of the sintered steel analyzed by new FEM analysis that we suggest correspond well with experimental ones.
10 CFR 431.17 - Determination of efficiency.
Code of Federal Regulations, 2014 CFR
2014-01-01
... characteristics of that basic model, and (ii) Based on engineering or statistical analysis, computer simulation or... simulation or modeling, and other analytic evaluation of performance data on which the AEDM is based... applied. (iii) If requested by the Department, the manufacturer shall conduct simulations to predict the...
10 CFR 431.17 - Determination of efficiency.
Code of Federal Regulations, 2012 CFR
2012-01-01
... characteristics of that basic model, and (ii) Based on engineering or statistical analysis, computer simulation or... simulation or modeling, and other analytic evaluation of performance data on which the AEDM is based... applied. (iii) If requested by the Department, the manufacturer shall conduct simulations to predict the...
ERIC Educational Resources Information Center
Science and Children, 1992
1992-01-01
Presents an activity that is part of the National Science and Technology Week (NSTW) 1993. Students apply principles of biomechanics to find the most effective techniques for performing a standing broad jump and use that analysis to improve their own jumping. Instructions include procedures, materials needed, and possible extensions to the…
ERIC Educational Resources Information Center
Lehrer, Adrienne
1975-01-01
A structural analysis of the wine vocabulary used by wine experts is given. Experiments involving typical wine drinkers show that there is little consensus in how the words are applied to wine. Communication tasks show that the sender and receiver of messages about wine perform little better than chance. (Author/RM)
Construction of a Physician Skills Inventory
ERIC Educational Resources Information Center
Richard, George V.; Zarconi, Joseph; Savickas, Mark L.
2012-01-01
The current study applied Holland's RIASEC typology to develop a "Physician Skills Inventory". We identified the transferable skills and abilities that are critical to effective performance in medicine and had 140 physicians in 25 different specialties rate the importance of those skills. Principal component analysis of their responses produced…
Battery energy storage sizing when time of use pricing is applied.
Carpinelli, Guido; Khormali, Shahab; Mottola, Fabio; Proto, Daniela
2014-01-01
Battery energy storage systems (BESSs) are considered a key device to be introduced to actuate the smart grid paradigm. However, the most critical aspect related to the use of such device is its economic feasibility as it is a still developing technology characterized by high costs and limited life duration. Particularly, the sizing of BESSs must be performed in an optimized way in order to maximize the benefits related to their use. This paper presents a simple and quick closed form procedure for the sizing of BESSs in residential and industrial applications when time-of-use tariff schemes are applied. A sensitivity analysis is also performed to consider different perspectives in terms of life span and future costs.
NASA Astrophysics Data System (ADS)
Agafonova, N.; Aleksandrov, A.; Anokhina, A.; Aoki, S.; Ariga, A.; Ariga, T.; Bender, D.; Bertolin, A.; Bozza, C.; Brugnera, R.; Buonaura, A.; Buontempo, S.; Büttner, B.; Chernyavsky, M.; Chukanov, A.; Consiglio, L.; D'Ambrosio, N.; De Lellis, G.; De Serio, M.; Del Amo Sanchez, P.; Di Crescenzo, A.; Di Ferdinando, D.; Di Marco, N.; Dmitrievski, S.; Dracos, M.; Duchesneau, D.; Dusini, S.; Dzhatdoev, T.; Ebert, J.; Ereditato, A.; Fini, R. A.; Fukuda, T.; Galati, G.; Garfagnini, A.; Giacomelli, G.; Göllnitz, C.; Goldberg, J.; Gornushkin, Y.; Grella, G.; Guler, M.; Gustavino, C.; Hagner, C.; Hara, T.; Hollnagel, A.; Hosseini, B.; Ishida, H.; Ishiguro, K.; Jakovcic, K.; Jollet, C.; Kamiscioglu, C.; Kamiscioglu, M.; Kawada, J.; Kim, J. H.; Kim, S. H.; Kitagawa, N.; Klicek, B.; Kodama, K.; Komatsu, M.; Kose, U.; Kreslo, I.; Lauria, A.; Lenkeit, J.; Ljubicic, A.; Longhin, A.; Loverre, P.; Malgin, A.; Malenica, M.; Mandrioli, G.; Matsuo, T.; Matveev, V.; Mauri, N.; Medinaceli, E.; Meregaglia, A.; Mikado, S.; Monacelli, P.; Montesi, M. C.; Morishima, K.; Muciaccia, M. T.; Naganawa, N.; Naka, T.; Nakamura, M.; Nakano, T.; Nakatsuka, Y.; Niwa, K.; Ogawa, S.; Okateva, N.; Olshevsky, A.; Omura, T.; Ozaki, K.; Paoloni, A.; Park, B. D.; Park, I. G.; Pasqualini, L.; Pastore, A.; Patrizii, L.; Pessard, H.; Pistillo, C.; Podgrudkov, D.; Polukhina, N.; Pozzato, M.; Pupilli, F.; Roda, M.; Rokujo, H.; Roganova, T.; Rosa, G.; Ryazhskaya, O.; Sato, O.; Schembri, A.; Shakiryanova, I.; Shchedrina, T.; Sheshukov, A.; Shibuya, H.; Shiraishi, T.; Shoziyoev, G.; Simone, S.; Sioli, M.; Sirignano, C.; Sirri, G.; Spinetti, M.; Stanco, L.; Starkov, N.; Stellacci, S. M.; Stipcevic, M.; Strauss, T.; Strolin, P.; Takahashi, S.; Tenti, M.; Terranova, F.; Tioukov, V.; Tufanli, S.; Vilain, P.; Vladimirov, M.; Votano, L.; Vuilleumier, J. L.; Wilquet, G.; Wonsak, B.; Yoon, C. S.; Zemskova, S.; Zghiche, A.
2014-08-01
The OPERA experiment, designed to perform the first observation of oscillations in appearance mode through the detection of the leptons produced in charged current interactions, has collected data from 2008 to 2012. In the present paper, the procedure developed to detect particle decays, occurring over distances of the order of from the neutrino interaction point, is described in detail and applied to the search for charmed hadrons, showing similar decay topologies as the lepton. In the analysed sample, 50 charm decay candidate events are observed while are expected, proving that the detector performance and the analysis chain applied to neutrino events are well reproduced by the OPERA simulation and thus validating the methods for appearance detection.
Battery Energy Storage Sizing When Time of Use Pricing Is Applied
Khormali, Shahab
2014-01-01
Battery energy storage systems (BESSs) are considered a key device to be introduced to actuate the smart grid paradigm. However, the most critical aspect related to the use of such device is its economic feasibility as it is a still developing technology characterized by high costs and limited life duration. Particularly, the sizing of BESSs must be performed in an optimized way in order to maximize the benefits related to their use. This paper presents a simple and quick closed form procedure for the sizing of BESSs in residential and industrial applications when time-of-use tariff schemes are applied. A sensitivity analysis is also performed to consider different perspectives in terms of life span and future costs. PMID:25295309
Buckling Testing and Analysis of Space Shuttle Solid Rocket Motor Cylinders
NASA Technical Reports Server (NTRS)
Weidner, Thomas J.; Larsen, David V.; McCool, Alex (Technical Monitor)
2002-01-01
A series of full-scale buckling tests were performed on the space shuttle Reusable Solid Rocket Motor (RSRM) cylinders. The tests were performed to determine the buckling capability of the cylinders and to provide data for analytical comparison. A nonlinear ANSYS Finite Element Analysis (FEA) model was used to represent and evaluate the testing. Analytical results demonstrated excellent correlation to test results, predicting the failure load within 5%. The analytical value was on the conservative side, predicting a lower failure load than was applied to the test. The resulting study and analysis indicated the important parameters for FEA to accurately predict buckling failure. The resulting method was subsequently used to establish the pre-launch buckling capability of the space shuttle system.
Near infrared spectroscopy of human muscles
NASA Astrophysics Data System (ADS)
Gasbarrone, R.; Currà, A.; Cardillo, A.; Bonifazi, G.; Serranti, S.
2018-02-01
Optical spectroscopy is a powerful tool in research and industrial applications. Its properties of being rapid, non-invasive and not destructive make it a promising technique for qualitative as well as quantitative analysis in medicine. Recent advances in materials and fabrication techniques provided portable, performant, sensing spectrometers readily operated by user-friendly cabled or wireless systems. We used such a system to test whether infrared spectroscopy techniques, currently utilized in many areas as primary/secondary raw materials sector, cultural heritage, agricultural/food industry, environmental remote and proximal sensing, pharmaceutical industry, etc., could be applied in living humans to categorize muscles. We acquired muscles infrared spectra in the Vis-SWIR regions (350-2500 nm), utilizing an ASD FieldSpec 4 Standard-Res Spectroradiometer with a spectral sampling capability of 1.4 nm at 350-1000 nm and 1.1 nm at 1001-2500 nm. After a preliminary spectra pre-processing (i.e. signal scattering reduction), Principal Component Analysis (PCA) was applied to identify similar spectral features presence and to realize their further grouping. Partial Least-Squares Discriminant Analysis (PLS-DA) was utilized to implement discrimination/prediction models. We studied 22 healthy subjects (age 25-89 years, 11 females), by acquiring Vis-SWIR spectra from the upper limb muscles (i.e. biceps, a forearm flexor, and triceps, a forearm extensor). Spectroscopy was performed in fixed limb postures (elbow angle approximately 90‡). We found that optical spectroscopy can be applied to study human tissues in vivo. Vis-SWIR spectra acquired from the arm detect muscles, distinguish flexors from extensors.
Park, Daeryong; Roesner, Larry A
2012-12-15
This study examined pollutant loads released to receiving water from a typical urban watershed in the Los Angeles (LA) Basin of California by applying a best management practice (BMP) performance model that includes uncertainty. This BMP performance model uses the k-C model and incorporates uncertainty analysis and the first-order second-moment (FOSM) method to assess the effectiveness of BMPs for removing stormwater pollutants. Uncertainties were considered for the influent event mean concentration (EMC) and the aerial removal rate constant of the k-C model. The storage treatment overflow and runoff model (STORM) was used to simulate the flow volume from watershed, the bypass flow volume and the flow volume that passes through the BMP. Detention basins and total suspended solids (TSS) were chosen as representatives of stormwater BMP and pollutant, respectively. This paper applies load frequency curves (LFCs), which replace the exceedance percentage with an exceedance frequency as an alternative to load duration curves (LDCs), to evaluate the effectiveness of BMPs. An evaluation method based on uncertainty analysis is suggested because it applies a water quality standard exceedance based on frequency and magnitude. As a result, the incorporation of uncertainty in the estimates of pollutant loads can assist stormwater managers in determining the degree of total daily maximum load (TMDL) compliance that could be expected from a given BMP in a watershed. Copyright © 2012 Elsevier Ltd. All rights reserved.
Azadeh, Ali; Sheikhalishahi, Mohammad
2015-06-01
A unique framework for performance optimization of generation companies (GENCOs) based on health, safety, environment, and ergonomics (HSEE) indicators is presented. To rank this sector of industry, the combination of data envelopment analysis (DEA), principal component analysis (PCA), and Taguchi are used for all branches of GENCOs. These methods are applied in an integrated manner to measure the performance of GENCO. The preferred model between DEA, PCA, and Taguchi is selected based on sensitivity analysis and maximum correlation between rankings. To achieve the stated objectives, noise is introduced into input data. The results show that Taguchi outperforms other methods. Moreover, a comprehensive experiment is carried out to identify the most influential factor for ranking GENCOs. The approach developed in this study could be used for continuous assessment and improvement of GENCO's performance in supplying energy with respect to HSEE factors. The results of such studies would help managers to have better understanding of weak and strong points in terms of HSEE factors.
Feature-space-based FMRI analysis using the optimal linear transformation.
Sun, Fengrong; Morris, Drew; Lee, Wayne; Taylor, Margot J; Mills, Travis; Babyn, Paul S
2010-09-01
The optimal linear transformation (OLT), an image analysis technique of feature space, was first presented in the field of MRI. This paper proposes a method of extending OLT from MRI to functional MRI (fMRI) to improve the activation-detection performance over conventional approaches of fMRI analysis. In this method, first, ideal hemodynamic response time series for different stimuli were generated by convolving the theoretical hemodynamic response model with the stimulus timing. Second, constructing hypothetical signature vectors for different activity patterns of interest by virtue of the ideal hemodynamic responses, OLT was used to extract features of fMRI data. The resultant feature space had particular geometric clustering properties. It was then classified into different groups, each pertaining to an activity pattern of interest; the applied signature vector for each group was obtained by averaging. Third, using the applied signature vectors, OLT was applied again to generate fMRI composite images with high SNRs for the desired activity patterns. Simulations and a blocked fMRI experiment were employed for the method to be verified and compared with the general linear model (GLM)-based analysis. The simulation studies and the experimental results indicated the superiority of the proposed method over the GLM-based analysis in detecting brain activities.
Sleep-deprivation effect on human performance: a meta-analysis approach
DOE Office of Scientific and Technical Information (OSTI.GOV)
Candice D. Griffith; Candice D. Griffith; Sankaran Mahadevan
Human fatigue is hard to define since there is no direct measure of fatigue, much like stress. Instead fatigue must be inferred from measures that are affected by fatigue. One such measurable output affected by fatigue is reaction time. In this study the relationship of reaction time to sleep deprivation is studied. These variables were selected because reaction time and hours of sleep deprivation are straightforward characteristics of fatigue to begin the investigation of fatigue effects on performance. Meta-analysis, a widely used procedure in medical and psychological studies, is applied to the variety of fatigue literature collected from various fieldsmore » in this study. Meta-analysis establishes a procedure for coding and analyzing information from various studies to compute an effect size. In this research the effect size reported is the difference between standardized means, and is found to be -0.6341, implying a strong relationship between sleep deprivation and performance degradation.« less
Zheng, Yunliang; Luan, Lianjun; Chen, Yong; Ren, Yiping; Wu, Yongjiang
2012-12-01
Physalins are important bioactive compounds from genus Physalis. They often occur as isomers, which makes the structural elucidation difficult. In the present study, the fragmentation behavior and UV characteristics of seven physalins from genus Physalis were firstly investigated using electrospray ionization tandem mass spectrometry (ESI-MS/MS) and diode array detection (DAD). Combined with ultra-performance liquid chromatography (UPLC) and DAD, the established approach to the structural identification of physalins by ESI-MS/MS was then applied to the analysis of Physalis alkekengi L. According to the UPLC retention behavior, the diagnostic UV spectra and the molecular structural information provided by MS/MS spectra, about 19 fingerprint peaks were identified, including 14 physalins and 5 other compounds. Finally, the established fingerprint method was applied to the analysis of 31 P. alkekengi L. samples collected from different locations, which reflected their similar chemical constituent properties. The proposed method provides a scientific and technical platform to the herbal industry for quality control and safety assurance of herbal preparations that contain this class of physalins. Copyright © 2012 Elsevier B.V. All rights reserved.
Comparative analysis of peak-detection techniques for comprehensive two-dimensional chromatography.
Latha, Indu; Reichenbach, Stephen E; Tao, Qingping
2011-09-23
Comprehensive two-dimensional gas chromatography (GC×GC) is a powerful technology for separating complex samples. The typical goal of GC×GC peak detection is to aggregate data points of analyte peaks based on their retention times and intensities. Two techniques commonly used for two-dimensional peak detection are the two-step algorithm and the watershed algorithm. A recent study [4] compared the performance of the two-step and watershed algorithms for GC×GC data with retention-time shifts in the second-column separations. In that analysis, the peak retention-time shifts were corrected while applying the two-step algorithm but the watershed algorithm was applied without shift correction. The results indicated that the watershed algorithm has a higher probability of erroneously splitting a single two-dimensional peak than the two-step approach. This paper reconsiders the analysis by comparing peak-detection performance for resolved peaks after correcting retention-time shifts for both the two-step and watershed algorithms. Simulations with wide-ranging conditions indicate that when shift correction is employed with both algorithms, the watershed algorithm detects resolved peaks with greater accuracy than the two-step method. Copyright © 2011 Elsevier B.V. All rights reserved.
Effects of automation of information-processing functions on teamwork.
Wright, Melanie C; Kaber, David B
2005-01-01
We investigated the effects of automation as applied to different stages of information processing on team performance in a complex decision-making task. Forty teams of 2 individuals performed a simulated Theater Defense Task. Four automation conditions were simulated with computer assistance applied to realistic combinations of information acquisition, information analysis, and decision selection functions across two levels of task difficulty. Multiple measures of team effectiveness and team coordination were used. Results indicated different forms of automation have different effects on teamwork. Compared with a baseline condition, an increase in automation of information acquisition led to an increase in the ratio of information transferred to information requested; an increase in automation of information analysis resulted in higher team coordination ratings; and automation of decision selection led to better team effectiveness under low levels of task difficulty but at the cost of higher workload. The results support the use of early and intermediate forms of automation related to acquisition and analysis of information in the design of team tasks. Decision-making automation may provide benefits in more limited contexts. Applications of this research include the design and evaluation of automation in team environments.
Almeida, Mariana R; Correa, Deleon N; Zacca, Jorge J; Logrado, Lucio Paulo Lima; Poppi, Ronei J
2015-02-20
The aim of this study was to develop a methodology using Raman hyperspectral imaging and chemometric methods for identification of pre- and post-blast explosive residues on banknote surfaces. The explosives studied were of military, commercial and propellant uses. After the acquisition of the hyperspectral imaging, independent component analysis (ICA) was applied to extract the pure spectra and the distribution of the corresponding image constituents. The performance of the methodology was evaluated by the explained variance and the lack of fit of the models, by comparing the ICA recovered spectra with the reference spectra using correlation coefficients and by the presence of rotational ambiguity in the ICA solutions. The methodology was applied to forensic samples to solve an automated teller machine explosion case. Independent component analysis proved to be a suitable method of resolving curves, achieving equivalent performance with the multivariate curve resolution with alternating least squares (MCR-ALS) method. At low concentrations, MCR-ALS presents some limitations, as it did not provide the correct solution. The detection limit of the methodology presented in this study was 50 μg cm(-2). Copyright © 2014 Elsevier B.V. All rights reserved.
String Stability of a Linear Formation Flight Control System
NASA Technical Reports Server (NTRS)
Allen, Michael J.; Ryan, Jack; Hanson, Curtis E.; Parle, James F.
2002-01-01
String stability analysis of an autonomous formation flight system was performed using linear and nonlinear simulations. String stability is a measure of how position errors propagate from one vehicle to another in a cascaded system. In the formation flight system considered here, each i(sup th) aircraft uses information from itself and the preceding ((i-1)(sup th)) aircraft to track a commanded relative position. A possible solution for meeting performance requirements with such a system is to allow string instability. This paper explores two results of string instability and outlines analysis techniques for string unstable systems. The three analysis techniques presented here are: linear, nonlinear formation performance, and ride quality. The linear technique was developed from a worst-case scenario and could be applied to the design of a string unstable controller. The nonlinear formation performance and ride quality analysis techniques both use nonlinear formation simulation. Three of the four formation-controller gain-sets analyzed in this paper were limited more by ride quality than by performance. Formations of up to seven aircraft in a cascaded formation could be used in the presence of light gusts with this string unstable system.
NASA Astrophysics Data System (ADS)
Uysal, Selcuk Can
In this research, MATLAB SimulinkRTM was used to develop a cooled engine model for industrial gas turbines and aero-engines. The model consists of uncooled on-design, mean-line turbomachinery design and a cooled off-design analysis in order to evaluate the engine performance parameters by using operating conditions, polytropic efficiencies, material information and cooling system details. The cooling analysis algorithm involves a 2nd law analysis to calculate losses from the cooling technique applied. The model is used in a sensitivity analysis that evaluates the impacts of variations in metal Biot number, thermal barrier coating Biot number, film cooling effectiveness, internal cooling effectiveness and maximum allowable blade temperature on main engine performance parameters of aero and industrial gas turbine engines. The model is subsequently used to analyze the relative performance impact of employing Anti-Vortex Film Cooling holes (AVH) by means of data obtained for these holes by Detached Eddy Simulation-CFD Techniques that are valid for engine-like turbulence intensity conditions. Cooled blade configurations with AVH and other different external cooling techniques were used in a performance comparison study. (Abstract shortened by ProQuest.).
[Balanced scorecard for performance measurement of a nursing organization in a Korean hospital].
Hong, Yoonmi; Hwang, Kyung Ja; Kim, Mi Ja; Park, Chang Gi
2008-02-01
The purpose of this study was to develop a balanced scorecard (BSC) for performance measurement of a Korean hospital nursing organization and to evaluate the validity and reliability of performance measurement indicators. Two hundred fifty-nine nurses in a Korean hospital participated in a survey questionnaire that included 29-item performance evaluation indicators developed by investigators of this study based on the Kaplan and Norton's BSC (1992). Cronbach's alpha was used to test the reliability of the BSC. Exploratory and confirmatory factor analysis with a structure equation model (SEM) was applied to assess the construct validity of the BSC. Cronbach's alpha of 29 items was .948. Factor analysis of the BSC showed 5 principal components (eigen value >1.0) which explained 62.7% of the total variance, and it included a new one, community service. The SEM analysis results showed that 5 components were significant for the hospital BSC tool. High degree of reliability and validity of this BSC suggests that it may be used for performance measurements of a Korean hospital nursing organization. Future studies may consider including a balanced number of nurse managers and staff nurses in the study. Further data analysis on the relationships among factors is recommended.
NASA Astrophysics Data System (ADS)
Nakashima, Hiroshi; Takatsu, Yuzuru; Shinone, Hisanori; Matsukawa, Hisao; Kasetani, Takahiro
Soil-tire system interaction is a fundamental and important research topic in terramechanics. We applied a 2D finite element, discrete element method (FE-DEM), using FEM for the tire and the bottom soil layer and DEM for the surface soil layer. Satisfactory performance analysis was achieved. In this study, to clarify the capabilities and limitations of the method for soil-tire interaction analysis, the tractive performance of real automobile tires with two different tread patterns—smooth and grooved—was analyzed by FE-DEM, and the numerical results compared with the experimental results obtained using an indoor traction measurement system. The analysis of tractive performance could be performed with sufficient accuracy by the proposed 2D dynamic FE-DEM. FE-DEM obtained larger drawbar pull for a tire with a grooved tread pattern, which was verified by the experimental results. Moreover, the result for the grooved tire showed almost the same gross tractive effort and similar running resistance as in experiments. However, for a tire with smooth tread pattern, the analyzed gross tractive effort and running resistance behaved differently than the experimental results, largely due to the difference in tire sinkage in FE-DEM.
NASA Astrophysics Data System (ADS)
Liu, Yande; Ying, Yibin; Lu, Huishan; Fu, Xiaping
2005-11-01
A new method is proposed to eliminate the varying background and noise simultaneously for multivariate calibration of Fourier transform near infrared (FT-NIR) spectral signals. An ideal spectrum signal prototype was constructed based on the FT-NIR spectrum of fruit sugar content measurement. The performances of wavelet based threshold de-noising approaches via different combinations of wavelet base functions were compared. Three families of wavelet base function (Daubechies, Symlets and Coiflets) were applied to estimate the performance of those wavelet bases and threshold selection rules by a series of experiments. The experimental results show that the best de-noising performance is reached via the combinations of Daubechies 4 or Symlet 4 wavelet base function. Based on the optimization parameter, wavelet regression models for sugar content of pear were also developed and result in a smaller prediction error than a traditional Partial Least Squares Regression (PLSR) mode.
Evaluating the interior thermal performance of mosques in the tropical environment
NASA Astrophysics Data System (ADS)
Nordin, N. I.; Misni, A.
2018-02-01
This study introduces the methodology applied in conducting data collection and data analysis. Data collection is the process of gathering and measuring information on targeted variables in an established systematic method. Qualitative and quantitative methods are combined in collecting data from government departments, site experiments and observation. Furthermore, analysing the indoor thermal performance data in the heritage and new mosques were used thermal monitoring tests, while validation will be made by meteorology data. Origin 8 version of the software is used to analyse all the data. Comparison techniques were applied to analyse several factors that influence the indoor thermal performance of mosques, namely building envelope include floor area, opening, and material used. Building orientation, location, surrounding vegetation and water elements are also recorded as supported building primary data. The comparison of primary data using these variables for four mosques include heritage and new buildings were revealed.
Meyer, Linda
2002-03-01
This study examined the antecedents and determinants predictive of whether nursing students (N = 92) intend to ask for assignments to perform nursing behaviors after using a database to record essential clinical behaviors. The results of applying the theory of planned behavior (TPB) to behavioral intention using multivariant path analysis suggested that the endogenous variables, attitude and subjective norms, had a significant effect on the intention to ask for assignments to perform nursing behaviors. In addition, it was primarily through attitudes and subjective norms that the respective antecedents or exogenous variables, behavioral beliefs and normative beliefs, affected the intention to ask for assignments to perform nursing behaviors. The lack of direct influence of perceived behavioral control on intention and the direct negative impact of control belief on intention were contrary to expectations, given the tenets of the TPB.
NASA Astrophysics Data System (ADS)
von Larcher, Thomas; Harlander, Uwe; Alexandrov, Kiril; Wang, Yongtai
2010-05-01
Experiments on baroclinic wave instabilities in a rotating cylindrical gap have been long performed, e.g., to unhide regular waves of different zonal wave number, to better understand the transition to the quasi-chaotic regime, and to reveal the underlying dynamical processes of complex wave flows. We present the application of appropriate multivariate data analysis methods on time series data sets acquired by the use of non-intrusive measurement techniques of a quite different nature. While the high accurate Laser-Doppler-Velocimetry (LDV ) is used for measurements of the radial velocity component at equidistant azimuthal positions, a high sensitive thermographic camera measures the surface temperature field. The measurements are performed at particular parameter points, where our former studies show that kinds of complex wave patterns occur [1, 2]. Obviously, the temperature data set has much more information content as the velocity data set due to the particular measurement techniques. Both sets of time series data are analyzed by using multivariate statistical techniques. While the LDV data sets are studied by applying the Multi-Channel Singular Spectrum Analysis (M - SSA), the temperature data sets are analyzed by applying the Empirical Orthogonal Functions (EOF ). Our goal is (a) to verify the results yielded with the analysis of the velocity data and (b) to compare the data analysis methods. Therefor, the temperature data are processed in a way to become comparable to the LDV data, i.e. reducing the size of the data set in such a manner that the temperature measurements would imaginary be performed at equidistant azimuthal positions only. This approach initially results in a great loss of information. But applying the M - SSA to the reduced temperature data sets enable us to compare the methods. [1] Th. von Larcher and C. Egbers, Experiments on transitions of baroclinic waves in a differentially heated rotating annulus, Nonlinear Processes in Geophysics, 2005, 12, 1033-1041, NPG Print: ISSN 1023-5809, NPG Online: ISSN 1607-7946 [2] U. Harlander, Th. von Larcher, Y. Wang and C. Egbers, PIV- and LDV-measurements of baroclinic wave interactions in a thermally driven rotating annulus, Experiments in Fluids, 2009, DOI: 10.1007/s00348-009-0792-5
Steam generator tubing NDE performance
DOE Office of Scientific and Technical Information (OSTI.GOV)
Henry, G.; Welty, C.S. Jr.
1997-02-01
Steam generator (SG) non-destructive examination (NDE) is a fundamental element in the broader SG in-service inspection (ISI) process, a cornerstone in the management of PWR steam generators. Based on objective performance measures (tube leak forced outages and SG-related capacity factor loss), ISI performance has shown a continually improving trend over the years. Performance of the NDE element is a function of the fundamental capability of the technique, and the ability of the analysis portion of the process in field implementation of the technique. The technology continues to improve in several areas, e.g. system sensitivity, data collection rates, probe/coil design, andmore » data analysis software. With these improvements comes the attendant requirement for qualification of the technique on the damage form(s) to which it will be applied, and for training and qualification of the data analysis element of the ISI process on the field implementation of the technique. The introduction of data transfer via fiber optic line allows for remote data acquisition and analysis, thus improving the efficiency of analysis for a limited pool of data analysts. This paper provides an overview of the current status of SG NDE, and identifies several important issues to be addressed.« less
Yan, Wenying; Han, Qingjie; Guo, Panpan; Wang, Chunying; Zhang, Zijian
2016-01-01
Abri Herba has remarkable properties, such as cleanup heat detoxification, dampness and activating blood circulation to dissipate blood stasis; as a result, it has been applied to treat acute or chronic hepatitis and mastitis. Abri mollis Herba is often used as Abri Herba. Hierarchical cluster analysis (HCA) was applied to compare the similarities and differences of the chemical compositions in the two types of medicinal materials. To establish a high-performance liquid chromatography and tandem mass spectrometry (HPLC-MS/MS) method for the simultaneous analysis of 15 flavonoids, two phenolic acids and three alkaloids in Abri Herba and Abri mollis Herba. The chromatographic separation was performed on a C18 column with a mobile phase of methanol (A), acetonitrile (B) and 0.5‰ acetic acid in water (C) using gradient elution. The detection of the target compounds was performed in multiple-reaction monitoring (MRM) mode using a hybrid quadrupole linear ion trap mass spectrometer equipped with positive/negative ion-switching electrospray ionisation (ESI) source. The developed method is reliable, sensitive and specific. In addition, the method has been successfully applied to differentiate 15 batches of Abri Herba and 27 batches of Abri mollis Herba stems. Furthermore, a comparison of the contents among stems, roots and leaves from the same strain in seven batches of Abri mollis Herba and four batches of Abri Herba has also been performed. HPLC-MS/MS method is sensitive and selective and can be suitable for the reliable quality control of Abri mollis Herba and Abri Herba. Copyright © 2015 John Wiley & Sons, Ltd.
Applying systems ergonomics methods in sport: A systematic review.
Hulme, Adam; Thompson, Jason; Plant, Katherine L; Read, Gemma J M; Mclean, Scott; Clacy, Amanda; Salmon, Paul M
2018-04-16
As sports systems become increasingly more complex, competitive, and technology-centric, there is a greater need for systems ergonomics methods to consider the performance, health, and safety of athletes in context with the wider settings in which they operate. Therefore, the purpose of this systematic review was to identify and critically evaluate studies which have applied a systems ergonomics research approach in the context of sports performance and injury management. Five databases (PubMed, Scopus, ScienceDirect, Web of Science, and SPORTDiscus) were searched for the dates 01 January 1990 to 01 August 2017, inclusive, for original peer-reviewed journal articles and conference papers. Reported analyses were underpinned by a recognised systems ergonomics method, and study aims were related to the optimisation of sports performance (e.g. communication, playing style, technique, tactics, or equipment), and/or the management of sports injury (i.e. identification, prevention, or treatment). A total of seven articles were identified. Two articles were focussed on understanding and optimising sports performance, whereas five examined sports injury management. The methods used were the Event Analysis of Systemic Teamwork, Cognitive Work Analysis (the Work Domain Analysis Abstraction Hierarchy), Rasmussen's Risk Management Framework, and the Systems Theoretic Accident Model and Processes method. The individual sport application was distance running, whereas the team sports contexts examined were cycling, football, Australian Football League, and rugby union. The included systems ergonomics applications were highly flexible, covering both amateur and elite sports contexts. The studies were rated as valuable, providing descriptions of injury controls and causation, the factors influencing injury management, the allocation of responsibilities for injury prevention, as well as the factors and their interactions underpinning sports performance. Implications and future directions for research are described. Copyright © 2018 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Luthfie, A. A.; Pratiwi, S. E.; Hidayatulloh, P.
2018-03-01
Indonesia is a country which has abundant renewable energy resources, comprises of water, solar, geothermal, wind, bioenergy, and ocean energy. Utilization of water energy through MHP is widely applied in remote areas in Indonesia. This utilization requires a water-converting device known as a water turbine. Rosefsky (2010) developed a water turbine known as the Hydrocoil turbine. This turbine is an axial turbine which is a modification of screw turbine. This turbine has a pitch length that decreases in the direction of the water flow and is able to work at relatively low water flow and head. The use of Hydrocoil turbine has not been widely applied in Indonesia, therefore this research is focused on analyzing the performance of Hydrocoil turbine. The analysis was performed using Computational Fluid Dynamics (CFD) method. Hydrocoil turbine performance analysis was performed at 3 m, 4 m, and 5 m head respectively as well as rotational speed variations of 100 rpm, 300 rpm, 500 rpm, 700 rpm, 900 rpm, 1,100 rpm, 1,300 rpm, 1,500 rpm, 1,700 rpm, and 1,900 rpm. Based on simulation result, the largest power generated by the turbine at 3 m head is 1,134.06 W, while at 4 m and 5 m are 1,722.39 W and 2,231.49 W respectively. It is also found that the largest turbine’s efficiency at 3 m head is 93.22% while at 4 m and 5 m head are 94.6% and 89.88% respectively. The result also shows that the larger the head the greater the operational rotational speed range.
Vasa previa screening strategies: a decision and cost-effectiveness analysis.
Sinkey, R G; Odibo, A O
2018-05-22
The aim of this study is to perform a decision and cost-effectiveness analysis comparing four screening strategies for the antenatal diagnosis of vasa previa among singleton pregnancies. A decision-analytic model was constructed comparing vasa previa screening strategies. Published probabilities and costs were applied to four transvaginal screening scenarios which occurred at the time of mid-trimester ultrasound: no screening, ultrasound-indicated screening, screening pregnancies conceived by in vitro fertilization (IVF), and universal screening. Ultrasound-indicated screening was defined as performing a transvaginal ultrasound at the time of routine anatomy ultrasound in response to one of the following sonographic findings associated with an increased risk of vasa previa: low-lying placenta, marginal or velamentous cord insertion, or bilobed or succenturiate lobed placenta. The primary outcome was cost per quality adjusted life years (QALY) in U.S. dollars. The analysis was from a healthcare system perspective with a willingness to pay (WTP) threshold of $100,000 per QALY selected. One-way and multivariate sensitivity analyses (Monte-Carlo simulation) were performed. This decision-analytic model demonstrated that screening pregnancies conceived by IVF was the most cost-effective strategy with an incremental cost effectiveness ratio (ICER) of $29,186.50 / QALY. Ultrasound-indicated screening was the second most cost-effective with an ICER of $56,096.77 / QALY. These data were robust to all one-way and multivariate sensitivity analyses performed. Within our baseline assumptions, transvaginal ultrasound screening for vasa previa appears to be most cost-effective when performed among IVF pregnancies. However, both IVF and ultrasound-indicated screening strategies fall within contemporary willingness-to-pay thresholds, suggesting that both strategies may be appropriate to apply in clinical practice. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.
Performance assessment in algebra learning process
NASA Astrophysics Data System (ADS)
Lestariani, Ida; Sujadi, Imam; Pramudya, Ikrar
2017-12-01
The purpose of research to describe the implementation of performance assessment on algebra learning process. The subject in this research is math educator of SMAN 1 Ngawi class X. This research includes descriptive qualitative research type. Techniques of data collecting are done by observation method, interview, and documentation. Data analysis technique is done by data reduction, data presentation, and conclusion. The results showed any indication that the steps taken by the educator in applying the performance assessment are 1) preparing individual worksheets and group worksheets, 2) preparing rubric assessments for independent worksheets and groups and 3) making performance assessments rubric to learners’ performance results with individual or groups task.
Performance Analysis Tool for HPC and Big Data Applications on Scientific Clusters
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yoo, Wucherl; Koo, Michelle; Cao, Yu
Big data is prevalent in HPC computing. Many HPC projects rely on complex workflows to analyze terabytes or petabytes of data. These workflows often require running over thousands of CPU cores and performing simultaneous data accesses, data movements, and computation. It is challenging to analyze the performance involving terabytes or petabytes of workflow data or measurement data of the executions, from complex workflows over a large number of nodes and multiple parallel task executions. To help identify performance bottlenecks or debug the performance issues in large-scale scientific applications and scientific clusters, we have developed a performance analysis framework, using state-ofthe-more » art open-source big data processing tools. Our tool can ingest system logs and application performance measurements to extract key performance features, and apply the most sophisticated statistical tools and data mining methods on the performance data. It utilizes an efficient data processing engine to allow users to interactively analyze a large amount of different types of logs and measurements. To illustrate the functionality of the big data analysis framework, we conduct case studies on the workflows from an astronomy project known as the Palomar Transient Factory (PTF) and the job logs from the genome analysis scientific cluster. Our study processed many terabytes of system logs and application performance measurements collected on the HPC systems at NERSC. The implementation of our tool is generic enough to be used for analyzing the performance of other HPC systems and Big Data workows.« less
Yi, YaXiong; Zhang, Yong; Ding, Yue; Lu, Lu; Zhang, Tong; Zhao, Yuan; Xu, XiaoJun; Zhang, YuXin
2016-11-01
J. Sep. Sci. 2016, 39, 4147-4157 DOI: 10.1002/jssc.201600284 Yinchenhao decoction (YCHD) is a famous Chinese herbal formula recorded in the Shang Han Lun which was prescribed by Zhongjing Zhang during 150-219 AD. A novel quantitative analysis method was developed, based on ultrahigh performance liquid chromatography coupled with a diode array detector for the simultaneous determination of 14 main active components in Yinchenhao decoction. Furthermore, the method has been applied for compositional difference analysis of the 14 components in eight normal extraction samples of Yinchenhao decoction, with the aid of hierarchical clustering analysis and similarity analysis. The present research could help hospital, factory and lab choose the best way to make Yinchenhao decoction with better efficacy. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Liang, Xianrui; Zhao, Cui; Su, Weike
2015-11-01
An ultra-performance liquid chromatography coupled with quadrupole time-of-flight mass spectrometry method integrating multi-constituent determination and fingerprint analysis has been established for quality assessment and control of Scutellaria indica L. The optimized method possesses the advantages of speediness, efficiency, and allows multi-constituents determination and fingerprint analysis in one chromatographic run within 11 min. 36 compounds were detected, and 23 of them were unequivocally identified or tentatively assigned. The established fingerprint method was applied to the analysis of ten S. indica samples from different geographic locations. The quality assessment was achieved by using principal component analysis. The proposed method is useful and reliable for the characterization of multi-constituents in a complex chemical system and the overall quality assessment of S. indica. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Model-Based Fault Diagnosis: Performing Root Cause and Impact Analyses in Real Time
NASA Technical Reports Server (NTRS)
Figueroa, Jorge F.; Walker, Mark G.; Kapadia, Ravi; Morris, Jonathan
2012-01-01
Generic, object-oriented fault models, built according to causal-directed graph theory, have been integrated into an overall software architecture dedicated to monitoring and predicting the health of mission- critical systems. Processing over the generic fault models is triggered by event detection logic that is defined according to the specific functional requirements of the system and its components. Once triggered, the fault models provide an automated way for performing both upstream root cause analysis (RCA), and for predicting downstream effects or impact analysis. The methodology has been applied to integrated system health management (ISHM) implementations at NASA SSC's Rocket Engine Test Stands (RETS).
Deductive Evaluation: Formal Code Analysis With Low User Burden
NASA Technical Reports Server (NTRS)
Di Vito, Ben. L
2016-01-01
We describe a framework for symbolically evaluating iterative C code using a deductive approach that automatically discovers and proves program properties. Although verification is not performed, the method can infer detailed program behavior. Software engineering work flows could be enhanced by this type of analysis. Floyd-Hoare verification principles are applied to synthesize loop invariants, using a library of iteration-specific deductive knowledge. When needed, theorem proving is interleaved with evaluation and performed on the fly. Evaluation results take the form of inferred expressions and type constraints for values of program variables. An implementation using PVS (Prototype Verification System) is presented along with results for sample C functions.
Exploratory reconstructability analysis of accident TBI data
NASA Astrophysics Data System (ADS)
Zwick, Martin; Carney, Nancy; Nettleton, Rosemary
2018-02-01
This paper describes the use of reconstructability analysis to perform a secondary study of traumatic brain injury data from automobile accidents. Neutral searches were done and their results displayed with a hypergraph. Directed searches, using both variable-based and state-based models, were applied to predict performance on two cognitive tests and one neurological test. Very simple state-based models gave large uncertainty reductions for all three DVs and sizeable improvements in percent correct for the two cognitive test DVs which were equally sampled. Conditional probability distributions for these models are easily visualized with simple decision trees. Confounding variables and counter-intuitive findings are also reported.
Measurement uncertainty analysis techniques applied to PV performance measurements
NASA Astrophysics Data System (ADS)
Wells, C.
1992-10-01
The purpose of this presentation is to provide a brief introduction to measurement uncertainty analysis, outline how it is done, and illustrate uncertainty analysis with examples drawn from the PV field, with particular emphasis toward its use in PV performance measurements. The uncertainty information we know and state concerning a PV performance measurement or a module test result determines, to a significant extent, the value and quality of that result. What is measurement uncertainty analysis? It is an outgrowth of what has commonly been called error analysis. But uncertainty analysis, a more recent development, gives greater insight into measurement processes and tests, experiments, or calibration results. Uncertainty analysis gives us an estimate of the interval about a measured value or an experiment's final result within which we believe the true value of that quantity will lie. Why should we take the time to perform an uncertainty analysis? A rigorous measurement uncertainty analysis: Increases the credibility and value of research results; allows comparisons of results from different labs; helps improve experiment design and identifies where changes are needed to achieve stated objectives (through use of the pre-test analysis); plays a significant role in validating measurements and experimental results, and in demonstrating (through the post-test analysis) that valid data have been acquired; reduces the risk of making erroneous decisions; demonstrates quality assurance and quality control measures have been accomplished; define Valid Data as data having known and documented paths of: Origin, including theory; measurements; traceability to measurement standards; computations; uncertainty analysis of results.
NASA Technical Reports Server (NTRS)
Levison, William H.
1988-01-01
This study explored application of a closed loop pilot/simulator model to the analysis of some simulator fidelity issues. The model was applied to two data bases: (1) a NASA ground based simulation of an air-to-air tracking task in which nonvisual cueing devices were explored, and (2) a ground based and inflight study performed by the Calspan Corporation to explore the effects of simulator delay on attitude tracking performance. The model predicted the major performance trends obtained in both studies. A combined analytical and experimental procedure for exploring simulator fidelity issues is outlined.
Adaptive automation of human-machine system information-processing functions.
Kaber, David B; Wright, Melanie C; Prinzel, Lawrence J; Clamann, Michael P
2005-01-01
The goal of this research was to describe the ability of human operators to interact with adaptive automation (AA) applied to various stages of complex systems information processing, defined in a model of human-automation interaction. Forty participants operated a simulation of an air traffic control task. Automated assistance was adaptively applied to information acquisition, information analysis, decision making, and action implementation aspects of the task based on operator workload states, which were measured using a secondary task. The differential effects of the forms of automation were determined and compared with a manual control condition. Results of two 20-min trials of AA or manual control revealed a significant effect of the type of automation on performance, particularly during manual control periods as part of the adaptive conditions. Humans appear to better adapt to AA applied to sensory and psychomotor information-processing functions (action implementation) than to AA applied to cognitive functions (information analysis and decision making), and AA is superior to completely manual control. Potential applications of this research include the design of automation to support air traffic controller information processing.
Garten, Justin; Hoover, Joe; Johnson, Kate M; Boghrati, Reihane; Iskiwitch, Carol; Dehghani, Morteza
2018-02-01
Theory-driven text analysis has made extensive use of psychological concept dictionaries, leading to a wide range of important results. These dictionaries have generally been applied through word count methods which have proven to be both simple and effective. In this paper, we introduce Distributed Dictionary Representations (DDR), a method that applies psychological dictionaries using semantic similarity rather than word counts. This allows for the measurement of the similarity between dictionaries and spans of text ranging from complete documents to individual words. We show how DDR enables dictionary authors to place greater emphasis on construct validity without sacrificing linguistic coverage. We further demonstrate the benefits of DDR on two real-world tasks and finally conduct an extensive study of the interaction between dictionary size and task performance. These studies allow us to examine how DDR and word count methods complement one another as tools for applying concept dictionaries and where each is best applied. Finally, we provide references to tools and resources to make this method both available and accessible to a broad psychological audience.
Multilevel microvibration test for performance predictions of a space optical load platform
NASA Astrophysics Data System (ADS)
Li, Shiqi; Zhang, Heng; Liu, Shiping; Wang, Yue
2018-05-01
This paper presents a framework for the multilevel microvibration analysis and test of a space optical load platform. The test framework is conducted on three levels, including instrument, subsystem, and system level. Disturbance source experimental investigations are performed to evaluate the vibration amplitude and study vibration mechanism. Transfer characteristics of space camera are validated by a subsystem test, which allows the calculation of transfer functions from various disturbance sources to optical performance outputs. In order to identify the influence of the source on the spacecraft performance, a system level microvibration measurement test has been performed on the ground. From the time domain analysis and spectrum analysis of multilevel microvibration tests, we concluded that the disturbance source has a significant effect on its installation position. After transmitted through mechanical links, the residual vibration reduces to a background noise level. In addition, the angular microvibration of the platform jitter is mainly concentrated in the rotation of y-axes. This work is applied to a real practical application involving the high resolution satellite camera system.
Time Series Imputation via L1 Norm-Based Singular Spectrum Analysis
NASA Astrophysics Data System (ADS)
Kalantari, Mahdi; Yarmohammadi, Masoud; Hassani, Hossein; Silva, Emmanuel Sirimal
Missing values in time series data is a well-known and important problem which many researchers have studied extensively in various fields. In this paper, a new nonparametric approach for missing value imputation in time series is proposed. The main novelty of this research is applying the L1 norm-based version of Singular Spectrum Analysis (SSA), namely L1-SSA which is robust against outliers. The performance of the new imputation method has been compared with many other established methods. The comparison is done by applying them to various real and simulated time series. The obtained results confirm that the SSA-based methods, especially L1-SSA can provide better imputation in comparison to other methods.
NASA Technical Reports Server (NTRS)
Garcia-Espada, Susana; Haas, Rudiger; Colomer, Francisco
2010-01-01
An important limitation for the precision in the results obtained by space geodetic techniques like VLBI and GPS are tropospheric delays caused by the neutral atmosphere, see e.g. [1]. In recent years numerical weather models (NWM) have been applied to improve mapping functions which are used for tropospheric delay modeling in VLBI and GPS data analyses. In this manuscript we use raytracing to calculate slant delays and apply these to the analysis of Europe VLBI data. The raytracing is performed through the limited area numerical weather prediction (NWP) model HIRLAM. The advantages of this model are high spatial (0.2 deg. x 0.2 deg.) and high temporal resolution (in prediction mode three hours).
Federal Register 2010, 2011, 2012, 2013, 2014
2010-06-22
... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration 49 CFR Part... Onshore Hazardous Liquid Low-Stress Lines AGENCY: Pipeline and Hazardous Materials Safety Administration... pipelines to perform a complete ``could affect'' analysis to determine which rural low-stress pipeline...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-05-05
... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration 49 CFR Part... to All Rural Onshore Hazardous Liquid Low-Stress Lines AGENCY: Pipeline and Hazardous Materials... burdensome to require operators of these pipelines to perform a complete ``could affect'' analysis to...
Wear studies made of slip rings and gas bearing components
NASA Technical Reports Server (NTRS)
Furr, A. K.
1967-01-01
Neutron activation analysis techniques were employed for the study of the wear and performance characteristics of slip ring and rotor assemblies and of the problems arising from environmental conditions with special reference to surface contamination. Results showed that the techniques could be successfully applied to measurement of wear parameters.
The Relationship between Performance and Satisfaction: A Utility Analysis.
1985-03-01
Journal of Applied Psychology, 52, 343-347. 21 Korman, A.K. (1980) Career success /personal failure. Englewood Cliffs, N.J.: Prentice Hall. Lawler...this extensively in his book Career Success / Personal Failure (1980). Korman quotes from Matters (1976, p. 124): "After you’ve sold widgets for twenty
ERIC Educational Resources Information Center
Ribeiro, Maria Miguel; Hoover, Elona; Burford, Gemma; Buchebner, Julia; Lindenthal, Thomas
2016-01-01
Purpose: The purpose of this paper is to illustrate that values-focused assessment can provide a useful lens for integrating sustainability and institutional performance assessment in universities. Design/methodology/approach: This study applies a values elicitation methodology for indicator development, through thematic analysis of…
Politicizing Articulation: Applying Lyotard's Work to the Use of Standards in Educational Leadership
ERIC Educational Resources Information Center
Niesche, Richard
2013-01-01
This paper presents a case for the importance of an application of Jean-Francois Lyotard's ideas to the analysis of educational leadership. Through exploring Lyotard's concepts of "language games", the "differend" and "performativity", this paper argues that the approach taken through the development of leadership…
DEAN: A Program for Dynamic Engine Analysis.
1985-01-01
hardware and memory limitations. DIGTEM (ref. 4), a recently written code allows steady-state as well as transient calculations to be performed. DIGTEM has...Computer Program for Generating Dynamic Turbofan Engine Models ( DIGTEM )," NASA TM-83446. 5. Carnahan, B., Luther, H.A., and Wilkes, J.O., Applied Numerical
Nasiripour, Amir Ashkan; Toloie-Ashlaghy, Abbas; Ta-Bibi, Seyed Jamaleddin; Maleki, Mohammad Reza; Gorji, Hassan Abolghasem
2014-01-01
Universities of Medical Science and Health Services (UMSHSs) are among the main organizations in Iran's health-care section. Improving their efficiency in financial resource management through creating an appropri-ate coordination between consumption and resources is strategically vital. Investigating the financial performance as well as ranking the Iranian UMSHSs is the research objective. The study is of descriptive and applied type. The study population includes the UMSHSs of Iran (n=42) among which 24 UMSHSs are selected. DEA is used with the aim to model and assess the financial performance in-cluding 4 inputs and 3 outputs. Also, linear regression is applied to determine the effectiveness of the applied indices as well as the level of the financial performance. Data are obtained from the Budgeting Center in the Ministry of Health and Medical Education, during 2010 mainly through forms designed based on the available balance sheets. The average score of financial performance assessment for UMSHSs based on the DEA of input-oriented data is 0.74, assuming a constant scale of DEA-CRS. Thus, approximately 25% of the studied UMSHSs have maxi-mum relative performance and totally, there is about a 30% capacity to increase the financial performance in these UMSHSs. Most Iranian UMSHSs do not have high financial performance. This can be due to problems in financial resource management especially in asset combining. Therefore, compilation and execution of a comprehensive pro-gram for organizational change and agility with the aim to create a kind of optimized combination of resources and assets is strongly recommended.
Thermal Analysis of a Disposable, Instrument-Free DNA Amplification Lab-on-a-Chip Platform.
Pardy, Tamás; Rang, Toomas; Tulp, Indrek
2018-06-04
Novel second-generation rapid diagnostics based on nucleic acid amplification tests (NAAT) offer performance metrics on par with clinical laboratories in detecting infectious diseases at the point of care. The diagnostic assay is typically performed within a Lab-on-a-Chip (LoC) component with integrated temperature regulation. However, constraints on device dimensions, cost and power supply inherent with the device format apply to temperature regulation as well. Thermal analysis on simplified thermal models for the device can help overcome these barriers by speeding up thermal optimization. In this work, we perform experimental thermal analysis on the simplified thermal model for our instrument-free, single-use LoC NAAT platform. The system is evaluated further by finite element modelling. Steady-state as well as transient thermal analysis are performed to evaluate the performance of a self-regulating polymer resin heating element in the proposed device geometry. Reaction volumes in the target temperature range of the amplification reaction are estimated in the simulated model to assess compliance with assay requirements. Using the proposed methodology, we demonstrated our NAAT device concept capable of performing loop-mediated isothermal amplification in the 20⁻25 °C ambient temperature range with 32 min total assay time.
Functional Characterization of Two Novel Human Prostate Cancer Metastasis Related Genes
2008-02-01
systems (27-29), a major leap in functional genomic investigation would be the ability to perform genetic subtractive analysis with in vivo-derived...been designed to detect and isolate different DNA sequences present in one complimentary (31) or genomic (32) DNA library but absent in another. The...many disorders if applied correctly, the use of control specimens different from the native tissue for subtractive genomic analysis in some studies has
Morvannou, A; Forquet, N; Michel, S; Troesch, S; Molle, P
2015-01-01
Approximately 3,500 constructed wetlands (CWs) provide raw wastewater treatment in France for small communities (<5,000 people equivalent). Built during the past 30 years, most consist of two vertical flow constructed wetlands (VFCWs) in series (stages). Many configurations exist, with systems associated with horizontal flow filters or waste stabilization ponds, vertical flow with recirculation, partially saturated systems, etc. A database analyzed 10 years earlier on the classical French system summarized the global performances data. This paper provides a similar analysis of performance data from 415 full-scale two-stage VFCWs from an improved database expanded by monitoring data available from Irstea and the French technical department. Trends presented in the first study are confirmed, exhibiting high chemical oxygen demand (COD), total suspended solids (TSS) and total Kjeldahl nitrogen (TKN) removal rates (87%, 93% and 84%, respectively). Typical concentrations at the second-stage outlet are 74 mgCOD L(-1), 17 mgTSS L(-1) and 11 mgTKN L(-1). Pollutant removal performances are summarized in relation to the loads applied at the first treatment stage. While COD and TSS removal rates remain stable over the range of applied loads, the spreading of TKN removal rates increases as applied loads increase.
Performance characteristics of LOX-H2, tangential-entry, swirl-coaxial, rocket injectors
NASA Technical Reports Server (NTRS)
Howell, Doug; Petersen, Eric; Clark, Jim
1993-01-01
Development of a high performing swirl-coaxial injector requires an understanding of fundamental performance characteristics. This paper addresses the findings of studies on cold flow atomic characterizations which provided information on the influence of fluid properties and element operating conditions on the produced droplet sprays. These findings are applied to actual rocket conditions. The performance characteristics of swirl-coaxial injection elements under multi-element hot-fire conditions were obtained by analysis of combustion performance data from three separate test series. The injection elements are described and test results are analyzed using multi-variable linear regression. A direct comparison of test results indicated that reduced fuel injection velocity improved injection element performance through improved propellant mixing.
Policy design and performance of emissions trading markets: an adaptive agent-based analysis.
Bing, Zhang; Qinqin, Yu; Jun, Bi
2010-08-01
Emissions trading is considered to be a cost-effective environmental economic instrument for pollution control. However, the pilot emissions trading programs in China have failed to bring remarkable success in the campaign for pollution control. The policy design of an emissions trading program is found to have a decisive impact on its performance. In this study, an artificial market for sulfur dioxide (SO2) emissions trading applying the agent-based model was constructed. The performance of the Jiangsu SO2 emissions trading market under different policy design scenario was also examined. Results show that the market efficiency of emissions trading is significantly affected by policy design and existing policies. China's coal-electricity price system is the principal factor influencing the performance of the SO2 emissions trading market. Transaction costs would also reduce market efficiency. In addition, current-level emissions discharge fee/tax and banking mechanisms do not distinctly affect policy performance. Thus, applying emissions trading in emission control in China should consider policy design and interaction with other existing policies.
NASA Astrophysics Data System (ADS)
Núñez, M.; Robie, T.; Vlachos, D. G.
2017-10-01
Kinetic Monte Carlo (KMC) simulation provides insights into catalytic reactions unobtainable with either experiments or mean-field microkinetic models. Sensitivity analysis of KMC models assesses the robustness of the predictions to parametric perturbations and identifies rate determining steps in a chemical reaction network. Stiffness in the chemical reaction network, a ubiquitous feature, demands lengthy run times for KMC models and renders efficient sensitivity analysis based on the likelihood ratio method unusable. We address the challenge of efficiently conducting KMC simulations and performing accurate sensitivity analysis in systems with unknown time scales by employing two acceleration techniques: rate constant rescaling and parallel processing. We develop statistical criteria that ensure sufficient sampling of non-equilibrium steady state conditions. Our approach provides the twofold benefit of accelerating the simulation itself and enabling likelihood ratio sensitivity analysis, which provides further speedup relative to finite difference sensitivity analysis. As a result, the likelihood ratio method can be applied to real chemistry. We apply our methodology to the water-gas shift reaction on Pt(111).
Comprehensive Structural Dynamic Analysis of the SSME/AT Fuel Pump First-Stage Turbine Blade
NASA Technical Reports Server (NTRS)
Brown, A. M.
1998-01-01
A detailed structural dynamic analysis of the Pratt & Whitney high-pressure fuel pump first-stage turbine blades has been performed to identify the cause of the tip cracking found in the turbomachinery in November 1997. The analysis was also used to help evaluate potential fixes for the problem. Many of the methods available in structural dynamics were applied, including modal displacement and stress analysis, frequency and transient response to tip loading from the first-stage Blade Outer Gas Seals (BOGS), fourier analysis, and shock spectra analysis of the transient response. The primary findings were that the BOGS tip loading is impulsive in nature, thereby exciting many modes of the blade that exhibit high stress at the tip cracking location. Therefore, a proposed BOGS count change would not help the situation because a clearly identifiable resonance situation does not exist. The recommendations for the resolution of the problem are to maintain the existing BOGS count, eliminate the stress concentration in the blade due to its geometric design, and reduce the applied load on the blade by adding shiplaps in the BOGS.
Design and Evaluation of Glass/epoxy Composite Blade and Composite Tower Applied to Wind Turbine
NASA Astrophysics Data System (ADS)
Park, Hyunbum
2018-02-01
In the study, the analysis and manufacturing of small class wind turbine blade was performed. In the structural design, firstly the loading conditions are defined through the load case analysis. The proposed structural configuration of blade has a sandwich type composite structure with the E-glass/Epoxy face sheets and the Urethane foam core for lightness, structural stability, low manufacturing cost and easy manufacturing process. And also, this work proposes a design procedure and results of tower for the small scale wind turbine systems. Structural analysis of blade including load cases, stress, deformation, buckling, vibration and fatigue life was performed using the finite element method, the load spectrum analysis and the Miner rule. Moreover, investigation on structural safety of tower was verified through structural analysis by FEM. The manufacturing of blade and tower was performed based on structural design. In order to investigate the designed structure, the structural tests were conducted and its results were compared with the calculated results. It is confirmed that the final proposed blade and tower meet the design requirements.
Remote Sensing Information Science Research
NASA Technical Reports Server (NTRS)
Clarke, Keith C.; Scepan, Joseph; Hemphill, Jeffrey; Herold, Martin; Husak, Gregory; Kline, Karen; Knight, Kevin
2002-01-01
This document is the final report summarizing research conducted by the Remote Sensing Research Unit, Department of Geography, University of California, Santa Barbara under National Aeronautics and Space Administration Research Grant NAG5-10457. This document describes work performed during the period of 1 March 2001 thorough 30 September 2002. This report includes a survey of research proposed and performed within RSRU and the UCSB Geography Department during the past 25 years. A broad suite of RSRU research conducted under NAG5-10457 is also described under themes of Applied Research Activities and Information Science Research. This research includes: 1. NASA ESA Research Grant Performance Metrics Reporting. 2. Global Data Set Thematic Accuracy Analysis. 3. ISCGM/Global Map Project Support. 4. Cooperative International Activities. 5. User Model Study of Global Environmental Data Sets. 6. Global Spatial Data Infrastructure. 7. CIESIN Collaboration. 8. On the Value of Coordinating Landsat Operations. 10. The California Marine Protected Areas Database: Compilation and Accuracy Issues. 11. Assessing Landslide Hazard Over a 130-Year Period for La Conchita, California Remote Sensing and Spatial Metrics for Applied Urban Area Analysis, including: (1) IKONOS Data Processing for Urban Analysis. (2) Image Segmentation and Object Oriented Classification. (3) Spectral Properties of Urban Materials. (4) Spatial Scale in Urban Mapping. (5) Variable Scale Spatial and Temporal Urban Growth Signatures. (6) Interpretation and Verification of SLEUTH Modeling Results. (7) Spatial Land Cover Pattern Analysis for Representing Urban Land Use and Socioeconomic Structures. 12. Colorado River Flood Plain Remote Sensing Study Support. 13. African Rainfall Modeling and Assessment. 14. Remote Sensing and GIS Integration.
Munro, Sarah A; Lund, Steven P; Pine, P Scott; Binder, Hans; Clevert, Djork-Arné; Conesa, Ana; Dopazo, Joaquin; Fasold, Mario; Hochreiter, Sepp; Hong, Huixiao; Jafari, Nadereh; Kreil, David P; Łabaj, Paweł P; Li, Sheng; Liao, Yang; Lin, Simon M; Meehan, Joseph; Mason, Christopher E; Santoyo-Lopez, Javier; Setterquist, Robert A; Shi, Leming; Shi, Wei; Smyth, Gordon K; Stralis-Pavese, Nancy; Su, Zhenqiang; Tong, Weida; Wang, Charles; Wang, Jian; Xu, Joshua; Ye, Zhan; Yang, Yong; Yu, Ying; Salit, Marc
2014-09-25
There is a critical need for standard approaches to assess, report and compare the technical performance of genome-scale differential gene expression experiments. Here we assess technical performance with a proposed standard 'dashboard' of metrics derived from analysis of external spike-in RNA control ratio mixtures. These control ratio mixtures with defined abundance ratios enable assessment of diagnostic performance of differentially expressed transcript lists, limit of detection of ratio (LODR) estimates and expression ratio variability and measurement bias. The performance metrics suite is applicable to analysis of a typical experiment, and here we also apply these metrics to evaluate technical performance among laboratories. An interlaboratory study using identical samples shared among 12 laboratories with three different measurement processes demonstrates generally consistent diagnostic power across 11 laboratories. Ratio measurement variability and bias are also comparable among laboratories for the same measurement process. We observe different biases for measurement processes using different mRNA-enrichment protocols.
Motivation, Compensation, and Performance for Science and Technological Teachers
NASA Astrophysics Data System (ADS)
Abast, R. M.; Sangi, N. M.; Tumanduk, M. S. S. S.; Roring, R.
2018-02-01
This research is operationally aimed to obtain the result of analysis and interpretation about: relationship of achievement motive, compensation with performance at a junior high school in Manado, Indonesia. This research applies a quantitative approach with correlation analysis method. The research was conducted at one junior high school in Manado, Indonesia. The results showed achievement motive at the school teachers is quite high. This result means that, generally, the teachers of the school have a desire to improve achievement; the performance at the school is good enough. This result means that in general, the performance of teachers at the school is increasing, there is a linkage degree and determinative power between the achievement motive with the performance of teachers at the school amounted 0.773% or 77.3%, compensation for the school teachers in Manado is good enough. This result means that the compensation received is satisfactory, there is a linkage degree and determinative power between compensation and performance of the school teachers in Manado amounted to 0.582 or 58.2%.
Quantitative phenotyping via deep barcode sequencing.
Smith, Andrew M; Heisler, Lawrence E; Mellor, Joseph; Kaper, Fiona; Thompson, Michael J; Chee, Mark; Roth, Frederick P; Giaever, Guri; Nislow, Corey
2009-10-01
Next-generation DNA sequencing technologies have revolutionized diverse genomics applications, including de novo genome sequencing, SNP detection, chromatin immunoprecipitation, and transcriptome analysis. Here we apply deep sequencing to genome-scale fitness profiling to evaluate yeast strain collections in parallel. This method, Barcode analysis by Sequencing, or "Bar-seq," outperforms the current benchmark barcode microarray assay in terms of both dynamic range and throughput. When applied to a complex chemogenomic assay, Bar-seq quantitatively identifies drug targets, with performance superior to the benchmark microarray assay. We also show that Bar-seq is well-suited for a multiplex format. We completely re-sequenced and re-annotated the yeast deletion collection using deep sequencing, found that approximately 20% of the barcodes and common priming sequences varied from expectation, and used this revised list of barcode sequences to improve data quality. Together, this new assay and analysis routine provide a deep-sequencing-based toolkit for identifying gene-environment interactions on a genome-wide scale.
Popa, Stefan Octavian; Ferrari, Myriam; Andreozzi, Giuseppe Maria; Martini, Romeo; Bagno, Andrea
2015-11-01
Laser Doppler Fluxmetry is used to evaluate the hemodynamics of skin microcirculation. Laser Doppler signals contain oscillations due to fluctuations of microvascular perfusion. By performing spectral analysis, six frequency intervals from 0.005 to 2 Hz have been identified and assigned to distinct cardiovascular structures: heart, respiration, vascular myocites, sympathetic terminations and endothelial cells (dependent and independent on nitric oxide). Transcutaneous electrical pulses are currently applied to treat several diseases, i.e. neuropathies and chronic painful leg ulcers. Recently, FREMS (Frequency Rhythmic Electrical Modulation System) has been applied to vasculopathic patients, too. In this study Laser Doppler signals of skin microcirculation were measured in five patients with intermittent claudication, before and after the FREMS therapy. Changes in vascular activities were assessed by wavelet transform analysis. Preliminary results demonstrate that FREMS induces alterations in vascular activities. Copyright © 2015 IPEM. Published by Elsevier Ltd. All rights reserved.
Pairwise Classifier Ensemble with Adaptive Sub-Classifiers for fMRI Pattern Analysis.
Kim, Eunwoo; Park, HyunWook
2017-02-01
The multi-voxel pattern analysis technique is applied to fMRI data for classification of high-level brain functions using pattern information distributed over multiple voxels. In this paper, we propose a classifier ensemble for multiclass classification in fMRI analysis, exploiting the fact that specific neighboring voxels can contain spatial pattern information. The proposed method converts the multiclass classification to a pairwise classifier ensemble, and each pairwise classifier consists of multiple sub-classifiers using an adaptive feature set for each class-pair. Simulated and real fMRI data were used to verify the proposed method. Intra- and inter-subject analyses were performed to compare the proposed method with several well-known classifiers, including single and ensemble classifiers. The comparison results showed that the proposed method can be generally applied to multiclass classification in both simulations and real fMRI analyses.
Approach to proliferation risk assessment based on multiple objective analysis framework
DOE Office of Scientific and Technical Information (OSTI.GOV)
Andrianov, A.; Kuptsov, I.; Studgorodok 1, Obninsk, Kaluga region, 249030
2013-07-01
The approach to the assessment of proliferation risk using the methods of multi-criteria decision making and multi-objective optimization is presented. The approach allows the taking into account of the specifics features of the national nuclear infrastructure, and possible proliferation strategies (motivations, intentions, and capabilities). 3 examples of applying the approach are shown. First, the approach has been used to evaluate the attractiveness of HEU (high enriched uranium)production scenarios at a clandestine enrichment facility using centrifuge enrichment technology. Secondly, the approach has been applied to assess the attractiveness of scenarios for undeclared production of plutonium or HEU by theft of materialsmore » circulating in nuclear fuel cycle facilities and thermal reactors. Thirdly, the approach has been used to perform a comparative analysis of the structures of developing nuclear power systems based on different types of nuclear fuel cycles, the analysis being based on indicators of proliferation risk.« less
A Bibliometric Analysis on Cancer Population Science with Topic Modeling.
Li, Ding-Cheng; Rastegar-Mojarad, Majid; Okamoto, Janet; Liu, Hongfang; Leichow, Scott
2015-01-01
Bibliometric analysis is a research method used in library and information science to evaluate research performance. It applies quantitative and statistical analyses to describe patterns observed in a set of publications and can help identify previous, current, and future research trends or focus. To better guide our institutional strategic plan in cancer population science, we conducted bibliometric analysis on publications of investigators currently funded by either Division of Cancer Preventions (DCP) or Division of Cancer Control and Population Science (DCCPS) at National Cancer Institute. We applied two topic modeling techniques: author topic modeling (AT) and dynamic topic modeling (DTM). Our initial results show that AT can address reasonably the issues related to investigators' research interests, research topic distributions and popularities. In compensation, DTM can address the evolving trend of each topic by displaying the proportion changes of key words, which is consistent with the changes of MeSH headings.
Developing focused wellness programs: using concept analysis to increase business value.
Byczek, Lance; Kalina, Christine M; Levin, Pamela F
2003-09-01
Concept analysis is a useful tool in providing clarity to an abstract idea as well as an objective basis for developing wellness program products, goals, and outcomes. To plan for and develop successful wellness programs, it is critical for occupational health nurses to clearly understand a program concept as applied to a particular community or population. Occupational health nurses can use the outcome measures resulting from the concept analysis process to help demonstrate the business value of their wellness programs. This concept analysis demonstrates a predominance of the performance related attributes of fitness in the scientific literature.
Hayes, Timothy; Usami, Satoshi; Jacobucci, Ross; McArdle, John J
2015-12-01
In this article, we describe a recent development in the analysis of attrition: using classification and regression trees (CART) and random forest methods to generate inverse sampling weights. These flexible machine learning techniques have the potential to capture complex nonlinear, interactive selection models, yet to our knowledge, their performance in the missing data analysis context has never been evaluated. To assess the potential benefits of these methods, we compare their performance with commonly employed multiple imputation and complete case techniques in 2 simulations. These initial results suggest that weights computed from pruned CART analyses performed well in terms of both bias and efficiency when compared with other methods. We discuss the implications of these findings for applied researchers. (c) 2015 APA, all rights reserved).
Hayes, Timothy; Usami, Satoshi; Jacobucci, Ross; McArdle, John J.
2016-01-01
In this article, we describe a recent development in the analysis of attrition: using classification and regression trees (CART) and random forest methods to generate inverse sampling weights. These flexible machine learning techniques have the potential to capture complex nonlinear, interactive selection models, yet to our knowledge, their performance in the missing data analysis context has never been evaluated. To assess the potential benefits of these methods, we compare their performance with commonly employed multiple imputation and complete case techniques in 2 simulations. These initial results suggest that weights computed from pruned CART analyses performed well in terms of both bias and efficiency when compared with other methods. We discuss the implications of these findings for applied researchers. PMID:26389526
Applied Meteorology Unit (AMU) Quarterly Report - Fourth Quarter FY-09
NASA Technical Reports Server (NTRS)
Bauman, William; Crawford, Winifred; Barrett, Joe; Watson, Leela; Wheeler, Mark
2009-01-01
This report summarizes the Applied Meteorology Unit (AMU) activities for the fourth quarter of Fiscal Year 2009 (July - September 2009). Tasks reports include: (1) Peak Wind Tool for User Launch Commit Criteria (LCC), (2) Objective Lightning Probability Tool. Phase III, (3) Peak Wind Tool for General Forecasting. Phase II, (4) Update and Maintain Advanced Regional Prediction System (ARPS) Data Analysis System (ADAS), (5) Verify MesoNAM Performance (6) develop a Graphical User Interface to update selected parameters for the Hybrid Single-Particle Lagrangian Integrated Trajectory (HYSPLlT)
New prospects of VESUVIO applied to measurements in water mixtures
NASA Astrophysics Data System (ADS)
Rodríguez Palomino, L. A.; Dawidowski, J.; Blostein, J. J.; Cuello, G. J.
2014-12-01
We present new measurements on mixtures of light and heavy water in the spectrometer VESUVIO (Rutherford Appleton Laboratory, UK), and analyze them from the perspective of different kind of applications. We perform a single detector analysis and show the multiple scattering and attenuation corrections with the aim to employ them in mass- spectrometry. We also show the capabilities to perform transmission measurements to determine total cross sections of an acceptable quality by means of its transmission monitor.
Genetic programming based ensemble system for microarray data classification.
Liu, Kun-Hong; Tong, Muchenxuan; Xie, Shu-Tong; Yee Ng, Vincent To
2015-01-01
Recently, more and more machine learning techniques have been applied to microarray data analysis. The aim of this study is to propose a genetic programming (GP) based new ensemble system (named GPES), which can be used to effectively classify different types of cancers. Decision trees are deployed as base classifiers in this ensemble framework with three operators: Min, Max, and Average. Each individual of the GP is an ensemble system, and they become more and more accurate in the evolutionary process. The feature selection technique and balanced subsampling technique are applied to increase the diversity in each ensemble system. The final ensemble committee is selected by a forward search algorithm, which is shown to be capable of fitting data automatically. The performance of GPES is evaluated using five binary class and six multiclass microarray datasets, and results show that the algorithm can achieve better results in most cases compared with some other ensemble systems. By using elaborate base classifiers or applying other sampling techniques, the performance of GPES may be further improved.
Earth resources mission performance studies. Volume 2: Simulation results
NASA Technical Reports Server (NTRS)
1974-01-01
Simulations were made at three month intervals to investigate the EOS mission performance over the four seasons of the year. The basic objectives of the study were: (1) to evaluate the ability of an EOS type system to meet a representative set of specific collection requirements, and (2) to understand the capabilities and limitations of the EOS that influence the system's ability to satisfy certain collection objectives. Although the results were obtained from a consideration of a two sensor EOS system, the analysis can be applied to any remote sensing system having similar optical and operational characteristics. While the category related results are applicable only to the specified requirement configuration, the results relating to general capability and limitations of the sensors can be applied in extrapolating to other U.S. based EOS collection requirements. The TRW general purpose mission simulator and analytic techniques discussed in this report can be applied to a wide range of collection and planning problems of earth orbiting imaging systems.
Genetic Programming Based Ensemble System for Microarray Data Classification
Liu, Kun-Hong; Tong, Muchenxuan; Xie, Shu-Tong; Yee Ng, Vincent To
2015-01-01
Recently, more and more machine learning techniques have been applied to microarray data analysis. The aim of this study is to propose a genetic programming (GP) based new ensemble system (named GPES), which can be used to effectively classify different types of cancers. Decision trees are deployed as base classifiers in this ensemble framework with three operators: Min, Max, and Average. Each individual of the GP is an ensemble system, and they become more and more accurate in the evolutionary process. The feature selection technique and balanced subsampling technique are applied to increase the diversity in each ensemble system. The final ensemble committee is selected by a forward search algorithm, which is shown to be capable of fitting data automatically. The performance of GPES is evaluated using five binary class and six multiclass microarray datasets, and results show that the algorithm can achieve better results in most cases compared with some other ensemble systems. By using elaborate base classifiers or applying other sampling techniques, the performance of GPES may be further improved. PMID:25810748
NASA Technical Reports Server (NTRS)
Brown, Jonathan M.; Petersen, Jeremy D.
2014-01-01
NASA's WIND mission has been operating in a large amplitude Lissajous orbit in the vicinity of the interior libration point of the Sun-Earth/Moon system since 2004. Regular stationkeeping maneuvers are required to maintain the orbit due to the instability around the collinear libration points. Historically these stationkeeping maneuvers have been performed by applying an incremental change in velocity, or (delta)v along the spacecraft-Sun vector as projected into the ecliptic plane. Previous studies have shown that the magnitude of libration point stationkeeping maneuvers can be minimized by applying the (delta)v in the direction of the local stable manifold found using dynamical systems theory. This paper presents the analysis of this new maneuver strategy which shows that the magnitude of stationkeeping maneuvers can be decreased by 5 to 25 percent, depending on the location in the orbit where the maneuver is performed. The implementation of the optimized maneuver method into operations is discussed and results are presented for the first two optimized stationkeeping maneuvers executed by WIND.
Applying Mathematical Optimization Methods to an ACT-R Instance-Based Learning Model.
Said, Nadia; Engelhart, Michael; Kirches, Christian; Körkel, Stefan; Holt, Daniel V
2016-01-01
Computational models of cognition provide an interface to connect advanced mathematical tools and methods to empirically supported theories of behavior in psychology, cognitive science, and neuroscience. In this article, we consider a computational model of instance-based learning, implemented in the ACT-R cognitive architecture. We propose an approach for obtaining mathematical reformulations of such cognitive models that improve their computational tractability. For the well-established Sugar Factory dynamic decision making task, we conduct a simulation study to analyze central model parameters. We show how mathematical optimization techniques can be applied to efficiently identify optimal parameter values with respect to different optimization goals. Beyond these methodological contributions, our analysis reveals the sensitivity of this particular task with respect to initial settings and yields new insights into how average human performance deviates from potential optimal performance. We conclude by discussing possible extensions of our approach as well as future steps towards applying more powerful derivative-based optimization methods.
2013-01-01
Background As high-throughput genomic technologies become accurate and affordable, an increasing number of data sets have been accumulated in the public domain and genomic information integration and meta-analysis have become routine in biomedical research. In this paper, we focus on microarray meta-analysis, where multiple microarray studies with relevant biological hypotheses are combined in order to improve candidate marker detection. Many methods have been developed and applied in the literature, but their performance and properties have only been minimally investigated. There is currently no clear conclusion or guideline as to the proper choice of a meta-analysis method given an application; the decision essentially requires both statistical and biological considerations. Results We performed 12 microarray meta-analysis methods for combining multiple simulated expression profiles, and such methods can be categorized for different hypothesis setting purposes: (1) HS A : DE genes with non-zero effect sizes in all studies, (2) HS B : DE genes with non-zero effect sizes in one or more studies and (3) HS r : DE gene with non-zero effect in "majority" of studies. We then performed a comprehensive comparative analysis through six large-scale real applications using four quantitative statistical evaluation criteria: detection capability, biological association, stability and robustness. We elucidated hypothesis settings behind the methods and further apply multi-dimensional scaling (MDS) and an entropy measure to characterize the meta-analysis methods and data structure, respectively. Conclusions The aggregated results from the simulation study categorized the 12 methods into three hypothesis settings (HS A , HS B , and HS r ). Evaluation in real data and results from MDS and entropy analyses provided an insightful and practical guideline to the choice of the most suitable method in a given application. All source files for simulation and real data are available on the author’s publication website. PMID:24359104
Chang, Lun-Ching; Lin, Hui-Min; Sibille, Etienne; Tseng, George C
2013-12-21
As high-throughput genomic technologies become accurate and affordable, an increasing number of data sets have been accumulated in the public domain and genomic information integration and meta-analysis have become routine in biomedical research. In this paper, we focus on microarray meta-analysis, where multiple microarray studies with relevant biological hypotheses are combined in order to improve candidate marker detection. Many methods have been developed and applied in the literature, but their performance and properties have only been minimally investigated. There is currently no clear conclusion or guideline as to the proper choice of a meta-analysis method given an application; the decision essentially requires both statistical and biological considerations. We performed 12 microarray meta-analysis methods for combining multiple simulated expression profiles, and such methods can be categorized for different hypothesis setting purposes: (1) HS(A): DE genes with non-zero effect sizes in all studies, (2) HS(B): DE genes with non-zero effect sizes in one or more studies and (3) HS(r): DE gene with non-zero effect in "majority" of studies. We then performed a comprehensive comparative analysis through six large-scale real applications using four quantitative statistical evaluation criteria: detection capability, biological association, stability and robustness. We elucidated hypothesis settings behind the methods and further apply multi-dimensional scaling (MDS) and an entropy measure to characterize the meta-analysis methods and data structure, respectively. The aggregated results from the simulation study categorized the 12 methods into three hypothesis settings (HS(A), HS(B), and HS(r)). Evaluation in real data and results from MDS and entropy analyses provided an insightful and practical guideline to the choice of the most suitable method in a given application. All source files for simulation and real data are available on the author's publication website.
Performance of Oil Pumping Rings: An Analytical and Experimental Study
NASA Technical Reports Server (NTRS)
Eusepi, M. W.; Walowit, J. A.; Pinkus, O.; Holmes, P.
1986-01-01
A steady-state design computer program was developed to predict the performance of pumping rings as functions of geometry, applied loading, speed, ring modulus, and fluid viscosity. Additional analyses were developed to predict transient behavior of the ring and the effects of temperature rises occurring in the hydrodynamic film between the ring and shaft. The analysis was initially compared with previous experimental data and then used to design additional rings for further testing. Tests were performed with Rulon, carbon-graphite, and babbit rings. The design analysis was used to size all of the rings and to select the ranges of clearances, thickness, and loading. Although full quantitative agreement was lacking, relative agreement existed in that rings that were predicted to perform well theoretically, generally performed well experimentally. Some causes for discrepanices between theory and experiment are believed to be due to starvation, leakage past the secondary seal at high pressures, and uncertainties in the small clearances and local inlet temperatures to the pumping ring. A separate preliminary analysis was performed for a pumping Leningrader seal. This anlaysis can be used to predict the film thickness and flow rate thr ough the seal as a function of pressure, speed, loading, and geometry.
Textural and Mineralogical Analysis of Volcanic Rocks by µ-XRF Mapping.
Germinario, Luigi; Cossio, Roberto; Maritan, Lara; Borghi, Alessandro; Mazzoli, Claudio
2016-06-01
In this study, µ-XRF was applied as a novel surface technique for quick acquisition of elemental X-ray maps of rocks, image analysis of which provides quantitative information on texture and rock-forming minerals. Bench-top µ-XRF is cost-effective, fast, and non-destructive, can be applied to both large (up to a few tens of cm) and fragile samples, and yields major and trace element analysis with good sensitivity. Here, X-ray mapping was performed with a resolution of 103.5 µm and spot size of 30 µm over sample areas of about 5×4 cm of Euganean trachyte, a volcanic porphyritic rock from the Euganean Hills (NE Italy) traditionally used in cultural heritage. The relative abundance of phenocrysts and groundmass, as well as the size and shape of the various mineral phases, were obtained from image analysis of the elemental maps. The quantified petrographic features allowed identification of various extraction sites, revealing an objective method for archaeometric provenance studies exploiting µ-XRF imaging.
Piccinini, Filippo; Balassa, Tamas; Szkalisity, Abel; Molnar, Csaba; Paavolainen, Lassi; Kujala, Kaisa; Buzas, Krisztina; Sarazova, Marie; Pietiainen, Vilja; Kutay, Ulrike; Smith, Kevin; Horvath, Peter
2017-06-28
High-content, imaging-based screens now routinely generate data on a scale that precludes manual verification and interrogation. Software applying machine learning has become an essential tool to automate analysis, but these methods require annotated examples to learn from. Efficiently exploring large datasets to find relevant examples remains a challenging bottleneck. Here, we present Advanced Cell Classifier (ACC), a graphical software package for phenotypic analysis that addresses these difficulties. ACC applies machine-learning and image-analysis methods to high-content data generated by large-scale, cell-based experiments. It features methods to mine microscopic image data, discover new phenotypes, and improve recognition performance. We demonstrate that these features substantially expedite the training process, successfully uncover rare phenotypes, and improve the accuracy of the analysis. ACC is extensively documented, designed to be user-friendly for researchers without machine-learning expertise, and distributed as a free open-source tool at www.cellclassifier.org. Copyright © 2017 Elsevier Inc. All rights reserved.
Fatigue Analysis of Rotating Parts. A Case Study for a Belt Driven Pulley
NASA Astrophysics Data System (ADS)
Sandu, Ionela; Tabacu, Stefan; Ducu, Catalin
2017-10-01
The present study is focused on the life estimation of a rotating part as a component of an engine assembly namely the pulley of the coolant pump. The goal of the paper is to develop a model, supported by numerical analysis, capable to predict the lifetime of the part. Starting from functional drawing, CAD Model and technical specifications of the part a numerical model was developed. MATLAB code was used to develop a tool to apply the load over the selected area. The numerical analysis was performed in two steps. The first simulation concerned the inertia relief due to rotational motion about the shaft (of the pump). Results from this simulation were saved and the stress - strain state used as initial conditions for the analysis with the load applied. The lifetime of a good part was estimated. A defect was created in order to investigate the influence over the working requirements. It was found that there is little influence with respect to the prescribed lifetime.
The fundamental parameter method applied to X-ray fluorescence analysis with synchrotron radiation
NASA Astrophysics Data System (ADS)
Pantenburg, F. J.; Beier, T.; Hennrich, F.; Mommsen, H.
1992-05-01
Quantitative X-ray fluorescence analysis applying the fundamental parameter method is usually restricted to monochromatic excitation sources. It is shown here, that such analyses can be performed as well with a white synchrotron radiation spectrum. To determine absolute elemental concentration values it is necessary to know the spectral distribution of this spectrum. A newly designed and tested experimental setup, which uses the synchrotron radiation emitted from electrons in a bending magnet of ELSA (electron stretcher accelerator of the university of Bonn) is presented. The determination of the exciting spectrum, described by the given electron beam parameters, is limited due to uncertainties in the vertical electron beam size and divergence. We describe a method which allows us to determine the relative and absolute spectral distributions needed for accurate analysis. First test measurements of different alloys and standards of known composition demonstrate that it is possible to determine exact concentration values in bulk and trace element analysis.
From LIDAR Scanning to 3d FEM Analysis for Complex Surface and Underground Excavations
NASA Astrophysics Data System (ADS)
Chun, K.; Kemeny, J.
2017-12-01
Light detection and ranging (LIDAR) has been a prevalent remote-sensing technology applied in the geological fields due to its high precision and ease to use. One of the major applications is to use the detailed geometrical information of underground structures as a basis for the generation of three-dimensional numerical model that can be used in FEM analysis. To date, however, straightforward techniques in reconstructing numerical model from the scanned data of underground structures have not been well established or tested. In this paper, we propose a comprehensive approach integrating from LIDAR scanning to finite element numerical analysis, specifically converting LIDAR 3D point clouds of object containing complex surface geometry into finite element model. This methodology has been applied to the Kartchner Caverns in Arizona for the stability analysis. Numerical simulations were performed using the finite element code ABAQUS. The results indicate that the highlights of our technologies obtained from LIDAR is effective and provide reference for other similar engineering project in practice.
Huebsch, Nathaniel; Loskill, Peter; Mandegar, Mohammad A; Marks, Natalie C; Sheehan, Alice S; Ma, Zhen; Mathur, Anurag; Nguyen, Trieu N; Yoo, Jennie C; Judge, Luke M; Spencer, C Ian; Chukka, Anand C; Russell, Caitlin R; So, Po-Lin; Conklin, Bruce R; Healy, Kevin E
2015-05-01
Contractile motion is the simplest metric of cardiomyocyte health in vitro, but unbiased quantification is challenging. We describe a rapid automated method, requiring only standard video microscopy, to analyze the contractility of human-induced pluripotent stem cell-derived cardiomyocytes (iPS-CM). New algorithms for generating and filtering motion vectors combined with a newly developed isogenic iPSC line harboring genetically encoded calcium indicator, GCaMP6f, allow simultaneous user-independent measurement and analysis of the coupling between calcium flux and contractility. The relative performance of these algorithms, in terms of improving signal to noise, was tested. Applying these algorithms allowed analysis of contractility in iPS-CM cultured over multiple spatial scales from single cells to three-dimensional constructs. This open source software was validated with analysis of isoproterenol response in these cells, and can be applied in future studies comparing the drug responsiveness of iPS-CM cultured in different microenvironments in the context of tissue engineering.
The application of decision analysis to life support research and technology development
NASA Technical Reports Server (NTRS)
Ballin, Mark G.
1994-01-01
Applied research and technology development is often characterized by uncertainty, risk, and significant delays before tangible returns are obtained. Decision making regarding which technologies to advance and what resources to devote to them is a challenging but essential task. In the application of life support technology to future manned space flight, new technology concepts typically are characterized by nonexistent data and rough approximations of technology performance, uncertain future flight program needs, and a complex, time-intensive process to develop technology to a flight-ready status. Decision analysis is a quantitative, logic-based discipline that imposes formalism and structure to complex problems. It also accounts for the limits of knowledge that may be available at the time a decision is needed. The utility of decision analysis to life support technology R & D was evaluated by applying it to two case studies. The methodology was found to provide insight that is not possible from more traditional analysis approaches.
Ibrahim, Reham S; Fathy, Hoda
2018-03-30
Tracking the impact of commonly applied post-harvesting and industrial processing practices on the compositional integrity of ginger rhizome was implemented in this work. Untargeted metabolite profiling was performed using digitally-enhanced HPTLC method where the chromatographic fingerprints were extracted using ImageJ software then analysed with multivariate Principal Component Analysis (PCA) for pattern recognition. A targeted approach was applied using a new, validated, simple and fast HPTLC image analysis method for simultaneous quantification of the officially recognized markers 6-, 8-, 10-gingerol and 6-shogaol in conjunction with chemometric Hierarchical Clustering Analysis (HCA). The results of both targeted and untargeted metabolite profiling revealed that peeling, drying in addition to storage employed during processing have a great influence on ginger chemo-profile, the different forms of processed ginger shouldn't be used interchangeably. Moreover, it deemed necessary to consider the holistic metabolic profile for comprehensive evaluation of ginger during processing. Copyright © 2018. Published by Elsevier B.V.
Systems design analysis applied to launch vehicle configuration
NASA Technical Reports Server (NTRS)
Ryan, R.; Verderaime, V.
1993-01-01
As emphasis shifts from optimum-performance aerospace systems to least lift-cycle costs, systems designs must seek, adapt, and innovate cost improvement techniques in design through operations. The systems design process of concept, definition, and design was assessed for the types and flow of total quality management techniques that may be applicable in a launch vehicle systems design analysis. Techniques discussed are task ordering, quality leverage, concurrent engineering, Pareto's principle, robustness, quality function deployment, criteria, and others. These cost oriented techniques are as applicable to aerospace systems design analysis as to any large commercial system.
NASA Technical Reports Server (NTRS)
1976-01-01
Complete motion analysis laboratory has evolved out of analyzing walking patterns of crippled children at Stanford Children's Hospital. Data is collected by placing tiny electrical sensors over muscle groups of child's legs and inserting step-sensing switches in soles of shoes. Miniature radio transmitters send signals to receiver for continuous recording of abnormal walking pattern. Engineers are working to apply space electronics miniaturization techniques to reduce size and weight of telemetry system further as well as striving to increase signal bandwidth so analysis can be performed faster and more accurately using a mini-computer.
NASA Astrophysics Data System (ADS)
Dobrynin, S. A.; Kolubaev, E. A.; Smolin, A. Yu.; Dmitriev, A. I.; Psakhie, S. G.
2010-07-01
Time-frequency analysis of sound waves detected by a microphone during the friction of Hadfield’s steel has been performed using wavelet transform and window Fourier transform methods. This approach reveals a relationship between the appearance of quasi-periodic intensity outbursts in the acoustic response signals and the processes responsible for the formation of wear products. It is shown that the time-frequency analysis of acoustic emission in a tribosystem can be applied, along with traditional approaches, to studying features in the wear and friction process.
Lin, Zhichao; Wu, Zhongyu
2009-05-01
A rapid and reliable radiochemical method coupled with a simple and compact plating apparatus was developed, validated, and applied for the analysis of (210)Po in variety of food products and bioassay samples. The method performance characteristics, including accuracy, precision, robustness, and specificity, were evaluated along with a detailed measurement uncertainty analysis. With high Po recovery, improved energy resolution, and effective removal of interfering elements by chromatographic extraction, the overall method accuracy was determined to be better than 5% with measurement precision of 10%, at 95% confidence level.
Berthelet, M; Whyte, L G; Greer, C W
1996-04-15
Polyvinylpolypyrrolidone spin columns were used to rapidly purify crude soil DNA extracts from humic materials for polymerase chain reaction (PCR) analysis. The PCR detection limit for the tfdC gene, encoding chlorocatechol dioxygenase from the 2,4-dichlorophenoxyacetic acid degradation pathway, was 10(1)-10(2) cells/g soil in inoculated soils. The procedure could be applied to the amplification of biodegradative genes from indigenous microbial populations from a wide variety of soil types, and the entire analysis could be performed within 8 h.
NASA Astrophysics Data System (ADS)
Crawford, I.; Ruske, S.; Topping, D. O.; Gallagher, M. W.
2015-11-01
In this paper we present improved methods for discriminating and quantifying primary biological aerosol particles (PBAPs) by applying hierarchical agglomerative cluster analysis to multi-parameter ultraviolet-light-induced fluorescence (UV-LIF) spectrometer data. The methods employed in this study can be applied to data sets in excess of 1 × 106 points on a desktop computer, allowing for each fluorescent particle in a data set to be explicitly clustered. This reduces the potential for misattribution found in subsampling and comparative attribution methods used in previous approaches, improving our capacity to discriminate and quantify PBAP meta-classes. We evaluate the performance of several hierarchical agglomerative cluster analysis linkages and data normalisation methods using laboratory samples of known particle types and an ambient data set. Fluorescent and non-fluorescent polystyrene latex spheres were sampled with a Wideband Integrated Bioaerosol Spectrometer (WIBS-4) where the optical size, asymmetry factor and fluorescent measurements were used as inputs to the analysis package. It was found that the Ward linkage with z-score or range normalisation performed best, correctly attributing 98 and 98.1 % of the data points respectively. The best-performing methods were applied to the BEACHON-RoMBAS (Bio-hydro-atmosphere interactions of Energy, Aerosols, Carbon, H2O, Organics and Nitrogen-Rocky Mountain Biogenic Aerosol Study) ambient data set, where it was found that the z-score and range normalisation methods yield similar results, with each method producing clusters representative of fungal spores and bacterial aerosol, consistent with previous results. The z-score result was compared to clusters generated with previous approaches (WIBS AnalysiS Program, WASP) where we observe that the subsampling and comparative attribution method employed by WASP results in the overestimation of the fungal spore concentration by a factor of 1.5 and the underestimation of bacterial aerosol concentration by a factor of 5. We suggest that this likely due to errors arising from misattribution due to poor centroid definition and failure to assign particles to a cluster as a result of the subsampling and comparative attribution method employed by WASP. The methods used here allow for the entire fluorescent population of particles to be analysed, yielding an explicit cluster attribution for each particle and improving cluster centroid definition and our capacity to discriminate and quantify PBAP meta-classes compared to previous approaches.
NASA Technical Reports Server (NTRS)
Boyd, R. K.; Brumfield, J. O.; Campbell, W. J.
1984-01-01
Three feature extraction methods, canonical analysis (CA), principal component analysis (PCA), and band selection, have been applied to Thematic Mapper Simulator (TMS) data in order to evaluate the relative performance of the methods. The results obtained show that CA is capable of providing a transformation of TMS data which leads to better classification results than provided by all seven bands, by PCA, or by band selection. A second conclusion drawn from the study is that TMS bands 2, 3, 4, and 7 (thermal) are most important for landcover classification.
Ueda, Masanori; Iwaki, Masafumi; Nishihara, Tokihiro; Satoh, Yoshio; Hashimoto, Ken-ya
2008-04-01
This paper describes a circuit model for the analysis of nonlinearity in the filters based on radiofrequency (RF) bulk acoustic wave (BAW) resonators. The nonlinear output is expressed by a current source connected parallel to the linear resonator. Amplitude of the nonlinear current source is programmed proportional to the product of linear currents flowing in the resonator. Thus, the nonlinear analysis is performed by the common linear analysis, even for complex device structures. The analysis is applied to a ladder-type RF BAW filter, and frequency dependence of the nonlinear output is discussed. Furthermore, this analysis is verified through comparison with experiments.
Trompier, François; Burbidge, Christopher; Bassinet, Céline; Baumann, Marion; Bortolin, Emanuela; De Angelis, Cinzia; Eakins, Jonathan; Della Monaca, Sara; Fattibene, Paola; Quattrini, Maria Cristina; Tanner, Rick; Wieser, Albrecht; Woda, Clemens
2017-01-01
In the EC-funded project RENEB (Realizing the European Network in Biodosimetry), physical methods applied to fortuitous dosimetric materials are used to complement biological dosimetry, to increase dose assessment capacity for large-scale radiation/nuclear accidents. This paper describes the work performed to implement Optically Stimulated Luminescence (OSL) and Electron Paramagnetic Resonance (EPR) dosimetry techniques. OSL is applied to electronic components and EPR to touch-screen glass from mobile phones. To implement these new approaches, several blind tests and inter-laboratory comparisons (ILC) were organized for each assay. OSL systems have shown good performances. EPR systems also show good performance in controlled conditions, but ILC have also demonstrated that post-irradiation exposure to sunlight increases the complexity of the EPR signal analysis. Physically-based dosimetry techniques present high capacity, new possibilities for accident dosimetry, especially in the case of large-scale events. Some of the techniques applied can be considered as operational (e.g. OSL on Surface Mounting Devices [SMD]) and provide a large increase of measurement capacity for existing networks. Other techniques and devices currently undergoing validation or development in Europe could lead to considerable increases in the capacity of the RENEB accident dosimetry network.
Inertial Sensor Error Reduction through Calibration and Sensor Fusion.
Lambrecht, Stefan; Nogueira, Samuel L; Bortole, Magdo; Siqueira, Adriano A G; Terra, Marco H; Rocon, Eduardo; Pons, José L
2016-02-17
This paper presents the comparison between cooperative and local Kalman Filters (KF) for estimating the absolute segment angle, under two calibration conditions. A simplified calibration, that can be replicated in most laboratories; and a complex calibration, similar to that applied by commercial vendors. The cooperative filters use information from either all inertial sensors attached to the body, Matricial KF; or use information from the inertial sensors and the potentiometers of an exoskeleton, Markovian KF. A one minute walking trial of a subject walking with a 6-DoF exoskeleton was used to assess the absolute segment angle of the trunk, thigh, shank, and foot. The results indicate that regardless of the segment and filter applied, the more complex calibration always results in a significantly better performance compared to the simplified calibration. The interaction between filter and calibration suggests that when the quality of the calibration is unknown the Markovian KF is recommended. Applying the complex calibration, the Matricial and Markovian KF perform similarly, with average RMSE below 1.22 degrees. Cooperative KFs perform better or at least equally good as Local KF, we therefore recommend to use cooperative KFs instead of local KFs for control or analysis of walking.
Performance optimization of helicopter rotor blades
NASA Technical Reports Server (NTRS)
Walsh, Joanne L.
1991-01-01
As part of a center-wide activity at NASA Langley Research Center to develop multidisciplinary design procedures by accounting for discipline interactions, a performance design optimization procedure is developed. The procedure optimizes the aerodynamic performance of rotor blades by selecting the point of taper initiation, root chord, taper ratio, and maximum twist which minimize hover horsepower while not degrading forward flight performance. The procedure uses HOVT (a strip theory momentum analysis) to compute the horse power required for hover and the comprehensive helicopter analysis program CAMRAD to compute the horsepower required for forward flight and maneuver. The optimization algorithm consists of the general purpose optimization program CONMIN and approximate analyses. Sensitivity analyses consisting of derivatives of the objective function and constraints are carried out by forward finite differences. The procedure is applied to a test problem which is an analytical model of a wind tunnel model of a utility rotor blade.
Energy-Discriminative Performance of a Spectral Micro-CT System
He, Peng; Yu, Hengyong; Bennett, James; Ronaldson, Paul; Zainon, Rafidah; Butler, Anthony; Butler, Phil; Wei, Biao; Wang, Ge
2013-01-01
Experiments were performed to evaluate the energy-discriminative performance of a spectral (multi-energy) micro-CT system. The system, designed by MARS (Medipix All Resolution System) Bio-Imaging Ltd. (Christchurch, New Zealand), employs a photon-counting energy-discriminative detector technology developed by CERN (European Organization for Nuclear Research). We used the K-edge attenuation characteristic of some known materials to calibrate the detector’s photon energy discrimination. For tomographic analysis, we used the compressed sensing (CS) based ordered-subset simultaneous algebraic reconstruction techniques (OS-SART) to reconstruct sample images, which is effective to reduce noise and suppress artifacts. Unlike conventional CT, the principal component analysis (PCA) method can be applied to extract and quantify additional attenuation information from a spectral CT dataset. Our results show that the spectral CT has a good energy-discriminative performance and provides more attenuation information than the conventional CT. PMID:24004864
Target recognition of ladar range images using even-order Zernike moments.
Liu, Zheng-Jun; Li, Qi; Xia, Zhi-Wei; Wang, Qi
2012-11-01
Ladar range images have attracted considerable attention in automatic target recognition fields. In this paper, Zernike moments (ZMs) are applied to classify the target of the range image from an arbitrary azimuth angle. However, ZMs suffer from high computational costs. To improve the performance of target recognition based on small samples, even-order ZMs with serial-parallel backpropagation neural networks (BPNNs) are applied to recognize the target of the range image. It is found that the rotation invariance and classified performance of the even-order ZMs are both better than for odd-order moments and for moments compressed by principal component analysis. The experimental results demonstrate that combining the even-order ZMs with serial-parallel BPNNs can significantly improve the recognition rate for small samples.
Li, Xiaomeng; Fang, Dansi; Cong, Xiaodong; Cao, Gang; Cai, Hao; Cai, Baochang
2012-12-01
A method is described using rapid and sensitive Fourier transform near-infrared spectroscopy combined with high-performance liquid chromatography-diode array detection for the simultaneous identification and determination of four bioactive compounds in crude Radix Scrophulariae samples. Partial least squares regression is selected as the analysis type and multiplicative scatter correction, second derivative, and Savitzky-Golay filter were adopted for the spectral pretreatment. The correlation coefficients (R) of the calibration models were above 0.96 and the root mean square error of predictions were under 0.028. The developed models were applied to unknown samples with satisfactory results. The established method was validated and can be applied to the intrinsic quality control of crude Radix Scrophulariae.
NASA Astrophysics Data System (ADS)
Jacobson, Gloria; Rella, Chris; Farinas, Alejandro
2014-05-01
Technological advancement of instrumentation in atmospheric and other geoscience disciplines over the past decade has lead to a shift from discrete sample analysis to continuous, in-situ monitoring. Standard error analysis used for discrete measurements is not sufficient to assess and compare the error contribution of noise and drift from continuous-measurement instruments, and a different statistical analysis approach should be applied. The Allan standard deviation analysis technique developed for atomic clock stability assessment by David W. Allan [1] can be effectively and gainfully applied to continuous measurement instruments. As an example, P. Werle et al has applied these techniques to look at signal averaging for atmospheric monitoring by Tunable Diode-Laser Absorption Spectroscopy (TDLAS) [2]. This presentation will build on, and translate prior foundational publications to provide contextual definitions and guidelines for the practical application of this analysis technique to continuous scientific measurements. The specific example of a Picarro G2401 Cavity Ringdown Spectroscopy (CRDS) analyzer used for continuous, atmospheric monitoring of CO2, CH4 and CO will be used to define the basics features the Allan deviation, assess factors affecting the analysis, and explore the time-series to Allan deviation plot translation for different types of instrument noise (white noise, linear drift, and interpolated data). In addition, the useful application of using an Allan deviation to optimize and predict the performance of different calibration schemes will be presented. Even though this presentation will use the specific example of the Picarro G2401 CRDS Analyzer for atmospheric monitoring, the objective is to present the information such that it can be successfully applied to other instrument sets and disciplines. [1] D.W. Allan, "Statistics of Atomic Frequency Standards," Proc, IEEE, vol. 54, pp 221-230, Feb 1966 [2] P. Werle, R. Miicke, F. Slemr, "The Limits of Signal Averaging in Atmospheric Trace-Gas Monitoring by Tunable Diode-Laser Absorption Spectroscopy (TDLAS)," Applied Physics, B57, pp 131-139, April 1993
Baldelli, Sara; Marrubini, Giorgio; Cattaneo, Dario; Clementi, Emilio; Cerea, Matteo
2017-10-01
The application of Quality by Design (QbD) principles in clinical laboratories can help to develop an analytical method through a systematic approach, providing a significant advance over the traditional heuristic and empirical methodology. In this work, we applied for the first time the QbD concept in the development of a method for drug quantification in human plasma using elvitegravir as the test molecule. The goal of the study was to develop a fast and inexpensive quantification method, with precision and accuracy as requested by the European Medicines Agency guidelines on bioanalytical method validation. The method was divided into operative units, and for each unit critical variables affecting the results were identified. A risk analysis was performed to select critical process parameters that should be introduced in the design of experiments (DoEs). Different DoEs were used depending on the phase of advancement of the study. Protein precipitation and high-performance liquid chromatography-tandem mass spectrometry were selected as the techniques to be investigated. For every operative unit (sample preparation, chromatographic conditions, and detector settings), a model based on factors affecting the responses was developed and optimized. The obtained method was validated and clinically applied with success. To the best of our knowledge, this is the first investigation thoroughly addressing the application of QbD to the analysis of a drug in a biological matrix applied in a clinical laboratory. The extensive optimization process generated a robust method compliant with its intended use. The performance of the method is continuously monitored using control charts.
An active simulator for neonatal intubation: Design, development and assessment.
Baldoli, Ilaria; Tognarelli, Selene; Vangi, Ferdinando; Panizza, Davide; Scaramuzzo, Rosa T; Cuttano, Armando; Laschi, Cecilia; Menciassi, Arianna
2017-01-01
This study describes the technical realization and the pre-clinical validation of a instrumented neonatal intubation skill trainer able to provide objective feedback for the improvement of clinical competences required for such a delicate procedure. The Laerdal ® Neonatal Intubation Trainer was modified by applying pressure sensors on areas that are mainly subject to stress and potential injuries. Punctual Force Sensing Resistors (FSRs) were characterized and fixed on the external side of the airway structure on the dental arches and epiglottis. A custom silicone tongue was designed and developed to integrate a matrix textile sensor for mapping the pressure applied on its whole surface. The assessment of the developed tool was performed by nine clinical experts who were asked to practice three intubation procedures apiece. Median and maximum forces, over threshold events (i.e. 2N for gingival arch sensors and 7N for epiglottis and tongue sensors respectively) and execution time were measured for each trainee. Data analysis from training sessions revealed that the epiglottis is the point mainly stressed during an intubation procedure (maximum value: 16.69N, median value: 3.11N), while the analysis carried out on the pressure distribution on the instrumented tongue provided information on both force values and distribution, according to clinicians' performance. The debriefing phase was used to enhance the clinicians' awareness of applied force and gestures performed, confirming that the present study is an adequate starting point for achieving and optimizing neonatal intubation skills for both residents and expert clinicians. Copyright © 2016 IPEM. Published by Elsevier Ltd. All rights reserved.
Linear regression analysis: part 14 of a series on evaluation of scientific publications.
Schneider, Astrid; Hommel, Gerhard; Blettner, Maria
2010-11-01
Regression analysis is an important statistical method for the analysis of medical data. It enables the identification and characterization of relationships among multiple factors. It also enables the identification of prognostically relevant risk factors and the calculation of risk scores for individual prognostication. This article is based on selected textbooks of statistics, a selective review of the literature, and our own experience. After a brief introduction of the uni- and multivariable regression models, illustrative examples are given to explain what the important considerations are before a regression analysis is performed, and how the results should be interpreted. The reader should then be able to judge whether the method has been used correctly and interpret the results appropriately. The performance and interpretation of linear regression analysis are subject to a variety of pitfalls, which are discussed here in detail. The reader is made aware of common errors of interpretation through practical examples. Both the opportunities for applying linear regression analysis and its limitations are presented.
Melucci, Dora; Bendini, Alessandra; Tesini, Federica; Barbieri, Sara; Zappi, Alessandro; Vichi, Stefania; Conte, Lanfranco; Gallina Toschi, Tullia
2016-08-01
At present, the geographical origin of extra virgin olive oils can be ensured by documented traceability, although chemical analysis may add information that is useful for possible confirmation. This preliminary study investigated the effectiveness of flash gas chromatography electronic nose and multivariate data analysis to perform rapid screening of commercial extra virgin olive oils characterized by a different geographical origin declared in the label. A comparison with solid phase micro extraction coupled to gas chromatography mass spectrometry was also performed. The new method is suitable to verify the geographic origin of extra virgin olive oils based on principal components analysis and discriminant analysis applied to the volatile profile of the headspace as a fingerprint. The selected variables were suitable in discriminating between "100% Italian" and "non-100% Italian" oils. Partial least squares discriminant analysis also allowed prediction of the degree of membership of unknown samples to the classes examined. Copyright © 2016. Published by Elsevier Ltd.
Independent component analysis decomposition of hospital emergency department throughput measures
NASA Astrophysics Data System (ADS)
He, Qiang; Chu, Henry
2016-05-01
We present a method adapted from medical sensor data analysis, viz. independent component analysis of electroencephalography data, to health system analysis. Timely and effective care in a hospital emergency department is measured by throughput measures such as median times patients spent before they were admitted as an inpatient, before they were sent home, before they were seen by a healthcare professional. We consider a set of five such measures collected at 3,086 hospitals distributed across the U.S. One model of the performance of an emergency department is that these correlated throughput measures are linear combinations of some underlying sources. The independent component analysis decomposition of the data set can thus be viewed as transforming a set of performance measures collected at a site to a collection of outputs of spatial filters applied to the whole multi-measure data. We compare the independent component sources with the output of the conventional principal component analysis to show that the independent components are more suitable for understanding the data sets through visualizations.
Measurement of Workload: Physics, Psychophysics, and Metaphysics
NASA Technical Reports Server (NTRS)
Gopher, D.
1984-01-01
The present paper reviews the results of two experiments in which workload analysis was conducted based upon performance measures, brain evoked potentials and magnitude estimations of subjective load. The three types of measures were jointly applied to the description of the behavior of subjects in a wide battery of experimental tasks. Data analysis shows both instances of association and dissociation between types of measures. A general conceptual framework and methodological guidelines are proposed to account for these findings.
NASA Technical Reports Server (NTRS)
Spiller, Olaf
1991-01-01
The provisions applied to the Airbus A340 wing wiring against lightning indirect effects are presented. The construction and installation of the wiring's shielding systems are described, and the analysis and tests performed to determine the effectiveness of the measures taken are discussed. A first evaluation of the results of the theoretical analysis together with the provisional results of tests indicate a sufficient safety margin between required and achieved protection levels.
An Evaluation of Material Properties Using EMA and FEM
NASA Astrophysics Data System (ADS)
Ďuriš, Rastislav; Labašová, Eva
2016-12-01
The main goal of the paper is the determination of material properties from experimentally measured natural frequencies. A combination of two approaches to structural dynamics testing was applied: the experimental measurements of natural frequencies were performed by Experimental Modal Analysis (EMA) and the numerical simulations, were carried out by Finite Element Analysis (FEA). The optimization methods were used to determine the values of density and elasticity modulus of a specimen based on the experimental results.
A visiting scientist program in atmospheric sciences for the Goddard Space Flight Center
NASA Technical Reports Server (NTRS)
Davis, M. H.
1989-01-01
A visiting scientist program was conducted in the atmospheric sciences and related areas at the Goddard Laboratory for Atmospheres. Research was performed in mathematical analysis as applied to computer modeling of the atmospheres; development of atmospheric modeling programs; analysis of remotely sensed atmospheric, surface, and oceanic data and its incorporation into atmospheric models; development of advanced remote sensing instrumentation; and related research areas. The specific research efforts are detailed by tasks.
Self-stress control of real civil engineering tensegrity structures
NASA Astrophysics Data System (ADS)
Kłosowska, Joanna; Obara, Paulina; Gilewski, Wojciech
2018-01-01
The paper introduces the impact of the self-stress level on the behaviour of the tensegrity truss structures. Displacements for real civil engineering tensegrity structures are analysed. Full-scale tensegrity tower Warnow Tower which consists of six Simplex trusses is considered in this paper. Three models consisting of one, two and six modules are analysed. The analysis is performed by the second and third order theory. Mathematica software and Sofistik programme is applied to the analysis.
NASA Astrophysics Data System (ADS)
Black, Joshua A.; Knowles, Peter J.
2018-06-01
The performance of quasi-variational coupled-cluster (QV) theory applied to the calculation of activation and reaction energies has been investigated. A statistical analysis of results obtained for six different sets of reactions has been carried out, and the results have been compared to those from standard single-reference methods. In general, the QV methods lead to increased activation energies and larger absolute reaction energies compared to those obtained with traditional coupled-cluster theory.
Fanali, Chiara; Tripodo, Giusy; Russo, Marina; Della Posta, Susanna; Pasqualetti, Valentina; De Gara, Laura
2018-03-22
Hazelnut kernel phenolic compounds were recovered applying two different extraction approaches, namely ultrasound-assisted solid/liquid extraction (UA-SLE) and solid-phase extraction (SPE). Different solvents were tested evaluating total phenolic compounds and total flavonoids contents together to antioxidant activity. The optimum extraction conditions, in terms of the highest value of total phenolic compounds extracted together to other parameters like simplicity and cost were selected for method validation and individual phenolic compounds analysis. The UA-SLE protocol performed using 0.1 g of defatted sample and 15 mL of extraction solvent (1 mL methanol/1 mL water/8 mL methanol 0.1% formic acid/5 mL acetonitrile) was selected. The analysis of hazelnut kernel individual phenolic compounds was obtained by HPLC coupled with DAD and MS detections. Quantitative analysis was performed using a mixture of six phenolic compounds belonging to phenolic classes' representative of hazelnut. Then, the method was fully validated and the resulting RSD% values for retention time repeatability were below 1%. A good linearity was obtained giving R 2 no lower than 0.997.The accuracy of the extraction method was also assessed. Finally, the method was applied to the analysis of phenolic compounds in three different hazelnut kernel varieties observing a similar qualitative profile with differences in the quantity of detected compounds. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Terrill, Philip I; Wilson, Stephen J; Suresh, Sadasivam; Cooper, David M; Dakin, Carolyn
2012-08-01
Previous work has identified that non-linear variables calculated from respiratory data vary between sleep states, and that variables derived from the non-linear analytical tool recurrence quantification analysis (RQA) are accurate infant sleep state discriminators. This study aims to apply these discriminators to automatically classify 30 s epochs of infant sleep as REM, non-REM and wake. Polysomnograms were obtained from 25 healthy infants at 2 weeks, 3, 6 and 12 months of age, and manually sleep staged as wake, REM and non-REM. Inter-breath interval data were extracted from the respiratory inductive plethysmograph, and RQA applied to calculate radius, determinism and laminarity. Time-series statistic and spectral analysis variables were also calculated. A nested cross-validation method was used to identify the optimal feature subset, and to train and evaluate a linear discriminant analysis-based classifier. The RQA features radius and laminarity and were reliably selected. Mean agreement was 79.7, 84.9, 84.0 and 79.2 % at 2 weeks, 3, 6 and 12 months, and the classifier performed better than a comparison classifier not including RQA variables. The performance of this sleep-staging tool compares favourably with inter-human agreement rates, and improves upon previous systems using only respiratory data. Applications include diagnostic screening and population-based sleep research.
Guo, Long; Dou, Li-Li; Duan, Li; Liu, Ke; Bi, Zhi-Ming; Li, Ping; Liu, E-Hu
2015-09-01
Xingxiong injection (XXI) is a widely used Chinese herbal formula prepared by the folium ginkgo extract and ligustrazine for the treatment of cardiovascular and cerebrovascular diseases. Compared with the pharmacological studies, chemical analysis and quality control studies on this formula are relatively limited. In the present study, a high performance liquid chromatography coupled with quadrupole time-of-flight mass spectrometry (HPLC-QTOF MS) method was applied to comprehensive analysis of constituents in XXI. According to the fragmentation rules and previous reports, thirty ginkgo flavonoids, four ginkgo terpene lactones, and one alkaloid were identified. A high performance liquid chromatography coupled with triple quadrupole mass spectrometry (HPLC-QQQ MS) method was then applied to quantify ten major constituents in XXI. The method validation results indicated that the developed method had desirable specificity, linearity, precision and accuracy. The total contents of ginkgo flavonoids were about 22.05-25.51 μg·mL(-1) and the ginkgo terpene lactones amounts were about 4.41-8.70 μg·mL(-1) in six batches of XXI samples, respectively. Furthermore, cosine ratio algorithm and distance measurements were employed to evaluate the similarity of XXI samples, and the results demonstrated a high-quality consistency. This work could provide comprehensive information on the quality control of Xingxiong injection, which be helpful in the establishment of a rational quality control standard. Copyright © 2015 China Pharmaceutical University. Published by Elsevier B.V. All rights reserved.
Autotasked Performance in the NAS Workload: A Statistical Analysis
NASA Technical Reports Server (NTRS)
Carter, R. L.; Stockdale, I. E.; Kutler, Paul (Technical Monitor)
1998-01-01
A statistical analysis of the workload performance of a production quality FORTRAN code for five different Cray Y-MP hardware and system software configurations is performed. The analysis was based on an experimental procedure that was designed to minimize correlations between the number of requested CPUs and the time of day the runs were initiated. Observed autotasking over heads were significantly larger for the set of jobs that requested the maximum number of CPUs. Speedups for UNICOS 6 releases show consistent wall clock speedups in the workload of around 2. which is quite good. The observed speed ups were very similar for the set of jobs that requested 8 CPUs and the set that requested 4 CPUs. The original NAS algorithm for determining charges to the user discourages autotasking in the workload. A new charging algorithm to be applied to jobs run in the NQS multitasking queues also discourages NAS users from using auto tasking. The new algorithm favors jobs requesting 8 CPUs over those that request less, although the jobs requesting 8 CPUs experienced significantly higher over head and presumably degraded system throughput. A charging algorithm is presented that has the following desirable characteristics when applied to the data: higher overhead jobs requesting 8 CPUs are penalized when compared to moderate overhead jobs requesting 4 CPUs, thereby providing a charging incentive to NAS users to use autotasking in a manner that provides them with significantly improved turnaround while also maintaining system throughput.
Charehsaz, Mohammad; Gürbay, Aylin; Aydin, Ahmet; Sahin, Gönül
2014-01-01
In this study, a high-performance liquid chromatographic method (HPLC) and UV spectrophotometric method were developed, validated and applied for the determination of theophylline in biological fluids. Liquid- liquid extraction is performed for isolation of the drug and elimination of plasma and saliva interferences. Urine samples were applied without any extraction. The chromatographic separation was achieved on a C18 column by using 60:40 methanol:water as mobile phase under isocratic conditions at a flow rate of 0.75 mL/min with UV detection at 280 nm in HPLC method. UV spectrophotometric analysis was performed at 275 nm. the limit of quantification: 1.1 µg/mL for urine, 1.9 µg/mL for saliva, 3.1 µg/mL for plasma; recovery: 94.85% for plasma, 100.45% for saliva, 101.39% for urine; intra-day precision: 0.22-2.33%, inter-day precision: 3.17-13.12%. Spectrophotometric analysis results were as follows: the limit of quantitation: 5.23 µg/mL for plasma, 8.7 µg/mL for urine; recovery: 98.27% for plasma, 95.25% for urine; intra-day precision: 2.37 - 3.00%, inter-day precision: 5.43-7.91%. It can be concluded that this validated HPLC method is easy, precise, accurate, sensitive and selective for determination of theophylline in biological samples. Also spectrophotometric analysis can be used where it can be applicable.
NASA Astrophysics Data System (ADS)
Arif, Sajjad; Tanwir Alam, Md; Ansari, Akhter H.; Bilal Naim Shaikh, Mohd; Arif Siddiqui, M.
2018-05-01
The tribological performance of aluminium hybrid composites reinforced with micro SiC (5 wt%) and nano zirconia (0, 3, 6 and 9 wt%) fabricated through powder metallurgy technique were investigated using statistical and artificial neural network (ANN) approach. The influence of zirconia reinforcement, sliding distance and applied load were analyzed with test based on full factorial design of experiments. Analysis of variance (ANOVA) was used to evaluate the percentage contribution of each process parameters on wear loss. ANOVA approach suggested that wear loss be mainly influenced by sliding distance followed by zirconia reinforcement and applied load. Further, a feed forward back propagation neural network was applied on input/output date for predicting and analyzing the wear behaviour of fabricated composite. A very close correlation between experimental and ANN output were achieved by implementing the model. Finally, ANN model was effectively used to find the influence of various control factors on wear behaviour of hybrid composites.
A ranking index for quality assessment of forensic DNA profiles forensic DNA profiles
2010-01-01
Background Assessment of DNA profile quality is vital in forensic DNA analysis, both in order to determine the evidentiary value of DNA results and to compare the performance of different DNA analysis protocols. Generally the quality assessment is performed through manual examination of the DNA profiles based on empirical knowledge, or by comparing the intensities (allelic peak heights) of the capillary electrophoresis electropherograms. Results We recently developed a ranking index for unbiased and quantitative quality assessment of forensic DNA profiles, the forensic DNA profile index (FI) (Hedman et al. Improved forensic DNA analysis through the use of alternative DNA polymerases and statistical modeling of DNA profiles, Biotechniques 47 (2009) 951-958). FI uses electropherogram data to combine the intensities of the allelic peaks with the balances within and between loci, using Principal Components Analysis. Here we present the construction of FI. We explain the mathematical and statistical methodologies used and present details about the applied data reduction method. Thereby we show how to adapt the ranking index for any Short Tandem Repeat-based forensic DNA typing system through validation against a manual grading scale and calibration against a specific set of DNA profiles. Conclusions The developed tool provides unbiased quality assessment of forensic DNA profiles. It can be applied for any DNA profiling system based on Short Tandem Repeat markers. Apart from crime related DNA analysis, FI can therefore be used as a quality tool in paternal or familial testing as well as in disaster victim identification. PMID:21062433
Van Bockstaele, Femke; Janssens, Ann; Piette, Anne; Callewaert, Filip; Pede, Valerie; Offner, Fritz; Verhasselt, Bruno; Philippé, Jan
2006-07-15
ZAP-70 has been proposed as a surrogate marker for immunoglobulin heavy-chain variable region (IgV(H)) mutation status, which is known as a prognostic marker in B-cell chronic lymphocytic leukemia (CLL). The flow cytometric analysis of ZAP-70 suffers from difficulties in standardization and interpretation. We applied the Kolmogorov-Smirnov (KS) statistical test to make analysis more straightforward. We examined ZAP-70 expression by flow cytometry in 53 patients with CLL. Analysis was performed as initially described by Crespo et al. (New England J Med 2003; 348:1764-1775) and alternatively by application of the KS statistical test comparing T cells with B cells. Receiver-operating-characteristics (ROC)-curve analyses were performed to determine the optimal cut-off values for ZAP-70 measured by the two approaches. ZAP-70 protein expression was compared with ZAP-70 mRNA expression measured by a quantitative PCR (qPCR) and with the IgV(H) mutation status. Both flow cytometric analyses correlated well with the molecular technique and proved to be of equal value in predicting the IgV(H) mutation status. Applying the KS test is reproducible, simple, straightforward, and overcomes a number of difficulties encountered in the Crespo-method. The KS statistical test is an essential part of the software delivered with modern routine analytical flow cytometers and is well suited for analysis of ZAP-70 expression in CLL. (c) 2006 International Society for Analytical Cytology.
ERIC Educational Resources Information Center
Wolf, Katharina
2015-01-01
Industry placements are popular means to provide students with an opportunity to apply their skills, knowledge and experience in a "real world" setting. Within this context, supervisor feedback allows educators to measure students' performance beyond academic objectives, by benchmarking it against industry expectations. However, industry…
In this study, the concept of scale analysis is applied to evaluate two state-of-science meteorological models, namely MM5 and RAMS3b, currently being used to drive regional-scale air quality models. To this end, seasonal time series of observations and predictions for temperatur...
40 CFR 1065.275 - N2O measurement devices.
Code of Federal Regulations, 2010 CFR
2010-07-01
... for interpretation of infrared spectra. For example, EPA Test Method 320 is considered a valid method... and length to achieve adequate resolution of the N2O peak for analysis. Examples of acceptable columns....550(b) that would otherwise apply. For example, you may perform a span gas measurement before and...
40 CFR 1065.275 - N2O measurement devices.
Code of Federal Regulations, 2011 CFR
2011-07-01
... for interpretation of infrared spectra. For example, EPA Test Method 320 is considered a valid method... and length to achieve adequate resolution of the N2O peak for analysis. Examples of acceptable columns....550(b) that would otherwise apply. For example, you may perform a span gas measurement before and...
The Role of Group Interaction in Collective Efficacy and CSCL Performance
ERIC Educational Resources Information Center
Wang, Shu-Ling; Hsu, Hsien-Yuan; Lin, Sunny S. J.; Hwang, Gwo-Jen
2014-01-01
Although research has identified the importance of interaction behaviors in computer-supported collaborative learning (CSCL), very few attempts have been made to carry out in-depth analysis of interaction behaviors. This study thus applies both qualitative (e.g., content analyses, interviews) and quantitative methods in an attempt to investigate…
10 CFR 431.445 - Determination of small electric motor efficiency.
Code of Federal Regulations, 2010 CFR
2010-01-01
... determined either by testing in accordance with § 431.444 of this subpart, or by application of an... method. An AEDM applied to a basic model must be: (i) Derived from a mathematical model that represents... statistical analysis, computer simulation or modeling, or other analytic evaluation of performance data. (3...
The Relationship among Correct and Error Oral Reading Rates and Comprehension.
ERIC Educational Resources Information Center
Roberts, Michael; Smith, Deborah Deutsch
1980-01-01
Eight learning disabled boys (10 to 12 years old) who were seriously deficient in both their oral reading and comprehension performances participated in the study which investigated, through an applied behavior analysis model, the interrelationships of three reading variables--correct oral reading rates, error oral reading rates, and percentage of…
The Stability of Rater Severity in Large-Scale Assessment Programs.
ERIC Educational Resources Information Center
Congdon, Peter J.; McQueen, Joy
2000-01-01
Studied the stability of rater severity over an extended rating period by applying multifaceted Rasch analysis to ratings of 16 raters of writing performances of 8,285 elementary school students. Findings cast doubt on the practice of using a single calibration of rate severity as the basis for adjustment of person measures. (SLD)
Analyzing seasonal patterns of wildfire exposure factors in Sardinia, Italy
Michele Salis; Alan A. Ager; Fermin J. Alcasena; Bachisio Arca; Mark A. Finney; Grazia Pellizzaro; Donatella Spano
2015-01-01
In this paper, we applied landscape scale wildfire simulation modeling to explore the spatiotemporal patterns of wildfire likelihood and intensity in the island of Sardinia (Italy). We also performed wildfire exposure analysis for selected highly valued resources on the island to identify areas characterized by high risk. We observed substantial variation in burn...
ERIC Educational Resources Information Center
Texas State Technical Coll., Waco.
A project was conducted to determine if interactive video programs could produce positive results in literacy programs. During the project, staff from a technical college developed a task analysis, curriculum, and evaluation measures for the training of facilities maintenance workers in mathematical concepts. From this activity, an instructional…
ERIC Educational Resources Information Center
Kalayci, Nurdan; Cimen, Orhan
2012-01-01
The aim of this study is to examine the questionnaires used to evaluate teaching performance in higher education institutes and called "Instructor and Course Evaluation Questionnaires (ICEQ)" in terms of questionnaire preparation techniques and components of curriculum. Obtaining at least one ICEQ belonging to any state and private…
Informing Instruction of Students with Autism in Public School Settings
ERIC Educational Resources Information Center
Kuo, Nai-Cheng
2016-01-01
The number of applied behavior analysis (ABA) classrooms for students with autism is increasing in K-12 public schools. To inform instruction of students with autism in public school settings, this study examined the relation between performance on mastery learning assessments and standardized achievement tests for students with autism spectrum…
ERIC Educational Resources Information Center
Brackney, Ryan J.; Cheung, Timothy H. C.; Neisewander, Janet L.; Sanabria, Federico
2011-01-01
Dissociating motoric and motivational effects of pharmacological manipulations on operant behavior is a substantial challenge. To address this problem, we applied a response-bout analysis to data from rats trained to lever press for sucrose on variable-interval (VI) schedules of reinforcement. Motoric, motivational, and schedule factors (effort…
Using Home Irrigation Users' Perceptions to Inform Water Conservation Programs
ERIC Educational Resources Information Center
Warner, Laura A.; Chaudhary, Anil Kumar; Lamm, Alexa J.; Rumble, Joy N.; Momol, Esen
2017-01-01
Targeted agricultural education programs can play a role in solving complex water issues. This article applies importance-performance analysis to examine dimensions of water resources that may inform local water conservation campaigns in the United States. The purpose of this study was to generate a deep understanding of home irrigation users'…
Automating Risk Analysis of Software Design Models
Ruiz, Guifré; Heymann, Elisa; César, Eduardo; Miller, Barton P.
2014-01-01
The growth of the internet and networked systems has exposed software to an increased amount of security threats. One of the responses from software developers to these threats is the introduction of security activities in the software development lifecycle. This paper describes an approach to reduce the need for costly human expertise to perform risk analysis in software, which is common in secure development methodologies, by automating threat modeling. Reducing the dependency on security experts aims at reducing the cost of secure development by allowing non-security-aware developers to apply secure development with little to no additional cost, making secure development more accessible. To automate threat modeling two data structures are introduced, identification trees and mitigation trees, to identify threats in software designs and advise mitigation techniques, while taking into account specification requirements and cost concerns. These are the components of our model for automated threat modeling, AutSEC. We validated AutSEC by implementing it in a tool based on data flow diagrams, from the Microsoft security development methodology, and applying it to VOMS, a grid middleware component, to evaluate our model's performance. PMID:25136688
Automating risk analysis of software design models.
Frydman, Maxime; Ruiz, Guifré; Heymann, Elisa; César, Eduardo; Miller, Barton P
2014-01-01
The growth of the internet and networked systems has exposed software to an increased amount of security threats. One of the responses from software developers to these threats is the introduction of security activities in the software development lifecycle. This paper describes an approach to reduce the need for costly human expertise to perform risk analysis in software, which is common in secure development methodologies, by automating threat modeling. Reducing the dependency on security experts aims at reducing the cost of secure development by allowing non-security-aware developers to apply secure development with little to no additional cost, making secure development more accessible. To automate threat modeling two data structures are introduced, identification trees and mitigation trees, to identify threats in software designs and advise mitigation techniques, while taking into account specification requirements and cost concerns. These are the components of our model for automated threat modeling, AutSEC. We validated AutSEC by implementing it in a tool based on data flow diagrams, from the Microsoft security development methodology, and applying it to VOMS, a grid middleware component, to evaluate our model's performance.
Guidelines for reporting and using prediction tools for genetic variation analysis.
Vihinen, Mauno
2013-02-01
Computational prediction methods are widely used for the analysis of human genome sequence variants and their effects on gene/protein function, splice site aberration, pathogenicity, and disease risk. New methods are frequently developed. We believe that guidelines are essential for those writing articles about new prediction methods, as well as for those applying these tools in their research, so that the necessary details are reported. This will enable readers to gain the full picture of technical information, performance, and interpretation of results, and to facilitate comparisons of related methods. Here, we provide instructions on how to describe new methods, report datasets, and assess the performance of predictive tools. We also discuss what details of predictor implementation are essential for authors to understand. Similarly, these guidelines for the use of predictors provide instructions on what needs to be delineated in the text, as well as how researchers can avoid unwarranted conclusions. They are applicable to most prediction methods currently utilized. By applying these guidelines, authors will help reviewers, editors, and readers to more fully comprehend prediction methods and their use. © 2012 Wiley Periodicals, Inc.
Reliability considerations for the total strain range version of strainrange partitioning
NASA Technical Reports Server (NTRS)
Wirsching, P. H.; Wu, Y. T.
1984-01-01
A proposed total strainrange version of strainrange partitioning (SRP) to enhance the manner in which SRP is applied to life prediction is considered with emphasis on how advanced reliability technology can be applied to perform risk analysis and to derive safety check expressions. Uncertainties existing in the design factors associated with life prediction of a component which experiences the combined effects of creep and fatigue can be identified. Examples illustrate how reliability analyses of such a component can be performed when all design factors in the SRP model are random variables reflecting these uncertainties. The Rackwitz-Fiessler and Wu algorithms are used and estimates of the safety index and the probablity of failure are demonstrated for a SRP problem. Methods of analysis of creep-fatigue data with emphasis on procedures for producing synoptic statistics are presented. An attempt to demonstrate the importance of the contribution of the uncertainties associated with small sample sizes (fatique data) to risk estimates is discussed. The procedure for deriving a safety check expression for possible use in a design criteria document is presented.
Enhancing coronary Wave Intensity Analysis robustness by high order central finite differences.
Rivolo, Simone; Asrress, Kaleab N; Chiribiri, Amedeo; Sammut, Eva; Wesolowski, Roman; Bloch, Lars Ø; Grøndal, Anne K; Hønge, Jesper L; Kim, Won Y; Marber, Michael; Redwood, Simon; Nagel, Eike; Smith, Nicolas P; Lee, Jack
2014-09-01
Coronary Wave Intensity Analysis (cWIA) is a technique capable of separating the effects of proximal arterial haemodynamics from cardiac mechanics. Studies have identified WIA-derived indices that are closely correlated with several disease processes and predictive of functional recovery following myocardial infarction. The cWIA clinical application has, however, been limited by technical challenges including a lack of standardization across different studies and the derived indices' sensitivity to the processing parameters. Specifically, a critical step in WIA is the noise removal for evaluation of derivatives of the acquired signals, typically performed by applying a Savitzky-Golay filter, to reduce the high frequency acquisition noise. The impact of the filter parameter selection on cWIA output, and on the derived clinical metrics (integral areas and peaks of the major waves), is first analysed. The sensitivity analysis is performed either by using the filter as a differentiator to calculate the signals' time derivative or by applying the filter to smooth the ensemble-averaged waveforms. Furthermore, the power-spectrum of the ensemble-averaged waveforms contains little high-frequency components, which motivated us to propose an alternative approach to compute the time derivatives of the acquired waveforms using a central finite difference scheme. The cWIA output and consequently the derived clinical metrics are significantly affected by the filter parameters, irrespective of its use as a smoothing filter or a differentiator. The proposed approach is parameter-free and, when applied to the 10 in-vivo human datasets and the 50 in-vivo animal datasets, enhances the cWIA robustness by significantly reducing the outcome variability (by 60%).
A framework for characterizing eHealth literacy demands and barriers.
Chan, Connie V; Kaufman, David R
2011-11-17
Consumer eHealth interventions are of a growing importance in the individual management of health and health behaviors. However, a range of access, resources, and skills barriers prevent health care consumers from fully engaging in and benefiting from the spectrum of eHealth interventions. Consumers may engage in a range of eHealth tasks, such as participating in health discussion forums and entering information into a personal health record. eHealth literacy names a set of skills and knowledge that are essential for productive interactions with technology-based health tools, such as proficiency in information retrieval strategies, and communicating health concepts effectively. We propose a theoretical and methodological framework for characterizing complexity of eHealth tasks, which can be used to diagnose and describe literacy barriers and inform the development of solution strategies. We adapted and integrated two existing theoretical models relevant to the analysis of eHealth literacy into a single framework to systematically categorize and describe task demands and user performance on tasks needed by health care consumers in the information age. The method derived from the framework is applied to (1) code task demands using a cognitive task analysis, and (2) code user performance on tasks. The framework and method are applied to the analysis of a Web-based consumer eHealth task with information-seeking and decision-making demands. We present the results from the in-depth analysis of the task performance of a single user as well as of 20 users on the same task to illustrate both the detailed analysis and the aggregate measures obtained and potential analyses that can be performed using this method. The analysis shows that the framework can be used to classify task demands as well as the barriers encountered in user performance of the tasks. Our approach can be used to (1) characterize the challenges confronted by participants in performing the tasks, (2) determine the extent to which application of the framework to the cognitive task analysis can predict and explain the problems encountered by participants, and (3) inform revisions to the framework to increase accuracy of predictions. The results of this illustrative application suggest that the framework is useful for characterizing task complexity and for diagnosing and explaining barriers encountered in task completion. The framework and analytic approach can be a potentially powerful generative research platform to inform development of rigorous eHealth examination and design instruments, such as to assess eHealth competence, to design and evaluate consumer eHealth tools, and to develop an eHealth curriculum.
A Framework for Characterizing eHealth Literacy Demands and Barriers
Chan, Connie V
2011-01-01
Background Consumer eHealth interventions are of a growing importance in the individual management of health and health behaviors. However, a range of access, resources, and skills barriers prevent health care consumers from fully engaging in and benefiting from the spectrum of eHealth interventions. Consumers may engage in a range of eHealth tasks, such as participating in health discussion forums and entering information into a personal health record. eHealth literacy names a set of skills and knowledge that are essential for productive interactions with technology-based health tools, such as proficiency in information retrieval strategies, and communicating health concepts effectively. Objective We propose a theoretical and methodological framework for characterizing complexity of eHealth tasks, which can be used to diagnose and describe literacy barriers and inform the development of solution strategies. Methods We adapted and integrated two existing theoretical models relevant to the analysis of eHealth literacy into a single framework to systematically categorize and describe task demands and user performance on tasks needed by health care consumers in the information age. The method derived from the framework is applied to (1) code task demands using a cognitive task analysis, and (2) code user performance on tasks. The framework and method are applied to the analysis of a Web-based consumer eHealth task with information-seeking and decision-making demands. We present the results from the in-depth analysis of the task performance of a single user as well as of 20 users on the same task to illustrate both the detailed analysis and the aggregate measures obtained and potential analyses that can be performed using this method. Results The analysis shows that the framework can be used to classify task demands as well as the barriers encountered in user performance of the tasks. Our approach can be used to (1) characterize the challenges confronted by participants in performing the tasks, (2) determine the extent to which application of the framework to the cognitive task analysis can predict and explain the problems encountered by participants, and (3) inform revisions to the framework to increase accuracy of predictions. Conclusions The results of this illustrative application suggest that the framework is useful for characterizing task complexity and for diagnosing and explaining barriers encountered in task completion. The framework and analytic approach can be a potentially powerful generative research platform to inform development of rigorous eHealth examination and design instruments, such as to assess eHealth competence, to design and evaluate consumer eHealth tools, and to develop an eHealth curriculum. PMID:22094891
NASA Astrophysics Data System (ADS)
Peiris, T. S. G.; Nanayakkara, K. A. D. S. A.
2017-09-01
Mathematics plays a key role in engineering sciences as it assists to develop the intellectual maturity and analytical thinking of engineering students and exploring the student academic performance has received great attention recently. The lack of control over covariates motivates the need for their adjustment when measuring the degree of association between two sets of variables in Canonical Correlation Analysis (CCA). Thus to examine the individual effects of mathematics in Level 1 and Level 2 on engineering performance in Level 2, two adjusted analyses in CCA: Part CCA and Partial CCA were applied for the raw marks of engineering undergraduates for three different disciplines, at the Faculty of Engineering, University of Moratuwa, Sri Lanka. The joint influence of mathematics in Level 1 and Level 2 is significant on engineering performance in Level 2 irrespective of the engineering disciplines. The individual effect of mathematics in Level 2 is significantly higher compared to the individual effect of mathematics in Level 1 on engineering performance in Level 2. Furthermore, the individual effect of mathematics in Level 1 can be negligible. But, there would be a notable indirect effect of mathematics in Level 1 on engineering performance in Level 2. It can be concluded that the joint effect of mathematics in both Level 1 and Level 2 is immensely beneficial to improve the overall academic performance at the end of Level 2 of the engineering students. Furthermore, it was found that the impact mathematics varies among engineering disciplines. As partial CCA and partial CCA are not widely explored in applied work, it is recommended to use these techniques for various applications.
Cluster Correspondence Analysis.
van de Velden, M; D'Enza, A Iodice; Palumbo, F
2017-03-01
A method is proposed that combines dimension reduction and cluster analysis for categorical data by simultaneously assigning individuals to clusters and optimal scaling values to categories in such a way that a single between variance maximization objective is achieved. In a unified framework, a brief review of alternative methods is provided and we show that the proposed method is equivalent to GROUPALS applied to categorical data. Performance of the methods is appraised by means of a simulation study. The results of the joint dimension reduction and clustering methods are compared with the so-called tandem approach, a sequential analysis of dimension reduction followed by cluster analysis. The tandem approach is conjectured to perform worse when variables are added that are unrelated to the cluster structure. Our simulation study confirms this conjecture. Moreover, the results of the simulation study indicate that the proposed method also consistently outperforms alternative joint dimension reduction and clustering methods.
NASA Technical Reports Server (NTRS)
Chatterjee, Sharmista
1993-01-01
Our first goal in this project was to perform a systems analysis of a closed loop Environmental Control Life Support System (ECLSS). This pertains to the development of a model of an existing real system from which to assess the state or performance of the existing system. Systems analysis is applied to conceptual models obtained from a system design effort. For our modelling purposes we used a simulator tool called ASPEN (Advanced System for Process Engineering). Our second goal was to evaluate the thermodynamic efficiency of the different components comprising an ECLSS. Use is made of the second law of thermodynamics to determine the amount of irreversibility of energy loss of each component. This will aid design scientists in selecting the components generating the least entropy, as our penultimate goal is to keep the entropy generation of the whole system at a minimum.
NASA Astrophysics Data System (ADS)
Feng, Jianjun; Li, Chengzhe; Wu, Zhi
2017-08-01
As an important part of the valve opening and closing controller in engine, camshaft has high machining accuracy requirement in designing. Taking the high-speed camshaft grinder spindle system as the research object and the spindle system performance as the optimizing target, this paper firstly uses Solidworks to establish the three-dimensional finite element model (FEM) of spindle system, then conducts static analysis and the modal analysis by applying the established FEM in ANSYS Workbench, and finally uses the design optimization function of the ANSYS Workbench to optimize the structure parameter in the spindle system. The study results prove that the design of the spindle system fully meets the production requirements, and the performance of the optimized spindle system is promoted. Besides, this paper provides an analysis and optimization method for other grinder spindle systems.
Statistical shape analysis using 3D Poisson equation--A quantitatively validated approach.
Gao, Yi; Bouix, Sylvain
2016-05-01
Statistical shape analysis has been an important area of research with applications in biology, anatomy, neuroscience, agriculture, paleontology, etc. Unfortunately, the proposed methods are rarely quantitatively evaluated, and as shown in recent studies, when they are evaluated, significant discrepancies exist in their outputs. In this work, we concentrate on the problem of finding the consistent location of deformation between two population of shapes. We propose a new shape analysis algorithm along with a framework to perform a quantitative evaluation of its performance. Specifically, the algorithm constructs a Signed Poisson Map (SPoM) by solving two Poisson equations on the volumetric shapes of arbitrary topology, and statistical analysis is then carried out on the SPoMs. The method is quantitatively evaluated on synthetic shapes and applied on real shape data sets in brain structures. Copyright © 2016 Elsevier B.V. All rights reserved.
High-performance parallel analysis of coupled problems for aircraft propulsion
NASA Technical Reports Server (NTRS)
Felippa, C. A.; Farhat, C.; Lanteri, S.; Maman, N.; Piperno, S.; Gumaste, U.
1994-01-01
This research program deals with the application of high-performance computing methods for the analysis of complete jet engines. We have entitled this program by applying the two dimensional parallel aeroelastic codes to the interior gas flow problem of a bypass jet engine. The fluid mesh generation, domain decomposition, and solution capabilities were successfully tested. We then focused attention on methodology for the partitioned analysis of the interaction of the gas flow with a flexible structure and with the fluid mesh motion that results from these structural displacements. This is treated by a new arbitrary Lagrangian-Eulerian (ALE) technique that models the fluid mesh motion as that of a fictitious mass-spring network. New partitioned analysis procedures to treat this coupled three-component problem are developed. These procedures involved delayed corrections and subcycling. Preliminary results on the stability, accuracy, and MPP computational efficiency are reported.
NASA Technical Reports Server (NTRS)
Simpson, Timothy W.
1998-01-01
The use of response surface models and kriging models are compared for approximating non-random, deterministic computer analyses. After discussing the traditional response surface approach for constructing polynomial models for approximation, kriging is presented as an alternative statistical-based approximation method for the design and analysis of computer experiments. Both approximation methods are applied to the multidisciplinary design and analysis of an aerospike nozzle which consists of a computational fluid dynamics model and a finite element analysis model. Error analysis of the response surface and kriging models is performed along with a graphical comparison of the approximations. Four optimization problems are formulated and solved using both approximation models. While neither approximation technique consistently outperforms the other in this example, the kriging models using only a constant for the underlying global model and a Gaussian correlation function perform as well as the second order polynomial response surface models.
Pellegrini, Manuela; Rotolo, Maria Concetta; Busardò, Francesco Paolo; Pacifici, Roberta; Pichini, Simona
2017-01-01
Background: Recently, a large amount of physical and sexual performance enhancing products have started to be freely sold mainly on internet web sites as dietary supplements. However, there a high suspicion that pharmacologically active substance, prohibited in these products, can be present to provide the expected effect. Methods: A simple and rapid systematic toxicological analysis by gas chromatography-mass spectrometry and liquid chromatography-tandem mass spectrometry has been applied after a liquid-liquid extraction at acidic, neutral and alkaline pH with chloroform-isopropanol (9:1 v/v). The assays were validated in the range from 10 mg to 250 mg/g products showing a good linearity for the calibration curves (r2 ≥0.99). Mean extraction recoveries of analytes from different products were always higher than 90% and intra-assay and inter-assay precision and accuracy were always better than 15%. Results: The developed method was applied to the analysis of products with a high percentage of sales in websites and smart and sexy shops. In twelve of eighty supplements, anabolic steroids, anti-estrogenic drugs, psychoactive substances and sildenafil and analogs were identified and quantified. Conclusion: Eventual health hazards caused by the hidden presence of pharmacologically active substances in physical and sexual performance enhancing products are reported. PMID:27799033
Solid oxide fuel cell simulation and design optimization with numerical adjoint techniques
NASA Astrophysics Data System (ADS)
Elliott, Louie C.
This dissertation reports on the application of numerical optimization techniques as applied to fuel cell simulation and design. Due to the "multi-physics" inherent in a fuel cell, which results in a highly coupled and non-linear behavior, an experimental program to analyze and improve the performance of fuel cells is extremely difficult. This program applies new optimization techniques with computational methods from the field of aerospace engineering to the fuel cell design problem. After an overview of fuel cell history, importance, and classification, a mathematical model of solid oxide fuel cells (SOFC) is presented. The governing equations are discretized and solved with computational fluid dynamics (CFD) techniques including unstructured meshes, non-linear solution methods, numerical derivatives with complex variables, and sensitivity analysis with adjoint methods. Following the validation of the fuel cell model in 2-D and 3-D, the results of the sensitivity analysis are presented. The sensitivity derivative for a cost function with respect to a design variable is found with three increasingly sophisticated techniques: finite difference, direct differentiation, and adjoint. A design cycle is performed using a simple optimization method to improve the value of the implemented cost function. The results from this program could improve fuel cell performance and lessen the world's dependence on fossil fuels.
Chen, Karen Hui-Jung
2017-12-01
Evaluation capacity building (ECB) is a context-dependent process. Contextual factors affecting ECB implementation have been explored theoretically and practically, but their influence within a changing environment has seldom been discussed. This study examined essential context-sensitive parameters, particularly those involved in implementing new governmental policies regarding higher education. Taiwan was used as a case study for exploring the effect of contextual change on ECB attributes from the perspectives of training receivers and providers. Surveys and interviews were used for data collection and importance-performance analysis was applied for data analysis. Four prominent features were identified. First, the ECB attributes perceived as important by receivers were performed adequately, whereas those perceived as less important were performed less well. Second, under new policies, training provider designed training covering a wide range of ECB, whereas receivers focused on those can be directly applied in evaluation process. Third, in a small education system such as Taiwan's, the complexity of peer review is high and ethical issues become important. Fourth, because the evaluation structure has been changed from single- to dual-track, receivers expect more training for institution staff, whereas providers insist on hierarchical training. Aligning ECB supply and needs is paramount for adaptation to new policies. Copyright © 2017 Elsevier Ltd. All rights reserved.
Allan Cheyne, J; Solman, Grayden J F; Carriere, Jonathan S A; Smilek, Daniel
2009-04-01
We present arguments and evidence for a three-state attentional model of task engagement/disengagement. The model postulates three states of mind-wandering: occurrent task inattention, generic task inattention, and response disengagement. We hypothesize that all three states are both causes and consequences of task performance outcomes and apply across a variety of experimental and real-world tasks. We apply this model to the analysis of a widely used GO/NOGO task, the Sustained Attention to Response Task (SART). We identify three performance characteristics of the SART that map onto the three states of the model: RT variability, anticipations, and omissions. Predictions based on the model are tested, and largely corroborated, via regression and lag-sequential analyses of both successful and unsuccessful withholding on NOGO trials as well as self-reported mind-wandering and everyday cognitive errors. The results revealed theoretically consistent temporal associations among the state indicators and between these and SART errors as well as with self-report measures. Lag analysis was consistent with the hypotheses that temporal transitions among states are often extremely abrupt and that the association between mind-wandering and performance is bidirectional. The bidirectional effects suggest that errors constitute important occasions for reactive mind-wandering. The model also enables concrete phenomenological, behavioral, and physiological predictions for future research.
Matysiak, W; Królikowska-Prasał, I; Staszyc, J; Kifer, E; Romanowska-Sarlej, J
1989-01-01
The studies were performed on 44 white female Wistar rats which were intratracheally administered the suspension of the soil dust and the electro-energetic ashes. The electro-energetic ashes were collected from 6 different local heat and power generating plants while the soil dust from several random places of our country. The statistical analysis of the body and the lung mass of the animals subjected to the single dust and ash insufflation was performed. The applied variants proved the statistically significant differences between the body and the lung mass. The observed differences are connected with the kinds of dust and ash used in the experiment.
NASA Technical Reports Server (NTRS)
Kwon, Youngwoo; Pavlidis, Dimitris; Tutt, Marcel N.
1991-01-01
A large-signal analysis method based on an harmonic balance technique and a 2-D cubic spline interpolation function has been developed and applied to the prediction of InP-based HEMT oscillator performance for frequencies extending up to the submillimeter-wave range. The large-signal analysis method uses a limited number of DC and small-signal S-parameter data and allows the accurate characterization of HEMT large-signal behavior. The method has been validated experimentally using load-pull measurement. Oscillation frequency, power performance, and load requirements are discussed, with an operation capability of 300 GHz predicted using state-of-the-art devices (fmax is approximately equal to 450 GHz).
Iris recognition based on robust principal component analysis
NASA Astrophysics Data System (ADS)
Karn, Pradeep; He, Xiao Hai; Yang, Shuai; Wu, Xiao Hong
2014-11-01
Iris images acquired under different conditions often suffer from blur, occlusion due to eyelids and eyelashes, specular reflection, and other artifacts. Existing iris recognition systems do not perform well on these types of images. To overcome these problems, we propose an iris recognition method based on robust principal component analysis. The proposed method decomposes all training images into a low-rank matrix and a sparse error matrix, where the low-rank matrix is used for feature extraction. The sparsity concentration index approach is then applied to validate the recognition result. Experimental results using CASIA V4 and IIT Delhi V1iris image databases showed that the proposed method achieved competitive performances in both recognition accuracy and computational efficiency.
Local and Global Gestalt Laws: A Neurally Based Spectral Approach.
Favali, Marta; Citti, Giovanna; Sarti, Alessandro
2017-02-01
This letter presents a mathematical model of figure-ground articulation that takes into account both local and global gestalt laws and is compatible with the functional architecture of the primary visual cortex (V1). The local gestalt law of good continuation is described by means of suitable connectivity kernels that are derived from Lie group theory and quantitatively compared with long-range connectivity in V1. Global gestalt constraints are then introduced in terms of spectral analysis of a connectivity matrix derived from these kernels. This analysis performs grouping of local features and individuates perceptual units with the highest salience. Numerical simulations are performed, and results are obtained by applying the technique to a number of stimuli.