Model-Based Linkage Analysis of a Quantitative Trait.
Song, Yeunjoo E; Song, Sunah; Schnell, Audrey H
2017-01-01
Linkage Analysis is a family-based method of analysis to examine whether any typed genetic markers cosegregate with a given trait, in this case a quantitative trait. If linkage exists, this is taken as evidence in support of a genetic basis for the trait. Historically, linkage analysis was performed using a binary disease trait, but has been extended to include quantitative disease measures. Quantitative traits are desirable as they provide more information than binary traits. Linkage analysis can be performed using single-marker methods (one marker at a time) or multipoint (using multiple markers simultaneously). In model-based linkage analysis the genetic model for the trait of interest is specified. There are many software options for performing linkage analysis. Here, we use the program package Statistical Analysis for Genetic Epidemiology (S.A.G.E.). S.A.G.E. was chosen because it also includes programs to perform data cleaning procedures and to generate and test genetic models for a quantitative trait, in addition to performing linkage analysis. We demonstrate in detail the process of running the program LODLINK to perform single-marker analysis, and MLOD to perform multipoint analysis using output from SEGREG, where SEGREG was used to determine the best fitting statistical model for the trait.
NASA Technical Reports Server (NTRS)
1972-01-01
The tug design and performance data base for the economic analysis of space tug operation are presented. A compendium of the detailed design and performance information from the data base is developed. The design data are parametric across a range of reusable space tug sizes. The performance curves are generated for selected point designs of expendable orbit injection stages and reusable tugs. Data are presented in the form of graphs for various modes of operation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, Yousu; Huang, Zhenyu; Chavarría-Miranda, Daniel
Contingency analysis is a key function in the Energy Management System (EMS) to assess the impact of various combinations of power system component failures based on state estimation. Contingency analysis is also extensively used in power market operation for feasibility test of market solutions. High performance computing holds the promise of faster analysis of more contingency cases for the purpose of safe and reliable operation of today’s power grids with less operating margin and more intermittent renewable energy sources. This paper evaluates the performance of counter-based dynamic load balancing schemes for massive contingency analysis under different computing environments. Insights frommore » the performance evaluation can be used as guidance for users to select suitable schemes in the application of massive contingency analysis. Case studies, as well as MATLAB simulations, of massive contingency cases using the Western Electricity Coordinating Council power grid model are presented to illustrate the application of high performance computing with counter-based dynamic load balancing schemes.« less
Performance analysis of mini-propellers based on FlightGear
NASA Astrophysics Data System (ADS)
Vogeltanz, Tomáš
2016-06-01
This paper presents a performance analysis of three mini-propellers based on the FlightGear flight simulator. Although a basic propeller analysis has to be performed before the use of FlightGear, for a complex and more practical performance analysis, it is advantageous to use a propeller model in cooperation with a particular aircraft model. This approach may determine whether the propeller has sufficient quality in respect of aircraft requirements. In the first section, the software used for the analysis is illustrated. Then, the parameters of the analyzed mini-propellers and the tested UAV are described. Finally, the main section shows and discusses the results of the performance analysis of the mini-propellers.
Analysis of swimming performance: perceptions and practices of US-based swimming coaches.
Mooney, Robert; Corley, Gavin; Godfrey, Alan; Osborough, Conor; Newell, John; Quinlan, Leo Richard; ÓLaighin, Gearóid
2016-01-01
In elite swimming, a broad range of methods are used to assess performance, inform coaching practices and monitor athletic progression. The aim of this paper was to examine the performance analysis practices of swimming coaches and to explore the reasons behind the decisions that coaches take when analysing performance. Survey data were analysed from 298 Level 3 competitive swimming coaches (245 male, 53 female) based in the United States. Results were compiled to provide a generalised picture of practices and perceptions and to examine key emerging themes. It was found that a disparity exists between the importance swim coaches place on biomechanical analysis of swimming performance and the types of analyses that are actually conducted. Video-based methods are most frequently employed, with over 70% of coaches using these methods at least monthly, with analyses being mainly qualitative in nature rather than quantitative. Barriers to the more widespread use of quantitative biomechanical analysis in elite swimming environments were explored. Constraints include time, cost and availability of resources, but other factors such as sources of information on swimming performance and analysis and control over service provision are also discussed, with particular emphasis on video-based methods and emerging sensor-based technologies.
ERIC Educational Resources Information Center
Zheng, Lanqin
2016-01-01
This meta-analysis examined research on the effects of self-regulated learning scaffolds on academic performance in computer-based learning environments from 2004 to 2015. A total of 29 articles met inclusion criteria and were included in the final analysis with a total sample size of 2,648 students. Moderator analyses were performed using a…
Analysis of a Rocket Based Combined Cycle Engine during Rocket Only Operation
NASA Technical Reports Server (NTRS)
Smith, T. D.; Steffen, C. J., Jr.; Yungster, S.; Keller, D. J.
1998-01-01
The all rocket mode of operation is a critical factor in the overall performance of a rocket based combined cycle (RBCC) vehicle. However, outside of performing experiments or a full three dimensional analysis, there are no first order parametric models to estimate performance. As a result, an axisymmetric RBCC engine was used to analytically determine specific impulse efficiency values based upon both full flow and gas generator configurations. Design of experiments methodology was used to construct a test matrix and statistical regression analysis was used to build parametric models. The main parameters investigated in this study were: rocket chamber pressure, rocket exit area ratio, percent of injected secondary flow, mixer-ejector inlet area, mixer-ejector area ratio, and mixer-ejector length-to-inject diameter ratio. A perfect gas computational fluid dynamics analysis was performed to obtain values of vacuum specific impulse. Statistical regression analysis was performed based on both full flow and gas generator engine cycles. Results were also found to be dependent upon the entire cycle assumptions. The statistical regression analysis determined that there were five significant linear effects, six interactions, and one second-order effect. Two parametric models were created to provide performance assessments of an RBCC engine in the all rocket mode of operation.
2011-09-01
Evaluation Process through Capabilities-Based Analysis 5. FUNDING NUMBERS 6. AUTHOR(S) Eric J. Lednicky 7. PERFORMING ORGANIZATION NAME(S) AND...ADDRESS(ES) Naval Postgraduate School Monterey, CA 93943-5000 8. PERFORMING ORGANIZATION REPORT NUMBER 9. SPONSORING /MONITORING AGENCY NAME(S...14 C. MEASURES OF EFFECTIVENESS / MEASURES OF PERFORMANCE
Peterson, Leif E
2002-01-01
CLUSFAVOR (CLUSter and Factor Analysis with Varimax Orthogonal Rotation) 5.0 is a Windows-based computer program for hierarchical cluster and principal-component analysis of microarray-based transcriptional profiles. CLUSFAVOR 5.0 standardizes input data; sorts data according to gene-specific coefficient of variation, standard deviation, average and total expression, and Shannon entropy; performs hierarchical cluster analysis using nearest-neighbor, unweighted pair-group method using arithmetic averages (UPGMA), or furthest-neighbor joining methods, and Euclidean, correlation, or jack-knife distances; and performs principal-component analysis. PMID:12184816
Li, Zhiming; Yu, Lan; Wang, Xin; Yu, Haiyang; Gao, Yuanxiang; Ren, Yande; Wang, Gang; Zhou, Xiaoming
2017-11-09
The purpose of this study was to investigate the diagnostic performance of mammographic texture analysis in the differential diagnosis of benign and malignant breast tumors. Digital mammography images were obtained from the Picture Archiving and Communication System at our institute. Texture features of mammographic images were calculated. Mann-Whitney U test was used to identify differences between the benign and malignant group. The receiver operating characteristic (ROC) curve analysis was used to assess the diagnostic performance of texture features. Significant differences of texture features of histogram, gray-level co-occurrence matrix (GLCM) and run length matrix (RLM) were found between the benign and malignant breast group (P < .05). The area under the ROC (AUROC) of histogram, GLCM, and RLM were 0.800, 0.787, and 0.761, with no differences between them (P > .05). The AUROCs of imaging-based diagnosis, texture analysis, and imaging-based diagnosis combined with texture analysis were 0.873, 0.863, and 0.961, respectively. When imaging-based diagnosis was combined with texture analysis, the AUROC was higher than that of imaging-based diagnosis or texture analysis (P < .05). Mammographic texture analysis is a reliable technique for differential diagnosis of benign and malignant breast tumors. Furthermore, the combination of imaging-based diagnosis and texture analysis can significantly improve diagnostic performance. Copyright © 2017 Elsevier Inc. All rights reserved.
NASA Technical Reports Server (NTRS)
1972-01-01
An economic analysis of space tug operations is presented. The subjects discussed are: (1) data base for orbit injection stages, (2) data base for reusable space tug, (3) performance equations, (4) data integration and interpretation, (5) tug performance and mission model accomodation, (6) total program cost, (7) payload analysis, (8) computer software, and (9) comparison of tug concepts.
Role of IAC in large space systems thermal analysis
NASA Technical Reports Server (NTRS)
Jones, G. K.; Skladany, J. T.; Young, J. P.
1982-01-01
Computer analysis programs to evaluate critical coupling effects that can significantly influence spacecraft system performance are described. These coupling effects arise from the varied parameters of the spacecraft systems, environments, and forcing functions associated with disciplines such as thermal, structures, and controls. Adverse effects can be expected to significantly impact system design aspects such as structural integrity, controllability, and mission performance. One such needed design analysis capability is a software system that can integrate individual discipline computer codes into a highly user-oriented/interactive-graphics-based analysis capability. The integrated analysis capability (IAC) system can be viewed as: a core framework system which serves as an integrating base whereby users can readily add desired analysis modules and as a self-contained interdisciplinary system analysis capability having a specific set of fully integrated multidisciplinary analysis programs that deal with the coupling of thermal, structures, controls, antenna radiation performance, and instrument optical performance disciplines.
NASA Astrophysics Data System (ADS)
Zafar, I.; Edirisinghe, E. A.; Acar, S.; Bez, H. E.
2007-02-01
Automatic vehicle Make and Model Recognition (MMR) systems provide useful performance enhancements to vehicle recognitions systems that are solely based on Automatic License Plate Recognition (ALPR) systems. Several car MMR systems have been proposed in literature. However these approaches are based on feature detection algorithms that can perform sub-optimally under adverse lighting and/or occlusion conditions. In this paper we propose a real time, appearance based, car MMR approach using Two Dimensional Linear Discriminant Analysis that is capable of addressing this limitation. We provide experimental results to analyse the proposed algorithm's robustness under varying illumination and occlusions conditions. We have shown that the best performance with the proposed 2D-LDA based car MMR approach is obtained when the eigenvectors of lower significance are ignored. For the given database of 200 car images of 25 different make-model classifications, a best accuracy of 91% was obtained with the 2D-LDA approach. We use a direct Principle Component Analysis (PCA) based approach as a benchmark to compare and contrast the performance of the proposed 2D-LDA approach to car MMR. We conclude that in general the 2D-LDA based algorithm supersedes the performance of the PCA based approach.
NASA Technical Reports Server (NTRS)
DeBonis, J. R.; Trefny, C. J.; Steffen, C. J., Jr.
1999-01-01
Design and analysis of the inlet for a rocket based combined cycle engine is discussed. Computational fluid dynamics was used in both the design and subsequent analysis. Reynolds averaged Navier-Stokes simulations were performed using both perfect gas and real gas assumptions. An inlet design that operates over the required Mach number range from 0 to 12 was produced. Performance data for cycle analysis was post processed using a stream thrust averaging technique. A detailed performance database for cycle analysis is presented. The effect ot vehicle forebody compression on air capture is also examined.
Ishikawa, Sohta A; Inagaki, Yuji; Hashimoto, Tetsuo
2012-01-01
In phylogenetic analyses of nucleotide sequences, 'homogeneous' substitution models, which assume the stationarity of base composition across a tree, are widely used, albeit individual sequences may bear distinctive base frequencies. In the worst-case scenario, a homogeneous model-based analysis can yield an artifactual union of two distantly related sequences that achieved similar base frequencies in parallel. Such potential difficulty can be countered by two approaches, 'RY-coding' and 'non-homogeneous' models. The former approach converts four bases into purine and pyrimidine to normalize base frequencies across a tree, while the heterogeneity in base frequency is explicitly incorporated in the latter approach. The two approaches have been applied to real-world sequence data; however, their basic properties have not been fully examined by pioneering simulation studies. Here, we assessed the performances of the maximum-likelihood analyses incorporating RY-coding and a non-homogeneous model (RY-coding and non-homogeneous analyses) on simulated data with parallel convergence to similar base composition. Both RY-coding and non-homogeneous analyses showed superior performances compared with homogeneous model-based analyses. Curiously, the performance of RY-coding analysis appeared to be significantly affected by a setting of the substitution process for sequence simulation relative to that of non-homogeneous analysis. The performance of a non-homogeneous analysis was also validated by analyzing a real-world sequence data set with significant base heterogeneity.
NASA Technical Reports Server (NTRS)
Smith, Timothy D.; Steffen, Christopher J., Jr.; Yungster, Shaye; Keller, Dennis J.
1998-01-01
The all rocket mode of operation is shown to be a critical factor in the overall performance of a rocket based combined cycle (RBCC) vehicle. An axisymmetric RBCC engine was used to determine specific impulse efficiency values based upon both full flow and gas generator configurations. Design of experiments methodology was used to construct a test matrix and multiple linear regression analysis was used to build parametric models. The main parameters investigated in this study were: rocket chamber pressure, rocket exit area ratio, injected secondary flow, mixer-ejector inlet area, mixer-ejector area ratio, and mixer-ejector length-to-inlet diameter ratio. A perfect gas computational fluid dynamics analysis, using both the Spalart-Allmaras and k-omega turbulence models, was performed with the NPARC code to obtain values of vacuum specific impulse. Results from the multiple linear regression analysis showed that for both the full flow and gas generator configurations increasing mixer-ejector area ratio and rocket area ratio increase performance, while increasing mixer-ejector inlet area ratio and mixer-ejector length-to-diameter ratio decrease performance. Increasing injected secondary flow increased performance for the gas generator analysis, but was not statistically significant for the full flow analysis. Chamber pressure was found to be not statistically significant.
Cooperation, Technology, and Performance: A Case Study.
ERIC Educational Resources Information Center
Cavanagh, Thomas; Dickenson, Sabrina; Brandt, Suzanne
1999-01-01
Describes the CTP (Cooperation, Technology, and Performance) model and explains how it is used by the Department of Veterans Affairs-Veteran's Benefit Administration (VBA) for training. Discusses task analysis; computer-based training; cooperative-based learning environments; technology-based learning; performance-assessment methods; courseware…
Performance analysis and dynamic modeling of a single-spool turbojet engine
NASA Astrophysics Data System (ADS)
Andrei, Irina-Carmen; Toader, Adrian; Stroe, Gabriela; Frunzulica, Florin
2017-01-01
The purposes of modeling and simulation of a turbojet engine are the steady state analysis and transient analysis. From the steady state analysis, which consists in the investigation of the operating, equilibrium regimes and it is based on appropriate modeling describing the operation of a turbojet engine at design and off-design regimes, results the performance analysis, concluded by the engine's operational maps (i.e. the altitude map, velocity map and speed map) and the engine's universal map. The mathematical model that allows the calculation of the design and off-design performances, in case of a single spool turbojet is detailed. An in house code was developed, its calibration was done for the J85 turbojet engine as the test case. The dynamic modeling of the turbojet engine is obtained from the energy balance equations for compressor, combustor and turbine, as the engine's main parts. The transient analysis, which is based on appropriate modeling of engine and its main parts, expresses the dynamic behavior of the turbojet engine, and further, provides details regarding the engine's control. The aim of the dynamic analysis is to determine a control program for the turbojet, based on the results provided by performance analysis. In case of the single-spool turbojet engine, with fixed nozzle geometry, the thrust is controlled by one parameter, which is the fuel flow rate. The design and management of the aircraft engine controls are based on the results of the transient analysis. The construction of the design model is complex, since it is based on both steady-state and transient analysis, further allowing the flight path cycle analysis and optimizations. This paper presents numerical simulations for a single-spool turbojet engine (J85 as test case), with appropriate modeling for steady-state and dynamic analysis.
NASA Technical Reports Server (NTRS)
Ruf, Joseph; Holt, James B.; Canabal, Francisco
1999-01-01
This paper presents the status of analyses on three Rocket Based Combined Cycle configurations underway in the Applied Fluid Dynamics Analysis Group (TD64). TD64 is performing computational fluid dynamics analysis on a Penn State RBCC test rig, the proposed Draco axisymmetric RBCC engine and the Trailblazer engine. The intent of the analysis on the Penn State test rig is to benchmark the Finite Difference Navier Stokes code for ejector mode fluid dynamics. The Draco engine analysis is a trade study to determine the ejector mode performance as a function of three engine design variables. The Trailblazer analysis is to evaluate the nozzle performance in scramjet mode. Results to date of each analysis are presented.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, Yousu; Huang, Zhenyu; Rice, Mark J.
Contingency analysis studies are necessary to assess the impact of possible power system component failures. The results of the contingency analysis are used to ensure the grid reliability, and in power market operation for the feasibility test of market solutions. Currently, these studies are performed in real time based on the current operating conditions of the grid with a set of pre-selected contingency list, which might result in overlooking some critical contingencies caused by variable system status. To have a complete picture of a power grid, more contingencies need to be studied to improve grid reliability. High-performance computing techniques holdmore » the promise of being able to perform the analysis for more contingency cases within a much shorter time frame. This paper evaluates the performance of counter-based dynamic load balancing schemes for a massive contingency analysis program on 10,000+ cores. One million N-2 contingency analysis cases with a Western Electricity Coordinating Council power grid model have been used to demonstrate the performance. The speedup of 3964 with 4096 cores and 7877 with 10240 cores are obtained. This paper reports the performance of the load balancing scheme with a single counter and two counters, describes disk I/O issues, and discusses other potential techniques for further improving the performance.« less
Evaluating Web-Based Nursing Education's Effects: A Systematic Review and Meta-Analysis.
Kang, Jiwon; Seomun, GyeongAe
2017-09-01
This systematic review and meta-analysis investigated whether using web-based nursing educational programs increases a participant's knowledge and clinical performance. We performed a meta-analysis of studies published between January 2000 and July 2016 and identified through RISS, CINAHL, ProQuest Central, Embase, the Cochrane Library, and PubMed. Eleven studies were eligible for inclusion in this analysis. The results of the meta-analysis demonstrated significant differences not only for the overall effect but also specifically for blended programs and short (2 weeks or 4 weeks) intervention periods. To present more evidence supporting the effectiveness of web-based nursing educational programs, further research is warranted.
Development of the performance confirmation program at YUCCA mountain, nevada
LeCain, G.D.; Barr, D.; Weaver, D.; Snell, R.; Goodin, S.W.; Hansen, F.D.
2006-01-01
The Yucca Mountain Performance Confirmation program consists of tests, monitoring activities, experiments, and analyses to evaluate the adequacy of assumptions, data, and analyses that form the basis of the conceptual and numerical models of flow and transport associated with a proposed radioactive waste repository at Yucca Mountain, Nevada. The Performance Confirmation program uses an eight-stage risk-informed, performance-based approach. Selection of the Performance Confirmation activities for inclusion in the Performance Confirmation program was done using a risk-informed performance-based decision analysis. The result of this analysis was a Performance Confirmation base portfolio that consists of 20 activities. The 20 Performance Confirmation activities include geologic, hydrologie, and construction/engineering testing. Some of the activities began during site characterization, and others will begin during construction, or post emplacement, and continue until repository closure.
Probabilistic performance-based design for high performance control systems
NASA Astrophysics Data System (ADS)
Micheli, Laura; Cao, Liang; Gong, Yongqiang; Cancelli, Alessandro; Laflamme, Simon; Alipour, Alice
2017-04-01
High performance control systems (HPCS) are advanced damping systems capable of high damping performance over a wide frequency bandwidth, ideal for mitigation of multi-hazards. They include active, semi-active, and hybrid damping systems. However, HPCS are more expensive than typical passive mitigation systems, rely on power and hardware (e.g., sensors, actuators) to operate, and require maintenance. In this paper, a life cycle cost analysis (LCA) approach is proposed to estimate the economic benefit these systems over the entire life of the structure. The novelty resides in the life cycle cost analysis in the performance based design (PBD) tailored to multi-level wind hazards. This yields a probabilistic performance-based design approach for HPCS. Numerical simulations are conducted on a building located in Boston, MA. LCA are conducted for passive control systems and HPCS, and the concept of controller robustness is demonstrated. Results highlight the promise of the proposed performance-based design procedure.
Digital microarray analysis for digital artifact genomics
NASA Astrophysics Data System (ADS)
Jaenisch, Holger; Handley, James; Williams, Deborah
2013-06-01
We implement a Spatial Voting (SV) based analogy of microarray analysis for digital gene marker identification in malware code sections. We examine a famous set of malware formally analyzed by Mandiant and code named Advanced Persistent Threat (APT1). APT1 is a Chinese organization formed with specific intent to infiltrate and exploit US resources. Manidant provided a detailed behavior and sting analysis report for the 288 malware samples available. We performed an independent analysis using a new alternative to the traditional dynamic analysis and static analysis we call Spatial Analysis (SA). We perform unsupervised SA on the APT1 originating malware code sections and report our findings. We also show the results of SA performed on some members of the families associated by Manidant. We conclude that SV based SA is a practical fast alternative to dynamics analysis and static analysis.
NASA Technical Reports Server (NTRS)
Ruf, Joseph H.; Holt, James B.; Canabal, Francisco
2001-01-01
This paper presents the status of analyses on three Rocket Based Combined Cycle (RBCC) configurations underway in the Applied Fluid Dynamics Analysis Group (TD64). TD64 is performing computational fluid dynamics (CFD) analysis on a Penn State RBCC test rig, the proposed Draco axisymmetric RBCC engine and the Trailblazer engine. The intent of the analysis on the Penn State test rig is to benchmark the Finite Difference Navier Stokes (FDNS) code for ejector mode fluid dynamics. The Draco analysis was a trade study to determine the ejector mode performance as a function of three engine design variables. The Trailblazer analysis is to evaluate the nozzle performance in scramjet mode. Results to date of each analysis are presented.
Metrological analysis of a virtual flowmeter-based transducer for cryogenic helium
DOE Office of Scientific and Technical Information (OSTI.GOV)
Arpaia, P., E-mail: pasquale.arpaia@unina.it; Technology Department, European Organization for Nuclear Research; Girone, M., E-mail: mario.girone@cern.ch
2015-12-15
The metrological performance of a virtual flowmeter-based transducer for monitoring helium under cryogenic conditions is assessed. At this aim, an uncertainty model of the transducer, mainly based on a valve model, exploiting finite-element approach, and a virtual flowmeter model, based on the Sereg-Schlumberger method, are presented. The models are validated experimentally on a case study for helium monitoring in cryogenic systems at the European Organization for Nuclear Research (CERN). The impact of uncertainty sources on the transducer metrological performance is assessed by a sensitivity analysis, based on statistical experiment design and analysis of variance. In this way, the uncertainty sourcesmore » most influencing metrological performance of the transducer are singled out over the input range as a whole, at varying operating and setting conditions. This analysis turns out to be important for CERN cryogenics operation because the metrological design of the transducer is validated, and its components and working conditions with critical specifications for future improvements are identified.« less
CASAS: Cancer Survival Analysis Suite, a web based application
Rupji, Manali; Zhang, Xinyan; Kowalski, Jeanne
2017-01-01
We present CASAS, a shiny R based tool for interactive survival analysis and visualization of results. The tool provides a web-based one stop shop to perform the following types of survival analysis: quantile, landmark and competing risks, in addition to standard survival analysis. The interface makes it easy to perform such survival analyses and obtain results using the interactive Kaplan-Meier and cumulative incidence plots. Univariate analysis can be performed on one or several user specified variable(s) simultaneously, the results of which are displayed in a single table that includes log rank p-values and hazard ratios along with their significance. For several quantile survival analyses from multiple cancer types, a single summary grid is constructed. The CASAS package has been implemented in R and is available via http://shinygispa.winship.emory.edu/CASAS/. The developmental repository is available at https://github.com/manalirupji/CASAS/. PMID:28928946
CASAS: Cancer Survival Analysis Suite, a web based application.
Rupji, Manali; Zhang, Xinyan; Kowalski, Jeanne
2017-01-01
We present CASAS, a shiny R based tool for interactive survival analysis and visualization of results. The tool provides a web-based one stop shop to perform the following types of survival analysis: quantile, landmark and competing risks, in addition to standard survival analysis. The interface makes it easy to perform such survival analyses and obtain results using the interactive Kaplan-Meier and cumulative incidence plots. Univariate analysis can be performed on one or several user specified variable(s) simultaneously, the results of which are displayed in a single table that includes log rank p-values and hazard ratios along with their significance. For several quantile survival analyses from multiple cancer types, a single summary grid is constructed. The CASAS package has been implemented in R and is available via http://shinygispa.winship.emory.edu/CASAS/. The developmental repository is available at https://github.com/manalirupji/CASAS/.
A Cost-Based Analysis on Using DoD Civilian Workforce to Perform Ordnance Support in Pearl Harbor
study examines whether using the government civilian workforce to perform ordnance handling generates cost savings when compared with contracting...Using a cost-based analysis, this study reviews all the associated costs of converting to a government civilian workforce and compares them with the cost
Design and performance analysis of gas and liquid radial turbines
NASA Astrophysics Data System (ADS)
Tan, Xu
In the first part of the research, pumps running in reverse as turbines are studied. This work uses experimental data of wide range of pumps representing the centrifugal pumps' configurations in terms of specific speed. Based on specific speed and specific diameter an accurate correlation is developed to predict the performances at best efficiency point of the centrifugal pump in its turbine mode operation. The proposed prediction method yields very good results to date compared to previous such attempts. The present method is compared to nine previous methods found in the literature. The comparison results show that the method proposed in this paper is the most accurate. The proposed method can be further complemented and supplemented by more future tests to increase its accuracy. The proposed method is meaningful because it is based both specific speed and specific diameter. The second part of the research is focused on the design and analysis of the radial gas turbine. The specification of the turbine is obtained from the solar biogas hybrid system. The system is theoretically analyzed and constructed based on the purchased compressor. Theoretical analysis results in a specification of 100lb/min, 900ºC inlet total temperature and 1.575atm inlet total pressure. 1-D and 3-D geometry of the rotor is generated based on Aungier's method. 1-D loss model analysis and 3-D CFD simulations are performed to examine the performances of the rotor. The total-to-total efficiency of the rotor is more than 90%. With the help of CFD analysis, modifications on the preliminary design obtained optimized aerodynamic performances. At last, the theoretical performance analysis on the hybrid system is performed with the designed turbine.
Optimization of Turbine Engine Cycle Analysis with Analytic Derivatives
NASA Technical Reports Server (NTRS)
Hearn, Tristan; Hendricks, Eric; Chin, Jeffrey; Gray, Justin; Moore, Kenneth T.
2016-01-01
A new engine cycle analysis tool, called Pycycle, was recently built using the OpenMDAO framework. This tool uses equilibrium chemistry based thermodynamics, and provides analytic derivatives. This allows for stable and efficient use of gradient-based optimization and sensitivity analysis methods on engine cycle models, without requiring the use of finite difference derivative approximation methods. To demonstrate this, a gradient-based design optimization was performed on a multi-point turbofan engine model. Results demonstrate very favorable performance compared to an optimization of an identical model using finite-difference approximated derivatives.
Work domain constraints for modelling surgical performance.
Morineau, Thierry; Riffaud, Laurent; Morandi, Xavier; Villain, Jonathan; Jannin, Pierre
2015-10-01
Three main approaches can be identified for modelling surgical performance: a competency-based approach, a task-based approach, both largely explored in the literature, and a less known work domain-based approach. The work domain-based approach first describes the work domain properties that constrain the agent's actions and shape the performance. This paper presents a work domain-based approach for modelling performance during cervical spine surgery, based on the idea that anatomical structures delineate the surgical performance. This model was evaluated through an analysis of junior and senior surgeons' actions. Twenty-four cervical spine surgeries performed by two junior and two senior surgeons were recorded in real time by an expert surgeon. According to a work domain-based model describing an optimal progression through anatomical structures, the degree of adjustment of each surgical procedure to a statistical polynomial function was assessed. Each surgical procedure showed a significant suitability with the model and regression coefficient values around 0.9. However, the surgeries performed by senior surgeons fitted this model significantly better than those performed by junior surgeons. Analysis of the relative frequencies of actions on anatomical structures showed that some specific anatomical structures discriminate senior from junior performances. The work domain-based modelling approach can provide an overall statistical indicator of surgical performance, but in particular, it can highlight specific points of interest among anatomical structures that the surgeons dwelled on according to their level of expertise.
Performance Based Logistics... What’s Stopping Us
2016-03-01
performance-based life cycle product support, where outcomes are acquired through performance-based arrangements that deliver Warfighter requirements and...correlates to the acquisition life cycle framework: spend the time and effort to identify and lock in the PBL requirements; conduct an analysis to...PDASD[L&MR]) on PBL strategies. The study, Project Proof Point: A Study to Determine the Impact of Performance Based Logistics (PBL) on Life Cycle
Performer-centric Interface Design.
ERIC Educational Resources Information Center
McGraw, Karen L.
1995-01-01
Describes performer-centric interface design and explains a model-based approach for conducting performer-centric analysis and design. Highlights include design methodology, including cognitive task analysis; creating task scenarios; creating the presentation model; creating storyboards; proof of concept screens; object models and icons;…
Greensmith, David J
2014-01-01
Here I present an Excel based program for the analysis of intracellular Ca transients recorded using fluorescent indicators. The program can perform all the necessary steps which convert recorded raw voltage changes into meaningful physiological information. The program performs two fundamental processes. (1) It can prepare the raw signal by several methods. (2) It can then be used to analyze the prepared data to provide information such as absolute intracellular Ca levels. Also, the rates of change of Ca can be measured using multiple, simultaneous regression analysis. I demonstrate that this program performs equally well as commercially available software, but has numerous advantages, namely creating a simplified, self-contained analysis workflow. Copyright © 2013 The Author. Published by Elsevier Ireland Ltd.. All rights reserved.
SEP thrust subsystem performance sensitivity analysis
NASA Technical Reports Server (NTRS)
Atkins, K. L.; Sauer, C. G., Jr.; Kerrisk, D. J.
1973-01-01
This is a two-part report on solar electric propulsion (SEP) performance sensitivity analysis. The first part describes the preliminary analysis of the SEP thrust system performance for an Encke rendezvous mission. A detailed description of thrust subsystem hardware tolerances on mission performance is included together with nominal spacecraft parameters based on these tolerances. The second part describes the method of analysis and graphical techniques used in generating the data for Part 1. Included is a description of both the trajectory program used and the additional software developed for this analysis. Part 2 also includes a comprehensive description of the use of the graphical techniques employed in this performance analysis.
Greensmith, David J.
2014-01-01
Here I present an Excel based program for the analysis of intracellular Ca transients recorded using fluorescent indicators. The program can perform all the necessary steps which convert recorded raw voltage changes into meaningful physiological information. The program performs two fundamental processes. (1) It can prepare the raw signal by several methods. (2) It can then be used to analyze the prepared data to provide information such as absolute intracellular Ca levels. Also, the rates of change of Ca can be measured using multiple, simultaneous regression analysis. I demonstrate that this program performs equally well as commercially available software, but has numerous advantages, namely creating a simplified, self-contained analysis workflow. PMID:24125908
Dispersion analysis for baseline reference mission 2
NASA Technical Reports Server (NTRS)
Snow, L. S.
1975-01-01
A dispersion analysis considering uncertainties (or perturbations) in platform, vehicle, and environmental parameters was performed for baseline reference mission (BRM) 2. The dispersion analysis is based on the nominal trajectory for BRM 2. The analysis was performed to determine state vector and performance dispersions (or variations) which result from the indicated uncertainties. The dispersions are determined at major mission events and fixed times from liftoff (time slices). The dispersion results will be used to evaluate the capability of the vehicle to perform the mission within a specified level of confidence and to determine flight performance reserves.
Low-Latency Embedded Vision Processor (LLEVS)
2016-03-01
26 3.2.3 Task 3 Projected Performance Analysis of FPGA- based Vision Processor ........... 31 3.2.3.1 Algorithms Latency Analysis ...Programmable Gate Array Custom Hardware for Real- Time Multiresolution Analysis . ............................................... 35...conduct data analysis for performance projections. The data acquired through measurements , simulation and estimation provide the requisite platform for
Structural reliability analysis under evidence theory using the active learning kriging model
NASA Astrophysics Data System (ADS)
Yang, Xufeng; Liu, Yongshou; Ma, Panke
2017-11-01
Structural reliability analysis under evidence theory is investigated. It is rigorously proved that a surrogate model providing only correct sign prediction of the performance function can meet the accuracy requirement of evidence-theory-based reliability analysis. Accordingly, a method based on the active learning kriging model which only correctly predicts the sign of the performance function is proposed. Interval Monte Carlo simulation and a modified optimization method based on Karush-Kuhn-Tucker conditions are introduced to make the method more efficient in estimating the bounds of failure probability based on the kriging model. Four examples are investigated to demonstrate the efficiency and accuracy of the proposed method.
Funding Ohio Community Colleges: An Analysis of the Performance Funding Model
ERIC Educational Resources Information Center
Krueger, Cynthia A.
2013-01-01
This study examined Ohio's community college performance funding model that is based on seven student success metrics. A percentage of the regular state subsidy is withheld from institutions; funding is earned back based on the three-year average of success points achieved in comparison to other community colleges in the state. Analysis of…
ERIC Educational Resources Information Center
Park, Sanghoon
2017-01-01
This paper reports the findings of a comparative analysis of online learner behavioral interactions, time-on-task, attendance, and performance at different points throughout a semester (beginning, during, and end) based on two online courses: one course offering authentic discussion-based learning activities and the other course offering authentic…
ERIC Educational Resources Information Center
Muslihah, Oleh Eneng
2015-01-01
The research examines the correlation between the understanding of school-based management, emotional intelligences and headmaster performance. Data was collected, using quantitative methods. The statistical analysis used was the Pearson Correlation, and multivariate regression analysis. The results of this research suggest firstly that there is…
An Analysis of Performance-Based Funding Policies and Recommendations for the Florida College System
ERIC Educational Resources Information Center
Balog, Scott E.
2016-01-01
Nearly 30 states have adopted or are transitioning to performance-based funding programs for community colleges that allocate funding based on institutional performance according to defined metrics. While embraced by state lawmakers and promoted by outside advocacy groups as a method to improve student outcomes, enhance accountability and ensure…
NASA Astrophysics Data System (ADS)
Mercer, Gary J.
This quantitative study examined the relationship between secondary students with math anxiety and physics performance in an inquiry-based constructivist classroom. The Revised Math Anxiety Rating Scale was used to evaluate math anxiety levels. The results were then compared to the performance on a physics standardized final examination. A simple correlation was performed, followed by a multivariate regression analysis to examine effects based on gender and prior math background. The correlation showed statistical significance between math anxiety and physics performance. The regression analysis showed statistical significance for math anxiety, physics performance, and prior math background, but did not show statistical significance for math anxiety, physics performance, and gender.
NASA Astrophysics Data System (ADS)
Razali, Nur Fhathyhah; Mohd Suradi, Nur Riza; Ahmad Shahabuddin, Faridatul Azna; Ismail, Wan Rosmanira; Abidin, Norkisme Zainal; Ahmad, Nor Amalina; Mustafa, Zainol
2013-04-01
This study aims to identify the determinants of technological innovation capability of Malaysian-owned companies in the resources-based manufacturing, to identify the relationship between technological innovation capability (TIC) and technological innovation performance (TIP) for the resource-based manufacturing. Furthermore, this study also aims to identify innovation capability factors that need more emphasis and improvements from the respective authority. The scope of the study covers four industries which are petrochemical industries, pharmaceutical industries, palm oil-based industries and food processing industries which are located in the state of Selangor. Descriptive analysis, correlation analysis and performance capability analysis were used in this study. It was found that, technological innovation capabilities (TIC) for companies in the resource-based manufacturing are moderate. Factors such as policies capability, human resources capability and facilities capability have a positive relationship with the performance of technological innovation (TIP). These findings will help the government in making decisions and better implementation of policies to strengthen the competitiveness of the company, particularly in resource-based manufacturing.
Qu, Yongzhi; He, David; Yoon, Jae; Van Hecke, Brandon; Bechhoefer, Eric; Zhu, Junda
2014-01-01
In recent years, acoustic emission (AE) sensors and AE-based techniques have been developed and tested for gearbox fault diagnosis. In general, AE-based techniques require much higher sampling rates than vibration analysis-based techniques for gearbox fault diagnosis. Therefore, it is questionable whether an AE-based technique would give a better or at least the same performance as the vibration analysis-based techniques using the same sampling rate. To answer the question, this paper presents a comparative study for gearbox tooth damage level diagnostics using AE and vibration measurements, the first known attempt to compare the gearbox fault diagnostic performance of AE- and vibration analysis-based approaches using the same sampling rate. Partial tooth cut faults are seeded in a gearbox test rig and experimentally tested in a laboratory. Results have shown that the AE-based approach has the potential to differentiate gear tooth damage levels in comparison with the vibration-based approach. While vibration signals are easily affected by mechanical resonance, the AE signals show more stable performance. PMID:24424467
Kandaswamy, Umasankar; Rotman, Ziv; Watt, Dana; Schillebeeckx, Ian; Cavalli, Valeria; Klyachko, Vitaly
2013-01-01
High-resolution live-cell imaging studies of neuronal structure and function are characterized by large variability in image acquisition conditions due to background and sample variations as well as low signal-to-noise ratio. The lack of automated image analysis tools that can be generalized for varying image acquisition conditions represents one of the main challenges in the field of biomedical image analysis. Specifically, segmentation of the axonal/dendritic arborizations in brightfield or fluorescence imaging studies is extremely labor-intensive and still performed mostly manually. Here we describe a fully automated machine-learning approach based on textural analysis algorithms for segmenting neuronal arborizations in high-resolution brightfield images of live cultured neurons. We compare performance of our algorithm to manual segmentation and show that it combines 90% accuracy, with similarly high levels of specificity and sensitivity. Moreover, the algorithm maintains high performance levels under a wide range of image acquisition conditions indicating that it is largely condition-invariable. We further describe an application of this algorithm to fully automated synapse localization and classification in fluorescence imaging studies based on synaptic activity. Textural analysis-based machine-learning approach thus offers a high performance condition-invariable tool for automated neurite segmentation. PMID:23261652
Pal, P; Kumar, R; Srivastava, N; Chaudhuri, J
2014-02-01
A Visual Basic simulation software (WATTPPA) has been developed to analyse the performance of an advanced wastewater treatment plant. This user-friendly and menu-driven software is based on the dynamic mathematical model for an industrial wastewater treatment scheme that integrates chemical, biological and membrane-based unit operations. The software-predicted results corroborate very well with the experimental findings as indicated in the overall correlation coefficient of the order of 0.99. The software permits pre-analysis and manipulation of input data, helps in optimization and exhibits performance of an integrated plant visually on a graphical platform. It allows quick performance analysis of the whole system as well as the individual units. The software first of its kind in its domain and in the well-known Microsoft Excel environment is likely to be very useful in successful design, optimization and operation of an advanced hybrid treatment plant for hazardous wastewater.
Fruehwald-Pallamar, J; Hesselink, J R; Mafee, M F; Holzer-Fruehwald, L; Czerny, C; Mayerhoefer, M E
2016-02-01
To evaluate whether texture-based analysis of standard MRI sequences can help in the discrimination between benign and malignant head and neck tumors. The MR images of 100 patients with a histologically clarified head or neck mass, from two different institutions, were analyzed. Texture-based analysis was performed using texture analysis software, with region of interest measurements for 2 D and 3 D evaluation independently for all axial sequences. COC, RUN, GRA, ARM, and WAV features were calculated for all ROIs. 10 texture feature subsets were used for a linear discriminant analysis, in combination with k-nearest-neighbor classification. Benign and malignant tumors were compared with regard to texture-based values. There were differences in the images from different field-strength scanners, as well as from different vendors. For the differentiation of benign and malignant tumors, we found differences on STIR and T2-weighted images for 2 D, and on contrast-enhanced T1-TSE with fat saturation for 3 D evaluation. In a separate analysis of the subgroups 1.5 and 3 Tesla, more discriminating features were found. Texture-based analysis is a useful tool in the discrimination of benign and malignant tumors when performed on one scanner with the same protocol. We cannot recommend this technique for the use of multicenter studies with clinical data. 2 D/3 D texture-based analysis can be performed in head and neck tumors. Texture-based analysis can differentiate between benign and malignant masses. Analyzed MR images should originate from one scanner with an identical protocol. © Georg Thieme Verlag KG Stuttgart · New York.
Wang, Jing; Wu, Chen-Jiang; Bao, Mei-Ling; Zhang, Jing; Wang, Xiao-Ning; Zhang, Yu-Dong
2017-10-01
To investigate whether machine learning-based analysis of MR radiomics can help improve the performance PI-RADS v2 in clinically relevant prostate cancer (PCa). This IRB-approved study included 54 patients with PCa undergoing multi-parametric (mp) MRI before prostatectomy. Imaging analysis was performed on 54 tumours, 47 normal peripheral (PZ) and 48 normal transitional (TZ) zone based on histological-radiological correlation. Mp-MRI was scored via PI-RADS, and quantified by measuring radiomic features. Predictive model was developed using a novel support vector machine trained with: (i) radiomics, (ii) PI-RADS scores, (iii) radiomics and PI-RADS scores. Paired comparison was made via ROC analysis. For PCa versus normal TZ, the model trained with radiomics had a significantly higher area under the ROC curve (Az) (0.955 [95% CI 0.923-0.976]) than PI-RADS (Az: 0.878 [0.834-0.914], p < 0.001). The Az between them was insignificant for PCa versus PZ (0.972 [0.945-0.988] vs. 0.940 [0.905-0.965], p = 0.097). When radiomics was added, performance of PI-RADS was significantly improved for PCa versus PZ (Az: 0.983 [0.960-0.995]) and PCa versus TZ (Az: 0.968 [0.940-0.985]). Machine learning analysis of MR radiomics can help improve the performance of PI-RADS in clinically relevant PCa. • Machine-based analysis of MR radiomics outperformed in TZ cancer against PI-RADS. • Adding MR radiomics significantly improved the performance of PI-RADS. • DKI-derived Dapp and Kapp were two strong markers for the diagnosis of PCa.
Application of tissue mesodissection to molecular cancer diagnostics.
Krizman, David; Adey, Nils; Parry, Robert
2015-02-01
To demonstrate clinical application of a mesodissection platform that was developed to combine advantages of laser-based instrumentation with the speed/ease of manual dissection for automated dissection of tissue off standard glass slides. Genomic analysis for KRAS gene mutation was performed on formalin fixed paraffin embedded (FFPE) cancer patient tissue that was dissected using the mesodissection platform. Selected reaction monitoring proteomic analysis for quantitative Her2 protein expression was performed on FFPE patient tumour tissue dissected by a laser-based instrument and the MilliSect instrument. Genomic analysis demonstrates highly confident detection of KRAS mutation specifically in lung cancer cells and not the surrounding benign, non-tumour tissue. Proteomic analysis demonstrates Her2 quantitative protein expression in breast cancer cells dissected manually, by laser-based instrumentation and by MilliSect instrumentation (mesodissection). Slide-mounted tissue dissection is commonly performed using laser-based instruments or manually scraping tissue by scalpel. Here we demonstrate that the mesodissection platform as performed by the MilliSect instrument for tissue dissection is cost-effective; it functions comparably to laser-based dissection and which can be adopted into a clinical diagnostic workflow. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.
ERIC Educational Resources Information Center
Joyce, Bruce R.
This analysis, a review of literature and experience on performance-based education, is divided into nine chapters: chapter one, "The Short Form: Best-Guess Working Hypotheses for Competency-Based Education" (a summary of the frame of reference for this study and the recommendations for performance-based education which resulted from the effort);…
Jin, H; Yuan, L; Li, C; Kan, Y; Hao, R; Yang, J
2014-03-01
The purpose of this study was to systematically review and perform a meta-analysis of published data regarding the diagnostic performance of positron emission tomography (PET) or PET/computed tomography (PET/CT) in prosthetic infection after arthroplasty. A comprehensive computer literature search of studies published through May 31, 2012 regarding PET or PET/CT in patients suspicious of prosthetic infection was performed in PubMed/MEDLINE, Embase and Scopus databases. Pooled sensitivity and specificity of PET or PET/CT in patients suspicious of prosthetic infection on a per prosthesis-based analysis were calculated. The area under the receiver-operating characteristic (ROC) curve was calculated to measure the accuracy of PET or PET/CT in patients with suspicious of prosthetic infection. Fourteen studies comprising 838 prosthesis with suspicious of prosthetic infection after arthroplasty were included in this meta-analysis. The pooled sensitivity of PET or PET/CT in detecting prosthetic infection was 86% (95% confidence interval [CI] 82-90%) on a per prosthesis-based analysis. The pooled specificity of PET or PET/CT in detecting prosthetic infection was 86% (95% CI 83-89%) on a per prosthesis-based analysis. The area under the ROC curve was 0.93 on a per prosthesis-based analysis. In patients suspicious of prosthetic infection, FDG PET or PET/CT demonstrated high sensitivity and specificity. FDG PET or PET/CT are accurate methods in this setting. Nevertheless, possible sources of false positive results and influcing factors should kept in mind.
Pugh, Carla M; DaRosa, Debra A
2013-10-01
There is a paucity of performance-based assessments that focus on intraoperative decision making. The purpose of this article is to review the performance outcomes and usefulness of two performance-based assessments that were developed using cognitive task analysis (CTA) frameworks. Assessment-A used CTA to create a "think aloud" oral examination that was administered while junior residents (PGY 1-2's, N = 69) performed a porcine-based laparoscopic cholecystectomy. Assessment-B used CTA to create a simulation-based, formative assessment of senior residents' (PGY 4-5's, N = 29) decision making during a laparoscopic ventral hernia repair. In addition to survey-based assessments of usefulness, a multiconstruct evaluation was performed using eight variables. When comparing performance outcomes, both approaches revealed major deficiencies in residents' intraoperative decision-making skills. Multiconstruct evaluation of the two CTA approaches revealed assessment method advantages for five of the eight evaluation areas: (1) Cognitive Complexity, (2) Content Quality, (3) Content Coverage, (4) Meaningfulness, and (5) Transfer and Generalizability. The two CTA performance assessments were useful in identifying significant training needs. While there are pros and cons to each approach, the results serve as a useful blueprint for program directors seeking to develop performance-based assessments for intraoperative decision making. Reprint & Copyright © 2013 Association of Military Surgeons of the U.S.
Plenis, Alina; Rekowska, Natalia; Bączek, Tomasz
2016-01-01
This article focuses on correlating the column classification obtained from the method created at the Katholieke Universiteit Leuven (KUL), with the chromatographic resolution attained in biomedical separation. In the KUL system, each column is described with four parameters, which enables estimation of the FKUL value characterising similarity of those parameters to the selected reference stationary phase. Thus, a ranking list based on the FKUL value can be calculated for the chosen reference column, then correlated with the results of the column performance test. In this study, the column performance test was based on analysis of moclobemide and its two metabolites in human plasma by liquid chromatography (LC), using 18 columns. The comparative study was performed using traditional correlation of the FKUL values with the retention parameters of the analytes describing the column performance test. In order to deepen the comparative assessment of both data sets, factor analysis (FA) was also used. The obtained results indicated that the stationary phase classes, closely related according to the KUL method, yielded comparable separation for the target substances. Therefore, the column ranking system based on the FKUL-values could be considered supportive in the choice of the appropriate column for biomedical analysis. PMID:26805819
Plenis, Alina; Rekowska, Natalia; Bączek, Tomasz
2016-01-21
This article focuses on correlating the column classification obtained from the method created at the Katholieke Universiteit Leuven (KUL), with the chromatographic resolution attained in biomedical separation. In the KUL system, each column is described with four parameters, which enables estimation of the FKUL value characterising similarity of those parameters to the selected reference stationary phase. Thus, a ranking list based on the FKUL value can be calculated for the chosen reference column, then correlated with the results of the column performance test. In this study, the column performance test was based on analysis of moclobemide and its two metabolites in human plasma by liquid chromatography (LC), using 18 columns. The comparative study was performed using traditional correlation of the FKUL values with the retention parameters of the analytes describing the column performance test. In order to deepen the comparative assessment of both data sets, factor analysis (FA) was also used. The obtained results indicated that the stationary phase classes, closely related according to the KUL method, yielded comparable separation for the target substances. Therefore, the column ranking system based on the FKUL-values could be considered supportive in the choice of the appropriate column for biomedical analysis.
Amplify-and-forward cooperative diversity for green UWB-based WBSNs.
Shaban, Heba; Abou El-Nasr, Mohamad
2013-01-01
This paper proposes a novel green cooperative diversity technique based on suboptimal template-based ultra-wideband (UWB) wireless body sensor networks (WBSNs) using amplify-and-forward (AF) relays. In addition, it analyzes the bit-error-rate (BER) performance of the proposed nodes. The analysis is based on the moment-generating function (MGF) of the total signal-to-noise ratio (SNR) at the destination. It also provides an approximate value for the total SNR. The analysis studies the performance of equally correlated binary pulse position modulation (EC-BPPM) assuming the sinusoidal and square suboptimal template pulses. Numerical results are provided for the performance evaluation of optimal and suboptimal template-based nodes with and without relay cooperation. Results show that one relay node provides ~23 dB performance enhancement at 1e - 3 BER, which mitigates the effect of the nondesirable non-line-of-sight (NLOS) links in WBSNs.
Amplify-and-Forward Cooperative Diversity for Green UWB-Based WBSNs
2013-01-01
This paper proposes a novel green cooperative diversity technique based on suboptimal template-based ultra-wideband (UWB) wireless body sensor networks (WBSNs) using amplify-and-forward (AF) relays. In addition, it analyzes the bit-error-rate (BER) performance of the proposed nodes. The analysis is based on the moment-generating function (MGF) of the total signal-to-noise ratio (SNR) at the destination. It also provides an approximate value for the total SNR. The analysis studies the performance of equally correlated binary pulse position modulation (EC-BPPM) assuming the sinusoidal and square suboptimal template pulses. Numerical results are provided for the performance evaluation of optimal and suboptimal template-based nodes with and without relay cooperation. Results show that one relay node provides ~23 dB performance enhancement at 1e − 3 BER, which mitigates the effect of the nondesirable non-line-of-sight (NLOS) links in WBSNs. PMID:24307880
Ataer-Cansizoglu, Esra; Bolon-Canedo, Veronica; Campbell, J Peter; Bozkurt, Alican; Erdogmus, Deniz; Kalpathy-Cramer, Jayashree; Patel, Samir; Jonas, Karyn; Chan, R V Paul; Ostmo, Susan; Chiang, Michael F
2015-11-01
We developed and evaluated the performance of a novel computer-based image analysis system for grading plus disease in retinopathy of prematurity (ROP), and identified the image features, shapes, and sizes that best correlate with expert diagnosis. A dataset of 77 wide-angle retinal images from infants screened for ROP was collected. A reference standard diagnosis was determined for each image by combining image grading from 3 experts with the clinical diagnosis from ophthalmoscopic examination. Manually segmented images were cropped into a range of shapes and sizes, and a computer algorithm was developed to extract tortuosity and dilation features from arteries and veins. Each feature was fed into our system to identify the set of characteristics that yielded the highest-performing system compared to the reference standard, which we refer to as the "i-ROP" system. Among the tested crop shapes, sizes, and measured features, point-based measurements of arterial and venous tortuosity (combined), and a large circular cropped image (with radius 6 times the disc diameter), provided the highest diagnostic accuracy. The i-ROP system achieved 95% accuracy for classifying preplus and plus disease compared to the reference standard. This was comparable to the performance of the 3 individual experts (96%, 94%, 92%), and significantly higher than the mean performance of 31 nonexperts (81%). This comprehensive analysis of computer-based plus disease suggests that it may be feasible to develop a fully-automated system based on wide-angle retinal images that performs comparably to expert graders at three-level plus disease discrimination. Computer-based image analysis, using objective and quantitative retinal vascular features, has potential to complement clinical ROP diagnosis by ophthalmologists.
Faradji, Farhad; Ward, Rabab K; Birch, Gary E
2009-06-15
The feasibility of having a self-paced brain-computer interface (BCI) based on mental tasks is investigated. The EEG signals of four subjects performing five mental tasks each are used in the design of a 2-state self-paced BCI. The output of the BCI should only be activated when the subject performs a specific mental task and should remain inactive otherwise. For each subject and each task, the feature coefficient and the classifier that yield the best performance are selected, using the autoregressive coefficients as the features. The classifier with a zero false positive rate and the highest true positive rate is selected as the best classifier. The classifiers tested include: linear discriminant analysis, quadratic discriminant analysis, Mahalanobis discriminant analysis, support vector machine, and radial basis function neural network. The results show that: (1) some classifiers obtained the desired zero false positive rate; (2) the linear discriminant analysis classifier does not yield acceptable performance; (3) the quadratic discriminant analysis classifier outperforms the Mahalanobis discriminant analysis classifier and performs almost as well as the radial basis function neural network; and (4) the support vector machine classifier has the highest true positive rates but unfortunately has nonzero false positive rates in most cases.
ERIC Educational Resources Information Center
Burk, Erlan
2012-01-01
Aerospace companies needed additional research on technology-based training to verify expectations when enhancing human capital through online systems analysis training. The research for online systems analysis training provided aerospace companies a means to verify expectations for systems analysis technology-based training on business…
Fan, Chunlin; Deng, Jiewei; Yang, Yunyun; Liu, Junshan; Wang, Ying; Zhang, Xiaoqi; Fai, Kuokchiu; Zhang, Qingwen; Ye, Wencai
2013-10-01
An ultra-performance liquid chromatography coupled with quadrupole time-of-flight mass spectrometry (UPLC-QTOF-MS) method integrating multi-ingredients determination and fingerprint analysis has been established for quality assessment and control of leaves from Ilex latifolia. The method possesses the advantages of speediness, efficiency, accuracy, and allows the multi-ingredients determination and fingerprint analysis in one chromatographic run within 13min. Multi-ingredients determination was performed based on the extracted ion chromatograms of the exact pseudo-molecular ions (with a 0.01Da window), and fingerprint analysis was performed based on the base peak chromatograms, obtained by negative-ion electrospray ionization QTOF-MS. The method validation results demonstrated our developed method possessing desirable specificity, linearity, precision and accuracy. The method was utilized to analyze 22 I. latifolia samples from different origins. The quality assessment was achieved by using both similarity analysis (SA) and principal component analysis (PCA), and the results from SA were consistent with those from PCA. Our experimental results demonstrate that the strategy integrated multi-ingredients determination and fingerprint analysis using UPLC-QTOF-MS technique is a useful approach for rapid pharmaceutical analysis, with promising prospects for the differentiation of origin, the determination of authenticity, and the overall quality assessment of herbal medicines. Copyright © 2013 Elsevier B.V. All rights reserved.
Performance analysis of a coherent free space optical communication system based on experiment.
Cao, Jingtai; Zhao, Xiaohui; Liu, Wei; Gu, Haijun
2017-06-26
Based on our previous study and designed experimental AO system with a 97-element continuous surface deformable mirror, we conduct the performance analysis of a coherent free space optical communication (FSOC) system for mixing efficiency (ME), bit error rate (BER) and outage probability under different Greenwood frequency and atmospheric coherent length. The results show that the influence of the atmospheric temporal characteristics on the performance is slightly stronger than that of the spatial characteristics when the receiving aperture and the number of sub-apertures are given. This analysis result provides a reference for the design of the coherent FSOC system.
Using enterprise architecture to analyse how organisational structure impact motivation and learning
NASA Astrophysics Data System (ADS)
Närman, Pia; Johnson, Pontus; Gingnell, Liv
2016-06-01
When technology, environment, or strategies change, organisations need to adjust their structures accordingly. These structural changes do not always enhance the organisational performance as intended partly because organisational developers do not understand the consequences of structural changes in performance. This article presents a model-based analysis framework for quantitative analysis of the effect of organisational structure on organisation performance in terms of employee motivation and learning. The model is based on Mintzberg's work on organisational structure. The quantitative analysis is formalised using the Object Constraint Language (OCL) and the Unified Modelling Language (UML) and implemented in an enterprise architecture tool.
Lunar base surface mission operations. Lunar Base Systems Study (LBSS) task 4.1
NASA Technical Reports Server (NTRS)
1987-01-01
The purpose was to perform an analysis of the surface operations associated with a human-tended lunar base. Specifically, the study defined surface elements and developed mission manifests for a selected base scenario, determined the nature of surface operations associated with this scenario, generated a preliminary crew extravehicular and intravehicular activity (EVA/IVA) time resource schedule for conducting the missions, and proposed concepts for utilizing remotely operated equipment to perform repetitious or hazardous surface tasks. The operations analysis was performed on a 6 year period of human-tended lunar base operation prior to permanent occupancy. The baseline scenario was derived from a modified version of the civil needs database (CNDB) scenario. This scenario emphasizes achievement of a limited set of science and exploration objectives while emplacing the minimum habitability elements required for a permanent base.
NASA Technical Reports Server (NTRS)
Johnston, John D.; Parrish, Keith; Howard, Joseph M.; Mosier, Gary E.; McGinnis, Mark; Bluth, Marcel; Kim, Kevin; Ha, Hong Q.
2004-01-01
This is a continuation of a series of papers on modeling activities for JWST. The structural-thermal- optical, often referred to as "STOP", analysis process is used to predict the effect of thermal distortion on optical performance. The benchmark STOP analysis for JWST assesses the effect of an observatory slew on wavefront error. The paper begins an overview of multi-disciplinary engineering analysis, or integrated modeling, which is a critical element of the JWST mission. The STOP analysis process is then described. This process consists of the following steps: thermal analysis, structural analysis, and optical analysis. Temperatures predicted using geometric and thermal math models are mapped to the structural finite element model in order to predict thermally-induced deformations. Motions and deformations at optical surfaces are input to optical models and optical performance is predicted using either an optical ray trace or WFE estimation techniques based on prior ray traces or first order optics. Following the discussion of the analysis process, results based on models representing the design at the time of the System Requirements Review. In addition to baseline performance predictions, sensitivity studies are performed to assess modeling uncertainties. Of particular interest is the sensitivity of optical performance to uncertainties in temperature predictions and variations in metal properties. The paper concludes with a discussion of modeling uncertainty as it pertains to STOP analysis.
Positioning performance analysis of the time sum of arrival algorithm with error features
NASA Astrophysics Data System (ADS)
Gong, Feng-xun; Ma, Yan-qiu
2018-03-01
The theoretical positioning accuracy of multilateration (MLAT) with the time difference of arrival (TDOA) algorithm is very high. However, there are some problems in practical applications. Here we analyze the location performance of the time sum of arrival (TSOA) algorithm from the root mean square error ( RMSE) and geometric dilution of precision (GDOP) in additive white Gaussian noise (AWGN) environment. The TSOA localization model is constructed. Using it, the distribution of location ambiguity region is presented with 4-base stations. And then, the location performance analysis is started from the 4-base stations with calculating the RMSE and GDOP variation. Subsequently, when the location parameters are changed in number of base stations, base station layout and so on, the performance changing patterns of the TSOA location algorithm are shown. So, the TSOA location characteristics and performance are revealed. From the RMSE and GDOP state changing trend, the anti-noise performance and robustness of the TSOA localization algorithm are proved. The TSOA anti-noise performance will be used for reducing the blind-zone and the false location rate of MLAT systems.
Using Android-Based Educational Game for Learning Colloid Material
NASA Astrophysics Data System (ADS)
Sari, S.; Anjani, R.; Farida, I.; Ramdhani, M. A.
2017-09-01
This research is based on the importance of the development of student’s chemical literacy on Colloid material using Android-based educational game media. Educational game products are developed through research and development design. In the analysis phase, material analysis is performed to generate concept maps, determine chemical literacy indicators, game strategies and set game paths. In the design phase, product packaging is carried out, then validation and feasibility test are performed. Research produces educational game based on Android that has the characteristics that is: Colloid material presented in 12 levels of game in the form of questions and challenges, presents visualization of discourse, images and animation contextually to develop the process of thinking and attitude. Based on the analysis of validation and trial results, the product is considered feasible to use.
Fuzzy logic based sensor performance evaluation of vehicle mounted metal detector systems
NASA Astrophysics Data System (ADS)
Abeynayake, Canicious; Tran, Minh D.
2015-05-01
Vehicle Mounted Metal Detector (VMMD) systems are widely used for detection of threat objects in humanitarian demining and military route clearance scenarios. Due to the diverse nature of such operational conditions, operational use of VMMD without a proper understanding of its capability boundaries may lead to heavy causalities. Multi-criteria fitness evaluations are crucial for determining capability boundaries of any sensor-based demining equipment. Evaluation of sensor based military equipment is a multi-disciplinary topic combining the efforts of researchers, operators, managers and commanders having different professional backgrounds and knowledge profiles. Information acquired through field tests usually involves uncertainty, vagueness and imprecision due to variations in test and evaluation conditions during a single test or series of tests. This report presents a fuzzy logic based methodology for experimental data analysis and performance evaluation of VMMD. This data evaluation methodology has been developed to evaluate sensor performance by consolidating expert knowledge with experimental data. A case study is presented by implementing the proposed data analysis framework in a VMMD evaluation scenario. The results of this analysis confirm accuracy, practicability and reliability of the fuzzy logic based sensor performance evaluation framework.
ERIC Educational Resources Information Center
Pitas, Nicholas; Murray, Alison; Olsen, Max; Graefe, Alan
2017-01-01
This article describes a modified importance-performance framework for use in evaluation of recreation-based experiential learning programs. Importance-performance analysis (IPA) provides an effective and readily applicable means of evaluating many programs, but the near universal satisfaction associated with recreation inhibits the use of IPA in…
Lam, H K; Leung, Frank H F
2007-10-01
This correspondence presents the stability analysis and performance design of the continuous-time fuzzy-model-based control systems. The idea of the nonparallel-distributed-compensation (non-PDC) control laws is extended to the continuous-time fuzzy-model-based control systems. A nonlinear controller with non-PDC control laws is proposed to stabilize the continuous-time nonlinear systems in Takagi-Sugeno's form. To produce the stability-analysis result, a parameter-dependent Lyapunov function (PDLF) is employed. However, two difficulties are usually encountered: 1) the time-derivative terms produced by the PDLF will complicate the stability analysis and 2) the stability conditions are not in the form of linear-matrix inequalities (LMIs) that aid the design of feedback gains. To tackle the first difficulty, the time-derivative terms are represented by some weighted-sum terms in some existing approaches, which will increase the number of stability conditions significantly. In view of the second difficulty, some positive-definitive terms are added in order to cast the stability conditions into LMIs. In this correspondence, the favorable properties of the membership functions and nonlinear control laws, which allow the introduction of some free matrices, are employed to alleviate the two difficulties while retaining the favorable properties of PDLF-based approach. LMI-based stability conditions are derived to ensure the system stability. Furthermore, based on a common scalar performance index, LMI-based performance conditions are derived to guarantee the system performance. Simulation examples are given to illustrate the effectiveness of the proposed approach.
Kopriva, Ivica; Persin, Antun; Puizina-Ivić, Neira; Mirić, Lina
2010-07-02
This study was designed to demonstrate robust performance of the novel dependent component analysis (DCA)-based approach to demarcation of the basal cell carcinoma (BCC) through unsupervised decomposition of the red-green-blue (RGB) fluorescent image of the BCC. Robustness to intensity fluctuation is due to the scale invariance property of DCA algorithms, which exploit spectral and spatial diversities between the BCC and the surrounding tissue. Used filtering-based DCA approach represents an extension of the independent component analysis (ICA) and is necessary in order to account for statistical dependence that is induced by spectral similarity between the BCC and surrounding tissue. This generates weak edges what represents a challenge for other segmentation methods as well. By comparative performance analysis with state-of-the-art image segmentation methods such as active contours (level set), K-means clustering, non-negative matrix factorization, ICA and ratio imaging we experimentally demonstrate good performance of DCA-based BCC demarcation in two demanding scenarios where intensity of the fluorescent image has been varied almost two orders of magnitude. Copyright 2010 Elsevier B.V. All rights reserved.
A Systemic Cause Analysis Model for Human Performance Technicians
ERIC Educational Resources Information Center
Sostrin, Jesse
2011-01-01
This article presents a systemic, research-based cause analysis model for use in the field of human performance technology (HPT). The model organizes the most prominent barriers to workplace learning and performance into a conceptual framework that explains and illuminates the architecture of these barriers that exist within the fabric of everyday…
Performance analysis of an integrated GPS/inertial attitude determination system. M.S. Thesis - MIT
NASA Technical Reports Server (NTRS)
Sullivan, Wendy I.
1994-01-01
The performance of an integrated GPS/inertial attitude determination system is investigated using a linear covariance analysis. The principles of GPS interferometry are reviewed, and the major error sources of both interferometers and gyroscopes are discussed and modeled. A new figure of merit, attitude dilution of precision (ADOP), is defined for two possible GPS attitude determination methods, namely single difference and double difference interferometry. Based on this figure of merit, a satellite selection scheme is proposed. The performance of the integrated GPS/inertial attitude determination system is determined using a linear covariance analysis. Based on this analysis, it is concluded that the baseline errors (i.e., knowledge of the GPS interferometer baseline relative to the vehicle coordinate system) are the limiting factor in system performance. By reducing baseline errors, it should be possible to use lower quality gyroscopes without significantly reducing performance. For the cases considered, single difference interferometry is only marginally better than double difference interferometry. Finally, the performance of the system is found to be relatively insensitive to the satellite selection technique.
Relation between brain architecture and mathematical ability in children: a DBM study.
Han, Zhaoying; Davis, Nicole; Fuchs, Lynn; Anderson, Adam W; Gore, John C; Dawant, Benoit M
2013-12-01
Population-based studies indicate that between 5 and 9 percent of US children exhibit significant deficits in mathematical reasoning, yet little is understood about the brain morphological features related to mathematical performances. In this work, deformation-based morphometry (DBM) analyses have been performed on magnetic resonance images of the brains of 79 third graders to investigate whether there is a correlation between brain morphological features and mathematical proficiency. Group comparison was also performed between Math Difficulties (MD-worst math performers) and Normal Controls (NC), where each subgroup consists of 20 age and gender matched subjects. DBM analysis is based on the analysis of the deformation fields generated by non-rigid registration algorithms, which warp the individual volumes to a common space. To evaluate the effect of registration algorithms on DBM results, five nonrigid registration algorithms have been used: (1) the Adaptive Bases Algorithm (ABA); (2) the Image Registration Toolkit (IRTK); (3) the FSL Nonlinear Image Registration Tool; (4) the Automatic Registration Tool (ART); and (5) the normalization algorithm available in SPM8. The deformation field magnitude (DFM) was used to measure the displacement at each voxel, and the Jacobian determinant (JAC) was used to quantify local volumetric changes. Results show there are no statistically significant volumetric differences between the NC and the MD groups using JAC. However, DBM analysis using DFM found statistically significant anatomical variations between the two groups around the left occipital-temporal cortex, left orbital-frontal cortex, and right insular cortex. Regions of agreement between at least two algorithms based on voxel-wise analysis were used to define Regions of Interest (ROIs) to perform an ROI-based correlation analysis on all 79 volumes. Correlations between average DFM values and standard mathematical scores over these regions were found to be significant. We also found that the choice of registration algorithm has an impact on DBM-based results, so we recommend using more than one algorithm when conducting DBM studies. To the best of our knowledge, this is the first study that uses DBM to investigate brain anatomical features related to mathematical performance in a relatively large population of children. © 2013.
Elastic-plastic mixed-iterative finite element analysis: Implementation and performance assessment
NASA Technical Reports Server (NTRS)
Sutjahjo, Edhi; Chamis, Christos C.
1993-01-01
An elastic-plastic algorithm based on Von Mises and associative flow criteria is implemented in MHOST-a mixed iterative finite element analysis computer program developed by NASA Lewis Research Center. The performance of the resulting elastic-plastic mixed-iterative analysis is examined through a set of convergence studies. Membrane and bending behaviors of 4-node quadrilateral shell finite elements are tested for elastic-plastic performance. Generally, the membrane results are excellent, indicating the implementation of elastic-plastic mixed-iterative analysis is appropriate.
Clark, Neil R.; Szymkiewicz, Maciej; Wang, Zichen; Monteiro, Caroline D.; Jones, Matthew R.; Ma’ayan, Avi
2016-01-01
Gene set analysis of differential expression, which identifies collectively differentially expressed gene sets, has become an important tool for biology. The power of this approach lies in its reduction of the dimensionality of the statistical problem and its incorporation of biological interpretation by construction. Many approaches to gene set analysis have been proposed, but benchmarking their performance in the setting of real biological data is difficult due to the lack of a gold standard. In a previously published work we proposed a geometrical approach to differential expression which performed highly in benchmarking tests and compared well to the most popular methods of differential gene expression. As reported, this approach has a natural extension to gene set analysis which we call Principal Angle Enrichment Analysis (PAEA). PAEA employs dimensionality reduction and a multivariate approach for gene set enrichment analysis. However, the performance of this method has not been assessed nor its implementation as a web-based tool. Here we describe new benchmarking protocols for gene set analysis methods and find that PAEA performs highly. The PAEA method is implemented as a user-friendly web-based tool, which contains 70 gene set libraries and is freely available to the community. PMID:26848405
Clark, Neil R; Szymkiewicz, Maciej; Wang, Zichen; Monteiro, Caroline D; Jones, Matthew R; Ma'ayan, Avi
2015-11-01
Gene set analysis of differential expression, which identifies collectively differentially expressed gene sets, has become an important tool for biology. The power of this approach lies in its reduction of the dimensionality of the statistical problem and its incorporation of biological interpretation by construction. Many approaches to gene set analysis have been proposed, but benchmarking their performance in the setting of real biological data is difficult due to the lack of a gold standard. In a previously published work we proposed a geometrical approach to differential expression which performed highly in benchmarking tests and compared well to the most popular methods of differential gene expression. As reported, this approach has a natural extension to gene set analysis which we call Principal Angle Enrichment Analysis (PAEA). PAEA employs dimensionality reduction and a multivariate approach for gene set enrichment analysis. However, the performance of this method has not been assessed nor its implementation as a web-based tool. Here we describe new benchmarking protocols for gene set analysis methods and find that PAEA performs highly. The PAEA method is implemented as a user-friendly web-based tool, which contains 70 gene set libraries and is freely available to the community.
2004-01-01
Cognitive Task Analysis Abstract As Department of Defense (DoD) leaders rely more on modeling and simulation to provide information on which to base...capabilities and intent. Cognitive Task Analysis (CTA) Cognitive Task Analysis (CTA) is an extensive/detailed look at tasks and subtasks performed by a...Domain Analysis and Task Analysis: A Difference That Matters. In Cognitive Task Analysis , edited by J. M. Schraagen, S.
Analysis of rosen piezoelectric transformers with a varying cross-section.
Xue, H; Yang, J; Hu, Y
2008-07-01
We study the effects of a varying cross-section on the performance of Rosen piezoelectric transformers operating with length extensional modes of rods. A theoretical analysis is performed using an extended version of a one-dimensional model developed in a previous paper. Numerical results based on the theoretical analysis are presented.
Frequency Spectrum Method-Based Stress Analysis for Oil Pipelines in Earthquake Disaster Areas
Wu, Xiaonan; Lu, Hongfang; Huang, Kun; Wu, Shijuan; Qiao, Weibiao
2015-01-01
When a long distance oil pipeline crosses an earthquake disaster area, inertial force and strong ground motion can cause the pipeline stress to exceed the failure limit, resulting in bending and deformation failure. To date, researchers have performed limited safety analyses of oil pipelines in earthquake disaster areas that include stress analysis. Therefore, using the spectrum method and theory of one-dimensional beam units, CAESAR II is used to perform a dynamic earthquake analysis for an oil pipeline in the XX earthquake disaster area. This software is used to determine if the displacement and stress of the pipeline meet the standards when subjected to a strong earthquake. After performing the numerical analysis, the primary seismic action axial, longitudinal and horizontal displacement directions and the critical section of the pipeline can be located. Feasible project enhancement suggestions based on the analysis results are proposed. The designer is able to utilize this stress analysis method to perform an ultimate design for an oil pipeline in earthquake disaster areas; therefore, improving the safe operation of the pipeline. PMID:25692790
Frequency spectrum method-based stress analysis for oil pipelines in earthquake disaster areas.
Wu, Xiaonan; Lu, Hongfang; Huang, Kun; Wu, Shijuan; Qiao, Weibiao
2015-01-01
When a long distance oil pipeline crosses an earthquake disaster area, inertial force and strong ground motion can cause the pipeline stress to exceed the failure limit, resulting in bending and deformation failure. To date, researchers have performed limited safety analyses of oil pipelines in earthquake disaster areas that include stress analysis. Therefore, using the spectrum method and theory of one-dimensional beam units, CAESAR II is used to perform a dynamic earthquake analysis for an oil pipeline in the XX earthquake disaster area. This software is used to determine if the displacement and stress of the pipeline meet the standards when subjected to a strong earthquake. After performing the numerical analysis, the primary seismic action axial, longitudinal and horizontal displacement directions and the critical section of the pipeline can be located. Feasible project enhancement suggestions based on the analysis results are proposed. The designer is able to utilize this stress analysis method to perform an ultimate design for an oil pipeline in earthquake disaster areas; therefore, improving the safe operation of the pipeline.
NASA Astrophysics Data System (ADS)
Murrill, Steven R.; Franck, Charmaine C.; Espinola, Richard L.; Petkie, Douglas T.; De Lucia, Frank C.; Jacobs, Eddie L.
2011-11-01
The U.S. Army Research Laboratory (ARL) and the U.S. Army Night Vision and Electronic Sensors Directorate (NVESD) have developed a terahertz-band imaging system performance model/tool for detection and identification of concealed weaponry. The details of the MATLAB-based model which accounts for the effects of all critical sensor and display components, and for the effects of atmospheric attenuation, concealment material attenuation, and active illumination, were reported on at the 2005 SPIE Europe Security & Defence Symposium (Brugge). An advanced version of the base model that accounts for both the dramatic impact that target and background orientation can have on target observability as related to specular and Lambertian reflections captured by an active-illumination-based imaging system, and for the impact of target and background thermal emission, was reported on at the 2007 SPIE Defense and Security Symposium (Orlando). This paper will provide a comprehensive review of an enhanced, user-friendly, Windows-executable, terahertz-band imaging system performance analysis and design tool that now includes additional features such as a MODTRAN-based atmospheric attenuation calculator and advanced system architecture configuration inputs that allow for straightforward performance analysis of active or passive systems based on scanning (single- or line-array detector element(s)) or staring (focal-plane-array detector elements) imaging architectures. This newly enhanced THz imaging system design tool is an extension of the advanced THz imaging system performance model that was developed under the Defense Advanced Research Project Agency's (DARPA) Terahertz Imaging Focal-Plane Technology (TIFT) program. This paper will also provide example system component (active-illumination source and detector) trade-study analyses using the new features of this user-friendly THz imaging system performance analysis and design tool.
Novel Image Encryption Scheme Based on Chebyshev Polynomial and Duffing Map
2014-01-01
We present a novel image encryption algorithm using Chebyshev polynomial based on permutation and substitution and Duffing map based on substitution. Comprehensive security analysis has been performed on the designed scheme using key space analysis, visual testing, histogram analysis, information entropy calculation, correlation coefficient analysis, differential analysis, key sensitivity test, and speed test. The study demonstrates that the proposed image encryption algorithm shows advantages of more than 10113 key space and desirable level of security based on the good statistical results and theoretical arguments. PMID:25143970
ERIC Educational Resources Information Center
John H. Hinds Area Vocational School, Elwood, IN.
This book contains a task inventory, a task analysis of 150 tasks from that inventory, and a tool list for performance-based welding courses in the state of Indiana. The task inventory and tool list reflect 28 job titles found in Indiana. In the first part of the guide, tasks are listed by these domains: carbon-arc, electron beam, G.M.A.W., gas…
Advanced Video Analysis Needs for Human Performance Evaluation
NASA Technical Reports Server (NTRS)
Campbell, Paul D.
1994-01-01
Evaluators of human task performance in space missions make use of video as a primary source of data. Extraction of relevant human performance information from video is often a labor-intensive process requiring a large amount of time on the part of the evaluator. Based on the experiences of several human performance evaluators, needs were defined for advanced tools which could aid in the analysis of video data from space missions. Such tools should increase the efficiency with which useful information is retrieved from large quantities of raw video. They should also provide the evaluator with new analytical functions which are not present in currently used methods. Video analysis tools based on the needs defined by this study would also have uses in U.S. industry and education. Evaluation of human performance from video data can be a valuable technique in many industrial and institutional settings where humans are involved in operational systems and processes.
The role of ecological dynamics in analysing performance in team sports.
Vilar, Luís; Araújo, Duarte; Davids, Keith; Button, Chris
2012-01-01
Performance analysis is a subdiscipline of sports sciences and one-approach, notational analysis, has been used to objectively audit and describe behaviours of performers during different subphases of play, providing additional information for practitioners to improve future sports performance. Recent criticisms of these methods have suggested the need for a sound theoretical rationale to explain performance behaviours, not just describe them. The aim of this article was to show how ecological dynamics provides a valid theoretical explanation of performance in team sports by explaining the formation of successful and unsuccessful patterns of play, based on symmetry-breaking processes emerging from functional interactions between players and the performance environment. We offer the view that ecological dynamics is an upgrade to more operational methods of performance analysis that merely document statistics of competitive performance. In support of our arguments, we refer to exemplar data on competitive performance in team sports that have revealed functional interpersonal interactions between attackers and defenders, based on variations in the spatial positioning of performers relative to each other in critical performance areas, such as the scoring zones. Implications of this perspective are also considered for practice task design and sport development programmes.
ERIC Educational Resources Information Center
Yasar, M. Diyaddin
2017-01-01
This study aimed at performing content analysis and meta-analysis on dissertations related to brain-based learning in science education to find out the general trend and tendency of brain-based learning in science education and find out the effect of such studies on achievement and attitude of learners with the ultimate aim of raising awareness…
Analysis of the Seismic Performance of Isolated Buildings according to Life-Cycle Cost
Dang, Yu; Han, Jian-ping; Li, Yong-tao
2015-01-01
This paper proposes an indicator of seismic performance based on life-cycle cost of a building. It is expressed as a ratio of lifetime damage loss to life-cycle cost and determines the seismic performance of isolated buildings. Major factors are considered, including uncertainty in hazard demand and structural capacity, initial costs, and expected loss during earthquakes. Thus, a high indicator value indicates poor building seismic performance. Moreover, random vibration analysis is conducted to measure structural reliability and evaluate the expected loss and life-cycle cost of isolated buildings. The expected loss of an actual, seven-story isolated hospital building is only 37% of that of a fixed-base building. Furthermore, the indicator of the structural seismic performance of the isolated building is much lower in value than that of the structural seismic performance of the fixed-base building. Therefore, isolated buildings are safer and less risky than fixed-base buildings. The indicator based on life-cycle cost assists owners and engineers in making investment decisions in consideration of structural design, construction, and expected loss. It also helps optimize the balance between building reliability and building investment. PMID:25653677
Analysis of the seismic performance of isolated buildings according to life-cycle cost.
Dang, Yu; Han, Jian-Ping; Li, Yong-Tao
2015-01-01
This paper proposes an indicator of seismic performance based on life-cycle cost of a building. It is expressed as a ratio of lifetime damage loss to life-cycle cost and determines the seismic performance of isolated buildings. Major factors are considered, including uncertainty in hazard demand and structural capacity, initial costs, and expected loss during earthquakes. Thus, a high indicator value indicates poor building seismic performance. Moreover, random vibration analysis is conducted to measure structural reliability and evaluate the expected loss and life-cycle cost of isolated buildings. The expected loss of an actual, seven-story isolated hospital building is only 37% of that of a fixed-base building. Furthermore, the indicator of the structural seismic performance of the isolated building is much lower in value than that of the structural seismic performance of the fixed-base building. Therefore, isolated buildings are safer and less risky than fixed-base buildings. The indicator based on life-cycle cost assists owners and engineers in making investment decisions in consideration of structural design, construction, and expected loss. It also helps optimize the balance between building reliability and building investment.
2010-12-01
important factor is that all the offerors have adequate information about the requirements and performance- based strategy. That is why communication with...progress and unsuccessful results. Lack of Skilled Acquisition Workforce: As we know, the success of the every system and organization is based on the...term services such as information technology service. A GAO (2008) report found that, “implementing a performance- based approach is often more
ERIC Educational Resources Information Center
Travis, James L., III
2014-01-01
This study investigated how and to what extent the development and use of the OV-5a operational architecture decomposition tree (OADT) from the Department of Defense (DoD) Architecture Framework (DoDAF) affects requirements analysis with respect to complete performance metrics for performance-based services acquisition of ICT under rigid…
An Integrated Low-Speed Performance and Noise Prediction Methodology for Subsonic Aircraft
NASA Technical Reports Server (NTRS)
Olson, E. D.; Mavris, D. N.
2000-01-01
An integrated methodology has been assembled to compute the engine performance, takeoff and landing trajectories, and community noise levels for a subsonic commercial aircraft. Where feasible, physics-based noise analysis methods have been used to make the results more applicable to newer, revolutionary designs and to allow for a more direct evaluation of new technologies. The methodology is intended to be used with approximation methods and risk analysis techniques to allow for the analysis of a greater number of variable combinations while retaining the advantages of physics-based analysis. Details of the methodology are described and limited results are presented for a representative subsonic commercial aircraft.
Considering Horn's Parallel Analysis from a Random Matrix Theory Point of View.
Saccenti, Edoardo; Timmerman, Marieke E
2017-03-01
Horn's parallel analysis is a widely used method for assessing the number of principal components and common factors. We discuss the theoretical foundations of parallel analysis for principal components based on a covariance matrix by making use of arguments from random matrix theory. In particular, we show that (i) for the first component, parallel analysis is an inferential method equivalent to the Tracy-Widom test, (ii) its use to test high-order eigenvalues is equivalent to the use of the joint distribution of the eigenvalues, and thus should be discouraged, and (iii) a formal test for higher-order components can be obtained based on a Tracy-Widom approximation. We illustrate the performance of the two testing procedures using simulated data generated under both a principal component model and a common factors model. For the principal component model, the Tracy-Widom test performs consistently in all conditions, while parallel analysis shows unpredictable behavior for higher-order components. For the common factor model, including major and minor factors, both procedures are heuristic approaches, with variable performance. We conclude that the Tracy-Widom procedure is preferred over parallel analysis for statistically testing the number of principal components based on a covariance matrix.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Herrera, Joshua M.
2015-03-01
This report is an analysis of the means of egress and life safety requirements for the laboratory building. The building is located at Sandia National Laboratories (SNL) in Albuquerque, NM. The report includes a prescriptive-based analysis as well as a performance-based analysis. Following the analysis are appendices which contain maps of the laboratory building used throughout the analysis. The top of all the maps is assumed to be north.
NASA Astrophysics Data System (ADS)
Arsad, Roslah; Shaari, Siti Nabilah Mohd; Isa, Zaidi
2017-11-01
Determining stock performance using financial ratio is challenging for many investors and researchers. Financial ratio can indicate the strengths and weaknesses of a company's stock performance. There are five categories of financial ratios namely liquidity, efficiency, leverage, profitability and market ratios. It is important to interpret the ratio correctly for proper financial decision making. The purpose of this study is to compare the performance of listed companies in Bursa Malaysia using Data Envelopment Analysis (DEA) and DuPont analysis Models. The study is conducted in 2015 involving 116 consumer products companies listed in Bursa Malaysia. The estimation method of Data Envelopment Analysis computes the efficiency scores and ranks the companies accordingly. The Alirezaee and Afsharian's method of analysis based Charnes, Cooper and Rhodes (CCR) where Constant Return to Scale (CRS) is employed. The DuPont analysis is a traditional tool for measuring the operating performance of companies. In this study, DuPont analysis is used to evaluate three different aspects such as profitability, efficiency of assets utilization and financial leverage. Return on Equity (ROE) is also calculated in DuPont analysis. This study finds that both analysis models provide different rankings of the selected samples. Hypothesis testing based on Pearson's correlation, indicates that there is no correlation between rankings produced by DEA and DuPont analysis. The DEA ranking model proposed by Alirezaee and Asharian is unstable. The method cannot provide complete ranking because the values of Balance Index is equal and zero.
COBRA ATD minefield detection model initial performance analysis
NASA Astrophysics Data System (ADS)
Holmes, V. Todd; Kenton, Arthur C.; Hilton, Russell J.; Witherspoon, Ned H.; Holloway, John H., Jr.
2000-08-01
A statistical performance analysis of the USMC Coastal Battlefield Reconnaissance and Analysis (COBRA) Minefield Detection (MFD) Model has been performed in support of the COBRA ATD Program under execution by the Naval Surface Warfare Center/Dahlgren Division/Coastal Systems Station . This analysis uses the Veridian ERIM International MFD model from the COBRA Sensor Performance Evaluation and Computational Tools for Research Analysis modeling toolbox and a collection of multispectral mine detection algorithm response distributions for mines and minelike clutter objects. These mine detection response distributions were generated form actual COBRA ATD test missions over littoral zone minefields. This analysis serves to validate both the utility and effectiveness of the COBRA MFD Model as a predictive MFD performance too. COBRA ATD minefield detection model algorithm performance results based on a simulate baseline minefield detection scenario are presented, as well as result of a MFD model algorithm parametric sensitivity study.
Environmental Assessment (EA) for Construct Base Civil Engineering Complex at McConnell AFB
2003-07-14
Engineer Squadron (22 CES/ CEVA ),53000 Hutchinson Street, Suite 109,McConnell AFB,KS,67221-3617 8. PERFORMING ORGANIZATION REPORT NUMBER 9. SPONSORING...Command Air Refueling Group Air Refueling Squadron Air Refueling Wing Bird Aircraft Strike Hazard Base Civil Engineer Best Management Practices...1991, in an "Economic Analysis", performed by Wilson and Company , dated 14 October 1993, and a repeat "Economic Analysis", dated 10 February 2000
Wind Energy Conversion System Analysis Model (WECSAM) computer program documentation
NASA Astrophysics Data System (ADS)
Downey, W. T.; Hendrick, P. L.
1982-07-01
Described is a computer-based wind energy conversion system analysis model (WECSAM) developed to predict the technical and economic performance of wind energy conversion systems (WECS). The model is written in CDC FORTRAN V. The version described accesses a data base containing wind resource data, application loads, WECS performance characteristics, utility rates, state taxes, and state subsidies for a six state region (Minnesota, Michigan, Wisconsin, Illinois, Ohio, and Indiana). The model is designed for analysis at the county level. The computer model includes a technical performance module and an economic evaluation module. The modules can be run separately or together. The model can be run for any single user-selected county within the region or looped automatically through all counties within the region. In addition, the model has a restart capability that allows the user to modify any data-base value written to a scratch file prior to the technical or economic evaluation.
Estimating Driving Performance Based on EEG Spectrum Analysis
NASA Astrophysics Data System (ADS)
Lin, Chin-Teng; Wu, Ruei-Cheng; Jung, Tzyy-Ping; Liang, Sheng-Fu; Huang, Teng-Yi
2005-12-01
The growing number of traffic accidents in recent years has become a serious concern to society. Accidents caused by driver's drowsiness behind the steering wheel have a high fatality rate because of the marked decline in the driver's abilities of perception, recognition, and vehicle control abilities while sleepy. Preventing such accidents caused by drowsiness is highly desirable but requires techniques for continuously detecting, estimating, and predicting the level of alertness of drivers and delivering effective feedbacks to maintain their maximum performance. This paper proposes an EEG-based drowsiness estimation system that combines electroencephalogram (EEG) log subband power spectrum, correlation analysis, principal component analysis, and linear regression models to indirectly estimate driver's drowsiness level in a virtual-reality-based driving simulator. Our results demonstrated that it is feasible to accurately estimate quantitatively driving performance, expressed as deviation between the center of the vehicle and the center of the cruising lane, in a realistic driving simulator.
Feng, Sheng; Lotz, Thomas; Chase, J Geoffrey; Hann, Christopher E
2010-01-01
Digital Image Elasto Tomography (DIET) is a non-invasive elastographic breast cancer screening technology, based on image-based measurement of surface vibrations induced on a breast by mechanical actuation. Knowledge of frequency response characteristics of a breast prior to imaging is critical to maximize the imaging signal and diagnostic capability of the system. A feasibility analysis for a non-invasive image based modal analysis system is presented that is able to robustly and rapidly identify resonant frequencies in soft tissue. Three images per oscillation cycle are enough to capture the behavior at a given frequency. Thus, a sweep over critical frequency ranges can be performed prior to imaging to determine critical imaging settings of the DIET system to optimize its tumor detection performance.
Spotlight-8 Image Analysis Software
NASA Technical Reports Server (NTRS)
Klimek, Robert; Wright, Ted
2006-01-01
Spotlight is a cross-platform GUI-based software package designed to perform image analysis on sequences of images generated by combustion and fluid physics experiments run in a microgravity environment. Spotlight can perform analysis on a single image in an interactive mode or perform analysis on a sequence of images in an automated fashion. Image processing operations can be employed to enhance the image before various statistics and measurement operations are performed. An arbitrarily large number of objects can be analyzed simultaneously with independent areas of interest. Spotlight saves results in a text file that can be imported into other programs for graphing or further analysis. Spotlight can be run on Microsoft Windows, Linux, and Apple OS X platforms.
Performance evaluation of existing building structure with pushover analysis
NASA Astrophysics Data System (ADS)
Handana, MAP; Karolina, R.; Steven
2018-02-01
In the management of the infrastructure of the building, during the period of buildings common building damage as a result of several reasons, earthquakes are common. The building is planned to work for a certain service life. But during the certain service life, the building vulnerable to damage due to various things. Any damage to cultivate can be detected as early as possible, because the damage could spread, triggering and exacerbating the latest. The newest concept to earthquake engineering is Performance Based Earthquake Engineering (PBEE). PBEE divided into two, namely Performance Based Seismic Design (PBSD) and Performance Based Seismic Evaluation (PBSE). Evaluation on PBSE one of which is the analysis of nonlinear pushover. Pushover analysis is a static analysis of nonlinear where the influence of the earthquake plan on building structure is considered as burdens static catch at the center of mass of each floor, which it was increased gradually until the loading causing the melting (plastic hinge) first within the building structure, then the load increases further changes the shapes of post-elastic large it reached the condition of elastic. Then followed melting (plastic hinge) in the location of the other structured.
Sports Stars: Analyzing the Performance of Astronomers at Visualization-based Discovery
NASA Astrophysics Data System (ADS)
Fluke, C. J.; Parrington, L.; Hegarty, S.; MacMahon, C.; Morgan, S.; Hassan, A. H.; Kilborn, V. A.
2017-05-01
In this data-rich era of astronomy, there is a growing reliance on automated techniques to discover new knowledge. The role of the astronomer may change from being a discoverer to being a confirmer. But what do astronomers actually look at when they distinguish between “sources” and “noise?” What are the differences between novice and expert astronomers when it comes to visual-based discovery? Can we identify elite talent or coach astronomers to maximize their potential for discovery? By looking to the field of sports performance analysis, we consider an established, domain-wide approach, where the expertise of the viewer (i.e., a member of the coaching team) plays a crucial role in identifying and determining the subtle features of gameplay that provide a winning advantage. As an initial case study, we investigate whether the SportsCode performance analysis software can be used to understand and document how an experienced Hi astronomer makes discoveries in spectral data cubes. We find that the process of timeline-based coding can be applied to spectral cube data by mapping spectral channels to frames within a movie. SportsCode provides a range of easy to use methods for annotation, including feature-based codes and labels, text annotations associated with codes, and image-based drawing. The outputs, including instance movies that are uniquely associated with coded events, provide the basis for a training program or team-based analysis that could be used in unison with discipline specific analysis software. In this coordinated approach to visualization and analysis, SportsCode can act as a visual notebook, recording the insight and decisions in partnership with established analysis methods. Alternatively, in situ annotation and coding of features would be a valuable addition to existing and future visualization and analysis packages.
NASA Astrophysics Data System (ADS)
Dadashzadeh, N.; Duzgun, H. S. B.; Yesiloglu-Gultekin, N.
2017-08-01
While advanced numerical techniques in slope stability analysis are successfully used in deterministic studies, they have so far found limited use in probabilistic analyses due to their high computation cost. The first-order reliability method (FORM) is one of the most efficient probabilistic techniques to perform probabilistic stability analysis by considering the associated uncertainties in the analysis parameters. However, it is not possible to directly use FORM in numerical slope stability evaluations as it requires definition of a limit state performance function. In this study, an integrated methodology for probabilistic numerical modeling of rock slope stability is proposed. The methodology is based on response surface method, where FORM is used to develop an explicit performance function from the results of numerical simulations. The implementation of the proposed methodology is performed by considering a large potential rock wedge in Sumela Monastery, Turkey. The accuracy of the developed performance function to truly represent the limit state surface is evaluated by monitoring the slope behavior. The calculated probability of failure is compared with Monte Carlo simulation (MCS) method. The proposed methodology is found to be 72% more efficient than MCS, while the accuracy is decreased with an error of 24%.
Spreadsheet-based engine data analysis tool - user's guide.
DOT National Transportation Integrated Search
2016-07-01
This record refers to both the spreadsheet tool - Fleet Equipment Performance Measurement Preventive Maintenance Model: Spreadsheet-Based Engine Data Analysis Tool, http://ntl.bts.gov/lib/60000/60000/60007/0-6626-P1_Final.xlsm - and its accompanying ...
Continued Evaluation of Gear Condition Indicator Performance on Rotorcraft Fleet
NASA Technical Reports Server (NTRS)
Delgado, Irebert R.; Dempsey, Paula J.; Antolick, Lance J.; Wade, Daniel R.
2013-01-01
This paper details analyses of condition indicator performance for the helicopter nose gearbox within the U.S. Army's Condition-Based Maintenance Program. Ten nose gearbox data sets underwent two specific analyses. A mean condition indicator level analysis was performed where condition indicator performance was based on a 'batting average' measured before and after part replacement. Two specific condition indicators, Diagnostic Algorithm 1 and Sideband Index, were found to perform well for the data sets studied. A condition indicator versus gear wear analysis was also performed, where gear wear photographs and descriptions from Army tear-down analyses were categorized based on ANSI/AGMA 1010-E95 standards. Seven nose gearbox data sets were analyzed and correlated with condition indicators Diagnostic Algorithm 1 and Sideband Index. Both were found to be most responsive to gear wear cases of micropitting and spalling. Input pinion nose gear box condition indicators were found to be more responsive to part replacement during overhaul than their corresponding output gear nose gear box condition indicators.
Giera, Brian; Bukosky, Scott; Lee, Elaine; ...
2018-01-23
Here, quantitative color analysis is performed on videos of high contrast, low power reversible electrophoretic deposition (EPD)-based displays operated under different applied voltages. This analysis is coded in an open-source software, relies on a color differentiation metric, ΔE * 00, derived from digital video, and provides an intuitive relationship between the operating conditions of the devices and their performance. Time-dependent ΔE * 00 color analysis reveals color relaxation behavior, recoverability for different voltage sequences, and operating conditions that can lead to optimal performance.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Giera, Brian; Bukosky, Scott; Lee, Elaine
Here, quantitative color analysis is performed on videos of high contrast, low power reversible electrophoretic deposition (EPD)-based displays operated under different applied voltages. This analysis is coded in an open-source software, relies on a color differentiation metric, ΔE * 00, derived from digital video, and provides an intuitive relationship between the operating conditions of the devices and their performance. Time-dependent ΔE * 00 color analysis reveals color relaxation behavior, recoverability for different voltage sequences, and operating conditions that can lead to optimal performance.
HSI top-down requirements analysis for ship manpower reduction
NASA Astrophysics Data System (ADS)
Malone, Thomas B.; Bost, J. R.
2000-11-01
U.S. Navy ship acquisition programs such as DD 21 and CVNX are increasingly relying on top down requirements analysis (TDRA) to define and assess design approaches for workload and manpower reduction, and for ensuring required levels of human performance, reliability, safety, and quality of life at sea. The human systems integration (HSI) approach to TDRA begins with a function analysis which identifies the functions derived from the requirements in the Operational Requirements Document (ORD). The function analysis serves as the function baseline for the ship, and also supports the definition of RDT&E and Total Ownership Cost requirements. A mission analysis is then conducted to identify mission scenarios, again based on requirements in the ORD, and the Design Reference Mission (DRM). This is followed by a mission/function analysis which establishes the function requirements to successfully perform the ship's missions. Function requirements of major importance for HSI are information, performance, decision, and support requirements associated with each function. An allocation of functions defines the roles of humans and automation in performing the functions associated with a mission. Alternate design concepts, based on function allocation strategies, are then described, and task networks associated with the concepts are developed. Task network simulations are conducted to assess workloads and human performance capabilities associated with alternate concepts. An assessment of the affordability and risk associated with alternate concepts is performed, and manning estimates are developed for feasible design concepts.
Skills, rules and knowledge in aircraft maintenance: errors in context
NASA Technical Reports Server (NTRS)
Hobbs, Alan; Williamson, Ann
2002-01-01
Automatic or skill-based behaviour is generally considered to be less prone to error than behaviour directed by conscious control. However, researchers who have applied Rasmussen's skill-rule-knowledge human error framework to accidents and incidents have sometimes found that skill-based errors appear in significant numbers. It is proposed that this is largely a reflection of the opportunities for error which workplaces present and does not indicate that skill-based behaviour is intrinsically unreliable. In the current study, 99 errors reported by 72 aircraft mechanics were examined in the light of a task analysis based on observations of the work of 25 aircraft mechanics. The task analysis identified the opportunities for error presented at various stages of maintenance work packages and by the job as a whole. Once the frequency of each error type was normalized in terms of the opportunities for error, it became apparent that skill-based performance is more reliable than rule-based performance, which is in turn more reliable than knowledge-based performance. The results reinforce the belief that industrial safety interventions designed to reduce errors would best be directed at those aspects of jobs that involve rule- and knowledge-based performance.
Convective Array Cooling for a Solar Powered Aircraft
NASA Technical Reports Server (NTRS)
Colozza, Anthony J.; Dolce, James (Technical Monitor)
2003-01-01
A general characteristic of photovoltaics is that they increase in efficiency as their operating temperature decreases. Based on this principal, the ability to increase a solar aircraft's performance by cooling the solar cells was examined. The solar cells were cooled by channeling some air underneath the cells and providing a convective cooling path to the back side of the array. A full energy balance and flow analysis of the air within the cooling passage was performed. The analysis was first performed on a preliminary level to estimate the benefits of the cooling passage. This analysis established a clear benefit to the cooling passage. Based on these results a more detailed analysis was performed. From this cell temperatures were calculated and array output power throughout a day period were determined with and without the cooling passage. The results showed that if the flow through the cooling passage remained laminar then the benefit in increased output power more than offset the drag induced by the cooling passage.
Grading the Metrics: Performance-Based Funding in the Florida State University System
ERIC Educational Resources Information Center
Cornelius, Luke M.; Cavanaugh, Terence W.
2016-01-01
A policy analysis of Florida's 10-factor Performance-Based Funding system for state universities. The focus of the article is on the system of performance metrics developed by the state Board of Governors and their impact on institutions and their missions. The paper also discusses problems and issues with the metrics, their ongoing evolution, and…
Analysis of the Department of Defense Pre-Award Contracting Process
2014-12-01
Justification and Approval JBSA Joint Base San Antonio KPIs Key Performance Indicators MAJCOMs Major Command MP Mandatory Commands NAVIAR...meets desired results. Results-based performance measurement establishes key performance indicators ( KPIs ) that determine whether procurement...or goals, and underlying business processes (Cullen, 2009, p. 38). Within each quadrant, Cullen provided examples of KPIs that serve to measure
Using a virtual reality temporal bone simulator to assess otolaryngology trainees.
Zirkle, Molly; Roberson, David W; Leuwer, Rudolf; Dubrowski, Adam
2007-02-01
The objective of this study is to determine the feasibility of computerized evaluation of resident performance using hand motion analysis on a virtual reality temporal bone (VR TB) simulator. We hypothesized that both computerized analysis and expert ratings would discriminate the performance of novices from experienced trainees. We also hypothesized that performance on the virtual reality temporal bone simulator (VR TB) would differentiate based on previous drilling experience. The authors conducted a randomized, blind assessment study. Nineteen volunteers from the Otolaryngology-Head and Neck Surgery training program at the University of Toronto drilled both a cadaveric TB and a simulated VR TB. Expert reviewers were asked to assess operative readiness of the trainee based on a blind video review of their performance. Computerized hand motion analysis of each participant's performance was conducted. Expert raters were able to discriminate novices from experienced trainees (P < .05) on cadaveric temporal bones, and there was a trend toward discrimination on VR TB performance. Hand motion analysis showed that experienced trainees had better movement economy than novices (P < .05) on the VR TB. Performance, as measured by hand motion analysis on the VR TB simulator, reflects trainees' previous drilling experience. This study suggests that otolaryngology trainees could accomplish initial temporal bone training on a VR TB simulator, which can provide feedback to the trainee, and may reduce the need for constant faculty supervision and evaluation.
Zhang, Wei; Zhou, Yue; Xu, Xiao-Quan; Kong, Ling-Yan; Xu, Hai; Yu, Tong-Fu; Shi, Hai-Bin; Feng, Qing
2018-01-01
To assess the performance of a whole-tumor histogram analysis of apparent diffusion coefficient (ADC) maps in differentiating thymic carcinoma from lymphoma, and compare it with that of a commonly used hot-spot region-of-interest (ROI)-based ADC measurement. Diffusion weighted imaging data of 15 patients with thymic carcinoma and 13 patients with lymphoma were retrospectively collected and processed with a mono-exponential model. ADC measurements were performed by using a histogram-based and hot-spot-ROI-based approach. In the histogram-based approach, the following parameters were generated: mean ADC (ADC mean ), median ADC (ADC median ), 10th and 90th percentile of ADC (ADC 10 and ADC 90 ), kurtosis, and skewness. The difference in ADCs between thymic carcinoma and lymphoma was compared using a t test. Receiver operating characteristic analyses were conducted to determine and compare the differentiating performance of ADCs. Lymphoma demonstrated significantly lower ADC mean , ADC median , ADC 10 , ADC 90 , and hot-spot-ROI-based mean ADC than those found in thymic carcinoma (all p values < 0.05). There were no differences found in the kurtosis ( p = 0.412) and skewness ( p = 0.273). The ADC 10 demonstrated optimal differentiating performance (cut-off value, 0.403 × 10 -3 mm 2 /s; area under the receiver operating characteristic curve [AUC], 0.977; sensitivity, 92.3%; specificity, 93.3%), followed by the ADC mean , ADC median , ADC 90 , and hot-spot-ROI-based mean ADC. The AUC of ADC 10 was significantly higher than that of the hot spot ROI based ADC (0.977 vs. 0.797, p = 0.036). Compared with the commonly used hot spot ROI based ADC measurement, a histogram analysis of ADC maps can improve the differentiating performance between thymic carcinoma and lymphoma.
Centre of pressure patterns in the golf swing: individual-based analysis.
Ball, Kevin; Best, Russell
2012-06-01
Weight transfer has been identified as important in group-based analyses. The aim of this study was to extend this work by examining the importance of weight transfer in the golf swing on an individual basis. Five professional and amateur golfers performed 50 swings with the driver, hitting a ball into a net. The golfer's centre of pressure position and velocity, parallel with the line of shot, were measured by two force plates at eight swing events that were identified from high-speed video. The relationships between these parameters and club head velocity at ball contact were examined using regression statistics. The results did support the use of group-based analysis, with all golfers returning significant relationships. However, results were also individual-specific, with golfers returning different combinations of significant factors. Furthermore, factors not identified in group-based analysis were significant on an individual basis. The most consistent relationship was a larger weight transfer range associated with a larger club head velocity (p < 0.05). All golfers also returned at least one significant relationship with rate of weight transfer at swing events (p < 0.01). Individual-based analysis should form part of performance-based biomechanical analysis of sporting skills.
A Performance-Based Instructional Theory
ERIC Educational Resources Information Center
Lawson, Tom E.
1974-01-01
The rationale for a performanced- based instructional theory has arisen from significant advances during the past several years in instructional psychology. Four major areas of concern are: analysis of subject-matter content in terms of performance competencies, diagnosis of pre-instructional behavior, formulation of an instructional…
Digital processing of mesoscale analysis and space sensor data
NASA Technical Reports Server (NTRS)
Hickey, J. S.; Karitani, S.
1985-01-01
The mesoscale analysis and space sensor (MASS) data management and analysis system on the research computer system is presented. The MASS data base management and analysis system was implemented on the research computer system which provides a wide range of capabilities for processing and displaying large volumes of conventional and satellite derived meteorological data. The research computer system consists of three primary computers (HP-1000F, Harris/6, and Perkin-Elmer 3250), each of which performs a specific function according to its unique capabilities. The overall tasks performed concerning the software, data base management and display capabilities of the research computer system in terms of providing a very effective interactive research tool for the digital processing of mesoscale analysis and space sensor data is described.
An Elementary Algorithm for Autonomous Air Terminal Merging and Interval Management
NASA Technical Reports Server (NTRS)
White, Allan L.
2017-01-01
A central element of air traffic management is the safe merging and spacing of aircraft during the terminal area flight phase. This paper derives and examines an algorithm for the merging and interval managing problem for Standard Terminal Arrival Routes. It describes a factor analysis for performance based on the distribution of arrivals, the operating period of the terminal, and the topology of the arrival routes; then presents results from a performance analysis and from a safety analysis for a realistic topology based on typical routes for a runway at Phoenix International Airport. The heart of the safety analysis is a statistical derivation on how to conduct a safety analysis for a local simulation when the safety requirement is given for the entire airspace.
DOT National Transportation Integrated Search
2009-12-22
This document presents the University of Michigan Transportation Research Institutes plan to : perform analysis of data collected from the light vehicle platform field operational test of the : Integrated Vehicle-Based Safety Systems (IVBSS) progr...
DOT National Transportation Integrated Search
2009-11-23
This document presents the University of Michigan Transportation Research Institutes plan to perform : analysis of data collected from the heavy truck platform field operational test of the Integrated Vehicle- : Based Safety Systems (IVBSS) progra...
A Study of ATLAS Grid Performance for Distributed Analysis
NASA Astrophysics Data System (ADS)
Panitkin, Sergey; Fine, Valery; Wenaus, Torre
2012-12-01
In the past two years the ATLAS Collaboration at the LHC has collected a large volume of data and published a number of ground breaking papers. The Grid-based ATLAS distributed computing infrastructure played a crucial role in enabling timely analysis of the data. We will present a study of the performance and usage of the ATLAS Grid as platform for physics analysis in 2011. This includes studies of general properties as well as timing properties of user jobs (wait time, run time, etc). These studies are based on mining of data archived by the PanDA workload management system.
Optimization of Turbine Engine Cycle Analysis with Analytic Derivatives
NASA Technical Reports Server (NTRS)
Hearn, Tristan; Hendricks, Eric; Chin, Jeffrey; Gray, Justin; Moore, Kenneth T.
2016-01-01
A new engine cycle analysis tool, called Pycycle, was built using the OpenMDAO framework. Pycycle provides analytic derivatives allowing for an efficient use of gradient-based optimization methods on engine cycle models, without requiring the use of finite difference derivative approximation methods. To demonstrate this, a gradient-based design optimization was performed on a turbofan engine model. Results demonstrate very favorable performance compared to an optimization of an identical model using finite-difference approximated derivatives.
Bernal-Martinez, L.; Castelli, M. V.; Rodriguez-Tudela, J. L.; Cuenca-Estrella, M.
2014-01-01
A retrospective analysis of real-time PCR (RT-PCR) results for 151 biopsy samples obtained from 132 patients with proven invasive fungal diseases was performed. PCR-based techniques proved to be fast and sensitive and enabled definitive diagnosis in all cases studied, with detection of a total of 28 fungal species. PMID:24574295
2007-06-15
the base -case, a series analysis can be performed by varying the various inputs to the network to examine the impact of potential changes to improve...successfully interrogated was the primary MOE. • Based solely on the cost benefit analysis , the RSTG found that the addition of an Unmanned Surface...cargo. The CBP uses a risk based analysis and intelligence to pre-screen, assess and examine 100% of suspicious containers. The remaining cargo is
NASA Technical Reports Server (NTRS)
Bebis, George (Inventor); Amayeh, Gholamreza (Inventor)
2015-01-01
Hand-based biometric analysis systems and techniques are described which provide robust hand-based identification and verification. An image of a hand is obtained, which is then segmented into a palm region and separate finger regions. Acquisition of the image is performed without requiring particular orientation or placement restrictions. Segmentation is performed without the use of reference points on the images. Each segment is analyzed by calculating a set of Zernike moment descriptors for the segment. The feature parameters thus obtained are then fused and compared to stored sets of descriptors in enrollment templates to arrive at an identity decision. By using Zernike moments, and through additional manipulation, the biometric analysis is invariant to rotation, scale, or translation or an in put image. Additionally, the analysis utilizes re-use of commonly-seen terms in Zernike calculations to achieve additional efficiencies over traditional Zernike moment calculation.
NASA Technical Reports Server (NTRS)
Bebis, George
2013-01-01
Hand-based biometric analysis systems and techniques provide robust hand-based identification and verification. An image of a hand is obtained, which is then segmented into a palm region and separate finger regions. Acquisition of the image is performed without requiring particular orientation or placement restrictions. Segmentation is performed without the use of reference points on the images. Each segment is analyzed by calculating a set of Zernike moment descriptors for the segment. The feature parameters thus obtained are then fused and compared to stored sets of descriptors in enrollment templates to arrive at an identity decision. By using Zernike moments, and through additional manipulation, the biometric analysis is invariant to rotation, scale, or translation or an input image. Additionally, the analysis uses re-use of commonly seen terms in Zernike calculations to achieve additional efficiencies over traditional Zernike moment calculation.
Mueller, Evelyn A; Bengel, Juergen; Wirtz, Markus A
2013-12-01
This study aimed to develop a self-description assessment instrument to measure work performance in patients with musculoskeletal diseases. In terms of the International Classification of Functioning, Disability and Health (ICF), work performance is defined as the degree of meeting the work demands (activities) at the actual workplace (environment). To account for the fact that work performance depends on the work demands of the job, we strived to develop item banks that allow a flexible use of item subgroups depending on the specific work demands of the patients' jobs. Item development included the collection of work tasks from literature and content validation through expert surveys and patient interviews. The resulting 122 items were answered by 621 patients with musculoskeletal diseases. Exploratory factor analysis to ascertain dimensionality and Rasch analysis (partial credit model) for each of the resulting dimensions were performed. Exploratory factor analysis resulted in four dimensions, and subsequent Rasch analysis led to the following item banks: 'impaired productivity' (15 items), 'impaired cognitive performance' (18), 'impaired coping with stress' (13) and 'impaired physical performance' (low physical workload 20 items, high physical workload 10 items). The item banks exhibited person separation indices (reliability) between 0.89 and 0.96. The assessment of work performance adds the activities component to the more commonly employed participation component of the ICF-model. The four item banks can be adapted to specific jobs where necessary without losing comparability of person measures, as the item banks are based on Rasch analysis.
Constrained independent component analysis approach to nonobtrusive pulse rate measurements
NASA Astrophysics Data System (ADS)
Tsouri, Gill R.; Kyal, Survi; Dianat, Sohail; Mestha, Lalit K.
2012-07-01
Nonobtrusive pulse rate measurement using a webcam is considered. We demonstrate how state-of-the-art algorithms based on independent component analysis suffer from a sorting problem which hinders their performance, and propose a novel algorithm based on constrained independent component analysis to improve performance. We present how the proposed algorithm extracts a photoplethysmography signal and resolves the sorting problem. In addition, we perform a comparative study between the proposed algorithm and state-of-the-art algorithms over 45 video streams using a finger probe oxymeter for reference measurements. The proposed algorithm provides improved accuracy: the root mean square error is decreased from 20.6 and 9.5 beats per minute (bpm) for existing algorithms to 3.5 bpm for the proposed algorithm. An error of 3.5 bpm is within the inaccuracy expected from the reference measurements. This implies that the proposed algorithm provided performance of equal accuracy to the finger probe oximeter.
Constrained independent component analysis approach to nonobtrusive pulse rate measurements.
Tsouri, Gill R; Kyal, Survi; Dianat, Sohail; Mestha, Lalit K
2012-07-01
Nonobtrusive pulse rate measurement using a webcam is considered. We demonstrate how state-of-the-art algorithms based on independent component analysis suffer from a sorting problem which hinders their performance, and propose a novel algorithm based on constrained independent component analysis to improve performance. We present how the proposed algorithm extracts a photoplethysmography signal and resolves the sorting problem. In addition, we perform a comparative study between the proposed algorithm and state-of-the-art algorithms over 45 video streams using a finger probe oxymeter for reference measurements. The proposed algorithm provides improved accuracy: the root mean square error is decreased from 20.6 and 9.5 beats per minute (bpm) for existing algorithms to 3.5 bpm for the proposed algorithm. An error of 3.5 bpm is within the inaccuracy expected from the reference measurements. This implies that the proposed algorithm provided performance of equal accuracy to the finger probe oximeter.
UV Lidar Receiver Analysis for Tropospheric Sensing of Ozone
NASA Technical Reports Server (NTRS)
Pliutau, Denis; DeYoung, Russell J.
2013-01-01
A simulation of a ground based Ultra-Violet Differential Absorption Lidar (UV-DIAL) receiver system was performed under realistic daytime conditions to understand how range and lidar performance can be improved for a given UV pulse laser energy. Calculations were also performed for an aerosol channel transmitting at 3 W. The lidar receiver simulation studies were optimized for the purpose of tropospheric ozone measurements. The transmitted lidar UV measurements were from 285 to 295 nm and the aerosol channel was 527-nm. The calculations are based on atmospheric transmission given by the HITRAN database and the Modern Era Retrospective Analysis for Research and Applications (MERRA) meteorological data. The aerosol attenuation is estimated using both the BACKSCAT 4.0 code as well as data collected during the CALIPSO mission. The lidar performance is estimated for both diffuseirradiance free cases corresponding to nighttime operation as well as the daytime diffuse scattered radiation component based on previously reported experimental data. This analysis presets calculations of the UV-DIAL receiver ozone and aerosol measurement range as a function of sky irradiance, filter bandwidth and laser transmitted UV and 527-nm energy
Agustini, Deonir; Bergamini, Márcio F; Marcolino-Junior, Luiz Humberto
2017-01-25
The micro flow injection analysis (μFIA) is a powerful technique that uses the principles of traditional flow analysis in a microfluidic device and brings a number of improvements related to the consumption of reagents and samples, speed of analysis and portability. However, the complexity and cost of manufacturing processes, difficulty in integrating micropumps and the limited performance of systems employing passive pumps are challenges that must be overcome. Here, we present the characterization and optimization of a low cost device based on cotton threads as microfluidic channel to perform μFIA based on passive pumps with good analytical performance in a simple, easy and inexpensive way. The transport of solutions is made through cotton threads by capillary force facilitated by gravity. After studying and optimizing several features related to the device, were obtained a flow rate of 2.2 ± 0.1 μL s -1 , an analytical frequency of 208 injections per hour, a sample injection volume of 2.0 μL and a waste volume of approximately 40 μL per analysis. For chronoamperometric determination of naproxen, a detection limit of 0.29 μmol L -1 was reached, with a relative standard deviation (RSD) of 1.69% between injections and a RSD of 3.79% with five different devices. Thus, based on the performance presented by proposed microfluidic device, it is possible to overcome some limitations of the μFIA systems based on passive pumps and allow expansion in the use of this technique. Copyright © 2016 Elsevier B.V. All rights reserved.
ERIC Educational Resources Information Center
Gresham, Frank M.; Elliott, Stephen N.; Kettler, Ryan J.
2010-01-01
Base rate information is important in clinical assessment because one cannot know how unusual or typical a phenomenon is without first knowing its base rate in the population. This study empirically determined the base rates of social skills acquisition and performance deficits, social skills strengths, and problem behaviors using a nationally…
Extreme learning machine for ranking: generalization analysis and applications.
Chen, Hong; Peng, Jiangtao; Zhou, Yicong; Li, Luoqing; Pan, Zhibin
2014-05-01
The extreme learning machine (ELM) has attracted increasing attention recently with its successful applications in classification and regression. In this paper, we investigate the generalization performance of ELM-based ranking. A new regularized ranking algorithm is proposed based on the combinations of activation functions in ELM. The generalization analysis is established for the ELM-based ranking (ELMRank) in terms of the covering numbers of hypothesis space. Empirical results on the benchmark datasets show the competitive performance of the ELMRank over the state-of-the-art ranking methods. Copyright © 2014 Elsevier Ltd. All rights reserved.
Performance and analysis of MAC protocols based on application
NASA Astrophysics Data System (ADS)
Yadav, Ravi; Daniel, A. K.
2018-04-01
Wireless Sensor Network is one of the rapid emerging technology in recent decades. It covers large application area as civilian and military. Wireless Sensor Network primary consists of sensor nodes having low-power, low cost and multifunctional activities to collaborates and communicates via wireless medium. The deployment of sensor nodes are adhoc in nature, so sensor nodes are auto organize themselves in such a way to communicate with each other. The characteristics make more challenging areas on WSNs. This paper gives overview about characteristics of WSNs, Architecture and Contention Based MAC protocol. The paper present analysis of various protocol based on performance.
Hall Thruster Technology for NASA Science Missions
NASA Technical Reports Server (NTRS)
Manzella, David; Oh, David; Aadland, Randall
2005-01-01
The performance of a prototype Hall thruster designed for Discovery-class NASA science mission applications was evaluated at input powers ranging from 0.2 to 2.9 kilowatts. These data were used to construct a throttle profile for a projected Hall thruster system based on this prototype thruster. The suitability of such a Hall thruster system to perform robotic exploration missions was evaluated through the analysis of a near Earth asteroid sample return mission. This analysis demonstrated that a propulsion system based on the prototype Hall thruster offers mission benefits compared to a propulsion system based on an existing ion thruster.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Augustoni, Arnold L.
A laser safety and hazard analysis was performed for the temperature stabilized Big Sky Laser Technology (BSLT) laser central to the ARES system based on the 2007 version of the American National Standards Institutes (ANSI) Standard Z136.1, for Safe Use of Lasers and the 2005 version of the ANSI Standard Z136.6, for Safe Use of Lasers Outdoors. The ARES laser system is a Van/Truck based mobile platform, which is used to perform laser interaction experiments and tests at various national test sites.
CFD analysis of heat transfer performance of graphene based hybrid nanofluid in radiators
NASA Astrophysics Data System (ADS)
Bharadwaj, Bharath R.; Sanketh Mogeraya, K.; Manjunath, D. M.; Rao Ponangi, Babu; Rajendra Prasad, K. S.; Krishna, V.
2018-04-01
For Improved performance of an automobile engine, Cooling systems are one of the critical systems that need attention. With increased capacity to carry away large amounts of wasted heat, performance of an engine is increased. Current research on Nano-fluids suggests that they offer higher heat transfer rate compared to that of conventional coolants. Hence this project seeks to investigate the use of hybrid-nanofluids in radiators so as to increase its heat transfer performance. Carboxyl Graphene and Graphene Oxide based nanoparticles were selected due to the very high thermal conductivity of Graphene. System Analysis of the radiator was performed by considering a small part of the whole automobile radiator modelled using SEIMENS NX. CFD analysis was conducted using ANSYS FLUENT® for the nanofluid defined and the increase in effectiveness was compared to that of conventional coolants. Usage of such nanofluids for a fixed cooling requirement in the future can lead to significant downsizing of the radiator.
Clinical Effectiveness of Occupational Therapy in Mental Health: A Meta-Analysis.
Ikiugu, Moses N; Nissen, Ranelle M; Bellar, Cali; Maassen, Alexya; Van Peursem, Katlin
The purpose of this study was to estimate the effectiveness of theory-based occupational therapy interventions in improving occupational performance and well-being among people with a mental health diagnosis. The meta-analysis included 11 randomized controlled trials with a total of 520 adult participants with a mental health diagnosis. Outcomes were occupational performance, well-being, or both. We conducted meta-analyses using Comprehensive Meta-Analysis software (Version 3.0) with occupational performance and well-being as the dependent variables. Results indicated a medium effect of intervention on improving occupational performance (mean Hedge's g = 0.50, Z = 4.05, p < .001) and a small effect on well-being (mean Hedge's g = 0.46, Z = 4.96, p < .001). Theory-based occupational therapy interventions may be effective in improving occupational performance and well-being among people with a mental health diagnosis and should be an integral part of rehabilitation services in mental health. Copyright © 2017 by the American Occupational Therapy Association, Inc.
geospatial data analysis using parallel processing High performance computing Renewable resource technical potential and supply curve analysis Spatial database utilization Rapid analysis of large geospatial datasets energy and geospatial analysis products Research Interests Rapid, web-based renewable resource analysis
On Bi-Grid Local Mode Analysis of Solution Techniques for 3-D Euler and Navier-Stokes Equations
NASA Technical Reports Server (NTRS)
Ibraheem, S. O.; Demuren, A. O.
1994-01-01
A procedure is presented for utilizing a bi-grid stability analysis as a practical tool for predicting multigrid performance in a range of numerical methods for solving Euler and Navier-Stokes equations. Model problems based on the convection, diffusion and Burger's equation are used to illustrate the superiority of the bi-grid analysis as a predictive tool for multigrid performance in comparison to the smoothing factor derived from conventional von Neumann analysis. For the Euler equations, bi-grid analysis is presented for three upwind difference based factorizations, namely Spatial, Eigenvalue and Combination splits, and two central difference based factorizations, namely LU and ADI methods. In the former, both the Steger-Warming and van Leer flux-vector splitting methods are considered. For the Navier-Stokes equations, only the Beam-Warming (ADI) central difference scheme is considered. In each case, estimates of multigrid convergence rates from the bi-grid analysis are compared to smoothing factors obtained from single-grid stability analysis. Effects of grid aspect ratio and flow skewness are examined. Both predictions are compared with practical multigrid convergence rates for 2-D Euler and Navier-Stokes solutions based on the Beam-Warming central scheme.
Big Data Toolsets to Pharmacometrics: Application of Machine Learning for Time‐to‐Event Analysis
Gong, Xiajing; Hu, Meng
2018-01-01
Abstract Additional value can be potentially created by applying big data tools to address pharmacometric problems. The performances of machine learning (ML) methods and the Cox regression model were evaluated based on simulated time‐to‐event data synthesized under various preset scenarios, i.e., with linear vs. nonlinear and dependent vs. independent predictors in the proportional hazard function, or with high‐dimensional data featured by a large number of predictor variables. Our results showed that ML‐based methods outperformed the Cox model in prediction performance as assessed by concordance index and in identifying the preset influential variables for high‐dimensional data. The prediction performances of ML‐based methods are also less sensitive to data size and censoring rates than the Cox regression model. In conclusion, ML‐based methods provide a powerful tool for time‐to‐event analysis, with a built‐in capacity for high‐dimensional data and better performance when the predictor variables assume nonlinear relationships in the hazard function. PMID:29536640
NASA Astrophysics Data System (ADS)
Wang, Yupeng; Chang, Kyunghi
In this paper, we analyze the coexistence issues of M-WiMAX TDD and WCDMA FDD systems. Smart antenna techniques are applied to mitigate the performance loss induced by adjacent channel interference (ACI) in the scenarios where performance is heavily degraded. In addition, an ACI model is proposed to capture the effect of transmit beamforming at the M-WiMAX base station. Furthermore, a MCS-based throughput analysis is proposed, to jointly consider the effects of ACI, system packet error rate requirement, and the available modulation and coding schemes, which is not possible by using the conventional Shannon equation based analysis. From the results, we find that the proposed MCS-based analysis method is quite suitable to analyze the system theoretical throughput in a practical manner.
Turbine blade forced response prediction using FREPS
NASA Technical Reports Server (NTRS)
Murthy, Durbha, V.; Morel, Michael R.
1993-01-01
This paper describes a software system called FREPS (Forced REsponse Prediction System) that integrates structural dynamic, steady and unsteady aerodynamic analyses to efficiently predict the forced response dynamic stresses in axial flow turbomachinery blades due to aerodynamic and mechanical excitations. A flutter analysis capability is also incorporated into the system. The FREPS system performs aeroelastic analysis by modeling the motion of the blade in terms of its normal modes. The structural dynamic analysis is performed by a finite element code such as MSC/NASTRAN. The steady aerodynamic analysis is based on nonlinear potential theory and the unsteady aerodynamic analyses is based on the linearization of the non-uniform potential flow mean. The program description and presentation of the capabilities are reported herein. The effectiveness of the FREPS package is demonstrated on the High Pressure Oxygen Turbopump turbine of the Space Shuttle Main Engine. Both flutter and forced response analyses are performed and typical results are illustrated.
Information Extraction for System-Software Safety Analysis: Calendar Year 2007 Year-End Report
NASA Technical Reports Server (NTRS)
Malin, Jane T.
2008-01-01
This annual report describes work to integrate a set of tools to support early model-based analysis of failures and hazards due to system-software interactions. The tools perform and assist analysts in the following tasks: 1) extract model parts from text for architecture and safety/hazard models; 2) combine the parts with library information to develop the models for visualization and analysis; 3) perform graph analysis on the models to identify possible paths from hazard sources to vulnerable entities and functions, in nominal and anomalous system-software configurations; 4) perform discrete-time-based simulation on the models to investigate scenarios where these paths may play a role in failures and mishaps; and 5) identify resulting candidate scenarios for software integration testing. This paper describes new challenges in a NASA abort system case, and enhancements made to develop the integrated tool set.
UNCERTAINTY ANALYSIS OF TCE USING THE DOSE EXPOSURE ESTIMATING MODEL (DEEM) IN ACSL
The ACSL-based Dose Exposure Estimating Model(DEEM) under development by EPA is used to perform art uncertainty analysis of a physiologically based pharmacokinetic (PSPK) model of trichloroethylene (TCE). This model involves several circulating metabolites such as trichloroacet...
Li, Bo; Tang, Jing; Yang, Qingxia; Cui, Xuejiao; Li, Shuang; Chen, Sijie; Cao, Quanxing; Xue, Weiwei; Chen, Na; Zhu, Feng
2016-12-13
In untargeted metabolomics analysis, several factors (e.g., unwanted experimental &biological variations and technical errors) may hamper the identification of differential metabolic features, which requires the data-driven normalization approaches before feature selection. So far, ≥16 normalization methods have been widely applied for processing the LC/MS based metabolomics data. However, the performance and the sample size dependence of those methods have not yet been exhaustively compared and no online tool for comparatively and comprehensively evaluating the performance of all 16 normalization methods has been provided. In this study, a comprehensive comparison on these methods was conducted. As a result, 16 methods were categorized into three groups based on their normalization performances across various sample sizes. The VSN, the Log Transformation and the PQN were identified as methods of the best normalization performance, while the Contrast consistently underperformed across all sub-datasets of different benchmark data. Moreover, an interactive web tool comprehensively evaluating the performance of 16 methods specifically for normalizing LC/MS based metabolomics data was constructed and hosted at http://server.idrb.cqu.edu.cn/MetaPre/. In summary, this study could serve as a useful guidance to the selection of suitable normalization methods in analyzing the LC/MS based metabolomics data.
Li, Bo; Tang, Jing; Yang, Qingxia; Cui, Xuejiao; Li, Shuang; Chen, Sijie; Cao, Quanxing; Xue, Weiwei; Chen, Na; Zhu, Feng
2016-01-01
In untargeted metabolomics analysis, several factors (e.g., unwanted experimental & biological variations and technical errors) may hamper the identification of differential metabolic features, which requires the data-driven normalization approaches before feature selection. So far, ≥16 normalization methods have been widely applied for processing the LC/MS based metabolomics data. However, the performance and the sample size dependence of those methods have not yet been exhaustively compared and no online tool for comparatively and comprehensively evaluating the performance of all 16 normalization methods has been provided. In this study, a comprehensive comparison on these methods was conducted. As a result, 16 methods were categorized into three groups based on their normalization performances across various sample sizes. The VSN, the Log Transformation and the PQN were identified as methods of the best normalization performance, while the Contrast consistently underperformed across all sub-datasets of different benchmark data. Moreover, an interactive web tool comprehensively evaluating the performance of 16 methods specifically for normalizing LC/MS based metabolomics data was constructed and hosted at http://server.idrb.cqu.edu.cn/MetaPre/. In summary, this study could serve as a useful guidance to the selection of suitable normalization methods in analyzing the LC/MS based metabolomics data. PMID:27958387
NASA Technical Reports Server (NTRS)
Nguyen, Duc T.; Storaasli, Olaf O.; Qin, Jiangning; Qamar, Ramzi
1994-01-01
An automatic differentiation tool (ADIFOR) is incorporated into a finite element based structural analysis program for shape and non-shape design sensitivity analysis of structural systems. The entire analysis and sensitivity procedures are parallelized and vectorized for high performance computation. Small scale examples to verify the accuracy of the proposed program and a medium scale example to demonstrate the parallel vector performance on multiple CRAY C90 processors are included.
ERIC Educational Resources Information Center
Flowers, Claudia P.; Raju, Nambury S.; Oshima, T. C.
Current interest in the assessment of measurement equivalence emphasizes two methods of analysis, linear, and nonlinear procedures. This study simulated data using the graded response model to examine the performance of linear (confirmatory factor analysis or CFA) and nonlinear (item-response-theory-based differential item function or IRT-Based…
Downsizing?-?Intellectual Capital Performance Anorexia or Enhancement?
ERIC Educational Resources Information Center
Williams, S. Mitchell
2004-01-01
The objective of this paper is to investigate if downsizing contributes to, or impedes, a firm's intellectual capital performance (ICE) based on a longitudinal analysis of 56 United States publicly listed companies that significantly downsized their workforce during the mid-1990s. Empirical analysis indicates that for the majority of firms, ICE…
Performance Analysis of GAME: A Generic Automated Marking Environment
ERIC Educational Resources Information Center
Blumenstein, Michael; Green, Steve; Fogelman, Shoshana; Nguyen, Ann; Muthukkumarasamy, Vallipuram
2008-01-01
This paper describes the Generic Automated Marking Environment (GAME) and provides a detailed analysis of its performance in assessing student programming projects and exercises. GAME has been designed to automatically assess programming assignments written in a variety of languages based on the "structure" of the source code and the correctness…
Scalable Performance Environments for Parallel Systems
NASA Technical Reports Server (NTRS)
Reed, Daniel A.; Olson, Robert D.; Aydt, Ruth A.; Madhyastha, Tara M.; Birkett, Thomas; Jensen, David W.; Nazief, Bobby A. A.; Totty, Brian K.
1991-01-01
As parallel systems expand in size and complexity, the absence of performance tools for these parallel systems exacerbates the already difficult problems of application program and system software performance tuning. Moreover, given the pace of technological change, we can no longer afford to develop ad hoc, one-of-a-kind performance instrumentation software; we need scalable, portable performance analysis tools. We describe an environment prototype based on the lessons learned from two previous generations of performance data analysis software. Our environment prototype contains a set of performance data transformation modules that can be interconnected in user-specified ways. It is the responsibility of the environment infrastructure to hide details of module interconnection and data sharing. The environment is written in C++ with the graphical displays based on X windows and the Motif toolkit. It allows users to interconnect and configure modules graphically to form an acyclic, directed data analysis graph. Performance trace data are represented in a self-documenting stream format that includes internal definitions of data types, sizes, and names. The environment prototype supports the use of head-mounted displays and sonic data presentation in addition to the traditional use of visual techniques.
Safety and Performance Analysis of the Non-Radar Oceanic/Remote Airspace In-Trail Procedure
NASA Technical Reports Server (NTRS)
Carreno, Victor A.; Munoz, Cesar A.
2007-01-01
This document presents a safety and performance analysis of the nominal case for the In-Trail Procedure (ITP) in a non-radar oceanic/remote airspace. The analysis estimates the risk of collision between the aircraft performing the ITP and a reference aircraft. The risk of collision is only estimated for the ITP maneuver and it is based on nominal operating conditions. The analysis does not consider human error, communication error conditions, or the normal risk of flight present in current operations. The hazards associated with human error and communication errors are evaluated in an Operational Hazards Analysis presented elsewhere.
ERIC Educational Resources Information Center
Zhao, Ningning; Valcke, Martin; Desoete, Annemie; Verhaeghe, JeanPierre
2012-01-01
The purpose of the present study is to explore the relationship between family socioeconomic status and mathematics performance on the base of a multi-level analysis involving a large sample of Chinese primary school students. A weak relationship is found between socioeconomic status and performance in the Chinese context. The relationship does…
A value-based medicine cost-utility analysis of idiopathic epiretinal membrane surgery.
Gupta, Omesh P; Brown, Gary C; Brown, Melissa M
2008-05-01
To perform a reference case, cost-utility analysis of epiretinal membrane (ERM) surgery using current literature on outcomes and complications. Computer-based, value-based medicine analysis. Decision analyses were performed under two scenarios: ERM surgery in better-seeing eye and ERM surgery in worse-seeing eye. The models applied long-term published data primarily from the Blue Mountains Eye Study and the Beaver Dam Eye Study. Visual acuity and major complications were derived from 25-gauge pars plana vitrectomy studies. Patient-based, time trade-off utility values, Markov modeling, sensitivity analysis, and net present value adjustments were used in the design and calculation of results. Main outcome measures included the number of discounted quality-adjusted-life-years (QALYs) gained and dollars spent per QALY gained. ERM surgery in the better-seeing eye compared with observation resulted in a mean gain of 0.755 discounted QALYs (3% annual rate) per patient treated. This model resulted in $4,680 per QALY for this procedure. When sensitivity analysis was performed, utility values varied from $6,245 to $3,746/QALY gained, medical costs varied from $3,510 to $5,850/QALY gained, and ERM recurrence rate increased to $5,524/QALY. ERM surgery in the worse-seeing eye compared with observation resulted in a mean gain of 0.27 discounted QALYs per patient treated. The $/QALY was $16,146 with a range of $20,183 to $12,110 based on sensitivity analyses. Utility values ranged from $21,520 to $12,916/QALY and ERM recurrence rate increased to $16,846/QALY based on sensitivity analysis. ERM surgery is a very cost-effective procedure when compared with other interventions across medical subspecialties.
NASA Astrophysics Data System (ADS)
Keyport, Ren N.; Oommen, Thomas; Martha, Tapas R.; Sajinkumar, K. S.; Gierke, John S.
2018-02-01
A comparative analysis of landslides detected by pixel-based and object-oriented analysis (OOA) methods was performed using very high-resolution (VHR) remotely sensed aerial images for the San Juan La Laguna, Guatemala, which witnessed widespread devastation during the 2005 Hurricane Stan. A 3-band orthophoto of 0.5 m spatial resolution together with a 115 field-based landslide inventory were used for the analysis. A binary reference was assigned with a zero value for landslide and unity for non-landslide pixels. The pixel-based analysis was performed using unsupervised classification, which resulted in 11 different trial classes. Detection of landslides using OOA includes 2-step K-means clustering to eliminate regions based on brightness; elimination of false positives using object properties such as rectangular fit, compactness, length/width ratio, mean difference of objects, and slope angle. Both overall accuracy and F-score for OOA methods outperformed pixel-based unsupervised classification methods in both landslide and non-landslide classes. The overall accuracy for OOA and pixel-based unsupervised classification was 96.5% and 94.3%, respectively, whereas the best F-score for landslide identification for OOA and pixel-based unsupervised methods: were 84.3% and 77.9%, respectively.Results indicate that the OOA is able to identify the majority of landslides with a few false positive when compared to pixel-based unsupervised classification.
Distributed optical fiber vibration sensor based on spectrum analysis of Polarization-OTDR system.
Zhang, Ziyi; Bao, Xiaoyi
2008-07-07
A fully distributed optical fiber vibration sensor is demonstrated based on spectrum analysis of Polarization-OTDR system. Without performing any data averaging, vibration disturbances up to 5 kHz is successfully demonstrated in a 1km fiber link with 10m spatial resolution. The FFT is performed at each spatial resolution; the relation of the disturbance at each frequency component versus location allows detection of multiple events simultaneously with different and the same frequency components.
Tool Efficiency Analysis model research in SEMI industry
NASA Astrophysics Data System (ADS)
Lei, Ma; Nana, Zhang; Zhongqiu, Zhang
2018-06-01
One of the key goals in SEMI industry is to improve equipment through put and ensure equipment production efficiency maximization. This paper is based on SEMI standards in semiconductor equipment control, defines the transaction rules between different tool states, and presents a TEA system model which is to analysis tool performance automatically based on finite state machine. The system was applied to fab tools and verified its effectiveness successfully, and obtained the parameter values used to measure the equipment performance, also including the advices of improvement.
Laser Safety and Hazardous Analysis for the ARES (Big Sky) Laser System
DOE Office of Scientific and Technical Information (OSTI.GOV)
AUGUSTONI, ARNOLD L.
A laser safety and hazard analysis was performed for the ARES laser system based on the 2000 version of the American National Standards Institute's (ANSI) Standard Z136.1,for Safe Use of Lasers and the 2000 version of the ANSI Standard Z136.6, for Safe Use of Lasers Outdoors. The ARES laser system is a Van/Truck based mobile platform, which is used to perform laser interaction experiments and tests at various national test sites.
NASA Technical Reports Server (NTRS)
Lefebvre, D. R.; Sanderson, A. C.
1994-01-01
Robot coordination and control systems for remote teleoperation applications are by necessity implemented on distributed computers. Modeling and performance analysis of these distributed robotic systems is difficult, but important for economic system design. Performance analysis methods originally developed for conventional distributed computer systems are often unsatisfactory for evaluating real-time systems. The paper introduces a formal model of distributed robotic control systems; and a performance analysis method, based on scheduling theory, which can handle concurrent hard-real-time response specifications. Use of the method is illustrated by a case of remote teleoperation which assesses the effect of communication delays and the allocation of robot control functions on control system hardware requirements.
NUMERICAL FLOW AND TRANSPORT SIMULATIONS SUPPORTING THE SALTSTONE FACILITY PERFORMANCE ASSESSMENT
DOE Office of Scientific and Technical Information (OSTI.GOV)
Flach, G.
2009-02-28
The Saltstone Disposal Facility Performance Assessment (PA) is being revised to incorporate requirements of Section 3116 of the Ronald W. Reagan National Defense Authorization Act for Fiscal Year 2005 (NDAA), and updated data and understanding of vault performance since the 1992 PA (Cook and Fowler 1992) and related Special Analyses. A hybrid approach was chosen for modeling contaminant transport from vaults and future disposal cells to exposure points. A higher resolution, largely deterministic, analysis is performed on a best-estimate Base Case scenario using the PORFLOW numerical analysis code. a few additional sensitivity cases are simulated to examine alternative scenarios andmore » parameter settings. Stochastic analysis is performed on a simpler representation of the SDF system using the GoldSim code to estimate uncertainty and sensitivity about the Base Case. This report describes development of PORFLOW models supporting the SDF PA, and presents sample results to illustrate model behaviors and define impacts relative to key facility performance objectives. The SDF PA document, when issued, should be consulted for a comprehensive presentation of results.« less
Panda, Jibitesh Kumar; Sastry, Gadepalli Ravi Kiran; Rai, Ram Naresh
2018-05-25
The energy situation and the concerns about global warming nowadays have ignited research interest in non-conventional and alternative fuel resources to decrease the emission and the continuous dependency on fossil fuels, particularly for various sectors like power generation, transportation, and agriculture. In the present work, the research is focused on evaluating the performance, emission characteristics, and combustion of biodiesel such as palm kernel methyl ester with the addition of diesel additive "triacetin" in it. A timed manifold injection (TMI) system was taken up to examine the influence of durations of several blends induced on the emission and performance characteristics as compared to normal diesel operation. This experimental study shows better performance and releases less emission as compared with mineral diesel and in turn, indicates that high performance and low emission is promising in PKME-triacetin fuel operation. This analysis also attempts to describe the application of the fuzzy logic-based Taguchi analysis to optimize the emission and performance parameters.
Aircraft Conceptual Design and Risk Analysis Using Physics-Based Noise Prediction
NASA Technical Reports Server (NTRS)
Olson, Erik D.; Mavris, Dimitri N.
2006-01-01
An approach was developed which allows for design studies of commercial aircraft using physics-based noise analysis methods while retaining the ability to perform the rapid trade-off and risk analysis studies needed at the conceptual design stage. A prototype integrated analysis process was created for computing the total aircraft EPNL at the Federal Aviation Regulations Part 36 certification measurement locations using physics-based methods for fan rotor-stator interaction tones and jet mixing noise. The methodology was then used in combination with design of experiments to create response surface equations (RSEs) for the engine and aircraft performance metrics, geometric constraints and take-off and landing noise levels. In addition, Monte Carlo analysis was used to assess the expected variability of the metrics under the influence of uncertainty, and to determine how the variability is affected by the choice of engine cycle. Finally, the RSEs were used to conduct a series of proof-of-concept conceptual-level design studies demonstrating the utility of the approach. The study found that a key advantage to using physics-based analysis during conceptual design lies in the ability to assess the benefits of new technologies as a function of the design to which they are applied. The greatest difficulty in implementing physics-based analysis proved to be the generation of design geometry at a sufficient level of detail for high-fidelity analysis.
Motif-based analysis of large nucleotide data sets using MEME-ChIP
Ma, Wenxiu; Noble, William S; Bailey, Timothy L
2014-01-01
MEME-ChIP is a web-based tool for analyzing motifs in large DNA or RNA data sets. It can analyze peak regions identified by ChIP-seq, cross-linking sites identified by cLIP-seq and related assays, as well as sets of genomic regions selected using other criteria. MEME-ChIP performs de novo motif discovery, motif enrichment analysis, motif location analysis and motif clustering, providing a comprehensive picture of the DNA or RNA motifs that are enriched in the input sequences. MEME-ChIP performs two complementary types of de novo motif discovery: weight matrix–based discovery for high accuracy; and word-based discovery for high sensitivity. Motif enrichment analysis using DNA or RNA motifs from human, mouse, worm, fly and other model organisms provides even greater sensitivity. MEME-ChIP’s interactive HTML output groups and aligns significant motifs to ease interpretation. this protocol takes less than 3 h, and it provides motif discovery approaches that are distinct and complementary to other online methods. PMID:24853928
An XML-Based Protocol for Distributed Event Services
NASA Technical Reports Server (NTRS)
Smith, Warren; Gunter, Dan; Quesnel, Darcy; Biegel, Bryan (Technical Monitor)
2001-01-01
A recent trend in distributed computing is the construction of high-performance distributed systems called computational grids. One difficulty we have encountered is that there is no standard format for the representation of performance information and no standard protocol for transmitting this information. This limits the types of performance analysis that can be undertaken in complex distributed systems. To address this problem, we present an XML-based protocol for transmitting performance events in distributed systems and evaluate the performance of this protocol.
López-Pacheco, María G; Sánchez-Fernández, Luis P; Molina-Lozano, Herón
2014-01-15
Noise levels of common sources such as vehicles, whistles, sirens, car horns and crowd sounds are mixed in urban soundscapes. Nowadays, environmental acoustic analysis is performed based on mixture signals recorded by monitoring systems. These mixed signals make it difficult for individual analysis which is useful in taking actions to reduce and control environmental noise. This paper aims at separating, individually, the noise source from recorded mixtures in order to evaluate the noise level of each estimated source. A method based on blind deconvolution and blind source separation in the wavelet domain is proposed. This approach provides a basis to improve results obtained in monitoring and analysis of common noise sources in urban areas. The method validation is through experiments based on knowledge of the predominant noise sources in urban soundscapes. Actual recordings of common noise sources are used to acquire mixture signals using a microphone array in semi-controlled environments. The developed method has demonstrated great performance improvements in identification, analysis and evaluation of common urban sources. © 2013 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Murrill, Steven R.; Jacobs, Eddie L.; Franck, Charmaine C.; Petkie, Douglas T.; De Lucia, Frank C.
2015-10-01
The U.S. Army Research Laboratory (ARL) has continued to develop and enhance a millimeter-wave (MMW) and submillimeter- wave (SMMW)/terahertz (THz)-band imaging system performance prediction and analysis tool for both the detection and identification of concealed weaponry, and for pilotage obstacle avoidance. The details of the MATLAB-based model which accounts for the effects of all critical sensor and display components, for the effects of atmospheric attenuation, concealment material attenuation, and active illumination, were reported on at the 2005 SPIE Europe Security and Defence Symposium (Brugge). An advanced version of the base model that accounts for both the dramatic impact that target and background orientation can have on target observability as related to specular and Lambertian reflections captured by an active-illumination-based imaging system, and for the impact of target and background thermal emission, was reported on at the 2007 SPIE Defense and Security Symposium (Orlando). Further development of this tool that includes a MODTRAN-based atmospheric attenuation calculator and advanced system architecture configuration inputs that allow for straightforward performance analysis of active or passive systems based on scanning (single- or line-array detector element(s)) or staring (focal-plane-array detector elements) imaging architectures was reported on at the 2011 SPIE Europe Security and Defence Symposium (Prague). This paper provides a comprehensive review of a newly enhanced MMW and SMMW/THz imaging system analysis and design tool that now includes an improved noise sub-model for more accurate and reliable performance predictions, the capability to account for postcapture image contrast enhancement, and the capability to account for concealment material backscatter with active-illumination- based systems. Present plans for additional expansion of the model's predictive capabilities are also outlined.
Uniformity testing: assessment of a centralized web-based uniformity analysis system.
Klempa, Meaghan C
2011-06-01
Uniformity testing is performed daily to ensure adequate camera performance before clinical use. The aim of this study is to assess the reliability of Beth Israel Deaconess Medical Center's locally built, centralized, Web-based uniformity analysis system by examining the differences between manufacturer and Web-based National Electrical Manufacturers Association integral uniformity calculations measured in the useful field of view (FOV) and the central FOV. Manufacturer and Web-based integral uniformity calculations measured in the useful FOV and the central FOV were recorded over a 30-d period for 4 cameras from 3 different manufacturers. These data were then statistically analyzed. The differences between the uniformity calculations were computed, in addition to the means and the SDs of these differences for each head of each camera. There was a correlation between the manufacturer and Web-based integral uniformity calculations in the useful FOV and the central FOV over the 30-d period. The average differences between the manufacturer and Web-based useful FOV calculations ranged from -0.30 to 0.099, with SD ranging from 0.092 to 0.32. For the central FOV calculations, the average differences ranged from -0.163 to 0.055, with SD ranging from 0.074 to 0.24. Most of the uniformity calculations computed by this centralized Web-based uniformity analysis system are comparable to the manufacturers' calculations, suggesting that this system is reasonably reliable and effective. This finding is important because centralized Web-based uniformity analysis systems are advantageous in that they test camera performance in the same manner regardless of the manufacturer.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rechard, Robert P.
This report presents a concise history in tabular form of events leading up to site identification in 1978, site selection in 1987, subsequent characterization, and ongoing analysis through 2008 of the performance of a repository for spent nuclear fuel and high-level radioactive waste at Yucca Mountain in southern Nevada. The tabulated events generally occurred in five periods: (1) commitment to mined geologic disposal and identification of sites; (2) site selection and analysis, based on regional geologic characterization through literature and analogous data; (3) feasibility analysis demonstrating calculation procedures and importance of system components, based on rough measures of performance usingmore » surface exploration, waste process knowledge, and general laboratory experiments; (4) suitability analysis demonstrating viability of disposal system, based on environment-specific laboratory experiments, in-situ experiments, and underground disposal system characterization; and (5) compliance analysis, based on completed site-specific characterization. Because the relationship is important to understanding the evolution of the Yucca Mountain Project, the tabulation also shows the interaction between four broad categories of political bodies and government agencies/institutions: (a) technical milestones of the implementing institutions, (b) development of the regulatory requirements and related federal policy in laws and court decisions, (c) Presidential and agency directives and decisions, and (d) critiques of the Yucca Mountain Project and pertinent national and world events related to nuclear energy and radioactive waste.« less
Systems Analysis Of Advanced Coal-Based Power Plants
NASA Technical Reports Server (NTRS)
Ferrall, Joseph F.; Jennings, Charles N.; Pappano, Alfred W.
1988-01-01
Report presents appraisal of integrated coal-gasification/fuel-cell power plants. Based on study comparing fuel-cell technologies with each other and with coal-based alternatives and recommends most promising ones for research and development. Evaluates capital cost, cost of electricity, fuel consumption, and conformance with environmental standards. Analyzes sensitivity of cost of electricity to changes in fuel cost, to economic assumptions, and to level of technology. Recommends further evaluation of integrated coal-gasification/fuel-cell integrated coal-gasification/combined-cycle, and pulverized-coal-fired plants. Concludes with appendixes detailing plant-performance models, subsystem-performance parameters, performance goals, cost bases, plant-cost data sheets, and plant sensitivity to fuel-cell performance.
Howard Evan Canfield; Vicente L. Lopes
2000-01-01
A process-based, simulation model for evaporation, soil water and streamflow (BROOK903) was used to estimate soil moisture change on a semiarid rangeland watershed in southeastern Arizona. A sensitivity analysis was performed to select parameters affecting ET and soil moisture for calibration. Automatic parameter calibration was performed using a procedure based on a...
Graphs, matrices, and the GraphBLAS: Seven good reasons
Kepner, Jeremy; Bader, David; Buluç, Aydın; ...
2015-01-01
The analysis of graphs has become increasingly important to a wide range of applications. Graph analysis presents a number of unique challenges in the areas of (1) software complexity, (2) data complexity, (3) security, (4) mathematical complexity, (5) theoretical analysis, (6) serial performance, and (7) parallel performance. Implementing graph algorithms using matrix-based approaches provides a number of promising solutions to these challenges. The GraphBLAS standard (istcbigdata.org/GraphBlas) is being developed to bring the potential of matrix based graph algorithms to the broadest possible audience. The GraphBLAS mathematically defines a core set of matrix-based graph operations that can be used to implementmore » a wide class of graph algorithms in a wide range of programming environments. This paper provides an introduction to the GraphBLAS and describes how the GraphBLAS can be used to address many of the challenges associated with analysis of graphs.« less
Characterization and analysis of motion mechanism of electroactive chitosan-based actuator.
Altınkaya, Emine; Seki, Yoldaş; Çetin, Levent; Gürses, Barış Oğuz; Özdemir, Okan; Sever, Kutlay; Sarıkanat, Mehmet
2018-02-01
In order to analyze the bending mechanism of the electroactive chitosan-based actuator, different amounts of poly(diallyldimethylammonium chloride) (PDAD) were incorporated in chitosan solution. The effects of PDAD concentration on electromechanical performance of chitosan actuator were investigated under various excitation voltages. With the incorporation of PDAD into chitosan solution, crosslinked chitosan film acts as an actuator showing a considerable displacement behavior. However it can be noted that higher incorporation of PDAD into chitosan solution decreased the performance of the actuators. Thermal, viscoelastic, and crystallographic properties of the chitosan films were examined by thermogravimetric analysis, dynamic mechanical analysis, and X-ray diffraction analysis, respectively. The effect of incorporation of PDAD in chitosan-based film on morphological properties of chitosan film was determined by scanning electron microscopy. It was observed that the films involving PDAD have larger pore size than the PDAD free film. Copyright © 2017 Elsevier Ltd. All rights reserved.
System-based strategies for p53 recovery.
Azam, Muhammad Rizwan; Fazal, Sahar; Ullah, Mukhtar; Bhatti, Aamer I
2018-06-01
The authors have proposed a systems theory-based novel drug design approach for the p53 pathway. The pathway is taken as a dynamic system represented by ordinary differential equations-based mathematical model. Using control engineering practices, the system analysis and subsequent controller design is performed for the re-activation of wild-type p53. p53 revival is discussed for both modes of operation, i.e. the sustained and oscillatory. To define the problem in control system paradigm, modification in the existing mathematical model is performed to incorporate the effect of Nutlin. Attractor point analysis is carried out to select the suitable domain of attraction. A two-loop negative feedback control strategy is devised to drag the system trajectories to the attractor point and to regulate cellular concentration of Nutlin, respectively. An integrated framework is constituted to incorporate the pharmacokinetic effects of Nutlin in the cancerous cells. Bifurcation analysis is also performed on the p53 model to see the conditions for p53 oscillation.
Morin, Ruth T; Axelrod, Bradley N
Latent Class Analysis (LCA) was used to classify a heterogeneous sample of neuropsychology data. In particular, we used measures of performance validity, symptom validity, cognition, and emotional functioning to assess and describe latent groups of functioning in these areas. A data-set of 680 neuropsychological evaluation protocols was analyzed using a LCA. Data were collected from evaluations performed for clinical purposes at an urban medical center. A four-class model emerged as the best fitting model of latent classes. The resulting classes were distinct based on measures of performance validity and symptom validity. Class A performed poorly on both performance and symptom validity measures. Class B had intact performance validity and heightened symptom reporting. The remaining two Classes performed adequately on both performance and symptom validity measures, differing only in cognitive and emotional functioning. In general, performance invalidity was associated with worse cognitive performance, while symptom invalidity was associated with elevated emotional distress. LCA appears useful in identifying groups within a heterogeneous sample with distinct performance patterns. Further, the orthogonal nature of performance and symptom validities is supported.
Han, Chao; Chen, Junhui; Chen, Bo; Lee, Frank Sen-Chun; Wang, Xiaoru
2006-09-01
A simple and reliable high performance liquid chromatographic (HPLC) method has been developed and validated for the fingerprinting of extracts from the root of Pseudostellaria heterophylla (Miq.) Pax. HPLC with gradient elution was performed on an authentic reference standard of powdered P. heterophylla (Miq.) Pax root and 11 plant samples of the root were collected from different geographic locations. The HPLC chromatograms have been standardized through the selection and identification of reference peaks and the normalization of retention times and peak intensities of all the common peaks. The standardized HPLC fingerprints show high stability and reproducibility, and thus can be used effectively for the screening analysis or quality assessment of the root or its derived products. Similarity index calculations based on cosine angle values or correlation methods have been performed on the HPLC fingerprints. As a group, the fingerprints of the P. heterophylla (Miq.) Pax samples studied are highly correlated with closely similar fingerprints. Within the group, the samples can be further divided into subgroups based on hierarchical clustering analysis (HCA). Sample grouping based on HCA coincides nicely with those based on the geographical origins of the samples. The HPLC fingerprinting techniques thus have high potential in authentication or source-tracing types of applications.
Quantitative assessment of human motion using video motion analysis
NASA Technical Reports Server (NTRS)
Probe, John D.
1993-01-01
In the study of the dynamics and kinematics of the human body a wide variety of technologies has been developed. Photogrammetric techniques are well documented and are known to provide reliable positional data from recorded images. Often these techniques are used in conjunction with cinematography and videography for analysis of planar motion, and to a lesser degree three-dimensional motion. Cinematography has been the most widely used medium for movement analysis. Excessive operating costs and the lag time required for film development, coupled with recent advances in video technology, have allowed video based motion analysis systems to emerge as a cost effective method of collecting and analyzing human movement. The Anthropometric and Biomechanics Lab at Johnson Space Center utilizes the video based Ariel Performance Analysis System (APAS) to develop data on shirtsleeved and space-suited human performance in order to plan efficient on-orbit intravehicular and extravehicular activities. APAS is a fully integrated system of hardware and software for biomechanics and the analysis of human performance and generalized motion measurement. Major components of the complete system include the video system, the AT compatible computer, and the proprietary software.
School-Based Decision Making: A Principal-Agent Perspective.
ERIC Educational Resources Information Center
Ferris, James M.
1992-01-01
A principal-agent framework is used to examine potential gains in educational performance and potential threats to public accountability that school-based decision-making proposals pose. Analysis underscores the need to tailor the design of decentralized decision making to the sources of poor educational performance and threats to school…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, T.F.; Mok, G.C.; Carlson, R.W.
1996-12-01
CASKS is a microcomputer based computer system developed by LLNL to assist the Nuclear Regulatory Commission in performing confirmatory analyses for licensing review of radioactive-material storage cask designs. The analysis programs of the CASKS computer system consist of four modules--the impact analysis module, the thermal analysis module, the thermally-induced stress analysis module, and the pressure-induced stress analysis module. CASKS uses a series of menus to coordinate input programs, cask analysis programs, output programs, data archive programs and databases, so the user is able to run the system in an interactive environment. This paper outlines the theoretical background on the impactmore » analysis module and the yielding surface formulation. The close agreement between the CASKS analytical predictions and the results obtained form the two storage asks drop tests performed by SNL and by BNFL at Winfrith serves as the validation of the CASKS impact analysis module.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, T.F.; Mok, G.C.; Carlson, R.W.
1995-08-01
CASKS is a microcomputer based computer system developed by LLNL to assist the Nuclear Regulatory Commission in performing confirmatory analyses for licensing review of radioactive-material storage cask designs. The analysis programs of the CASKS computer system consist of four modules: the impact analysis module, the thermal analysis module, the thermally-induced stress analysis module, and the pressure-induced stress analysis module. CASKS uses a series of menus to coordinate input programs, cask analysis programs, output programs, data archive programs and databases, so the user is able to run the system in an interactive environment. This paper outlines the theoretical background on themore » impact analysis module and the yielding surface formulation. The close agreement between the CASKS analytical predictions and the results obtained form the two storage casks drop tests performed by SNL and by BNFL at Winfrith serves as the validation of the CASKS impact analysis module.« less
NASA Astrophysics Data System (ADS)
Hasanuddin; Setyawan, A.; Yulianto, B.
2018-03-01
Assessment to the performance of road pavement is deemed necessary to improve the management quality of road maintenance and rehabilitation. This research to evaluate the road base on functional and structural and recommendations handling done. Assessing the pavement performance is conducted with functional and structural evaluation. Functional evaluation of pavement is based on the value of IRI (International Roughness Index) which among others is derived from reading NAASRA for analysis and recommended road handling. Meanwhile, structural evaluation of pavement is done by analyzing deflection value based on FWD (Falling Weight Deflectometer) data resulting in SN (Structural Number) value. The analysis will result in SN eff (Structural Number Effective) and SN f (Structural Number Future) value obtained from comparing SN eff to SN f value that leads to SCI (Structural Condition Index) value. SCI value implies the possible recommendation for handling pavement. The study done to Simpang Tuan-Batas Kota Jambi road segment was based on functional analysis. The study indicated that the road segment split into 12 segments in which segment 1, 3, 5, 7, 9, and 11 were of regular maintenance, segment 2, 4, 8, 10, 12 belonged to periodic maintenance, and segment 6 was of rehabilitation. The structural analysis resulted in 8 segments consisting of segment 1 and 2 recommended for regular maintenance, segment 3, 4, 5, and 7 for functional overlay, and 6 and 8 were of structural overlay.
Probabilistic Finite Element Analysis & Design Optimization for Structural Designs
NASA Astrophysics Data System (ADS)
Deivanayagam, Arumugam
This study focuses on implementing probabilistic nature of material properties (Kevlar® 49) to the existing deterministic finite element analysis (FEA) of fabric based engine containment system through Monte Carlo simulations (MCS) and implementation of probabilistic analysis in engineering designs through Reliability Based Design Optimization (RBDO). First, the emphasis is on experimental data analysis focusing on probabilistic distribution models which characterize the randomness associated with the experimental data. The material properties of Kevlar® 49 are modeled using experimental data analysis and implemented along with an existing spiral modeling scheme (SMS) and user defined constitutive model (UMAT) for fabric based engine containment simulations in LS-DYNA. MCS of the model are performed to observe the failure pattern and exit velocities of the models. Then the solutions are compared with NASA experimental tests and deterministic results. MCS with probabilistic material data give a good prospective on results rather than a single deterministic simulation results. The next part of research is to implement the probabilistic material properties in engineering designs. The main aim of structural design is to obtain optimal solutions. In any case, in a deterministic optimization problem even though the structures are cost effective, it becomes highly unreliable if the uncertainty that may be associated with the system (material properties, loading etc.) is not represented or considered in the solution process. Reliable and optimal solution can be obtained by performing reliability optimization along with the deterministic optimization, which is RBDO. In RBDO problem formulation, in addition to structural performance constraints, reliability constraints are also considered. This part of research starts with introduction to reliability analysis such as first order reliability analysis, second order reliability analysis followed by simulation technique that are performed to obtain probability of failure and reliability of structures. Next, decoupled RBDO procedure is proposed with a new reliability analysis formulation with sensitivity analysis, which is performed to remove the highly reliable constraints in the RBDO, thereby reducing the computational time and function evaluations. Followed by implementation of the reliability analysis concepts and RBDO in finite element 2D truss problems and a planar beam problem are presented and discussed.
NASA Technical Reports Server (NTRS)
Welch, Bryan W.
2016-01-01
NASA is participating in the International Committee on Global Navigation Satellite Systems (GNSS) (ICG)'s efforts towards demonstrating the benefits to the space user in the Space Service Volume (SSV) when a multi-GNSS solution space approach is utilized. The ICG Working Group: Enhancement of GNSS Performance, New Services and Capabilities has started a three phase analysis initiative as an outcome of recommendations at the ICG-10 meeting, in preparation for the ICG-11 meeting. The second phase of that increasing complexity and fidelity analysis initiative is based on augmenting the Phase 1 pure geometrical approach with signal strength-based limitations to determine if access is valid. The second phase of analysis has been completed, and the results are documented in this paper.
Integrating Reliability Analysis with a Performance Tool
NASA Technical Reports Server (NTRS)
Nicol, David M.; Palumbo, Daniel L.; Ulrey, Michael
1995-01-01
A large number of commercial simulation tools support performance oriented studies of complex computer and communication systems. Reliability of these systems, when desired, must be obtained by remodeling the system in a different tool. This has obvious drawbacks: (1) substantial extra effort is required to create the reliability model; (2) through modeling error the reliability model may not reflect precisely the same system as the performance model; (3) as the performance model evolves one must continuously reevaluate the validity of assumptions made in that model. In this paper we describe an approach, and a tool that implements this approach, for integrating a reliability analysis engine into a production quality simulation based performance modeling tool, and for modeling within such an integrated tool. The integrated tool allows one to use the same modeling formalisms to conduct both performance and reliability studies. We describe how the reliability analysis engine is integrated into the performance tool, describe the extensions made to the performance tool to support the reliability analysis, and consider the tool's performance.
Solar array electrical performance assessment for Space Station Freedom
NASA Technical Reports Server (NTRS)
Smith, Bryan K.; Brisco, Holly
1993-01-01
Electrical power for Space Station Freedom will be generated by large Photovoltaic arrays with a beginning of life power requirement of 30.8 kW per array. The solar arrays will operate in a Low Earth Orbit (LEO) over a design life of fifteen years. This paper provides an analysis of the predicted solar array electrical performance over the design life and presents a summary of supporting analysis and test data for the assigned model parameters and performance loss factors. Each model parameter and loss factor is assessed based upon program requirements, component analysis, and test data to date. A description of the LMSC performance model, future test plans, and predicted performance ranges are also given.
Solar array electrical performance assessment for Space Station Freedom
NASA Technical Reports Server (NTRS)
Smith, Bryan K.; Brisco, Holly
1993-01-01
Electrical power for Space Station Freedom will be generated by large photovoltaic arrays with a beginning of life power requirement of 30.8 kW per array. The solar arrays will operate in a Low Earth Orbit (LEO) over a design life of fifteen years. This paper provides an analysis of the predicted solar array electrical performance over the design life and presents a summary of supporting analysis and test data for the assigned model parameters and performance loss factors. Each model parameter and loss factor is assessed based upon program requirements, component analysis and test data to date. A description of the LMSC performance model future test plans and predicted performance ranges are also given.
Classifying Higher Education Institutions in Korea: A Performance-Based Approach
ERIC Educational Resources Information Center
Shin, Jung Cheol
2009-01-01
The purpose of this study was to classify higher education institutions according to institutional performance rather than predetermined benchmarks. Institutional performance was defined as research performance and classified using Hierarchical Cluster Analysis, a statistical method that classifies objects according to specified classification…
IDHEAS – A NEW APPROACH FOR HUMAN RELIABILITY ANALYSIS
DOE Office of Scientific and Technical Information (OSTI.GOV)
G. W. Parry; J.A Forester; V.N. Dang
2013-09-01
This paper describes a method, IDHEAS (Integrated Decision-Tree Human Event Analysis System) that has been developed jointly by the US NRC and EPRI as an improved approach to Human Reliability Analysis (HRA) that is based on an understanding of the cognitive mechanisms and performance influencing factors (PIFs) that affect operator responses. The paper describes the various elements of the method, namely the performance of a detailed cognitive task analysis that is documented in a crew response tree (CRT), and the development of the associated time-line to identify the critical tasks, i.e. those whose failure results in a human failure eventmore » (HFE), and an approach to quantification that is based on explanations of why the HFE might occur.« less
An Approach to Experimental Design for the Computer Analysis of Complex Phenomenon
NASA Technical Reports Server (NTRS)
Rutherford, Brian
2000-01-01
The ability to make credible system assessments, predictions and design decisions related to engineered systems and other complex phenomenon is key to a successful program for many large-scale investigations in government and industry. Recently, many of these large-scale analyses have turned to computational simulation to provide much of the required information. Addressing specific goals in the computer analysis of these complex phenomenon is often accomplished through the use of performance measures that are based on system response models. The response models are constructed using computer-generated responses together with physical test results where possible. They are often based on probabilistically defined inputs and generally require estimation of a set of response modeling parameters. As a consequence, the performance measures are themselves distributed quantities reflecting these variabilities and uncertainties. Uncertainty in the values of the performance measures leads to uncertainties in predicted performance and can cloud the decisions required of the analysis. A specific goal of this research has been to develop methodology that will reduce this uncertainty in an analysis environment where limited resources and system complexity together restrict the number of simulations that can be performed. An approach has been developed that is based on evaluation of the potential information provided for each "intelligently selected" candidate set of computer runs. Each candidate is evaluated by partitioning the performance measure uncertainty into two components - one component that could be explained through the additional computational simulation runs and a second that would remain uncertain. The portion explained is estimated using a probabilistic evaluation of likely results for the additional computational analyses based on what is currently known about the system. The set of runs indicating the largest potential reduction in uncertainty is then selected and the computational simulations are performed. Examples are provided to demonstrate this approach on small scale problems. These examples give encouraging results. Directions for further research are indicated.
Sutton, J P; DeJong, G; Song, H; Wilkerson, D
1997-12-01
To operationalize research findings about a medical rehabilitation classification and payment model by building a prototype of a prospective payment system, and to determine whether this prototype model promotes payment equity. This latter objective is accomplished by identifying whether any facility or payment model characteristics are systematically associated with financial performance. This study was conducted in two phases. In Phase 1 the components of a diagnosis-related group (DRG)-like payment system, including a base rate, function-related group (FRG) weights, and adjusters, were identified and estimated using hospital cost functions. Phase 2 consisted of a simulation analysis in which each facility's financial performance was modeled, based on its 1990-1991 case mix. A multivariate regression equation was conducted to assess the extent to which characteristics of 42 rehabilitation facilities contribute toward determining financial performance under the present Medicare payment system as well as under the hypothetical model developed. Phase 1 (model development) included 61 rehabilitation hospitals. Approximately 59% were rehabilitation units within a general hospital and 48% were teaching facilities. The number of rehabilitation beds averaged 52. Phase 2 of the stimulation analysis included 42 rehabilitation facilities, subscribers to UDS in 1990-1991. Of these, 69% were rehabilitation units and 52% were teaching facilities. The number of rehabilitation beds averaged 48. Financial performance, as measured by the ratio of reimbursement to average costs. Case-mix index is the primary determinant of financial performance under the present Medicare payment system. None of the facility characteristics included in this analysis were associated with financial performance under the hypothetical FRG payment model. The most notable impact of an FRG-based payment model would be to create a stronger link between resource intensity and level of reimbursement, resulting in greater equity in the reimbursement of inpatient medical rehabilitation hospitals.
ERIC Educational Resources Information Center
Macmann, Gregg M.; Barnett, David W.
1994-01-01
Describes exploratory and confirmatory analyses of verbal-performance procedures to illustrate concepts and procedures for analysis of correlated factors. Argues that, based on convergent and discriminant validity criteria, factors should have higher correlations with variables that they purport to measure than with other variables. Discusses…
An Analysis of Critical Issues in Korean Teacher Evaluation Systems
ERIC Educational Resources Information Center
Choi, Hee Jun; Park, Ji-Hye
2016-01-01
Korea has used three different teacher evaluation systems since the 1960s: teacher performance rating, teacher performance-based pay and teacher evaluation for professional development. A number of studies have focused on an analysis of each evaluation system in terms of its advent, development, advantages and disadvantages, but these studies have…
Luker, Kali R; Sullivan, Maura E; Peyre, Sarah E; Sherman, Randy; Grunwald, Tiffany
2008-01-01
The aim of this study was to compare the surgical knowledge of residents before and after receiving a cognitive task analysis-based multimedia teaching module. Ten plastic surgery residents were evaluated performing flexor tendon repair on 3 occasions. Traditional learning occurred between the first and second trial and served as the control. A teaching module was introduced as an intervention between the second and third trial using cognitive task analysis to illustrate decision-making skills. All residents showed improvement in their decision-making ability when performing flexor tendon repair after each surgical procedure. The group improved through traditional methods as well as exposure to our talk-aloud protocol (P > .01). After being trained using the cognitive task analysis curriculum the group displayed a statistically significant knowledge expansion (P < .01). Residents receiving cognitive task analysis-based multimedia surgical curriculum instruction achieved greater command of problem solving and are better equipped to make correct decisions in flexor tendon repair.
Performance Analysis of Receive Diversity in Wireless Sensor Networks over GBSBE Models
Goel, Shivali; Abawajy, Jemal H.; Kim, Tai-hoon
2010-01-01
Wireless sensor networks have attracted a lot of attention recently. In this paper, we develop a channel model based on the elliptical model for multipath components involving randomly placed scatterers in the scattering region with sensors deployed on a field. We verify that in a sensor network, the use of receive diversity techniques improves the performance of the system. Extensive performance analysis of the system is carried out for both single and multiple antennas with the applied receive diversity techniques. Performance analyses based on variations in receiver height, maximum multipath delay and transmit power have been performed considering different numbers of antenna elements present in the receiver array, Our results show that increasing the number of antenna elements for a wireless sensor network does indeed improve the BER rates that can be obtained. PMID:22163510
A multifaceted independent performance analysis of facial subspace recognition algorithms.
Bajwa, Usama Ijaz; Taj, Imtiaz Ahmad; Anwar, Muhammad Waqas; Wang, Xuan
2013-01-01
Face recognition has emerged as the fastest growing biometric technology and has expanded a lot in the last few years. Many new algorithms and commercial systems have been proposed and developed. Most of them use Principal Component Analysis (PCA) as a base for their techniques. Different and even conflicting results have been reported by researchers comparing these algorithms. The purpose of this study is to have an independent comparative analysis considering both performance and computational complexity of six appearance based face recognition algorithms namely PCA, 2DPCA, A2DPCA, (2D)(2)PCA, LPP and 2DLPP under equal working conditions. This study was motivated due to the lack of unbiased comprehensive comparative analysis of some recent subspace methods with diverse distance metric combinations. For comparison with other studies, FERET, ORL and YALE databases have been used with evaluation criteria as of FERET evaluations which closely simulate real life scenarios. A comparison of results with previous studies is performed and anomalies are reported. An important contribution of this study is that it presents the suitable performance conditions for each of the algorithms under consideration.
Workplace-based assessment: raters' performance theories and constructs.
Govaerts, M J B; Van de Wiel, M W J; Schuwirth, L W T; Van der Vleuten, C P M; Muijtjens, A M M
2013-08-01
Weaknesses in the nature of rater judgments are generally considered to compromise the utility of workplace-based assessment (WBA). In order to gain insight into the underpinnings of rater behaviours, we investigated how raters form impressions of and make judgments on trainee performance. Using theoretical frameworks of social cognition and person perception, we explored raters' implicit performance theories, use of task-specific performance schemas and the formation of person schemas during WBA. We used think-aloud procedures and verbal protocol analysis to investigate schema-based processing by experienced (N = 18) and inexperienced (N = 16) raters (supervisor-raters in general practice residency training). Qualitative data analysis was used to explore schema content and usage. We quantitatively assessed rater idiosyncrasy in the use of performance schemas and we investigated effects of rater expertise on the use of (task-specific) performance schemas. Raters used different schemas in judging trainee performance. We developed a normative performance theory comprising seventeen inter-related performance dimensions. Levels of rater idiosyncrasy were substantial and unrelated to rater expertise. Experienced raters made significantly more use of task-specific performance schemas compared to inexperienced raters, suggesting more differentiated performance schemas in experienced raters. Most raters started to develop person schemas the moment they began to observe trainee performance. The findings further our understanding of processes underpinning judgment and decision making in WBA. Raters make and justify judgments based on personal theories and performance constructs. Raters' information processing seems to be affected by differences in rater expertise. The results of this study can help to improve rater training, the design of assessment instruments and decision making in WBA.
2014-02-01
a 0.18 in thick polymer interlayer between two layers of 0.5 in tempered silica based “ soda lime ” glass . A 0.08 in shatter resistant film was...AFCEC-CX-TY-TR-2014-0005 ANALYSIS OF MULTIPLE-IMPACT BALLISTIC PERFORMANCE OF A TEMPERED GLASS LAMINATE WITH A STRIKE FACE FILM Michael A. Magrini...Interim Technical Report 3 JAN 2012 to 2 JAN 2013 Analysis of Multiple-Impact Ballistic Performance of A Tempered Glass Laminate with a Strike Face Film
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gerhard, M.A.; Sommer, S.C.
1995-04-01
AUTOCASK (AUTOmatic Generation of 3-D CASK models) is a microcomputer-based system of computer programs and databases developed at the Lawrence Livermore National Laboratory (LLNL) for the structural analysis of shipping casks for radioactive material. Model specification is performed on the microcomputer, and the analyses are performed on an engineering workstation or mainframe computer. AUTOCASK is based on 80386/80486 compatible microcomputers. The system is composed of a series of menus, input programs, display programs, a mesh generation program, and archive programs. All data is entered through fill-in-the-blank input screens that contain descriptive data requests.
Boletti, A; Boffi, P; Martelli, P; Ferrario, M; Martinelli, M
2015-01-26
To face the increased demand for bandwidth, cost-effectiveness and simplicity of future Ethernet data communications, a comparison between two different solutions based on directly-modulated VCSEL sources and Silicon Photonics technologies is carried out. Also by exploiting 4-PAM modulation, the transmission of 50-Gb/s and beyond capacity per channel is analyzed by means of BER performance. Applications for optical backplane, very short reach and in case of client-optics networks and intra and inter massive data centers communications (up to 10 km) are taken into account. A comparative analysis based on the power consumption is also proposed.
Exploratory reconstructability analysis of accident TBI data
NASA Astrophysics Data System (ADS)
Zwick, Martin; Carney, Nancy; Nettleton, Rosemary
2018-02-01
This paper describes the use of reconstructability analysis to perform a secondary study of traumatic brain injury data from automobile accidents. Neutral searches were done and their results displayed with a hypergraph. Directed searches, using both variable-based and state-based models, were applied to predict performance on two cognitive tests and one neurological test. Very simple state-based models gave large uncertainty reductions for all three DVs and sizeable improvements in percent correct for the two cognitive test DVs which were equally sampled. Conditional probability distributions for these models are easily visualized with simple decision trees. Confounding variables and counter-intuitive findings are also reported.
Sun, Meng; Yan, Donghui; Yang, Xiaolu; Xue, Xingyang; Zhou, Sujuan; Liang, Shengwang; Wang, Shumei; Meng, Jiang
2017-05-01
Raw Arecae Semen, the seed of Areca catechu L., as well as Arecae Semen Tostum and Arecae semen carbonisata are traditionally processed by stir-baking for subsequent use in a variety of clinical applications. These three Arecae semen types, important Chinese herbal drugs, have been used in China and other Asian countries for thousands of years. In this study, the sensory technologies of a colorimeter and sensitive validated high-performance liquid chromatography with diode array detection were employed to discriminate raw Arecae semen and its processed drugs. The color parameters of the samples were determined by a colorimeter instrument CR-410. Moreover, the fingerprints of the four alkaloids of arecaidine, guvacine, arecoline and guvacoline were surveyed by high-performance liquid chromatography. Subsequently, Student's t test, the analysis of variance, fingerprint similarity analysis, hierarchical cluster analysis, principal component analysis, factor analysis and Pearson's correlation test were performed for final data analysis. The results obtained demonstrated a significant color change characteristic for components in raw Arecae semen and its processed drugs. Crude and processed Arecae semen could be determined based on colorimetry and high-performance liquid chromatography with a diode array detector coupled with chemometrics methods for a comprehensive quality evaluation. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Use of Model-Based Design Methods for Enhancing Resiliency Analysis of Unmanned Aerial Vehicles
NASA Astrophysics Data System (ADS)
Knox, Lenora A.
The most common traditional non-functional requirement analysis is reliability. With systems becoming more complex, networked, and adaptive to environmental uncertainties, system resiliency has recently become the non-functional requirement analysis of choice. Analysis of system resiliency has challenges; which include, defining resilience for domain areas, identifying resilience metrics, determining resilience modeling strategies, and understanding how to best integrate the concepts of risk and reliability into resiliency. Formal methods that integrate all of these concepts do not currently exist in specific domain areas. Leveraging RAMSoS, a model-based reliability analysis methodology for Systems of Systems (SoS), we propose an extension that accounts for resiliency analysis through evaluation of mission performance, risk, and cost using multi-criteria decision-making (MCDM) modeling and design trade study variability modeling evaluation techniques. This proposed methodology, coined RAMSoS-RESIL, is applied to a case study in the multi-agent unmanned aerial vehicle (UAV) domain to investigate the potential benefits of a mission architecture where functionality to complete a mission is disseminated across multiple UAVs (distributed) opposed to being contained in a single UAV (monolithic). The case study based research demonstrates proof of concept for the proposed model-based technique and provides sufficient preliminary evidence to conclude which architectural design (distributed vs. monolithic) is most resilient based on insight into mission resilience performance, risk, and cost in addition to the traditional analysis of reliability.
Vo, T D; Dwyer, G; Szeto, H H
1986-04-01
A relatively powerful and inexpensive microcomputer-based system for the spectral analysis of the EEG is presented. High resolution and speed is achieved with the use of recently available large-scale integrated circuit technology with enhanced functionality (INTEL Math co-processors 8087) which can perform transcendental functions rapidly. The versatility of the system is achieved with a hardware organization that has distributed data acquisition capability performed by the use of a microprocessor-based analog to digital converter with large resident memory (Cyborg ISAAC-2000). Compiled BASIC programs and assembly language subroutines perform on-line or off-line the fast Fourier transform and spectral analysis of the EEG which is stored as soft as well as hard copy. Some results obtained from test application of the entire system in animal studies are presented.
Modeling and performance analysis of QoS data
NASA Astrophysics Data System (ADS)
Strzeciwilk, Dariusz; Zuberek, Włodzimierz M.
2016-09-01
The article presents the results of modeling and analysis of data transmission performance on systems that support quality of service. Models are designed and tested, taking into account multiservice network architecture, i.e. supporting the transmission of data related to different classes of traffic. Studied were mechanisms of traffic shaping systems, which are based on the Priority Queuing with an integrated source of data and the various sources of data that is generated. Discussed were the basic problems of the architecture supporting QoS and queuing systems. Designed and built were models based on Petri nets, supported by temporal logics. The use of simulation tools was to verify the mechanisms of shaping traffic with the applied queuing algorithms. It is shown that temporal models of Petri nets can be effectively used in the modeling and analysis of the performance of computer networks.
Study of helicopterroll control effectiveness criteria
NASA Technical Reports Server (NTRS)
Heffley, Robert K.; Bourne, Simon M.; Curtiss, Howard C., Jr.; Hindson, William S.; Hess, Ronald A.
1986-01-01
A study of helicopter roll control effectiveness based on closed-loop task performance measurement and modeling is presented. Roll control critieria are based on task margin, the excess of vehicle task performance capability over the pilot's task performance demand. Appropriate helicopter roll axis dynamic models are defined for use with analytic models for task performance. Both near-earth and up-and-away large-amplitude maneuvering phases are considered. The results of in-flight and moving-base simulation measurements are presented to support the roll control effectiveness criteria offered. This Volume contains the theoretical analysis, simulation results and criteria development.
Cost/Benefit Analysis of Competing Patient Education Systems.
1977-10-28
The purpose of this study was to determine the best of three methods of administering patient education based on both cost and benefits. The two...objectives were to perform a cost/benefit analysis (CBA) on the various approaches to administering patient education , and to make a recommendation based
NASA Astrophysics Data System (ADS)
Sun, Li; Wang, Deyu
2011-09-01
A new multi-level analysis method of introducing the super-element modeling method, derived from the multi-level analysis method first proposed by O. F. Hughes, has been proposed in this paper to solve the problem of high time cost in adopting a rational-based optimal design method for ship structural design. Furthermore, the method was verified by its effective application in optimization of the mid-ship section of a container ship. A full 3-D FEM model of a ship, suffering static and quasi-static loads, was used as the analyzing object for evaluating the structural performance of the mid-ship module, including static strength and buckling performance. Research results reveal that this new method could substantially reduce the computational cost of the rational-based optimization problem without decreasing its accuracy, which increases the feasibility and economic efficiency of using a rational-based optimal design method in ship structural design.
Distributed intelligent data analysis in diabetic patient management.
Bellazzi, R.; Larizza, C.; Riva, A.; Mira, A.; Fiocchi, S.; Stefanelli, M.
1996-01-01
This paper outlines the methodologies that can be used to perform an intelligent analysis of diabetic patients' data, realized in a distributed management context. We present a decision-support system architecture based on two modules, a Patient Unit and a Medical Unit, connected by telecommunication services. We stress the necessity to resort to temporal abstraction techniques, combined with time series analysis, in order to provide useful advice to patients; finally, we outline how data analysis and interpretation can be cooperatively performed by the two modules. PMID:8947655
Chung, Yun Won
2012-11-22
Location management, which consists of location registration and paging, is essential to provide mobile communication services to mobile stations (MSs). Since MSs riding on a public transportation system (TS) generates significant location registration signaling loads simultaneously when a TS with riding MSs moves between location areas (LAs), group location management was proposed. Under the group location management, an MS performs group registration when it gets on a TS and performs group deregistration when it gets off a TS. Then, only a TS updates its current location when it changes LA, on behalf of all riding MSs. In this paper, movement-based group location management using radio frequency identification (RFID) is proposed, where the MS's getting on and getting off behaviors are detected using RFID and only location update of a TS is carried out if the number of crossed cells from the last updated cell exceeds a predefined movement threshold, on behalf of all riding MSs. Then, we develop an analytical model for the performance analysis of the movement-based group location management and analyze the effects of various parameters on the performance. The results show that the movement-based group location management has reduced signaling cost compared with movement-based individual location management, and optimal performance can be achieved by choosing appropriate movement threshold values.
Chung, Yun Won
2012-01-01
Location management, which consists of location registration and paging, is essential to provide mobile communication services to mobile stations (MSs). Since MSs riding on a public transportation system (TS) generates significant location registration signaling loads simultaneously when a TS with riding MSs moves between location areas (LAs), group location management was proposed. Under the group location management, an MS performs group registration when it gets on a TS and performs group deregistration when it gets off a TS. Then, only a TS updates its current location when it changes LA, on behalf of all riding MSs. In this paper, movement-based group location management using radio frequency identification (RFID) is proposed, where the MS’s getting on and getting off behaviors are detected using RFID and only location update of a TS is carried out if the number of crossed cells from the last updated cell exceeds a predefined movement threshold, on behalf of all riding MSs. Then, we develop an analytical model for the performance analysis of the movement-based group location management and analyze the effects of various parameters on the performance. The results show that the movement-based group location management has reduced signaling cost compared with movement-based individual location management, and optimal performance can be achieved by choosing appropriate movement threshold values. PMID:23443368
ERIC Educational Resources Information Center
McNair, Robert C.
A Performance-Based Training (PBT) Qualification Guide/Checklist was developed that would enable a trainee to attain the skills, knowledge, and attitude required to operate the High Flux Beam Reactor at Brookhaven National Laboratory. Design of this guide/checklist was based on the Instructional System Design Model. The needs analysis identified…
2011-01-01
Background The computer-aided identification of specific gait patterns is an important issue in the assessment of Parkinson's disease (PD). In this study, a computer vision-based gait analysis approach is developed to assist the clinical assessments of PD with kernel-based principal component analysis (KPCA). Method Twelve PD patients and twelve healthy adults with no neurological history or motor disorders within the past six months were recruited and separated according to their "Non-PD", "Drug-On", and "Drug-Off" states. The participants were asked to wear light-colored clothing and perform three walking trials through a corridor decorated with a navy curtain at their natural pace. The participants' gait performance during the steady-state walking period was captured by a digital camera for gait analysis. The collected walking image frames were then transformed into binary silhouettes for noise reduction and compression. Using the developed KPCA-based method, the features within the binary silhouettes can be extracted to quantitatively determine the gait cycle time, stride length, walking velocity, and cadence. Results and Discussion The KPCA-based method uses a feature-extraction approach, which was verified to be more effective than traditional image area and principal component analysis (PCA) approaches in classifying "Non-PD" controls and "Drug-Off/On" PD patients. Encouragingly, this method has a high accuracy rate, 80.51%, for recognizing different gaits. Quantitative gait parameters are obtained, and the power spectrums of the patients' gaits are analyzed. We show that that the slow and irregular actions of PD patients during walking tend to transfer some of the power from the main lobe frequency to a lower frequency band. Our results indicate the feasibility of using gait performance to evaluate the motor function of patients with PD. Conclusion This KPCA-based method requires only a digital camera and a decorated corridor setup. The ease of use and installation of the current method provides clinicians and researchers a low cost solution to monitor the progression of and the treatment to PD. In summary, the proposed method provides an alternative to perform gait analysis for patients with PD. PMID:22074315
A framework supporting the development of a Grid portal for analysis based on ROI.
Ichikawa, K; Date, S; Kaishima, T; Shimojo, S
2005-01-01
In our research on brain function analysis, users require two different simultaneous types of processing: interactive processing to a specific part of data and high-performance batch processing to an entire dataset. The difference between these two types of processing is in whether or not the analysis is for data in the region of interest (ROI). In this study, we propose a Grid portal that has a mechanism to freely assign computing resources to the users on a Grid environment according to the users' two different types of processing requirements. We constructed a Grid portal which integrates interactive processing and batch processing by the following two mechanisms. First, a job steering mechanism controls job execution based on user-tagged priority among organizations with heterogeneous computing resources. Interactive jobs are processed in preference to batch jobs by this mechanism. Second, a priority-based result delivery mechanism that administrates a rank of data significance. The portal ensures a turn-around time of interactive processing by the priority-based job controlling mechanism, and provides the users with quality of services (QoS) for interactive processing. The users can access the analysis results of interactive jobs in preference to the analysis results of batch jobs. The Grid portal has also achieved high-performance computation of MEG analysis with batch processing on the Grid environment. The priority-based job controlling mechanism has been realized to freely assign computing resources to the users' requirements. Furthermore the achievement of high-performance computation contributes greatly to the overall progress of brain science. The portal has thus made it possible for the users to flexibly include the large computational power in what they want to analyze.
Determination of UAV pre-flight Checklist for flight test purpose using qualitative failure analysis
NASA Astrophysics Data System (ADS)
Hendarko; Indriyanto, T.; Syardianto; Maulana, F. A.
2018-05-01
Safety aspects are of paramount importance in flight, especially in flight test phase. Before performing any flight tests of either manned or unmanned aircraft, one should include pre-flight checklists as a required safety document in the flight test plan. This paper reports on the development of a new approach for determination of pre-flight checklists for UAV flight test based on aircraft’s failure analysis. The Lapan’s LSA (Light Surveillance Aircraft) is used as a study case, assuming this aircraft has been transformed into the unmanned version. Failure analysis is performed on LSA using fault tree analysis (FTA) method. Analysis is focused on propulsion system and flight control system, which fail of these systems will lead to catastrophic events. Pre-flight checklist of the UAV is then constructed based on the basic causes obtained from failure analysis.
Exploring JavaScript and ROOT technologies to create Web-based ATLAS analysis and monitoring tools
NASA Astrophysics Data System (ADS)
Sánchez Pineda, A.
2015-12-01
We explore the potential of current web applications to create online interfaces that allow the visualization, interaction and real cut-based physics analysis and monitoring of processes through a web browser. The project consists in the initial development of web- based and cloud computing services to allow students and researchers to perform fast and very useful cut-based analysis on a browser, reading and using real data and official Monte- Carlo simulations stored in ATLAS computing facilities. Several tools are considered: ROOT, JavaScript and HTML. Our study case is the current cut-based H → ZZ → llqq analysis of the ATLAS experiment. Preliminary but satisfactory results have been obtained online.
Szulfer, Jarosław; Plenis, Alina; Bączek, Tomasz
2014-06-13
This paper focuses on the application of a column classification system based on the Katholieke Universiteit Leuven for the characterization of physicochemical properties of core-shell and ultra-high performance liquid chromatographic stationary phases, followed by the verification of the reliability of the obtained column classification in pharmaceutical practice. In the study, 7 stationary phases produced in core-shell technology and 18 ultra-high performance liquid chromatographic columns were chromatographically tested, and ranking lists were built on the FKUL-values calculated against two selected reference columns. In the column performance test, an analysis of alfuzosin in the presence of related substances was carried out using the brands of the stationary phases with the highest ranking positions. Next, a system suitability test as described by the European Pharmacopoeia monograph was performed. Moreover, a study was also performed to achieve a purposeful shortening of the analysis time of the compounds of interest using the selected stationary phases. Finally, it was checked whether methods using core-shell and ultra-high performance liquid chromatographic columns can be an interesting alternative to the high-performance liquid chromatographic method for the analysis of alfuzosin in pharmaceutical practice. Copyright © 2014 Elsevier B.V. All rights reserved.
Fan fault diagnosis based on symmetrized dot pattern analysis and image matching
NASA Astrophysics Data System (ADS)
Xu, Xiaogang; Liu, Haixiao; Zhu, Hao; Wang, Songling
2016-07-01
To detect the mechanical failure of fans, a new diagnostic method based on the symmetrized dot pattern (SDP) analysis and image matching is proposed. Vibration signals of 13 kinds of running states are acquired on a centrifugal fan test bed and reconstructed by the SDP technique. The SDP pattern templates of each running state are established. An image matching method is performed to diagnose the fault. In order to improve the diagnostic accuracy, the single template, multiple templates and clustering fault templates are used to perform the image matching.
Determining team cognition from delay analysis using cross recurrence plot.
Hajari, Nasim; Cheng, Irene; Bin Zheng; Basu, Anup
2016-08-01
Team cognition is an important factor in evaluating and determining team performance. Forming a team with good shared cognition is even more crucial for laparoscopic surgery applications. In this study, we analyzed the eye tracking data of two surgeons during a laparoscopic simulation operation, then performed Cross Recurrence Analysis (CRA) on the recorded data to study the delay behaviour for good performer and poor performer teams. Dual eye tracking data for twenty two dyad teams were recorded during a laparoscopic task and then the teams were divided into good performer and poor performer teams based on the task times. Eventually we studied the delay between two team members for good and poor performer teams. The results indicated that the good performer teams show a smaller delay comparing to poor performer teams. This study is compatible with gaze overlap analysis between team members and therefore it is a good evidence of shared cognition between team members.
NASA Astrophysics Data System (ADS)
Zhang, Zhu; Li, Hongbin; Tang, Dengping; Hu, Chen; Jiao, Yang
2017-10-01
Metering performance is the key parameter of an electronic voltage transformer (EVT), and it requires high accuracy. The conventional off-line calibration method using a standard voltage transformer is not suitable for the key equipment in a smart substation, which needs on-line monitoring. In this article, we propose a method for monitoring the metering performance of an EVT on-line based on cyber-physics correlation analysis. By the electrical and physical properties of a substation running in three-phase symmetry, the principal component analysis method is used to separate the metering deviation caused by the primary fluctuation and the EVT anomaly. The characteristic statistics of the measured data during operation are extracted, and the metering performance of the EVT is evaluated by analyzing the change in statistics. The experimental results show that the method successfully monitors the metering deviation of a Class 0.2 EVT accurately. The method demonstrates the accurate evaluation of on-line monitoring of the metering performance on an EVT without a standard voltage transformer.
Azadeh, Ali; Sheikhalishahi, Mohammad
2015-06-01
A unique framework for performance optimization of generation companies (GENCOs) based on health, safety, environment, and ergonomics (HSEE) indicators is presented. To rank this sector of industry, the combination of data envelopment analysis (DEA), principal component analysis (PCA), and Taguchi are used for all branches of GENCOs. These methods are applied in an integrated manner to measure the performance of GENCO. The preferred model between DEA, PCA, and Taguchi is selected based on sensitivity analysis and maximum correlation between rankings. To achieve the stated objectives, noise is introduced into input data. The results show that Taguchi outperforms other methods. Moreover, a comprehensive experiment is carried out to identify the most influential factor for ranking GENCOs. The approach developed in this study could be used for continuous assessment and improvement of GENCO's performance in supplying energy with respect to HSEE factors. The results of such studies would help managers to have better understanding of weak and strong points in terms of HSEE factors.
NASA Technical Reports Server (NTRS)
Stoll, John C.
1995-01-01
The performance of an unaided attitude determination system based on GPS interferometry is examined using linear covariance analysis. The modelled system includes four GPS antennae onboard a gravity gradient stabilized spacecraft, specifically the Air Force's RADCAL satellite. The principal error sources are identified and modelled. The optimal system's sensitivities to these error sources are examined through an error budget and by varying system parameters. The effects of two satellite selection algorithms, Geometric and Attitude Dilution of Precision (GDOP and ADOP, respectively) are examined. The attitude performance of two optimal-suboptimal filters is also presented. Based on this analysis, the limiting factors in attitude accuracy are the knowledge of the relative antenna locations, the electrical path lengths from the antennae to the receiver, and the multipath environment. The performance of the system is found to be fairly insensitive to torque errors, orbital inclination, and the two satellite geometry figures-of-merit tested.
2015-09-01
agreement PBL performance based logistics PCO procuring contracting officer PHS&T packaging, handling, storage & transportation PM program manager...from which a program manager must decide, with the assistance of the program’s Procurement Contracting Officer ( PCO ). As one of the key tenets of PBL, a
Inequality in Academic Performance and Juvenile Convictions: An Area-Based Analysis
ERIC Educational Resources Information Center
Sabates, Ricardo; Feinstein, Leon; Shingal, Anirudh
2011-01-01
This paper focuses on the links between inequality in academic performance and juvenile conviction rates for violent crime, stealing from another person, burglary in a dwelling and racially motivated offences. We use area-based aggregate data to model this relationship. Our results show that, above and beyond impacts of absolute access to…
An Analysis of High School Students' Perceptions and Academic Performance in Laboratory Experiences
ERIC Educational Resources Information Center
Mirchin, Robert Douglas
2012-01-01
This research study is an investigation of student-laboratory (i.e., lab) learning based on students' perceptions of experiences using questionnaire data and evidence of their science-laboratory performance based on paper-and-pencil assessments using Maryland-mandated criteria, Montgomery County Public Schools (MCPS) criteria, and published…
Improving the performance of a filling line based on simulation
NASA Astrophysics Data System (ADS)
Jasiulewicz-Kaczmarek, M.; Bartkowiak, T.
2016-08-01
The paper describes the method of improving performance of a filling line based on simulation. This study concerns a production line that is located in a manufacturing centre of a FMCG company. A discrete event simulation model was built using data provided by maintenance data acquisition system. Two types of failures were identified in the system and were approximated using continuous statistical distributions. The model was validated taking into consideration line performance measures. A brief Pareto analysis of line failures was conducted to identify potential areas of improvement. Two improvements scenarios were proposed and tested via simulation. The outcome of the simulations were the bases of financial analysis. NPV and ROI values were calculated taking into account depreciation, profits, losses, current CIT rate and inflation. A validated simulation model can be a useful tool in maintenance decision-making process.
Big Data Toolsets to Pharmacometrics: Application of Machine Learning for Time-to-Event Analysis.
Gong, Xiajing; Hu, Meng; Zhao, Liang
2018-05-01
Additional value can be potentially created by applying big data tools to address pharmacometric problems. The performances of machine learning (ML) methods and the Cox regression model were evaluated based on simulated time-to-event data synthesized under various preset scenarios, i.e., with linear vs. nonlinear and dependent vs. independent predictors in the proportional hazard function, or with high-dimensional data featured by a large number of predictor variables. Our results showed that ML-based methods outperformed the Cox model in prediction performance as assessed by concordance index and in identifying the preset influential variables for high-dimensional data. The prediction performances of ML-based methods are also less sensitive to data size and censoring rates than the Cox regression model. In conclusion, ML-based methods provide a powerful tool for time-to-event analysis, with a built-in capacity for high-dimensional data and better performance when the predictor variables assume nonlinear relationships in the hazard function. © 2018 The Authors. Clinical and Translational Science published by Wiley Periodicals, Inc. on behalf of American Society for Clinical Pharmacology and Therapeutics.
Performance criteria for emergency medicine residents: a job analysis.
Blouin, Danielle; Dagnone, Jeffrey Damon
2008-11-01
A major role of admission interviews is to assess a candidate's suitability for a residency program. Structured interviews have greater reliability and validity than do unstructured ones. The development of content for a structured interview is typically based on the dimensions of performance that are perceived as important to succeed in a particular line of work. A formal job analysis is normally conducted to determine these dimensions. The dimensions essential to succeed as an emergency medicine (EM) resident have not yet been studied. We aimed to analyze the work of EM residents to determine these essential dimensions. The "critical incident technique" was used to generate scenarios of poor and excellent resident performance. Two reviewers independently read each scenario and labelled the performance dimensions that were reflected in each. All labels assigned to a particular scenario were pooled and reviewed again until a consensus was reached. Five faculty members (25% of our total faculty) comprised the subject experts. Fifty-one incidents were generated and 50 different labels were applied. Eleven dimensions of performance applied to at least 5 incidents. "Professionalism" was the most valued performance dimension, represented in 56% of the incidents, followed by "self-confidence" (22%), "experience" (20%) and "knowledge" (20%). "Professionalism," "self-confidence," "experience" and "knowledge" were identified as the performance dimensions essential to succeed as an EM resident based on our formal job analysis using the critical incident technique. Performing a formal job analysis may assist training program directors with developing admission interviews.
Thermodynamic Analysis of Dual-Mode Scramjet Engine Operation and Performance
NASA Technical Reports Server (NTRS)
Riggins, David; Tacket, Regan; Taylor, Trent; Auslender, Aaron
2006-01-01
Recent analytical advances in understanding the performance continuum (the thermodynamic spectrum) for air-breathing engines based on fundamental second-law considerations have clarified scramjet and ramjet operation, performance, and characteristics. Second-law based analysis is extended specifically in this work to clarify and describe the performance characteristics for dual-mode scramjet operation in the mid-speed range of flight Mach 4 to 7. This is done by a fundamental investigation of the complex but predictable interplay between heat release and irreversibilities in such an engine; results demonstrate the flow and performance character of the dual mode regime and of dual mode transition behavior. Both analytical and computational (multi-dimensional CFD) studies of sample dual-mode flow-fields are performed in order to demonstrate the second-law capability and performance and operability issues. The impact of the dual-mode regime is found to be characterized by decreasing overall irreversibility with increasing heat release, within the operability limits of the system.
Bounthavong, Mark; Pruitt, Larry D; Smolenski, Derek J; Gahm, Gregory A; Bansal, Aasthaa; Hansen, Ryan N
2018-02-01
Introduction Home-based telebehavioural healthcare improves access to mental health care for patients restricted by travel burden. However, there is limited evidence assessing the economic value of home-based telebehavioural health care compared to in-person care. We sought to compare the economic impact of home-based telebehavioural health care and in-person care for depression among current and former US service members. Methods We performed trial-based cost-minimisation and cost-utility analyses to assess the economic impact of home-based telebehavioural health care versus in-person behavioural care for depression. Our analyses focused on the payer perspective (Department of Defense and Department of Veterans Affairs) at three months. We also performed a scenario analysis where all patients possessed video-conferencing technology that was approved by these agencies. The cost-utility analysis evaluated the impact of different depression categories on the incremental cost-effectiveness ratio. One-way and probabilistic sensitivity analyses were performed to test the robustness of the model assumptions. Results In the base case analysis the total direct cost of home-based telebehavioural health care was higher than in-person care (US$71,974 versus US$20,322). Assuming that patients possessed government-approved video-conferencing technology, home-based telebehavioural health care was less costly compared to in-person care (US$19,177 versus US$20,322). In one-way sensitivity analyses, the proportion of patients possessing personal computers was a major driver of direct costs. In the cost-utility analysis, home-based telebehavioural health care was dominant when patients possessed video-conferencing technology. Results from probabilistic sensitivity analyses did not differ substantially from base case results. Discussion Home-based telebehavioural health care is dependent on the cost of supplying video-conferencing technology to patients but offers the opportunity to increase access to care. Health-care policies centred on implementation of home-based telebehavioural health care should ensure that these technologies are able to be successfully deployed on patients' existing technology.
NASA Astrophysics Data System (ADS)
Latif, C.; Negara, V. S. I.; Wongtepa, W.; Thamatkeng, P.; Zainuri, M.; Pratapa, S.
2018-03-01
XANES analysis has been performed with the aim of knowing the Fe oxidation state in a synthesized LiFePO4 and its base materials. XANES measurements were performed at SLRI on energy around Fe K-edge. An XRD analysis has also been performed with the aim of knowing the phase composition, lattice parameters and crystallite size of the LiFePO4 as well as the base materials. From the XRD analysis, it was found that the dominating phase in the iron sand sample was Fe3O4 and the only phase found after calcination was LiFePO4. The latter phase exhibited crystallite size of 100 nm and lattice parameters a = 10.169916 Å, b = 5.919674 Å, c = 4.627893 Å. Qualitative analysis of XANES data revealed that the oxidation number of Fe in the sample before calcination was greater than that after calcination and Fe in the natural iron sand, indicated by the E0 values of 7129.2 eV, 7120.6 eV and 7124.4 eV respectively.
Kanada, Yoshikiyo; Sakurai, Hiroaki; Sugiura, Yoshito; Arai, Tomoaki; Koyama, Soichiro; Tanabe, Shigeo
2017-11-01
[Purpose] To create a regression formula in order to estimate 1RM for knee extensors, based on the maximal isometric muscle strength measured using a hand-held dynamometer and data regarding the body composition. [Subjects and Methods] Measurement was performed in 21 healthy males in their twenties to thirties. Single regression analysis was performed, with measurement values representing 1RM and the maximal isometric muscle strength as dependent and independent variables, respectively. Furthermore, multiple regression analysis was performed, with data regarding the body composition incorporated as another independent variable, in addition to the maximal isometric muscle strength. [Results] Through single regression analysis with the maximal isometric muscle strength as an independent variable, the following regression formula was created: 1RM (kg)=0.714 + 0.783 × maximal isometric muscle strength (kgf). On multiple regression analysis, only the total muscle mass was extracted. [Conclusion] A highly accurate regression formula to estimate 1RM was created based on both the maximal isometric muscle strength and body composition. Using a hand-held dynamometer and body composition analyzer, it was possible to measure these items in a short time, and obtain clinically useful results.
1988-01-19
approach for the analysis of aerial images. In this approach image analysis is performed ast three levels of abstraction, namely iconic or low-level... image analysis , symbolic or medium-level image analysis , and semantic or high-level image analysis . Domain dependent knowledge about prototypical urban
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ahrens, James P; Patchett, John M; Lo, Li - Ta
2011-01-24
This report provides documentation for the completion of the Los Alamos portion of the ASC Level II 'Visualization on the Supercomputing Platform' milestone. This ASC Level II milestone is a joint milestone between Sandia National Laboratory and Los Alamos National Laboratory. The milestone text is shown in Figure 1 with the Los Alamos portions highlighted in boldfaced text. Visualization and analysis of petascale data is limited by several factors which must be addressed as ACES delivers the Cielo platform. Two primary difficulties are: (1) Performance of interactive rendering, which is the most computationally intensive portion of the visualization process. Formore » terascale platforms, commodity clusters with graphics processors (GPUs) have been used for interactive rendering. For petascale platforms, visualization and rendering may be able to run efficiently on the supercomputer platform itself. (2) I/O bandwidth, which limits how much information can be written to disk. If we simply analyze the sparse information that is saved to disk we miss the opportunity to analyze the rich information produced every timestep by the simulation. For the first issue, we are pursuing in-situ analysis, in which simulations are coupled directly with analysis libraries at runtime. This milestone will evaluate the visualization and rendering performance of current and next generation supercomputers in contrast to GPU-based visualization clusters, and evaluate the perfromance of common analysis libraries coupled with the simulation that analyze and write data to disk during a running simulation. This milestone will explore, evaluate and advance the maturity level of these technologies and their applicability to problems of interest to the ASC program. In conclusion, we improved CPU-based rendering performance by a a factor of 2-10 times on our tests. In addition, we evaluated CPU and CPU-based rendering performance. We encourage production visualization experts to consider using CPU-based rendering solutions when it is appropriate. For example, on remote supercomputers CPU-based rendering can offer a means of viewing data without having to offload the data or geometry onto a CPU-based visualization system. In terms of comparative performance of the CPU and CPU we believe that further optimizations of the performance of both CPU or CPU-based rendering are possible. The simulation community is currently confronting this reality as they work to port their simulations to different hardware architectures. What is interesting about CPU rendering of massive datasets is that for part two decades CPU performance has significantly outperformed CPU-based systems. Based on our advancements, evaluations and explorations we believe that CPU-based rendering has returned as one viable option for the visualization of massive datasets.« less
Parallel-vector solution of large-scale structural analysis problems on supercomputers
NASA Technical Reports Server (NTRS)
Storaasli, Olaf O.; Nguyen, Duc T.; Agarwal, Tarun K.
1989-01-01
A direct linear equation solution method based on the Choleski factorization procedure is presented which exploits both parallel and vector features of supercomputers. The new equation solver is described, and its performance is evaluated by solving structural analysis problems on three high-performance computers. The method has been implemented using Force, a generic parallel FORTRAN language.
ERIC Educational Resources Information Center
Ibourk, Aomar
2013-01-01
Based on data from international surveys measuring learning (TIMSS), this article focuses on the analysis of the academic performance Moroccan students. The results of the econometric model show that the students' characteristics, their family environment and school context are key determinants of these performances. The study also shows that the…
The Distressed Brain: A Group Blind Source Separation Analysis on Tinnitus
De Ridder, Dirk; Vanneste, Sven; Congedo, Marco
2011-01-01
Background Tinnitus, the perception of a sound without an external sound source, can lead to variable amounts of distress. Methodology In a group of tinnitus patients with variable amounts of tinnitus related distress, as measured by the Tinnitus Questionnaire (TQ), an electroencephalography (EEG) is performed, evaluating the patients' resting state electrical brain activity. This resting state electrical activity is compared with a control group and between patients with low (N = 30) and high distress (N = 25). The groups are homogeneous for tinnitus type, tinnitus duration or tinnitus laterality. A group blind source separation (BSS) analysis is performed using a large normative sample (N = 84), generating seven normative components to which high and low tinnitus patients are compared. A correlation analysis of the obtained normative components' relative power and distress is performed. Furthermore, the functional connectivity as reflected by lagged phase synchronization is analyzed between the brain areas defined by the components. Finally, a group BSS analysis on the Tinnitus group as a whole is performed. Conclusions Tinnitus can be characterized by at least four BSS components, two of which are posterior cingulate based, one based on the subgenual anterior cingulate and one based on the parahippocampus. Only the subgenual component correlates with distress. When performed on a normative sample, group BSS reveals that distress is characterized by two anterior cingulate based components. Spectral analysis of these components demonstrates that distress in tinnitus is related to alpha and beta changes in a network consisting of the subgenual anterior cingulate cortex extending to the pregenual and dorsal anterior cingulate cortex as well as the ventromedial prefrontal cortex/orbitofrontal cortex, insula, and parahippocampus. This network overlaps partially with brain areas implicated in distress in patients suffering from pain, functional somatic syndromes and posttraumatic stress disorder, and might therefore represent a specific distress network. PMID:21998628
Use of power analysis to develop detectable significance criteria for sea urchin toxicity tests
Carr, R.S.; Biedenbach, J.M.
1999-01-01
When sufficient data are available, the statistical power of a test can be determined using power analysis procedures. The term “detectable significance” has been coined to refer to this criterion based on power analysis and past performance of a test. This power analysis procedure has been performed with sea urchin (Arbacia punctulata) fertilization and embryological development data from sediment porewater toxicity tests. Data from 3100 and 2295 tests for the fertilization and embryological development tests, respectively, were used to calculate the criteria and regression equations describing the power curves. Using Dunnett's test, a minimum significant difference (MSD) (β = 0.05) of 15.5% and 19% for the fertilization test, and 16.4% and 20.6% for the embryological development test, for α ≤ 0.05 and α ≤ 0.01, respectively, were determined. The use of this second criterion reduces type I (false positive) errors and helps to establish a critical level of difference based on the past performance of the test.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Maslenikov, O.R.; Mraz, M.J.; Johnson, J.J.
1986-03-01
This report documents the seismic analyses performed by SMA for the MFTF-B Axicell vacuum vessel. In the course of this study we performed response spectrum analyses, CLASSI fixed-base analyses, and SSI analyses that included interaction effects between the vessel and vault. The response spectrum analysis served to benchmark certain modeling differences between the LLNL and SMA versions of the vessel model. The fixed-base analysis benchmarked the differences between analysis techniques. The SSI analyses provided our best estimate of vessel response to the postulated seismic excitation for the MFTF-B facility, and included consideration of uncertainties in soil properties by calculating responsemore » for a range of soil shear moduli. Our results are presented in this report as tables of comparisons of specific member forces from our analyses and the analyses performed by LLNL. Also presented are tables of maximum accelerations and relative displacements and plots of response spectra at various selected locations.« less
Analysis on mechanics response of long-life asphalt pavement at moist hot heavy loading area
NASA Astrophysics Data System (ADS)
Xu, Xinquan; Li, Hao; Wu, Chuanhai; Li, Shanqiang
2018-04-01
Based on the durability of semi-rigid base asphalt pavement test road in Guangdong Yunluo expressway, by comparing the mechanics response of modified semi-rigid base, RCC base and inverted semi-rigid base with the state of continuous, using four unit five parameter model to evaluate rut depth of asphalt pavement structure, and through commonly used fatigue life prediction model to evaluate fatigue performance of three types of asphalt pavement structure. Theoretical calculation and four years tracking observation results of test road show that rut depth of modified semi-rigid base asphalt pavement is the minimum, the road performance is the best, and the fatigue performance is the optimal.
Feasibility analysis of base compaction specification : [project brief].
DOT National Transportation Integrated Search
2013-02-01
Appropriate design and construction of the aggregate base layer has significant influence on : structural stability and performance of pavements. Controlling the construction quality of : the granular base layer is important to achieve long-lasting p...
A framework for graph-based synthesis, analysis, and visualization of HPC cluster job data.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mayo, Jackson R.; Kegelmeyer, W. Philip, Jr.; Wong, Matthew H.
The monitoring and system analysis of high performance computing (HPC) clusters is of increasing importance to the HPC community. Analysis of HPC job data can be used to characterize system usage and diagnose and examine failure modes and their effects. This analysis is not straightforward, however, due to the complex relationships that exist between jobs. These relationships are based on a number of factors, including shared compute nodes between jobs, proximity of jobs in time, etc. Graph-based techniques represent an approach that is particularly well suited to this problem, and provide an effective technique for discovering important relationships in jobmore » queuing and execution data. The efficacy of these techniques is rooted in the use of a semantic graph as a knowledge representation tool. In a semantic graph job data, represented in a combination of numerical and textual forms, can be flexibly processed into edges, with corresponding weights, expressing relationships between jobs, nodes, users, and other relevant entities. This graph-based representation permits formal manipulation by a number of analysis algorithms. This report presents a methodology and software implementation that leverages semantic graph-based techniques for the system-level monitoring and analysis of HPC clusters based on job queuing and execution data. Ontology development and graph synthesis is discussed with respect to the domain of HPC job data. The framework developed automates the synthesis of graphs from a database of job information. It also provides a front end, enabling visualization of the synthesized graphs. Additionally, an analysis engine is incorporated that provides performance analysis, graph-based clustering, and failure prediction capabilities for HPC systems.« less
Bhaduri, Anirban; Ghosh, Dipak
2016-01-01
The cardiac dynamics during meditation is explored quantitatively with two chaos-based non-linear techniques viz. multi-fractal detrended fluctuation analysis and visibility network analysis techniques. The data used are the instantaneous heart rate (in beats/minute) of subjects performing Kundalini Yoga and Chi meditation from PhysioNet. The results show consistent differences between the quantitative parameters obtained by both the analysis techniques. This indicates an interesting phenomenon of change in the complexity of the cardiac dynamics during meditation supported with quantitative parameters. The results also produce a preliminary evidence that these techniques can be used as a measure of physiological impact on subjects performing meditation. PMID:26909045
Bhaduri, Anirban; Ghosh, Dipak
2016-01-01
The cardiac dynamics during meditation is explored quantitatively with two chaos-based non-linear techniques viz. multi-fractal detrended fluctuation analysis and visibility network analysis techniques. The data used are the instantaneous heart rate (in beats/minute) of subjects performing Kundalini Yoga and Chi meditation from PhysioNet. The results show consistent differences between the quantitative parameters obtained by both the analysis techniques. This indicates an interesting phenomenon of change in the complexity of the cardiac dynamics during meditation supported with quantitative parameters. The results also produce a preliminary evidence that these techniques can be used as a measure of physiological impact on subjects performing meditation.
An MS-DOS-based program for analyzing plutonium gamma-ray spectra
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ruhter, W.D.; Buckley, W.M.
1989-09-07
A plutonium gamma-ray analysis system that operates on MS-DOS-based computers has been developed for the International Atomic Energy Agency (IAEA) to perform in-field analysis of plutonium gamma-ray spectra for plutonium isotopics. The program titled IAEAPU consists of three separate applications: a data-transfer application for transferring spectral data from a CICERO multichannel analyzer to a binary data file, a data-analysis application to analyze plutonium gamma-ray spectra, for plutonium isotopic ratios and weight percents of total plutonium, and a data-quality assurance application to check spectral data for proper data-acquisition setup and performance. Volume 3 contains the software listings for these applications.
Diana, Barbara; Zurloni, Valentino; Elia, Massimiliano; Cavalera, Cesare M; Jonsson, Gudberg K; Anguera, M Teresa
2017-01-01
The influence of game location on performance has been widely examined in sport contexts. Concerning soccer, game-location affects positively the secondary and tertiary level of performance; however, there are fewer evidences about its effect on game structure (primary level of performance). This study aimed to detect the effect of game location on a primary level of performance in soccer. In particular, the objective was to reveal the hidden structures underlying the attack actions, in both home and away matches played by a top club (Serie A 2012/2013-First Leg). The methodological approach was based on systematic observation, supported by digital recordings and T-pattern analysis. Data were analyzed with THEME 6.0 software. A quantitative analysis, with nonparametric Mann-Whitney test and descriptive statistics, was carried out to test the hypotheses. A qualitative analysis on complex patterns was performed to get in-depth information on the game structure. This study showed that game tactics were significantly different, with home matches characterized by a more structured and varied game than away matches. In particular, a higher number of different patterns, with a higher level of complexity and including more unique behaviors was detected in home matches than in the away ones. No significant differences were found in the number of events coded per game between the two conditions. THEME software, and the corresponding T-pattern detection algorithm, enhance research opportunities by going further than frequency-based analyses, making this method an effective tool in supporting sport performance analysis and training.
Diana, Barbara; Zurloni, Valentino; Elia, Massimiliano; Cavalera, Cesare M.; Jonsson, Gudberg K.; Anguera, M. Teresa
2017-01-01
The influence of game location on performance has been widely examined in sport contexts. Concerning soccer, game-location affects positively the secondary and tertiary level of performance; however, there are fewer evidences about its effect on game structure (primary level of performance). This study aimed to detect the effect of game location on a primary level of performance in soccer. In particular, the objective was to reveal the hidden structures underlying the attack actions, in both home and away matches played by a top club (Serie A 2012/2013—First Leg). The methodological approach was based on systematic observation, supported by digital recordings and T-pattern analysis. Data were analyzed with THEME 6.0 software. A quantitative analysis, with nonparametric Mann–Whitney test and descriptive statistics, was carried out to test the hypotheses. A qualitative analysis on complex patterns was performed to get in-depth information on the game structure. This study showed that game tactics were significantly different, with home matches characterized by a more structured and varied game than away matches. In particular, a higher number of different patterns, with a higher level of complexity and including more unique behaviors was detected in home matches than in the away ones. No significant differences were found in the number of events coded per game between the two conditions. THEME software, and the corresponding T-pattern detection algorithm, enhance research opportunities by going further than frequency-based analyses, making this method an effective tool in supporting sport performance analysis and training. PMID:28878712
Park, Seejeen; Berry, Frances S
2013-09-01
Municipal solid waste (MSW) recycling performance, both nationally and in Florida, USA, has shown little improvement during the past decade. This research examines variations in the MSW recycling program performance in Florida counties in an attempt to identify effective recycling programs. After reviewing trends in the MSW management literature, we conducted an empirical analysis using cross-sectional multiple regression analysis. The findings suggest that the convenience-based hypothesis was supported by showing that curbside recycling had a positive effect on MSW recycling performance. Financial (cost-saving) incentive-based hypotheses were partially supported meaning that individual level incentives can influence recycling performance. Citizen environmental concern was found to positively affect the amount of county recycling, while education and political affiliation yielded no significant results. In conclusion, this article discusses the implications of the findings for both academic research and practice of MSW recycling programs.
Greghi, F M; Rossi, N T; Souza, G B J; Menegon, L N
2012-01-01
Comfort is an issue that has gained relevance within the aeronautical industry due to the necessity of manufacturers and airline companies of differentiating themselves in a market that has become more and more competitive each day. This study's aim is to analyze the comfort/discomfort of passengers, based on the analysis of the activities performed in the aircrafts' cabin during real flights, in order to create ergonomics requirements and a methodology of comfort analysis. The study has been performed during domestic commercial flights, and the adopted data collection techniques have been: the application of 219 questionnaires to passengers, 44 registrations of postures and actions through filmings and 12 semistructured interviews. The method has made possible the reconstruction of the user's action course in performing activities in real flight situations, and the calculation of the area occupied by the passenger during his or her actions. The integrated analysis of the results corroborates data from previous studies in which both the space made available to each passenger and the activity performed interfere in their perception of comfort. From this study it has been concluded that the method constitutes itself as an innovative tool within the process of aircrafts' cabins project enabling the calculation of the action space based on the reconstructed course.
Comparisons of non-Gaussian statistical models in DNA methylation analysis.
Ma, Zhanyu; Teschendorff, Andrew E; Yu, Hong; Taghia, Jalil; Guo, Jun
2014-06-16
As a key regulatory mechanism of gene expression, DNA methylation patterns are widely altered in many complex genetic diseases, including cancer. DNA methylation is naturally quantified by bounded support data; therefore, it is non-Gaussian distributed. In order to capture such properties, we introduce some non-Gaussian statistical models to perform dimension reduction on DNA methylation data. Afterwards, non-Gaussian statistical model-based unsupervised clustering strategies are applied to cluster the data. Comparisons and analysis of different dimension reduction strategies and unsupervised clustering methods are presented. Experimental results show that the non-Gaussian statistical model-based methods are superior to the conventional Gaussian distribution-based method. They are meaningful tools for DNA methylation analysis. Moreover, among several non-Gaussian methods, the one that captures the bounded nature of DNA methylation data reveals the best clustering performance.
Comparisons of Non-Gaussian Statistical Models in DNA Methylation Analysis
Ma, Zhanyu; Teschendorff, Andrew E.; Yu, Hong; Taghia, Jalil; Guo, Jun
2014-01-01
As a key regulatory mechanism of gene expression, DNA methylation patterns are widely altered in many complex genetic diseases, including cancer. DNA methylation is naturally quantified by bounded support data; therefore, it is non-Gaussian distributed. In order to capture such properties, we introduce some non-Gaussian statistical models to perform dimension reduction on DNA methylation data. Afterwards, non-Gaussian statistical model-based unsupervised clustering strategies are applied to cluster the data. Comparisons and analysis of different dimension reduction strategies and unsupervised clustering methods are presented. Experimental results show that the non-Gaussian statistical model-based methods are superior to the conventional Gaussian distribution-based method. They are meaningful tools for DNA methylation analysis. Moreover, among several non-Gaussian methods, the one that captures the bounded nature of DNA methylation data reveals the best clustering performance. PMID:24937687
Murray, Timothy G; Tornambe, Paul; Dugel, Pravin; Tong, Kuo Bianchini
2011-01-01
Background The purpose of this study is to report the use of activity-based cost analysis to identify areas of practice efficiencies and inefficiencies within a large academic retinal center and a small single-specialty group. This analysis establishes a framework for evaluating rapidly shifting clinical practices (anti-vascular endothelial growth factor therapy, microincisional vitrectomy surgery) and incorporating changing reimbursements for care delivery (intravitreal injections, optical coherence tomography [OCT]) to determine the impact on practice profitability. Pro forma modeling targeted the impact of declining reimbursement for OCT imaging and intravitreal injection using a strategy that incorporates activity-based cost analysis into a direct evaluation schema for clinical operations management. Methods Activity-based costing analyses were performed at two different types of retinal practices in the US, ie, a small single-specialty group practice and an academic hospital-based practice (Bascom Palmer Eye Institute). Retrospective claims data were utilized to identify all procedures performed and billed, submitted charges, allowed charges, and net collections from each of these two practices for the calendar years 2005–2006 and 2007–2008. A pro forma analysis utilizing current reimbursement profiles was performed to determine the impact of altered reimbursement on practice profitability. All analyses were performed by a third party consulting firm. Results The small single-specialty group practice outperformed the academic hospital-based practice on almost all markers of efficiency. In the academic hospital-based practice, only four service lines were profitable, ie, nonlaser surgery, laser surgery, non-OCT diagnostics, and injections. Profit margin varied from 62% for nonlaser surgery to 1% for intravitreal injections. Largest negative profit contributions were associated with office visits and OCT imaging. Conclusion Activity-based cost analysis is a powerful tool to evaluate retinal practice efficiencies. These two distinct practices were able to provide significant increases in clinical care (office visits, ophthalmic imaging, and patient procedures) through maintaining efficiencies of care. Pro forma analysis of 2011 data noted that OCT payments to facilities and physicians continue to decrease dramatically and that this payment decrease further reduced the profitability for the two largest aspects of these retinal practices, ie, intravitreal injections and OCT retinal imaging. Ultimately, all retinal practices are at risk for significant shifts in financial health related to rapidly evolving changes in patterns of care and reimbursement associated with providing outstanding clinical care. PMID:21792278
Validation of voxel-based morphometry (VBM) based on MRI
NASA Astrophysics Data System (ADS)
Yang, Xueyu; Chen, Kewei; Guo, Xiaojuan; Yao, Li
2007-03-01
Voxel-based morphometry (VBM) is an automated and objective image analysis technique for detecting differences in regional concentration or volume of brain tissue composition based on structural magnetic resonance (MR) images. VBM has been used widely to evaluate brain morphometric differences between different populations, but there isn't an evaluation system for its validation until now. In this study, a quantitative and objective evaluation system was established in order to assess VBM performance. We recruited twenty normal volunteers (10 males and 10 females, age range 20-26 years, mean age 22.6 years). Firstly, several focal lesions (hippocampus, frontal lobe, anterior cingulate, back of hippocampus, back of anterior cingulate) were simulated in selected brain regions using real MRI data. Secondly, optimized VBM was performed to detect structural differences between groups. Thirdly, one-way ANOVA and post-hoc test were used to assess the accuracy and sensitivity of VBM analysis. The results revealed that VBM was a good detective tool in majority of brain regions, even in controversial brain region such as hippocampus in VBM study. Generally speaking, much more severity of focal lesion was, better VBM performance was. However size of focal lesion had little effects on VBM analysis.
NASA Astrophysics Data System (ADS)
Mao, Chao; Chen, Shou
2017-01-01
According to the traditional entropy value method still have low evaluation accuracy when evaluating the performance of mining projects, a performance evaluation model of mineral project founded on improved entropy is proposed. First establish a new weight assignment model founded on compatible matrix analysis of analytic hierarchy process (AHP) and entropy value method, when the compatibility matrix analysis to achieve consistency requirements, if it has differences between subjective weights and objective weights, moderately adjust both proportions, then on this basis, the fuzzy evaluation matrix for performance evaluation. The simulation experiments show that, compared with traditional entropy and compatible matrix analysis method, the proposed performance evaluation model of mining project based on improved entropy value method has higher accuracy assessment.
NASA Technical Reports Server (NTRS)
Appleby, M. H.; Golightly, M. J.; Hardy, A. C.
1993-01-01
Major improvements have been completed in the approach to analyses and simulation of spacecraft radiation shielding and exposure. A computer-aided design (CAD)-based system has been developed for determining the amount of shielding provided by a spacecraft and simulating transmission of an incident radiation environment to any point within or external to the vehicle. Shielding analysis is performed using a customized ray-tracing subroutine contained within a standard engineering modeling software package. This improved shielding analysis technique has been used in several vehicle design programs such as a Mars transfer habitat, pressurized lunar rover, and the redesigned international Space Station. Results of analysis performed for the Space Station astronaut exposure assessment are provided to demonastrate the applicability and versatility of the system.
A root cause analysis project in a medication safety course.
Schafer, Jason J
2012-08-10
To develop, implement, and evaluate team-based root cause analysis projects as part of a required medication safety course for second-year pharmacy students. Lectures, in-class activities, and out-of-class reading assignments were used to develop students' medication safety skills and introduce them to the culture of medication safety. Students applied these skills within teams by evaluating cases of medication errors using root cause analyses. Teams also developed error prevention strategies and formally presented their findings. Student performance was assessed using a medication errors evaluation rubric. Of the 211 students who completed the course, the majority performed well on root cause analysis assignments and rated them favorably on course evaluations. Medication error evaluation and prevention was successfully introduced in a medication safety course using team-based root cause analysis projects.
Analysis of transient fission gas behaviour in oxide fuel using BISON and TRANSURANUS
NASA Astrophysics Data System (ADS)
Barani, T.; Bruschi, E.; Pizzocri, D.; Pastore, G.; Van Uffelen, P.; Williamson, R. L.; Luzzi, L.
2017-04-01
The modelling of fission gas behaviour is a crucial aspect of nuclear fuel performance analysis in view of the related effects on the thermo-mechanical performance of the fuel rod, which can be particularly significant during transients. In particular, experimental observations indicate that substantial fission gas release (FGR) can occur on a small time scale during transients (burst release). To accurately reproduce the rapid kinetics of the burst release process in fuel performance calculations, a model that accounts for non-diffusional mechanisms such as fuel micro-cracking is needed. In this work, we present and assess a model for transient fission gas behaviour in oxide fuel, which is applied as an extension of conventional diffusion-based models to introduce the burst release effect. The concept and governing equations of the model are presented, and the sensitivity of results to the newly introduced parameters is evaluated through an analytic sensitivity analysis. The model is assessed for application to integral fuel rod analysis by implementation in two structurally different fuel performance codes: BISON (multi-dimensional finite element code) and TRANSURANUS (1.5D code). Model assessment is based on the analysis of 19 light water reactor fuel rod irradiation experiments from the OECD/NEA IFPE (International Fuel Performance Experiments) database, all of which are simulated with both codes. The results point out an improvement in both the quantitative predictions of integral fuel rod FGR and the qualitative representation of the FGR kinetics with the transient model relative to the canonical, purely diffusion-based models of the codes. The overall quantitative improvement of the integral FGR predictions in the two codes is comparable. Moreover, calculated radial profiles of xenon concentration after irradiation are investigated and compared to experimental data, illustrating the underlying representation of the physical mechanisms of burst release.
An enhanced performance through agent-based secure approach for mobile ad hoc networks
NASA Astrophysics Data System (ADS)
Bisen, Dhananjay; Sharma, Sanjeev
2018-01-01
This paper proposes an agent-based secure enhanced performance approach (AB-SEP) for mobile ad hoc network. In this approach, agent nodes are selected through optimal node reliability as a factor. This factor is calculated on the basis of node performance features such as degree difference, normalised distance value, energy level, mobility and optimal hello interval of node. After selection of agent nodes, a procedure of malicious behaviour detection is performed using fuzzy-based secure architecture (FBSA). To evaluate the performance of the proposed approach, comparative analysis is done with conventional schemes using performance parameters such as packet delivery ratio, throughput, total packet forwarding, network overhead, end-to-end delay and percentage of malicious detection.
School-Based Mentoring for Adolescents: A Systematic Review and Meta-Analysis
ERIC Educational Resources Information Center
Wood, Sarah; Mayo-Wilson, Evan
2012-01-01
Objectives: To evaluate the impact of school-based mentoring for adolescents (11-18 years) on academic performance, attendance, attitudes, behavior, and self-esteem. Method: A systematic review and meta-analysis. The authors searched 12 databases from 1980 to 2011. Eight studies with 6,072 participants were included, 6 were included in…
3D TRUMP - A GBI launch window tool
NASA Astrophysics Data System (ADS)
Karels, Steven N.; Hancock, John; Matchett, Gary
3D TRUMP is a novel GPS and communicatons-link software analysis tool developed for the SDIO's Ground-Based Interceptor (GBI) program. 3D TRUMP uses a computationally efficient analysis tool which provides key GPS-based performance measures for an entire GBI mission's reentry vehicle and interceptor trajectories. Algorithms and sample outputs are presented.
NASA Technical Reports Server (NTRS)
Levison, William H.
1988-01-01
This study explored application of a closed loop pilot/simulator model to the analysis of some simulator fidelity issues. The model was applied to two data bases: (1) a NASA ground based simulation of an air-to-air tracking task in which nonvisual cueing devices were explored, and (2) a ground based and inflight study performed by the Calspan Corporation to explore the effects of simulator delay on attitude tracking performance. The model predicted the major performance trends obtained in both studies. A combined analytical and experimental procedure for exploring simulator fidelity issues is outlined.
Weil, Joyce; Hutchinson, Susan R; Traxler, Karen
2014-11-01
Data from the Women's Health and Aging Study were used to test a model of factors explaining depressive symptomology. The primary purpose of the study was to explore the association between performance-based measures of functional ability and depression and to examine the role of self-rated physical difficulties and perceived instrumental support in mediating the relationship between performance-based functioning and depression. The inclusion of performance-based measures allows for the testing of functional ability as a clinical precursor to disability and depression: a critical, but rarely examined, association in the disablement process. Structural equation modeling supported the overall fit of the model and found an indirect relationship between performance-based functioning and depression, with perceived physical difficulties serving as a significant mediator. Our results highlight the complementary nature of performance-based and self-rated measures and the importance of including perception of self-rated physical difficulties when examining depression in older persons. © The Author(s) 2014.
NASA Astrophysics Data System (ADS)
Hsiao, Y. R.; Tsai, C.
2017-12-01
As the WHO Air Quality Guideline indicates, ambient air pollution exposes world populations under threat of fatal symptoms (e.g. heart disease, lung cancer, asthma etc.), raising concerns of air pollution sources and relative factors. This study presents a novel approach to investigating the multiscale variations of PM2.5 in southern Taiwan over the past decade, with four meteorological influencing factors (Temperature, relative humidity, precipitation and wind speed),based on Noise-assisted Multivariate Empirical Mode Decomposition(NAMEMD) algorithm, Hilbert Spectral Analysis(HSA) and Time-dependent Intrinsic Correlation(TDIC) method. NAMEMD algorithm is a fully data-driven approach designed for nonlinear and nonstationary multivariate signals, and is performed to decompose multivariate signals into a collection of channels of Intrinsic Mode Functions (IMFs). TDIC method is an EMD-based method using a set of sliding window sizes to quantify localized correlation coefficients for multiscale signals. With the alignment property and quasi-dyadic filter bank of NAMEMD algorithm, one is able to produce same number of IMFs for all variables and estimates the cross correlation in a more accurate way. The performance of spectral representation of NAMEMD-HSA method is compared with Complementary Empirical Mode Decomposition/ Hilbert Spectral Analysis (CEEMD-HSA) and Wavelet Analysis. The nature of NAMAMD-based TDICC analysis is then compared with CEEMD-based TDIC analysis and the traditional correlation analysis.
NASA Astrophysics Data System (ADS)
Xu, Jun; Dang, Chao; Kong, Fan
2017-10-01
This paper presents a new method for efficient structural reliability analysis. In this method, a rotational quasi-symmetric point method (RQ-SPM) is proposed for evaluating the fractional moments of the performance function. Then, the derivation of the performance function's probability density function (PDF) is carried out based on the maximum entropy method in which constraints are specified in terms of fractional moments. In this regard, the probability of failure can be obtained by a simple integral over the performance function's PDF. Six examples, including a finite element-based reliability analysis and a dynamic system with strong nonlinearity, are used to illustrate the efficacy of the proposed method. All the computed results are compared with those by Monte Carlo simulation (MCS). It is found that the proposed method can provide very accurate results with low computational effort.
ERIC Educational Resources Information Center
Bahadur, Waheed; Bano, Amir; Waheed, Zarina; Wahab, Abdul
2017-01-01
The performance of schools is highly dependent on the leadership of school heads, and, flexible leaders accelerate school performance. The purpose of this qualitative study was to examine leadership behavior in selected boys' secondary schools that are performing well. Based on multiple-case study design, four high-performing schools from Quetta…
"Dip-and-read" paper-based analytical devices using distance-based detection with color screening.
Yamada, Kentaro; Citterio, Daniel; Henry, Charles S
2018-05-15
An improved paper-based analytical device (PAD) using color screening to enhance device performance is described. Current detection methods for PADs relying on the distance-based signalling motif can be slow due to the assay time being limited by capillary flow rates that wick fluid through the detection zone. For traditional distance-based detection motifs, analysis can take up to 45 min for a channel length of 5 cm. By using a color screening method, quantification with a distance-based PAD can be achieved in minutes through a "dip-and-read" approach. A colorimetric indicator line deposited onto a paper substrate using inkjet-printing undergoes a concentration-dependent colorimetric response for a given analyte. This color intensity-based response has been converted to a distance-based signal by overlaying a color filter with a continuous color intensity gradient matching the color of the developed indicator line. As a proof-of-concept, Ni quantification in welding fume was performed as a model assay. The results of multiple independent user testing gave mean absolute percentage error and average relative standard deviations of 10.5% and 11.2% respectively, which were an improvement over analysis based on simple visual color comparison with a read guide (12.2%, 14.9%). In addition to the analytical performance comparison, an interference study and a shelf life investigation were performed to further demonstrate practical utility. The developed system demonstrates an alternative detection approach for distance-based PADs enabling fast (∼10 min), quantitative, and straightforward assays.
NASA Technical Reports Server (NTRS)
Kuhn, A. E.
1975-01-01
A dispersion analysis considering 3 sigma uncertainties (or perturbations) in platform, vehicle, and environmental parameters was performed for the baseline reference mission (BRM) 1 of the space shuttle orbiter. The dispersion analysis is based on the nominal trajectory for the BRM 1. State vector and performance dispersions (or variations) which result from the indicated 3 sigma uncertainties were studied. The dispersions were determined at major mission events and fixed times from lift-off (time slices) and the results will be used to evaluate the capability of the vehicle to perform the mission within a 3 sigma level of confidence and to determine flight performance reserves. A computer program is given that was used for dynamic flight simulations of the space shuttle orbiter.
Zhang, Jing; Liang, Lichen; Anderson, Jon R; Gatewood, Lael; Rottenberg, David A; Strother, Stephen C
2008-01-01
As functional magnetic resonance imaging (fMRI) becomes widely used, the demands for evaluation of fMRI processing pipelines and validation of fMRI analysis results is increasing rapidly. The current NPAIRS package, an IDL-based fMRI processing pipeline evaluation framework, lacks system interoperability and the ability to evaluate general linear model (GLM)-based pipelines using prediction metrics. Thus, it can not fully evaluate fMRI analytical software modules such as FSL.FEAT and NPAIRS.GLM. In order to overcome these limitations, a Java-based fMRI processing pipeline evaluation system was developed. It integrated YALE (a machine learning environment) into Fiswidgets (a fMRI software environment) to obtain system interoperability and applied an algorithm to measure GLM prediction accuracy. The results demonstrated that the system can evaluate fMRI processing pipelines with univariate GLM and multivariate canonical variates analysis (CVA)-based models on real fMRI data based on prediction accuracy (classification accuracy) and statistical parametric image (SPI) reproducibility. In addition, a preliminary study was performed where four fMRI processing pipelines with GLM and CVA modules such as FSL.FEAT and NPAIRS.CVA were evaluated with the system. The results indicated that (1) the system can compare different fMRI processing pipelines with heterogeneous models (NPAIRS.GLM, NPAIRS.CVA and FSL.FEAT) and rank their performance by automatic performance scoring, and (2) the rank of pipeline performance is highly dependent on the preprocessing operations. These results suggest that the system will be of value for the comparison, validation, standardization and optimization of functional neuroimaging software packages and fMRI processing pipelines.
Gimenez, Thais; Braga, Mariana Minatel; Raggio, Daniela Procida; Deery, Chris; Ricketts, David N; Mendes, Fausto Medeiros
2013-01-01
Fluorescence-based methods have been proposed to aid caries lesion detection. Summarizing and analysing findings of studies about fluorescence-based methods could clarify their real benefits. We aimed to perform a comprehensive systematic review and meta-analysis to evaluate the accuracy of fluorescence-based methods in detecting caries lesions. Two independent reviewers searched PubMed, Embase and Scopus through June 2012 to identify papers/articles published. Other sources were checked to identify non-published literature. STUDY ELIGIBILITY CRITERIA, PARTICIPANTS AND DIAGNOSTIC METHODS: The eligibility criteria were studies that: (1) have assessed the accuracy of fluorescence-based methods of detecting caries lesions on occlusal, approximal or smooth surfaces, in both primary or permanent human teeth, in the laboratory or clinical setting; (2) have used a reference standard; and (3) have reported sufficient data relating to the sample size and the accuracy of methods. A diagnostic 2×2 table was extracted from included studies to calculate the pooled sensitivity, specificity and overall accuracy parameters (Diagnostic Odds Ratio and Summary Receiver-Operating curve). The analyses were performed separately for each method and different characteristics of the studies. The quality of the studies and heterogeneity were also evaluated. Seventy five studies met the inclusion criteria from the 434 articles initially identified. The search of the grey or non-published literature did not identify any further studies. In general, the analysis demonstrated that the fluorescence-based method tend to have similar accuracy for all types of teeth, dental surfaces or settings. There was a trend of better performance of fluorescence methods in detecting more advanced caries lesions. We also observed moderate to high heterogeneity and evidenced publication bias. Fluorescence-based devices have similar overall performance; however, better accuracy in detecting more advanced caries lesions has been observed.
Cardiovascular imaging environment: will the future be cloud-based?
Kawel-Boehm, Nadine; Bluemke, David A
2017-07-01
In cardiovascular CT and MR imaging large datasets have to be stored, post-processed, analyzed and distributed. Beside basic assessment of volume and function in cardiac magnetic resonance imaging e.g., more sophisticated quantitative analysis is requested requiring specific software. Several institutions cannot afford various types of software and provide expertise to perform sophisticated analysis. Areas covered: Various cloud services exist related to data storage and analysis specifically for cardiovascular CT and MR imaging. Instead of on-site data storage, cloud providers offer flexible storage services on a pay-per-use basis. To avoid purchase and maintenance of specialized software for cardiovascular image analysis, e.g. to assess myocardial iron overload, MR 4D flow and fractional flow reserve, evaluation can be performed with cloud based software by the consumer or complete analysis is performed by the cloud provider. However, challenges to widespread implementation of cloud services include regulatory issues regarding patient privacy and data security. Expert commentary: If patient privacy and data security is guaranteed cloud imaging is a valuable option to cope with storage of large image datasets and offer sophisticated cardiovascular image analysis for institutions of all sizes.
Analysis and methodology for aeronautical systems technology program planning
NASA Technical Reports Server (NTRS)
White, M. J.; Gershkoff, I.; Lamkin, S.
1983-01-01
A structured methodology was developed that allows the generation, analysis, and rank-ordering of system concepts by their benefits and costs, indicating the preferred order of implementation. The methodology is supported by a base of data on civil transport aircraft fleet growth projections and data on aircraft performance relating the contribution of each element of the aircraft to overall performance. The performance data are used to assess the benefits of proposed concepts. The methodology includes a computer program for performing the calculations needed to rank-order the concepts and compute their cumulative benefit-to-cost ratio. The use of the methodology and supporting data is illustrated through the analysis of actual system concepts from various sources.
A two-step sensitivity analysis for hydrological signatures in Jinhua River Basin, East China
NASA Astrophysics Data System (ADS)
Pan, S.; Fu, G.; Chiang, Y. M.; Xu, Y. P.
2016-12-01
Owing to model complexity and large number of parameters, calibration and sensitivity analysis are difficult processes for distributed hydrological models. In this study, a two-step sensitivity analysis approach is proposed for analyzing the hydrological signatures in Jinhua River Basin, East China, using the Distributed Hydrology-Soil-Vegetation Model (DHSVM). A rough sensitivity analysis is firstly conducted to obtain preliminary influential parameters via Analysis of Variance. The number of parameters was greatly reduced from eighteen-three to sixteen. Afterwards, the sixteen parameters are further analyzed based on a variance-based global sensitivity analysis, i.e., Sobol's sensitivity analysis method, to achieve robust sensitivity rankings and parameter contributions. Parallel-Computing is applied to reduce computational burden in variance-based sensitivity analysis. The results reveal that only a few number of model parameters are significantly sensitive, including rain LAI multiplier, lateral conductivity, porosity, field capacity, wilting point of clay loam, understory monthly LAI, understory minimum resistance and root zone depths of croplands. Finally several hydrological signatures are used for investigating the performance of DHSVM. Results show that high value of efficiency criteria didn't indicate excellent performance of hydrological signatures. For most samples from Sobol's sensitivity analysis, water yield was simulated very well. However, lowest and maximum annual daily runoffs were underestimated. Most of seven-day minimum runoffs were overestimated. Nevertheless, good performances of the three signatures above still exist in a number of samples. Analysis of peak flow shows that small and medium floods are simulated perfectly while slight underestimations happen to large floods. The work in this study helps to further multi-objective calibration of DHSVM model and indicates where to improve the reliability and credibility of model simulation.
Smart Sensor-Based Motion Detection System for Hand Movement Training in Open Surgery.
Sun, Xinyao; Byrns, Simon; Cheng, Irene; Zheng, Bin; Basu, Anup
2017-02-01
We introduce a smart sensor-based motion detection technique for objective measurement and assessment of surgical dexterity among users at different experience levels. The goal is to allow trainees to evaluate their performance based on a reference model shared through communication technology, e.g., the Internet, without the physical presence of an evaluating surgeon. While in the current implementation we used a Leap Motion Controller to obtain motion data for analysis, our technique can be applied to motion data captured by other smart sensors, e.g., OptiTrack. To differentiate motions captured from different participants, measurement and assessment in our approach are achieved using two strategies: (1) low level descriptive statistical analysis, and (2) Hidden Markov Model (HMM) classification. Based on our surgical knot tying task experiment, we can conclude that finger motions generated from users with different surgical dexterity, e.g., expert and novice performers, display differences in path length, number of movements and task completion time. In order to validate the discriminatory ability of HMM for classifying different movement patterns, a non-surgical task was included in our analysis. Experimental results demonstrate that our approach had 100 % accuracy in discriminating between expert and novice performances. Our proposed motion analysis technique applied to open surgical procedures is a promising step towards the development of objective computer-assisted assessment and training systems.
Student Learning with Performance-Based, In-Class and Learner-Centered, Online Exams
ERIC Educational Resources Information Center
Greenberg, Katherine; Lester, Jessica N.; Evans, Kathy; Williams, Michele; Hacker, Carolyn; Halic, Olivia
2008-01-01
The purpose of this study was to explore the experience of students with performance-based, in-class and learner-centered, online assessment and the effects of these formats on comprehensive exam scores in an educational psychology course required of participants in a teacher education program. In our quantitative analysis, we investigated the…
ERIC Educational Resources Information Center
Jackson, Dan; Bowden, Jack; Baker, Rose
2015-01-01
Moment-based estimators of the between-study variance are very popular when performing random effects meta-analyses. This type of estimation has many advantages including computational and conceptual simplicity. Furthermore, by using these estimators in large samples, valid meta-analyses can be performed without the assumption that the treatment…
2004-06-01
Meyer - 54 55 Olkin (KMO) and Bartlett’s test yielded a KMO measure of sampling adequacy of .683. Based on a rotated factor matrix, two factors were...factor analysis was performed on the first four questionnaire items (concern for success, effectiveness, friendliness, and sociability). A Kaiser
Gender-Based Behavioral Analysis for End-User Development and the "RULES" Attributes
ERIC Educational Resources Information Center
Tzafilkou, Katerina; Protogeros, Nicolaos; Karagiannidis, Charalampos; Koumpis, Adamantios
2017-01-01
This paper addresses the role of gender in End-User Development (EUD) environments and examines whether there are gender differences in performance and in correlations between performance and a set of behavioral attributes. Based on a review of the most prominent EUD-related behavioral Human Computer Interaction (HCI) theories, and the influence…
ERIC Educational Resources Information Center
Buono, Alexia; Gonzalez, Charles H.
2017-01-01
In this article, the authors (then two doctoral students) describe their methodology of engaging in an interdisciplinary, collaborative doctoral arts-based research (ABR) project. Education and the arts were integrated utilizing dance methods of bodily writing and performative inquiry to strengthen the analysis of dissertation findings in the…
2018-01-01
Exhaust gas recirculation (EGR) is one of the main methods of reducing NOX emissions and has been widely used in marine diesel engines. This paper proposes an optimized comprehensive assessment method based on multi-objective grey situation decision theory, grey relation theory and grey entropy analysis to evaluate the performance and optimize rate determination of EGR, which currently lack clear theoretical guidance. First, multi-objective grey situation decision theory is used to establish the initial decision-making model according to the main EGR parameters. The optimal compromise between diesel engine combustion and emission performance is transformed into a decision-making target weight problem. After establishing the initial model and considering the characteristics of EGR under different conditions, an optimized target weight algorithm based on grey relation theory and grey entropy analysis is applied to generate the comprehensive evaluation and decision-making model. Finally, the proposed method is successfully applied to a TBD234V12 turbocharged diesel engine, and the results clearly illustrate the feasibility of the proposed method for providing theoretical support and a reference for further EGR optimization. PMID:29377956
Optimal Parameter Design of Coarse Alignment for Fiber Optic Gyro Inertial Navigation System.
Lu, Baofeng; Wang, Qiuying; Yu, Chunmei; Gao, Wei
2015-06-25
Two different coarse alignment algorithms for Fiber Optic Gyro (FOG) Inertial Navigation System (INS) based on inertial reference frame are discussed in this paper. Both of them are based on gravity vector integration, therefore, the performance of these algorithms is determined by integration time. In previous works, integration time is selected by experience. In order to give a criterion for the selection process, and make the selection of the integration time more accurate, optimal parameter design of these algorithms for FOG INS is performed in this paper. The design process is accomplished based on the analysis of the error characteristics of these two coarse alignment algorithms. Moreover, this analysis and optimal parameter design allow us to make an adequate selection of the most accurate algorithm for FOG INS according to the actual operational conditions. The analysis and simulation results show that the parameter provided by this work is the optimal value, and indicate that in different operational conditions, the coarse alignment algorithms adopted for FOG INS are different in order to achieve better performance. Lastly, the experiment results validate the effectiveness of the proposed algorithm.
Zu, Xianghuan; Yang, Chuanlei; Wang, Hechun; Wang, Yinyan
2018-01-01
Exhaust gas recirculation (EGR) is one of the main methods of reducing NOX emissions and has been widely used in marine diesel engines. This paper proposes an optimized comprehensive assessment method based on multi-objective grey situation decision theory, grey relation theory and grey entropy analysis to evaluate the performance and optimize rate determination of EGR, which currently lack clear theoretical guidance. First, multi-objective grey situation decision theory is used to establish the initial decision-making model according to the main EGR parameters. The optimal compromise between diesel engine combustion and emission performance is transformed into a decision-making target weight problem. After establishing the initial model and considering the characteristics of EGR under different conditions, an optimized target weight algorithm based on grey relation theory and grey entropy analysis is applied to generate the comprehensive evaluation and decision-making model. Finally, the proposed method is successfully applied to a TBD234V12 turbocharged diesel engine, and the results clearly illustrate the feasibility of the proposed method for providing theoretical support and a reference for further EGR optimization.
Cost of Equity Estimation in Fuel and Energy Sector Companies Based on CAPM
NASA Astrophysics Data System (ADS)
Kozieł, Diana; Pawłowski, Stanisław; Kustra, Arkadiusz
2018-03-01
The article presents cost of equity estimation of capital groups from the fuel and energy sector, listed at the Warsaw Stock Exchange, based on the Capital Asset Pricing Model (CAPM). The objective of the article was to perform a valuation of equity with the application of CAPM, based on actual financial data and stock exchange data and to carry out a sensitivity analysis of such cost, depending on the financing structure of the entity. The objective of the article formulated in this manner has determined its' structure. It focuses on presentation of substantive analyses related to the core of equity and methods of estimating its' costs, with special attention given to the CAPM. In the practical section, estimation of cost was performed according to the CAPM methodology, based on the example of leading fuel and energy companies, such as Tauron GE and PGE. Simultaneously, sensitivity analysis of such cost was performed depending on the structure of financing the company's operation.
NASA Technical Reports Server (NTRS)
Mehta, Manish; Seaford, Mark; Kovarik, Brian; Dufrene, Aaron; Solly, Nathan; Kirchner, Robert; Engel, Carl D.
2014-01-01
The Space Launch System (SLS) base heating test is broken down into two test programs: (1) Pathfinder and (2) Main Test. The Pathfinder Test Program focuses on the design, development, hot-fire test and performance analyses of the 2% sub-scale SLS core-stage and booster element propulsion systems. The core-stage propulsion system is composed of four gaseous oxygen/hydrogen RS-25D model engines and the booster element is composed of two aluminum-based model solid rocket motors (SRMs). The first section of the paper discusses the motivation and test facility specifications for the test program. The second section briefly investigates the internal flow path of the design. The third section briefly shows the performance of the model RS-25D engines and SRMs for the conducted short duration hot-fire tests. Good agreement is observed based on design prediction analysis and test data. This program is a challenging research and development effort that has not been attempted in 40+ years for a NASA vehicle.
Evaluation of the Aurora Application Shade Measurement Accuracy
DOE Office of Scientific and Technical Information (OSTI.GOV)
2015-12-01
Aurora is an integrated, Web-based application that helps solar installers perform sales, engineering design, and financial analysis. One of Aurora's key features is its high-resolution remote shading analysis.
2012-01-01
Background This study illustrates an evidence-based method for the segmentation analysis of patients that could greatly improve the approach to population-based medicine, by filling a gap in the empirical analysis of this topic. Segmentation facilitates individual patient care in the context of the culture, health status, and the health needs of the entire population to which that patient belongs. Because many health systems are engaged in developing better chronic care management initiatives, patient profiles are critical to understanding whether some patients can move toward effective self-management and can play a central role in determining their own care, which fosters a sense of responsibility for their own health. A review of the literature on patient segmentation provided the background for this research. Method First, we conducted a literature review on patient satisfaction and segmentation to build a survey. Then, we performed 3,461 surveys of outpatient services users. The key structures on which the subjects’ perception of outpatient services was based were extrapolated using principal component factor analysis with varimax rotation. After the factor analysis, segmentation was performed through cluster analysis to better analyze the influence of individual attitudes on the results. Results Four segments were identified through factor and cluster analysis: the “unpretentious,” the “informed and supported,” the “experts” and the “advanced” patients. Their policies and managerial implications are outlined. Conclusions With this research, we provide the following: – a method for profiling patients based on common patient satisfaction surveys that is easily replicable in all health systems and contexts; – a proposal for segments based on the results of a broad-based analysis conducted in the Italian National Health System (INHS). Segments represent profiles of patients requiring different strategies for delivering health services. Their knowledge and analysis might support an effort to build an effective population-based medicine approach. PMID:23256543
Ten Eyck, Raymond P; Tews, Matthew; Ballester, John M; Hamilton, Glenn C
2010-06-01
To determine the impact of simulation-based instruction on student performance in the role of emergency department resuscitation team leader. A randomized, single-blinded, controlled study using an intention to treat analysis. Eighty-three fourth-year medical students enrolled in an emergency medicine clerkship were randomly allocated to two groups differing only by instructional format. Each student individually completed an initial simulation case, followed by a standardized curriculum of eight cases in either group simulation or case-based group discussion format before a second individual simulation case. A remote coinvestigator measured eight objective performance end points using digital recordings of all individual simulation cases. McNemar chi2, Pearson correlation, repeated measures multivariate analysis of variance, and follow-up analysis of variance were used for statistical evaluation. Sixty-eight students (82%) completed both initial and follow-up individual simulations. Eight students were lost from the simulation group and seven from the discussion group. The mean postintervention case performance was significantly better for the students allocated to simulation instruction compared with the group discussion students for four outcomes including a decrease in mean time to (1) order an intravenous line; (2) initiate cardiac monitoring; (3) order initial laboratory tests; and (4) initiate blood pressure monitoring. Paired comparisons of each student's initial and follow-up simulations demonstrated significant improvement in the same four areas, in mean time to order an abdominal radiograph and in obtaining an allergy history. A single simulation-based teaching session significantly improved student performance as a team leader. Additional simulation sessions provided further improvement compared with instruction provided in case-based group discussion format.
Quantitative assessment of human motion using video motion analysis
NASA Technical Reports Server (NTRS)
Probe, John D.
1990-01-01
In the study of the dynamics and kinematics of the human body, a wide variety of technologies was developed. Photogrammetric techniques are well documented and are known to provide reliable positional data from recorded images. Often these techniques are used in conjunction with cinematography and videography for analysis of planar motion, and to a lesser degree three-dimensional motion. Cinematography has been the most widely used medium for movement analysis. Excessive operating costs and the lag time required for film development coupled with recent advances in video technology have allowed video based motion analysis systems to emerge as a cost effective method of collecting and analyzing human movement. The Anthropometric and Biomechanics Lab at Johnson Space Center utilizes the video based Ariel Performance Analysis System to develop data on shirt-sleeved and space-suited human performance in order to plan efficient on orbit intravehicular and extravehicular activities. The system is described.
Analysis of Load Stress for Asphalt Pavement of Lean Concrete Base
NASA Astrophysics Data System (ADS)
Lijun, Suo; Xinwu, Wang
The study revealed that whether it is early distresses in asphalt pavement or not depends largely on working performance of base. In the field of asphalt pavement, it is widely accepted that lean concrete base, compared with the general semi-rigid base, has better working performance, such as high strength and good eroding resistance. Problem of early distresses in asphalt pavement, which caused by more traffic loadings, can be settled effectively when lean concrete is used in asphalt pavement. Traffic loading is important parameter used in the analysis of the new pavement design. However, few studies have done extensive and intensive research on the load stress for asphalt pavement of lean concrete base. Because of that, it is necessary to study the load stress for the asphalt pavement. In the paper, first of all, three-dimension finite element model of the asphalt pavement is created for the aim of doing mechanical analysis for the asphalt pavement. And then, the two main objectives of this study are investigated. One is analysis for load stress of lean concrete base, and the other is analysis for load stress of asphalt surface. The results show that load stress of lean concrete base decreases, decrease and increase with increase of base's thickness, surface's thickness and ratio of base's modulus to foundation's modulus respectively. So far as the asphalt surface is concerned, maximum shearing stress, which is caused by load, is evident in asphalt surface which is located in transverse contraction joint of lean concrete base of asphalt pavement. Maximum shearing stress decrease, decrease, decrease and increase respectively with increase of the surface's modulus, the surface's thickness, base's thickness and ratio of base's modulus to foundation's modulus.
2015-09-01
the network Mac8 Medium Access Control ( Mac ) (Ethernet) address observed as destination for outgoing packets subsessionid8 Zero-based index of...15. SUBJECT TERMS tactical networks, data reduction, high-performance computing, data analysis, big data 16. SECURITY CLASSIFICATION OF: 17...Integer index of row cts_deid Device (instrument) Identifier where observation took place cts_collpt Collection point or logical observation point on
1994-03-01
asked whether the planned structure considered (a) all objectives, (b) all functions, (c) all relevant units of analysis such as the plant , the...literature and provides an integrative model of design for high perfor-ming organizations. The model is based on an analysis of current theories of...important midrange theories underlie much of the work on organizational analysis . 0 Systems Approaches. These approaches emphasize the rational, goal
NASA Astrophysics Data System (ADS)
Chen, Xinjia; Lacy, Fred; Carriere, Patrick
2015-05-01
Sequential test algorithms are playing increasingly important roles for quick detecting network intrusions such as portscanners. In view of the fact that such algorithms are usually analyzed based on intuitive approximation or asymptotic analysis, we develop an exact computational method for the performance analysis of such algorithms. Our method can be used to calculate the probability of false alarm and average detection time up to arbitrarily pre-specified accuracy.
2009-12-01
events. Work associated with aperiodic tasks have the same statistical behavior and the same timing requirements. The timing deadlines are soft. • Sporadic...answers, but it is possible to calculate how precise the estimates are. Simulation-based performance analysis of a model includes a statistical ...to evaluate all pos- sible states in a timely manner. This is the principle reason for resorting to simulation and statistical analysis to evaluate
Semi-supervised vibration-based classification and condition monitoring of compressors
NASA Astrophysics Data System (ADS)
Potočnik, Primož; Govekar, Edvard
2017-09-01
Semi-supervised vibration-based classification and condition monitoring of the reciprocating compressors installed in refrigeration appliances is proposed in this paper. The method addresses the problem of industrial condition monitoring where prior class definitions are often not available or difficult to obtain from local experts. The proposed method combines feature extraction, principal component analysis, and statistical analysis for the extraction of initial class representatives, and compares the capability of various classification methods, including discriminant analysis (DA), neural networks (NN), support vector machines (SVM), and extreme learning machines (ELM). The use of the method is demonstrated on a case study which was based on industrially acquired vibration measurements of reciprocating compressors during the production of refrigeration appliances. The paper presents a comparative qualitative analysis of the applied classifiers, confirming the good performance of several nonlinear classifiers. If the model parameters are properly selected, then very good classification performance can be obtained from NN trained by Bayesian regularization, SVM and ELM classifiers. The method can be effectively applied for the industrial condition monitoring of compressors.
Wind study for high altitude platform design
NASA Technical Reports Server (NTRS)
Strganac, T. W.
1979-01-01
An analysis of upper air winds was performed to define the wind environment at potential operating altitudes for high-altitude powered platform concepts. Expected wind conditions of the contiguous United States, Pacific area (Alaska to Sea of Japan), and European area (Norwegian and Mediterranean Seas) were obtained using a representative network of sites selected based upon adequate high-altitude sampling, geographic dispersion, and observed upper wind patterns. A data base of twenty plus years of rawinsonde gathered wind information was used in the analysis. Annual variations from surface to 10 mb (approximately 31 km) pressure altitude were investigated to encompass the practical operating range for the platform concepts. Parametric analysis for the United States and foreign areas was performed to provide a basis for vehicle system design tradeoffs. This analysis of wind magnitudes indicates the feasibility of annual operation at a majority of sites and more selective seasonal operation for the extreme conditions between the pressure altitudes of 100 to 25 mb based upon the assumed design speeds.
Wind study for high altitude platform design
NASA Technical Reports Server (NTRS)
Strganac, T. W.
1979-01-01
An analysis of upper air winds was performed to define the wind environment at potential operating altitudes for high altitude powered platform concepts. Wind conditions of the continental United States, Pacific area (Alaska to Sea of Japan), and European area (Norwegian and Mediterranean Sea) were obtained using a representative network of sites selected based upon adequate high altitude sampling, geographic dispersion, and observed upper wind patterns. A data base of twenty plus years of rawinsonde gathered wind information was used in the analysis. Annual variations from surface to 10 mb pressure altitude were investigated to encompass the practical operating range for the platform concepts. Parametric analysis for the United States and foreign areas was performed to provide a basis for vehicle system design tradeoffs. This analysis of wind magnitudes indicates the feasibility of annual operation at a majority of sites and more selective seasonal operation for the extreme conditions between the pressure altitudes of 100 to 25 mb based upon the assumed design speeds.
Optical design and tolerancing of an ophthalmological system
NASA Astrophysics Data System (ADS)
Sieber, Ingo; Martin, Thomas; Yi, Allen; Li, Likai; Rübenach, Olaf
2014-09-01
Tolerance analysis by means of simulation is an essential step in system integration. Tolerance analysis allows for predicting the performance of a system setup of real manufactured parts and for an estimation of the yield with respect to evaluation figures, such as performance requirements, systems specification or cost demands. Currently, optical freeform optics is gaining importance in optical systems design. The performance of freeform optics often strongly depends on the manufacturing accuracy of the surfaces. For this reason, a tolerance analysis with respect to the fabrication accuracy is of crucial importance. The characterization of form tolerances caused by the manufacturing process is based on the definition of straightness, flatness, roundness, and cylindricity. In case of freeform components, however, it is often impossible to define a form deviation by means of this standard classification. Hence, prediction of the impact of manufacturing tolerances on the optical performance is not possible by means of a conventional tolerance analysis. To carry out a tolerance analysis of the optical subsystem, including freeform optics, metrology data of the fabricated surfaces have to be integrated into the optical model. The focus of this article is on design for manufacturability of freeform optics with integrated alignment structures and on tolerance analysis of the optical subsystem based on the measured surface data of manufactured optical freeform components with respect to assembly and manufacturing tolerances. This approach will be reported here using an ophthalmological system as an example.
ERIC Educational Resources Information Center
Trumpower, David L.
2015-01-01
Making inferences about population differences based on samples of data, that is, performing intuitive analysis of variance (IANOVA), is common in everyday life. However, the intuitive reasoning of individuals when making such inferences (even following statistics instruction), often differs from the normative logic of formal statistics. The…
Critical Factors Explaining the Leadership Performance of High-Performing Principals
ERIC Educational Resources Information Center
Hutton, Disraeli M.
2018-01-01
The study explored critical factors that explain leadership performance of high-performing principals and examined the relationship between these factors based on the ratings of school constituents in the public school system. The principal component analysis with the use of Varimax Rotation revealed that four components explain 51.1% of the…
Adam, Asrul; Ibrahim, Zuwairie; Mokhtar, Norrima; Shapiai, Mohd Ibrahim; Cumming, Paul; Mubin, Marizan
2016-01-01
Various peak models have been introduced to detect and analyze peaks in the time domain analysis of electroencephalogram (EEG) signals. In general, peak model in the time domain analysis consists of a set of signal parameters, such as amplitude, width, and slope. Models including those proposed by Dumpala, Acir, Liu, and Dingle are routinely used to detect peaks in EEG signals acquired in clinical studies of epilepsy or eye blink. The optimal peak model is the most reliable peak detection performance in a particular application. A fair measure of performance of different models requires a common and unbiased platform. In this study, we evaluate the performance of the four different peak models using the extreme learning machine (ELM)-based peak detection algorithm. We found that the Dingle model gave the best performance, with 72 % accuracy in the analysis of real EEG data. Statistical analysis conferred that the Dingle model afforded significantly better mean testing accuracy than did the Acir and Liu models, which were in the range 37-52 %. Meanwhile, the Dingle model has no significant difference compared to Dumpala model.
Teyhen, Deydre S; Shaffer, Scott W; Butler, Robert J; Goffar, Stephen L; Kiesel, Kyle B; Rhon, Daniel I; Boyles, Robert E; McMillian, Daniel J; Williamson, Jared N; Plisky, Phillip J
2016-10-01
Performance on movement tests helps to predict injury risk in a variety of physically active populations. Understanding baseline measures for normal is an important first step. Determine differences in physical performance assessments and describe normative values for these tests based on military unit type. Assessment of power, balance, mobility, motor control, and performance on the Army Physical Fitness Test were assessed in a cohort of 1,466 soldiers. Analysis of variance was performed to compare the results based on military unit type (Rangers, Combat, Combat Service, and Combat Service Support) and analysis of covariance was performed to determine the influence of age and gender. Rangers performed the best on all performance and fitness measures (p < 0.05). Combat soldiers performed better than Combat Service and Service Support soldiers on several physical performance tests and the Army Physical Fitness Test (p < 0.05). Performance in Combat Service and Service Support soldiers was equivalent on most measures (p < 0.05). Functional performance and level of fitness varied significantly by military unit type. Understanding these differences will provide a foundation for future injury prediction and prevention strategies. Reprint & Copyright © 2016 Association of Military Surgeons of the U.S.
Exploratory Analysis in Learning Analytics
ERIC Educational Resources Information Center
Gibson, David; de Freitas, Sara
2016-01-01
This article summarizes the methods, observations, challenges and implications for exploratory analysis drawn from two learning analytics research projects. The cases include an analysis of a games-based virtual performance assessment and an analysis of data from 52,000 students over a 5-year period at a large Australian university. The complex…
NASA Astrophysics Data System (ADS)
Villa, Enrique; Cano, Juan L.; Aja, Beatriz; Terán, J. Vicente; de la Fuente, Luisa; Mediavilla, Ángel; Artal, Eduardo
2018-03-01
This paper describes the analysis, design and characterization of a polarimetric receiver developed for covering the 35 to 47 GHz frequency band in the new instrument aimed at completing the ground-based Q-U-I Joint Tenerife Experiment. This experiment is designed to measure polarization in the Cosmic Microwave Background. The described high frequency instrument is a HEMT-based array composed of 29 pixels. A thorough analysis of the behaviour of the proposed receiver, based on electronic phase switching, is presented for a noise-like linearly polarized input signal, obtaining simultaneously I, Q and U Stokes parameters of the input signal. Wideband subsystems are designed, assembled and characterized for the polarimeter. Their performances are described showing appropriate results within the 35-to-47 GHz frequency band. Functionality tests are performed at room and cryogenic temperatures with adequate results for both temperature conditions, which validate the receiver concept and performance.
OpenMSI: A High-Performance Web-Based Platform for Mass Spectrometry Imaging
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rubel, Oliver; Greiner, Annette; Cholia, Shreyas
Mass spectrometry imaging (MSI) enables researchers to directly probe endogenous molecules directly within the architecture of the biological matrix. Unfortunately, efficient access, management, and analysis of the data generated by MSI approaches remain major challenges to this rapidly developing field. Despite the availability of numerous dedicated file formats and software packages, it is a widely held viewpoint that the biggest challenge is simply opening, sharing, and analyzing a file without loss of information. Here we present OpenMSI, a software framework and platform that addresses these challenges via an advanced, high-performance, extensible file format and Web API for remote data accessmore » (http://openmsi.nersc.gov). The OpenMSI file format supports storage of raw MSI data, metadata, and derived analyses in a single, self-describing format based on HDF5 and is supported by a large range of analysis software (e.g., Matlab and R) and programming languages (e.g., C++, Fortran, and Python). Careful optimization of the storage layout of MSI data sets using chunking, compression, and data replication accelerates common, selective data access operations while minimizing data storage requirements and are critical enablers of rapid data I/O. The OpenMSI file format has shown to provide >2000-fold improvement for image access operations, enabling spectrum and image retrieval in less than 0.3 s across the Internet even for 50 GB MSI data sets. To make remote high-performance compute resources accessible for analysis and to facilitate data sharing and collaboration, we describe an easy-to-use yet powerful Web API, enabling fast and convenient access to MSI data, metadata, and derived analysis results stored remotely to facilitate high-performance data analysis and enable implementation of Web based data sharing, visualization, and analysis.« less
The National Shipbuilding Research Program Executive Summary Robotics in Shipbuilding Workshop
1981-01-01
based on technoeconomic analysis and consideration environment. of working c-c-2 (3) The conceptual designs were based on application of commercial...results of our study. We identified shipbuilding tasks that should be performed by industrial robots based on technoeconomic and working-life incentives...is the TV image of the illuminated workplaces. The image is analyzed by the computer. The analysis includes noise rejection and fitting of straight
NASA Technical Reports Server (NTRS)
Phillips, Dave; Haas, William; Barth, Tim; Benjamin, Perakath; Graul, Michael; Bagatourova, Olga
2005-01-01
Range Process Simulation Tool (RPST) is a computer program that assists managers in rapidly predicting and quantitatively assessing the operational effects of proposed technological additions to, and/or upgrades of, complex facilities and engineering systems such as the Eastern Test Range. Originally designed for application to space transportation systems, RPST is also suitable for assessing effects of proposed changes in industrial facilities and large organizations. RPST follows a model-based approach that includes finite-capacity schedule analysis and discrete-event process simulation. A component-based, scalable, open architecture makes RPST easily and rapidly tailorable for diverse applications. Specific RPST functions include: (1) definition of analysis objectives and performance metrics; (2) selection of process templates from a processtemplate library; (3) configuration of process models for detailed simulation and schedule analysis; (4) design of operations- analysis experiments; (5) schedule and simulation-based process analysis; and (6) optimization of performance by use of genetic algorithms and simulated annealing. The main benefits afforded by RPST are provision of information that can be used to reduce costs of operation and maintenance, and the capability for affordable, accurate, and reliable prediction and exploration of the consequences of many alternative proposed decisions.
Integrated analysis of large space systems
NASA Technical Reports Server (NTRS)
Young, J. P.
1980-01-01
Based on the belief that actual flight hardware development of large space systems will necessitate a formalized method of integrating the various engineering discipline analyses, an efficient highly user oriented software system capable of performing interdisciplinary design analyses with tolerable solution turnaround time is planned Specific analysis capability goals were set forth with initial emphasis given to sequential and quasi-static thermal/structural analysis and fully coupled structural/control system analysis. Subsequently, the IAC would be expanded to include a fully coupled thermal/structural/control system, electromagnetic radiation, and optical performance analyses.
Performance Improvement of Power Analysis Attacks on AES with Encryption-Related Signals
NASA Astrophysics Data System (ADS)
Lee, You-Seok; Lee, Young-Jun; Han, Dong-Guk; Kim, Ho-Won; Kim, Hyoung-Nam
A power analysis attack is a well-known side-channel attack but the efficiency of the attack is frequently degraded by the existence of power components, irrelative to the encryption included in signals used for the attack. To enhance the performance of the power analysis attack, we propose a preprocessing method based on extracting encryption-related parts from the measured power signals. Experimental results show that the attacks with the preprocessed signals detect correct keys with much fewer signals, compared to the conventional power analysis attacks.
Sadegh Amalnick, Mohsen; Zarrin, Mansour
2017-03-13
Purpose The purpose of this paper is to present an integrated framework for performance evaluation and analysis of human resource (HR) with respect to the factors of health, safety, environment and ergonomics (HSEE) management system, and also the criteria of European federation for quality management (EFQM) as one of the well-known business excellence models. Design/methodology/approach In this study, an intelligent algorithm based on adaptive neuro-fuzzy inference system (ANFIS) along with fuzzy data envelopment analysis (FDEA) are developed and employed to assess the performance of the company. Furthermore, the impact of the factors on the company's performance as well as their strengths and weaknesses are identified by conducting a sensitivity analysis on the results. Similarly, a design of experiment is performed to prioritize the factors in the order of importance. Findings The results show that EFQM model has a far greater impact upon the company's performance than HSEE management system. According to the obtained results, it can be argued that integration of HSEE and EFQM leads to the performance improvement in the company. Practical implications In current study, the required data for executing the proposed framework are collected via valid questionnaires which are filled in by the staff of an aviation industry located in Tehran, Iran. Originality/value Managing HR performance results in improving usability, maintainability and reliability and finally in a significant reduction in the commercial aviation accident rate. Also, study of factors affecting HR performance authorities participate in developing systems in order to help operators better manage human error. This paper for the first time presents an intelligent framework based on ANFIS, FDEA and statistical tests for HR performance assessment and analysis with the ability of handling uncertainty and vagueness existing in real world environment.
Dinh, Michael M; Green, Timothy C; Bein, Kendall J; Lo, Serigne; Jones, Aaron; Johnson, Terence
2015-08-01
The objective was to evaluate the impact of an ED clinical redesign project that involved team-based care and early senior assessment on hospital performance. This was an interrupted time series analysis performed using daily hospital performance data 6 months before and 8 months after the implementation of the clinical redesign intervention that involved Emergency Consultant-led team-based care, redistribution of ED beds and implementation of a senior nursing coordination roles in the ED. The primary outcome was the daily National Emergency Access Target (NEAT) performance (proportion of total daily ED presentations that were admitted to an inpatient ward or discharged from ED within 4 h of arrival). Secondary outcomes were daily ALOS in ED, inpatient Clinical Emergency Response System (CERS) calls and hospital mortality. Autoregressive Integrated Moving Average analysis was used to model NEAT performance. Hospital mortality was modelled using negative binomial regression. After adjusting for patient volume, inpatient admissions, ambulance, hospital occupancy, weekends ED Consultant numbers, weekends and underlying trends, there was a 17% improvement in NEAT associated with the post-intervention period (95% CI 12, 19% P < 0.001). There was no change in the number of CERS calls and the median daily hospital mortality rate reduced from 1.04% to 0.96% (P = 0.025). An ED-focused clinical redesign project was associated with a 17% improvement in NEAT performance with no evidence of an increase in clinical deterioration on inpatient wards and evidence for an improvement in hospital mortality. © 2015 Australasian College for Emergency Medicine and Australasian Society for Emergency Medicine.
Performance, Agility and Cost of Cloud Computing Services for NASA GES DISC Giovanni Application
NASA Astrophysics Data System (ADS)
Pham, L.; Chen, A.; Wharton, S.; Winter, E. L.; Lynnes, C.
2013-12-01
The NASA Goddard Earth Science Data and Information Services Center (GES DISC) is investigating the performance, agility and cost of Cloud computing for GES DISC applications. Giovanni (Geospatial Interactive Online Visualization ANd aNalysis Infrastructure), one of the core applications at the GES DISC for online climate-related Earth science data access, subsetting, analysis, visualization, and downloading, was used to evaluate the feasibility and effort of porting an application to the Amazon Cloud Services platform. The performance and the cost of running Giovanni on the Amazon Cloud were compared to similar parameters for the GES DISC local operational system. A Giovanni Time-Series analysis of aerosol absorption optical depth (388nm) from OMI (Ozone Monitoring Instrument)/Aura was selected for these comparisons. All required data were pre-cached in both the Cloud and local system to avoid data transfer delays. The 3-, 6-, 12-, and 24-month data were used for analysis on the Cloud and local system respectively, and the processing times for the analysis were used to evaluate system performance. To investigate application agility, Giovanni was installed and tested on multiple Cloud platforms. The cost of using a Cloud computing platform mainly consists of: computing, storage, data requests, and data transfer in/out. The Cloud computing cost is calculated based on the hourly rate, and the storage cost is calculated based on the rate of Gigabytes per month. Cost for incoming data transfer is free, and for data transfer out, the cost is based on the rate in Gigabytes. The costs for a local server system consist of buying hardware/software, system maintenance/updating, and operating cost. The results showed that the Cloud platform had a 38% better performance and cost 36% less than the local system. This investigation shows the potential of cloud computing to increase system performance and lower the overall cost of system management.
Success Probability Analysis for Shuttle Based Microgravity Experiments
NASA Technical Reports Server (NTRS)
Liou, Ying-Hsin Andrew
1996-01-01
Presented in this report are the results of data analysis of shuttle-based microgravity flight experiments. Potential factors were identified in the previous grant period, and in this period 26 factors were selected for data analysis. In this project, the degree of success was developed and used as the performance measure. 293 of the 391 experiments in Lewis Research Center Microgravity Database were assigned degrees of success. The frequency analysis and the analysis of variance were conducted to determine the significance of the factors that effect the experiment success.
Haitsma, Jack J.; Furmli, Suleiman; Masoom, Hussain; Liu, Mingyao; Imai, Yumiko; Slutsky, Arthur S.; Beyene, Joseph; Greenwood, Celia M. T.; dos Santos, Claudia
2012-01-01
Objectives To perform a meta-analysis of gene expression microarray data from animal studies of lung injury, and to identify an injury-specific gene expression signature capable of predicting the development of lung injury in humans. Methods We performed a microarray meta-analysis using 77 microarray chips across six platforms, two species and different animal lung injury models exposed to lung injury with or/and without mechanical ventilation. Individual gene chips were classified and grouped based on the strategy used to induce lung injury. Effect size (change in gene expression) was calculated between non-injurious and injurious conditions comparing two main strategies to pool chips: (1) one-hit and (2) two-hit lung injury models. A random effects model was used to integrate individual effect sizes calculated from each experiment. Classification models were built using the gene expression signatures generated by the meta-analysis to predict the development of lung injury in human lung transplant recipients. Results Two injury-specific lists of differentially expressed genes generated from our meta-analysis of lung injury models were validated using external data sets and prospective data from animal models of ventilator-induced lung injury (VILI). Pathway analysis of gene sets revealed that both new and previously implicated VILI-related pathways are enriched with differentially regulated genes. Classification model based on gene expression signatures identified in animal models of lung injury predicted development of primary graft failure (PGF) in lung transplant recipients with larger than 80% accuracy based upon injury profiles from transplant donors. We also found that better classifier performance can be achieved by using meta-analysis to identify differentially-expressed genes than using single study-based differential analysis. Conclusion Taken together, our data suggests that microarray analysis of gene expression data allows for the detection of “injury" gene predictors that can classify lung injury samples and identify patients at risk for clinically relevant lung injury complications. PMID:23071521
Tokuda, T; Yamada, H; Sasagawa, K; Ohta, J
2009-10-01
This paper proposes and demonstrates a polarization-analyzing CMOS sensor based on image sensor architecture. The sensor was designed targeting applications for chiral analysis in a microchemistry system. The sensor features a monolithically embedded polarizer. Embedded polarizers with different angles were implemented to realize a real-time absolute measurement of the incident polarization angle. Although the pixel-level performance was confirmed to be limited, estimation schemes based on the variation of the polarizer angle provided a promising performance for real-time polarization measurements. An estimation scheme using 180 pixels in a 1deg step provided an estimation accuracy of 0.04deg. Polarimetric measurements of chiral solutions were also successfully performed to demonstrate the applicability of the sensor to optical chiral analysis.
The Development of a Handbook for Astrobee F Performance and Stability Analysis
NASA Technical Reports Server (NTRS)
Wolf, R. S.
1982-01-01
An astrobee F performance and stability analysis is presented, for use by the NASA Sounding Rocket Division. The performance analysis provides information regarding altitude, mach number, dynamic pressure, and velocity as functions of time since launch. It is found that payload weight has the greatest effect on performance, and performance prediction accuracy was calculated to remain within 1%. In addition, to assure sufficient flight stability, a predicted rigid-body static margin of at least 8% of the total vehicle length is required. Finally, fin cant angle predictions are given in order to achieve a 2.5 cycle per second burnout roll rate, based on obtaining 75% of the steady roll rate. It is noted that this method can be used by flight performance engineers to create a similar handbook for any sounding rocket series.
Determination of sex origin of meat and meat products on the DNA basis: a review.
Gokulakrishnan, Palanisamy; Kumar, Rajiv Ranjan; Sharma, Brahm Deo; Mendiratta, Sanjod Kumar; Malav, Omprakash; Sharma, Deepak
2015-01-01
Sex determination of domestic animal's meat is of potential value in meat authentication and quality control studies. Methods aiming at determining the sex origin of meat may be based either on the analysis of hormone or on the analysis of nucleic acids. At the present time, sex determination of meat and meat products based on hormone analysis employ gas chromatography-mass spectrometry (GC-MS), high-performance liquid chromatography-mass spectrometry/mass spectrometry (HPLC-MS/MS), and enzyme-linked immunosorbent assay (ELISA). Most of the hormone-based methods proved to be highly specific and sensitive but were not performed on a regular basis for meat sexing due to the technical limitations or the expensive equipments required. On the other hand, the most common methodology to determine the sex of meat is unquestionably traditional polymerase chain reaction (PCR) that involves gel electrophoresis of DNA amplicons. This review is intended to provide an overview of the DNA-based methods for sex determination of meat and meat products.
NASA Astrophysics Data System (ADS)
Poncelet, Carine; Merz, Ralf; Merz, Bruno; Parajka, Juraj; Oudin, Ludovic; Andréassian, Vazken; Perrin, Charles
2017-08-01
Most of previous assessments of hydrologic model performance are fragmented, based on small number of catchments, different methods or time periods and do not link the results to landscape or climate characteristics. This study uses large-sample hydrology to identify major catchment controls on daily runoff simulations. It is based on a conceptual lumped hydrological model (GR6J), a collection of 29 catchment characteristics, a multinational set of 1103 catchments located in Austria, France, and Germany and four runoff model efficiency criteria. Two analyses are conducted to assess how features and criteria are linked: (i) a one-dimensional analysis based on the Kruskal-Wallis test and (ii) a multidimensional analysis based on regression trees and investigating the interplay between features. The catchment features most affecting model performance are the flashiness of precipitation and streamflow (computed as the ratio of absolute day-to-day fluctuations by the total amount in a year), the seasonality of evaporation, the catchment area, and the catchment aridity. Nonflashy, nonseasonal, large, and nonarid catchments show the best performance for all the tested criteria. We argue that this higher performance is due to fewer nonlinear responses (higher correlation between precipitation and streamflow) and lower input and output variability for such catchments. Finally, we show that, compared to national sets, multinational sets increase results transferability because they explore a wider range of hydroclimatic conditions.
Electromagnetic fields from mobile phone base station - variability analysis.
Bienkowski, Pawel; Zubrzak, Bartlomiej
2015-09-01
The article describes the character of electromagnetic field (EMF) in mobile phone base station (BS) surroundings and its variability in time with an emphasis on the measurement difficulties related to its pulse and multi-frequency nature. Work also presents long-term monitoring measurements performed recently in different locations in Poland - small city with dispersed building development and in major polish city - dense urban area. Authors tried to determine the trends in changing of EMF spectrum analyzing daily changes of measured EMF levels in those locations. Research was performed using selective electromagnetic meters and also EMF meter with spectrum analysis.
Thermal finite-element analysis of space shuttle main engine turbine blade
NASA Technical Reports Server (NTRS)
Abdul-Aziz, Ali; Tong, Michael T.; Kaufman, Albert
1987-01-01
Finite-element, transient heat transfer analyses were performed for the first-stage blades of the space shuttle main engine (SSME) high-pressure fuel turbopump. The analyses were based on test engine data provided by Rocketdyne. Heat transfer coefficients were predicted by performing a boundary-layer analysis at steady-state conditions with the STAN5 boundary-layer code. Two different peak-temperature overshoots were evaluated for the startup transient. Cutoff transient conditions were also analyzed. A reduced gas temperature profile based on actual thermocouple data was also considered. Transient heat transfer analyses were conducted with the MARC finite-element computer code.
Chittiboina, Prashant; Banerjee, Anirban Deep; Nanda, Anil
2011-01-01
We performed a trauma database analysis to identify the effect of concomitant cranial injuries on outcome in patients with fractures of the axis. We identified patients with axis fractures over a 14-year period. A binary outcome measure was used. Univariate and multiple logistic regression analysis were performed. There were 259 cases with axis fractures. Closed head injury was noted in 57% and skull base trauma in 14%. Death occurred in 17 cases (6%). Seventy-two percent had good outcome. Presence of abnormal computed tomography head findings, skull base fractures, and visceral injury was significantly associated with poor outcome. Skull base injury in association with fractures of the axis is a significant independent predictor of worse outcomes, irrespective of the severity of the head injury. We propose that presence of concomitant cranial and upper vertebral injuries require careful evaluation in view of the associated poor prognosis. PMID:22470268
Orms, Natalie; Rehn, Dirk R; Dreuw, Andreas; Krylov, Anna I
2018-02-13
Density-based wave function analysis enables unambiguous comparisons of the electronic structure computed by different methods and removes ambiguity of orbital choices. We use this tool to investigate the performance of different spin-flip methods for several prototypical diradicals and triradicals. In contrast to previous calibration studies that focused on energy gaps between high- and low spin-states, we focus on the properties of the underlying wave functions, such as the number of effectively unpaired electrons. Comparison of different density functional and wave function theory results provides insight into the performance of the different methods when applied to strongly correlated systems such as polyradicals. We show that canonical molecular orbitals for species like large copper-containing diradicals fail to correctly represent the underlying electronic structure due to highly non-Koopmans character, while density-based analysis of the same wave function delivers a clear picture of the bonding pattern.
Performance of concrete members subjected to large hydrocarbon pool fires
Zwiers, Renata I.; Morgan, Bruce J.
1989-01-01
The authors discuss an investigation to determine analytically if the performance of concrete beams and columns in a hydrocarbon pool test fire would differ significantly from their performance in a standard test fire. The investigation consisted of a finite element analysis to obtain temperature distributions in typical cross sections, a comparison of the resulting temperature distribution in the cross section, and a strength analysis of a beam based on temperature distribution data. Results of the investigation are reported.
Digital Game-Based Learning for K-12 Mathematics Education: A Meta-Analysis
ERIC Educational Resources Information Center
Byun, JaeHwan; Joung, Eunmi
2018-01-01
Digital games (e.g., video games or computer games) have been reported as an effective educational method that can improve students' motivation and performance in mathematics education. This meta-analysis study (a) investigates the current trend of digital game-based learning (DGBL) by reviewing the research studies on the use of DGBL for…
Code of Federal Regulations, 2010 CFR
2010-07-01
... current employee or prospective employee based solely on the analysis of a polygraph test chart or the refusal to take a polygraph test. (b) Analysis of a polygraph test chart or refusal to take a polygraph..., job performance, etc. may be used as a basis for employment decisions. Employment decisions based on...
ERIC Educational Resources Information Center
Yeo, Seungsoo
2010-01-01
The purpose of this synthesis was to examine the relationship between Curriculum-Based Measurement (CBM) and statewide achievement tests in reading. A multilevel meta-analysis was used to calculate the correlation coefficient of the population for 27 studies that met the inclusion criteria. Results showed an overall large correlation coefficient…
ERIC Educational Resources Information Center
DuPaul, George J.; Eckert, Tanya L.; Vilardo, Brigid
2012-01-01
A meta-analysis evaluating the effects of school-based interventions for students with attention deficit hyperactivity disorder was conducted by examining 60 outcome studies between 1996 and 2010 that yielded 85 effect sizes. Separate analyses were performed for studies employing between-subjects, within- subjects, and single-subject experimental…
Komar, Alyssa; Ashley, Kelsey; Hanna, Kelly; Lavallee, Julia; Woodhouse, Janet; Bernstein, Janet; Andres, Matthew; Reed, Nick
2016-01-01
A pretest-posttest retrospective design was used to evaluate the impact of a group-based modified constraint-induced movement therapy (mCIMT) program on upper extremity function and occupational performance. 20 children ages 3 to 18 years with hemiplegia following an acquired brain injury participated in a 2-week group mCIMT program. Upper extremity function was measured with the Assisting Hand Assessment (AHA) and subtests from the Quality of Upper Extremity Skills Test (QUEST). Occupational performance and satisfaction were assessed using the Canadian Occupational Performance Measure (COPM). Data were analyzed using a Wilcoxon signed-ranks test. Group-based analysis revealed upper extremity function and occupational performance attained statistically significant improvements from pre- to postintervention on all outcome measures (AHA: Z = -3.63, p = <.001; QUEST Grasps: Z = -3.10, p = .002; QUEST Dissociated Movement: Z = -2.51, p = .012; COPM Performance: Z = -3.64, p = <.001; COPM Satisfaction: Z = -3.64, p = <.001). Across individuals, clinically significant improvements were found in 65% of participants' AHA scores. 80% of COPM Performance scores and 70% of COPM Satisfaction scores demonstrated clinically significant improvements in at least one identified goal. This study is an initial step in evaluating and providing preliminary evidence supporting the effectiveness of a group-based mCIMT program for children with hemiplegia following an acquired brain injury.
NASA Astrophysics Data System (ADS)
Li, Husheng; Betz, Sharon M.; Poor, H. Vincent
2007-05-01
This paper examines the performance of decision feedback based iterative channel estimation and multiuser detection in channel coded aperiodic DS-CDMA systems operating over multipath fading channels. First, explicit expressions describing the performance of channel estimation and parallel interference cancellation based multiuser detection are developed. These results are then combined to characterize the evolution of the performance of a system that iterates among channel estimation, multiuser detection and channel decoding. Sufficient conditions for convergence of this system to a unique fixed point are developed.
Cognitive performance modeling based on general systems performance theory.
Kondraske, George V
2010-01-01
General Systems Performance Theory (GSPT) was initially motivated by problems associated with quantifying different aspects of human performance. It has proved to be invaluable for measurement development and understanding quantitative relationships between human subsystem capacities and performance in complex tasks. It is now desired to bring focus to the application of GSPT to modeling of cognitive system performance. Previous studies involving two complex tasks (i.e., driving and performing laparoscopic surgery) and incorporating measures that are clearly related to cognitive performance (information processing speed and short-term memory capacity) were revisited. A GSPT-derived method of task analysis and performance prediction termed Nonlinear Causal Resource Analysis (NCRA) was employed to determine the demand on basic cognitive performance resources required to support different levels of complex task performance. This approach is presented as a means to determine a cognitive workload profile and the subsequent computation of a single number measure of cognitive workload (CW). Computation of CW may be a viable alternative to measuring it. Various possible "more basic" performance resources that contribute to cognitive system performance are discussed. It is concluded from this preliminary exploration that a GSPT-based approach can contribute to defining cognitive performance models that are useful for both individual subjects and specific groups (e.g., military pilots).
Implementation of probe data performance measures.
DOT National Transportation Integrated Search
2017-04-03
This report presents results from a 12-month project where three arterial analysis tools based on probe vehicle segment speed data were developed for District 6. A case study of 5 arterials and two incidents was performed.
NASA Technical Reports Server (NTRS)
Welch, Bryan W.
2016-01-01
NASA is participating in the International Committee on Global Navigation Satellite Systems (GNSS) (ICG)'s efforts towards demonstrating the benefits to the space user in the Space Service Volume (SSV) when a multi-GNSS solution space approach is utilized. The ICG Working Group: Enhancement of GNSS Performance, New Services and Capabilities has started a three phase analysis initiative as an outcome of recommendations at the ICG-10 meeting, in preparation for the ICG-11 meeting. The first phase of that increasing complexity and fidelity analysis initiative is based on a pure geometrically-derived access technique. The first phase of analysis has been completed, and the results are documented in this paper.
Removing Grit During Wastewater Treatment: CFD Analysis of HDVS Performance.
Meroney, Robert N; Sheker, Robert E
2016-05-01
Computational Fluid Dynamics (CFD) was used to simulate the grit and sand separation effectiveness of a typical hydrodynamic vortex separator (HDVS) system. The analysis examined the influences on the separator efficiency of: flow rate, fluid viscosities, total suspended solids (TSS), and particle size and distribution. It was found that separator efficiency for a wide range of these independent variables could be consolidated into a few curves based on the particle fall velocity to separator inflow velocity ratio, Ws/Vin. Based on CFD analysis it was also determined that systems of different sizes with length scale ratios ranging from 1 to 10 performed similarly when Ws/Vin and TSS were held constant. The CFD results have also been compared to a limited range of experimental data.
NASA Astrophysics Data System (ADS)
Dinh, Minh-Chau; Ju, Chang-Hyeon; Kim, Sung-Kyu; Kim, Jin-Geun; Park, Minwon; Yu, In-Keun
2013-01-01
The combination of a high temperature superconducting DC power cable and a voltage source converter based HVDC (VSC-HVDC) creates a new option for transmitting power with multiple collection and distribution points for long distance and bulk power transmissions. It offers some greater advantages compared with HVAC or conventional HVDC transmission systems, and it is well suited for the grid integration of renewable energy sources in existing distribution or transmission systems. For this reason, a superconducting DC transmission system based HVDC transmission technologies is planned to be set up in the Jeju power system, Korea. Before applying this system to a real power system on Jeju Island, system analysis should be performed through a real time test. In this paper, a model-sized superconducting VSC-HVDC system, which consists of a small model-sized VSC-HVDC connected to a 2 m YBCO HTS DC model cable, is implemented. The authors have performed the real-time simulation method that incorporates the model-sized superconducting VSC-HVDC system into the simulated Jeju power system using Real Time Digital Simulator (RTDS). The performance analysis of the superconducting VSC-HVDC systems has been verified by the proposed test platform and the results were discussed in detail.
NASA Astrophysics Data System (ADS)
Dinh, Minh-Chau; Ju, Chang-Hyeon; Kim, Sung-Kyu; Kim, Jin-Geun; Park, Minwon; Yu, In-Keun
2012-08-01
The combination of a high temperature superconducting DC power cable and a voltage source converter based HVDC (VSC-HVDC) creates a new option for transmitting power with multiple collection and distribution points for long distance and bulk power transmissions. It offers some greater advantages compared with HVAC or conventional HVDC transmission systems, and it is well suited for the grid integration of renewable energy sources in existing distribution or transmission systems. For this reason, a superconducting DC transmission system based HVDC transmission technologies is planned to be set up in the Jeju power system, Korea. Before applying this system to a real power system on Jeju Island, system analysis should be performed through a real time test. In this paper, a model-sized superconducting VSC-HVDC system, which consists of a small model-sized VSC-HVDC connected to a 2 m YBCO HTS DC model cable, is implemented. The authors have performed the real-time simulation method that incorporates the model-sized superconducting VSC-HVDC system into the simulated Jeju power system using Real Time Digital Simulator (RTDS). The performance analysis of the superconducting VSC-HVDC systems has been verified by the proposed test platform and the results were discussed in detail.
Wang, Mingyu; Han, Lijuan; Liu, Shasha; Zhao, Xuebing; Yang, Jinghua; Loh, Soh Kheang; Sun, Xiaomin; Zhang, Chenxi; Fang, Xu
2015-09-01
Renewable energy from lignocellulosic biomass has been deemed an alternative to depleting fossil fuels. In order to improve this technology, we aim to develop robust mathematical models for the enzymatic lignocellulose degradation process. By analyzing 96 groups of previously published and newly obtained lignocellulose saccharification results and fitting them to Weibull distribution, we discovered Weibull statistics can accurately predict lignocellulose saccharification data, regardless of the type of substrates, enzymes and saccharification conditions. A mathematical model for enzymatic lignocellulose degradation was subsequently constructed based on Weibull statistics. Further analysis of the mathematical structure of the model and experimental saccharification data showed the significance of the two parameters in this model. In particular, the λ value, defined the characteristic time, represents the overall performance of the saccharification system. This suggestion was further supported by statistical analysis of experimental saccharification data and analysis of the glucose production levels when λ and n values change. In conclusion, the constructed Weibull statistics-based model can accurately predict lignocellulose hydrolysis behavior and we can use the λ parameter to assess the overall performance of enzymatic lignocellulose degradation. Advantages and potential applications of the model and the λ value in saccharification performance assessment were discussed. Copyright © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Rhee, Peter C; Fischer, Michelle M; Rhee, Laura S; McMillan, Ha; Johnson, Anthony E
2017-03-01
Wide-awake, local anesthesia, no tourniquet (WALANT) hand surgery was developed to improve access to hand surgery care while optimizing medical resources. Hand surgery in the clinic setting may result in substantial cost savings for the United States Military Health Care System (MHS) and provide a safe alternative to performing similar procedures in the operating room. A prospective cohort study was performed on the first 100 consecutive clinic-based WALANT hand surgery procedures performed at a military medical center from January 2014 to September 2015 by a single hand surgeon. Cost savings analysis was performed by using the Medical Expense and Performance Reporting System, the standard cost accounting system for the MHS, to compare procedures performed in the clinic versus the operating room during the study period. A study specific questionnaire was obtained for 66 procedures to evaluate the patient's experience. For carpal tunnel release (n = 34) and A1 pulley release (n = 33), there were 85% and 70% cost savings by having the procedures performed in clinic under WALANT compared with the main operating room, respectively. During the study period, carpal tunnel release, A1 pulley release, and de Quervain release performed in the clinic instead of the operating room amounted to $393,100 in cost savings for the MHS. There were no adverse events during the WALANT procedure. A clinic-based WALANT hand surgery program at a military medical center results in considerable cost savings for the MHS. Economic/Decision Analysis IV. Copyright © 2017 American Society for Surgery of the Hand. Published by Elsevier Inc. All rights reserved.
Probabilistic Structural Evaluation of Uncertainties in Radiator Sandwich Panel Design
NASA Technical Reports Server (NTRS)
Kuguoglu, Latife; Ludwiczak, Damian
2006-01-01
The Jupiter Icy Moons Orbiter (JIMO) Space System is part of the NASA's Prometheus Program. As part of the JIMO engineering team at NASA Glenn Research Center, the structural design of the JIMO Heat Rejection Subsystem (HRS) is evaluated. An initial goal of this study was to perform sensitivity analyses to determine the relative importance of the input variables on the structural responses of the radiator panel. The desire was to let the sensitivity analysis information identify the important parameters. The probabilistic analysis methods illustrated here support this objective. The probabilistic structural performance evaluation of a HRS radiator sandwich panel was performed. The radiator panel structural performance was assessed in the presence of uncertainties in the loading, fabrication process variables, and material properties. The stress and displacement contours of the deterministic structural analysis at mean probability was performed and results presented. It is followed by a probabilistic evaluation to determine the effect of the primitive variables on the radiator panel structural performance. Based on uncertainties in material properties, structural geometry and loading, the results of the displacement and stress analysis are used as an input file for the probabilistic analysis of the panel. The sensitivity of the structural responses, such as maximum displacement and maximum tensile and compressive stresses of the facesheet in x and y directions and maximum VonMises stresses of the tube, to the loading and design variables is determined under the boundary condition where all edges of the radiator panel are pinned. Based on this study, design critical material and geometric parameters of the considered sandwich panel are identified.
Radiomics-based Prognosis Analysis for Non-Small Cell Lung Cancer
NASA Astrophysics Data System (ADS)
Zhang, Yucheng; Oikonomou, Anastasia; Wong, Alexander; Haider, Masoom A.; Khalvati, Farzad
2017-04-01
Radiomics characterizes tumor phenotypes by extracting large numbers of quantitative features from radiological images. Radiomic features have been shown to provide prognostic value in predicting clinical outcomes in several studies. However, several challenges including feature redundancy, unbalanced data, and small sample sizes have led to relatively low predictive accuracy. In this study, we explore different strategies for overcoming these challenges and improving predictive performance of radiomics-based prognosis for non-small cell lung cancer (NSCLC). CT images of 112 patients (mean age 75 years) with NSCLC who underwent stereotactic body radiotherapy were used to predict recurrence, death, and recurrence-free survival using a comprehensive radiomics analysis. Different feature selection and predictive modeling techniques were used to determine the optimal configuration of prognosis analysis. To address feature redundancy, comprehensive analysis indicated that Random Forest models and Principal Component Analysis were optimum predictive modeling and feature selection methods, respectively, for achieving high prognosis performance. To address unbalanced data, Synthetic Minority Over-sampling technique was found to significantly increase predictive accuracy. A full analysis of variance showed that data endpoints, feature selection techniques, and classifiers were significant factors in affecting predictive accuracy, suggesting that these factors must be investigated when building radiomics-based predictive models for cancer prognosis.
Practical Application of Model-based Programming and State-based Architecture to Space Missions
NASA Technical Reports Server (NTRS)
Horvath, Gregory; Ingham, Michel; Chung, Seung; Martin, Oliver; Williams, Brian
2006-01-01
A viewgraph presentation to develop models from systems engineers that accomplish mission objectives and manage the health of the system is shown. The topics include: 1) Overview; 2) Motivation; 3) Objective/Vision; 4) Approach; 5) Background: The Mission Data System; 6) Background: State-based Control Architecture System; 7) Background: State Analysis; 8) Overview of State Analysis; 9) Background: MDS Software Frameworks; 10) Background: Model-based Programming; 10) Background: Titan Model-based Executive; 11) Model-based Execution Architecture; 12) Compatibility Analysis of MDS and Titan Architectures; 13) Integrating Model-based Programming and Execution into the Architecture; 14) State Analysis and Modeling; 15) IMU Subsystem State Effects Diagram; 16) Titan Subsystem Model: IMU Health; 17) Integrating Model-based Programming and Execution into the Software IMU; 18) Testing Program; 19) Computationally Tractable State Estimation & Fault Diagnosis; 20) Diagnostic Algorithm Performance; 21) Integration and Test Issues; 22) Demonstrated Benefits; and 23) Next Steps
Bellec, J; Delaby, N; Jouyaux, F; Perdrieux, M; Bouvier, J; Sorel, S; Henry, O; Lafond, C
2017-07-01
Robotic radiosurgery requires plan delivery quality assurance (DQA) but there has never been a published comprehensive analysis of a patient-specific DQA process in a clinic. We proposed to evaluate 350 consecutive film-based patient-specific DQAs using statistical process control. We evaluated the performance of the process to propose achievable tolerance criteria for DQA validation and we sought to identify suboptimal DQA using control charts. DQAs were performed on a CyberKnife-M6 using Gafchromic-EBT3 films. The signal-to-dose conversion was performed using a multichannel-correction and a scanning protocol that combined measurement and calibration in a single scan. The DQA analysis comprised a gamma-index analysis at 3%/1.5mm and a separate evaluation of spatial and dosimetric accuracy of the plan delivery. Each parameter was plotted on a control chart and control limits were calculated. A capability index (Cpm) was calculated to evaluate the ability of the process to produce results within specifications. The analysis of capability showed that a gamma pass rate of 85% at 3%/1.5mm was highly achievable as acceptance criteria for DQA validation using a film-based protocol (Cpm>1.33). 3.4% of DQA were outside a control limit of 88% for gamma pass-rate. The analysis of the out-of-control DQA helped identify a dosimetric error in our institute for a specific treatment type. We have defined initial tolerance criteria for DQA validations. We have shown that the implementation of a film-based patient-specific DQA protocol with the use of control charts is an effective method to improve patient treatment safety on CyberKnife. Copyright © 2017 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.
Quantitative analysis of regional myocardial performance in coronary artery disease
NASA Technical Reports Server (NTRS)
Stewart, D. K.; Dodge, H. T.; Frimer, M.
1975-01-01
Findings from a group of subjects with significant coronary artery stenosis are given. A group of controls determined by use of a quantitative method for the study of regional myocardial performance based on the frame-by-frame analysis of biplane left ventricular angiograms are presented. Particular emphasis was placed upon the analysis of wall motion in terms of normalized segment dimensions, timing and velocity of contraction. The results were compared with the method of subjective assessment used clinically.
NASA Astrophysics Data System (ADS)
Klügel, J.
2006-12-01
Deterministic scenario-based seismic hazard analysis has a long tradition in earthquake engineering for developing the design basis of critical infrastructures like dams, transport infrastructures, chemical plants and nuclear power plants. For many applications besides of the design of infrastructures it is of interest to assess the efficiency of the design measures taken. These applications require a method allowing to perform a meaningful quantitative risk analysis. A new method for a probabilistic scenario-based seismic risk analysis has been developed based on a probabilistic extension of proven deterministic methods like the MCE- methodology. The input data required for the method are entirely based on the information which is necessary to perform any meaningful seismic hazard analysis. The method is based on the probabilistic risk analysis approach common for applications in nuclear technology developed originally by Kaplan & Garrick (1981). It is based (1) on a classification of earthquake events into different size classes (by magnitude), (2) the evaluation of the frequency of occurrence of events, assigned to the different classes (frequency of initiating events, (3) the development of bounding critical scenarios assigned to each class based on the solution of an optimization problem and (4) in the evaluation of the conditional probability of exceedance of critical design parameters (vulnerability analysis). The advantage of the method in comparison with traditional PSHA consists in (1) its flexibility, allowing to use different probabilistic models for earthquake occurrence as well as to incorporate advanced physical models into the analysis, (2) in the mathematically consistent treatment of uncertainties, and (3) in the explicit consideration of the lifetime of the critical structure as a criterion to formulate different risk goals. The method was applied for the evaluation of the risk of production interruption losses of a nuclear power plant during its residual lifetime.
Information Leakage Analysis by Abstract Interpretation
NASA Astrophysics Data System (ADS)
Zanioli, Matteo; Cortesi, Agostino
Protecting the confidentiality of information stored in a computer system or transmitted over a public network is a relevant problem in computer security. The approach of information flow analysis involves performing a static analysis of the program with the aim of proving that there will not be leaks of sensitive information. In this paper we propose a new domain that combines variable dependency analysis, based on propositional formulas, and variables' value analysis, based on polyhedra. The resulting analysis is strictly more accurate than the state of the art abstract interpretation based analyses for information leakage detection. Its modular construction allows to deal with the tradeoff between efficiency and accuracy by tuning the granularity of the abstraction and the complexity of the abstract operators.
The Impact of Grading on a Curve: Assessing the Results of Kulick and Wright's Simulation Analysis
ERIC Educational Resources Information Center
Bailey, Gary L.; Steed, Ronald C.
2012-01-01
Kulick and Wright concluded, based on theoretical mathematical simulations of hypothetical student exam scores, that assigning exam grades to students based on the relative position of their exam performance scores within a normal curve may be unfair, given the role that randomness plays in any given student's performance on any given exam.…
ERIC Educational Resources Information Center
Albus, Deb; Thurlow, Martha L.; Lazarus, Sheryl S.
2011-01-01
This report examines publicly reported participation and performance data for the alternate assessment based on modified achievement standards (AA-MAS). The authors' analysis of these data included all states publicly reporting AA-MAS data, regardless of whether they had received approval to use the results for Title I accountability calculations.…
Life-Cycle Inventory Analysis of I-joist Production in the United States
Richard D. Bergman
2015-01-01
Documenting the environmental performance of building products is becoming increasingly common. Creating environmental product declarations (EPDs) based on life-cycle assessment (LCA) data is one approach to provide scientific documentation of the productsâ environmental performance. Many U.S. structural wood products have LCA-based âeco-labelsâ developed under the ISO...
Validation Database Based Thermal Analysis of an Advanced RPS Concept
NASA Technical Reports Server (NTRS)
Balint, Tibor S.; Emis, Nickolas D.
2006-01-01
Advanced RPS concepts can be conceived, designed and assessed using high-end computational analysis tools. These predictions may provide an initial insight into the potential performance of these models, but verification and validation are necessary and required steps to gain confidence in the numerical analysis results. This paper discusses the findings from a numerical validation exercise for a small advanced RPS concept, based on a thermal analysis methodology developed at JPL and on a validation database obtained from experiments performed at Oregon State University. Both the numerical and experimental configurations utilized a single GPHS module enabled design, resembling a Mod-RTG concept. The analysis focused on operating and environmental conditions during the storage phase only. This validation exercise helped to refine key thermal analysis and modeling parameters, such as heat transfer coefficients, and conductivity and radiation heat transfer values. Improved understanding of the Mod-RTG concept through validation of the thermal model allows for future improvements to this power system concept.
NASA Astrophysics Data System (ADS)
Sabri, Karim; Colson, Gérard E.; Mbangala, Augustin M.
2008-10-01
Multi-period differences of technical and financial performances are analysed by comparing five North African railways over the period (1990-2004). A first approach is based on the Malmquist DEA TFP index for measuring the total factors productivity change, decomposed into technical efficiency change and technological changes. A multiple criteria analysis is also performed using the PROMETHEE II method and the software ARGOS. These methods provide complementary detailed information, especially by discriminating the technological and management progresses by Malmquist and the two dimensions of performance by Promethee: that are the service to the community and the enterprises performances, often in conflict.
NASA Astrophysics Data System (ADS)
Lai, Chao-Jen; Shaw, Chris C.; Whitman, Gary J.; Yang, Wei T.; Dempsey, Peter J.
2005-04-01
The purpose of this study is to compare the detection performance of three different mammography systems: screen/film (SF) combination, a-Si/CsI flat-panel (FP-), and charge-coupled device (CCD-) based systems. A 5-cm thick 50% adipose/50% glandular breast tissue equivalent slab phantom was used to provide an uniform background. Calcium carbonate grains of three different size groups were used to simulate microcalcifications (MCs): 112-125, 125-140, and 140-150 μm overlapping with the uniform background. Calcification images were acquired with the three mammography systems. Digital images were printed on hardcopy films. All film images were displayed on a mammographic viewer and reviewed by 5 mammographers. The visibility of the MC was rated with a 5-point confidence rating scale for each detection task, including the negative controls. Scores were averaged over all readers for various detectors and size groups. Receiver operating characteristic (ROC) analysis was performed and the areas under the ROC curves (Az"s) were computed for various imaging conditions. The results shows that (1) the FP-based system performed significantly better than the SF and CCD-based systems for individual size groups using ROC analysis (2) the FP-based system also performed significantly better than the SF and CCD-based systems for individual size groups using averaged confidence scale, and (3) the results obtained from the Az"s were largely correlated with these from confidence level scores. However, the correlation varied slightly among different imaging conditions.
A coach's political use of video-based feedback: a case study in elite-level academy soccer.
Booroff, Michael; Nelson, Lee; Potrac, Paul
2016-01-01
This paper examines the video-based pedagogical practices of Terry (pseudonym), a head coach of a professional junior academy squad. Data were collected through 6 in-depth, semi-structured interviews and 10 field observations of Terry's video-based coaching in situ. Three embracing categories were generated from the data. These demonstrated that Terry's video-based coaching was far from apolitical. Rather, Terry strategically used performance analysis technologies to help fulfil various objectives and outcomes that he understood to be expected of him within the club environment. Kelchtermans' micropolitical perspective, Callero's work addressing role and Groom et al.'s grounded theory were primarily utilised to make sense of Terry's perceptions and actions. The findings point to the value of developing contextually grounded understandings of coaches' uses of video-based performance analysis technology. Doing so could better prepare coaches for this aspect of their coaching practice.
Washko, George R; Criner, Gerald J; Mohsenifar, Zab; Sciurba, Frank C; Sharafkhaneh, Amir; Make, Barry J; Hoffman, Eric A; Reilly, John J
2008-06-01
Computed tomographic based indices of emphysematous lung destruction may highlight differences in disease pathogenesis and further enable the classification of subjects with Chronic Obstructive Pulmonary Disease. While there are multiple techniques that can be utilized for such radiographic analysis, there is very little published information comparing the performance of these methods in a clinical case series. Our objective was to examine several quantitative and semi-quantitative methods for the assessment of the burden of emphysema apparent on computed tomographic scans and compare their ability to predict lung mechanics and function. Automated densitometric analysis was performed on 1094 computed tomographic scans collected upon enrollment into the National Emphysema Treatment Trial. Trained radiologists performed an additional visual grading of emphysema on high resolution CT scans. Full pulmonary function test results were available for correlation, with a subset of subjects having additional measurements of lung static recoil. There was a wide range of emphysematous lung destruction apparent on the CT scans and univariate correlations to measures of lung function were of modest strength. No single method of CT scan analysis clearly outperformed the rest of the group. Quantification of the burden of emphysematous lung destruction apparent on CT scan is a weak predictor of lung function and mechanics in severe COPD with no uniformly superior method found to perform this analysis. The CT based quantification of emphysema may augment pulmonary function testing in the characterization of COPD by providing complementary phenotypic information.
Performance optimization of CO 2 heat pump water heater
Nawaz, Kashif; Shen, Bo; Elatar, Ahmed; ...
2017-10-14
A preliminary analysis was conducted to analyze the performance of a heat pump water heater (HPWH) that uses CO 2 as the refrigerant. A model to predict the performance was developed and calibrated based on the experimental data for an existing HPWH using a CO 2 refrigerant. The calibrated model was then used to run a parametric analysis in which factors such as water supply temperature, water circulation rate, tank stratification, and condenser configuration were considered. The performance of a commercial CO 2 system was compared with that of a similar system using R-134a as the refrigerant. It was foundmore » that CO 2 HPWH performance was comparable to that of an R-134a HPWH, more so for a separated gas cooler configuration. For comparable performance, the compressor size and the tube-in-tube heat exchanger (condenser/gas cooler) size were compared for CO 2- and R-134a-based systems. Finally, the impact of the water circulation rate on the water temperature stratification in the tank, an essential requirement for higher performance for CO 2 HPWH systems was also investigated.« less
Performance optimization of CO 2 heat pump water heater
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nawaz, Kashif; Shen, Bo; Elatar, Ahmed
A preliminary analysis was conducted to analyze the performance of a heat pump water heater (HPWH) that uses CO 2 as the refrigerant. A model to predict the performance was developed and calibrated based on the experimental data for an existing HPWH using a CO 2 refrigerant. The calibrated model was then used to run a parametric analysis in which factors such as water supply temperature, water circulation rate, tank stratification, and condenser configuration were considered. The performance of a commercial CO 2 system was compared with that of a similar system using R-134a as the refrigerant. It was foundmore » that CO 2 HPWH performance was comparable to that of an R-134a HPWH, more so for a separated gas cooler configuration. For comparable performance, the compressor size and the tube-in-tube heat exchanger (condenser/gas cooler) size were compared for CO 2- and R-134a-based systems. Finally, the impact of the water circulation rate on the water temperature stratification in the tank, an essential requirement for higher performance for CO 2 HPWH systems was also investigated.« less
An empirical analysis of thermal protective performance of fabrics used in protective clothing.
Mandal, Sumit; Song, Guowen
2014-10-01
Fabric-based protective clothing is widely used for occupational safety of firefighters/industrial workers. The aim of this paper is to study thermal protective performance provided by fabric systems and to propose an effective model for predicting the thermal protective performance under various thermal exposures. Different fabric systems that are commonly used to manufacture thermal protective clothing were selected. Laboratory simulations of the various thermal exposures were created to evaluate the protective performance of the selected fabric systems in terms of time required to generate second-degree burns. Through the characterization of selected fabric systems in a particular thermal exposure, various factors affecting the performances were statistically analyzed. The key factors for a particular thermal exposure were recognized based on the t-test analysis. Using these key factors, the performance predictive multiple linear regression and artificial neural network (ANN) models were developed and compared. The identified best-fit ANN models provide a basic tool to study thermal protective performance of a fabric. © The Author 2014. Published by Oxford University Press on behalf of the British Occupational Hygiene Society.
Identification of human operator performance models utilizing time series analysis
NASA Technical Reports Server (NTRS)
Holden, F. M.; Shinners, S. M.
1973-01-01
The results of an effort performed by Sperry Systems Management Division for AMRL in applying time series analysis as a tool for modeling the human operator are presented. This technique is utilized for determining the variation of the human transfer function under various levels of stress. The human operator's model is determined based on actual input and output data from a tracking experiment.
ERIC Educational Resources Information Center
McArdle, John J.; Paskus, Thomas S.; Boker, Steven M.
2013-01-01
This is an application of contemporary multilevel regression modeling to the prediction of academic performances of 1st-year college students. At a first level of analysis, the data come from N greater than 16,000 students who were college freshman in 1994-1995 and who were also participants in high-level college athletics. At a second level of…
A balanced perspective: using nonfinancial measures to assess financial performance.
Watkins, Ann L
2003-11-01
Assessments of hospitals' financial performance have traditionally been based exclusively on analysis of a concise set of key financial ratios. One study, however, demonstrates that analysis of a hospital's financial condition can be significantly enhanced with the addition of several nonfinancial measures, including case-mix adjusted admissions, case-mix adjusted admissions per full-time equivalent, and case-mix adjusted admissions per beds in service.
Improving Department of Defense Global Distribution Performance Through Network Analysis
2016-06-01
network performance increase. 14. SUBJECT TERMS supply chain metrics, distribution networks, requisition shipping time, strategic distribution database...peace and war” (p. 4). USTRANSCOM Metrics and Analysis Branch defines, develops, tracks, and maintains outcomes- based supply chain metrics to...2014a, p. 8). The Joint Staff defines a TDD standard as the maximum number of days the supply chain can take to deliver requisitioned materiel
Barua, Shaibal; Begum, Shahina; Ahmed, Mobyen Uddin
2015-01-01
Machine learning algorithms play an important role in computer science research. Recent advancement in sensor data collection in clinical sciences lead to a complex, heterogeneous data processing, and analysis for patient diagnosis and prognosis. Diagnosis and treatment of patients based on manual analysis of these sensor data are difficult and time consuming. Therefore, development of Knowledge-based systems to support clinicians in decision-making is important. However, it is necessary to perform experimental work to compare performances of different machine learning methods to help to select appropriate method for a specific characteristic of data sets. This paper compares classification performance of three popular machine learning methods i.e., case-based reasoning, neutral networks and support vector machine to diagnose stress of vehicle drivers using finger temperature and heart rate variability. The experimental results show that case-based reasoning outperforms other two methods in terms of classification accuracy. Case-based reasoning has achieved 80% and 86% accuracy to classify stress using finger temperature and heart rate variability. On contrary, both neural network and support vector machine have achieved less than 80% accuracy by using both physiological signals.
Azadeh, Ali; Sheikhalishahi, Mohammad
2014-01-01
Background A unique framework for performance optimization of generation companies (GENCOs) based on health, safety, environment, and ergonomics (HSEE) indicators is presented. Methods To rank this sector of industry, the combination of data envelopment analysis (DEA), principal component analysis (PCA), and Taguchi are used for all branches of GENCOs. These methods are applied in an integrated manner to measure the performance of GENCO. The preferred model between DEA, PCA, and Taguchi is selected based on sensitivity analysis and maximum correlation between rankings. To achieve the stated objectives, noise is introduced into input data. Results The results show that Taguchi outperforms other methods. Moreover, a comprehensive experiment is carried out to identify the most influential factor for ranking GENCOs. Conclusion The approach developed in this study could be used for continuous assessment and improvement of GENCO's performance in supplying energy with respect to HSEE factors. The results of such studies would help managers to have better understanding of weak and strong points in terms of HSEE factors. PMID:26106505
NASA Astrophysics Data System (ADS)
Li, Xiang; Luo, Ming; Qiu, Ying; Alphones, Arokiaswami; Zhong, Wen-De; Yu, Changyuan; Yang, Qi
2018-02-01
In this paper, channel equalization techniques for coherent optical fiber transmission systems based on independent component analysis (ICA) are reviewed. The principle of ICA for blind source separation is introduced. The ICA based channel equalization after both single-mode fiber and few-mode fiber transmission for single-carrier and orthogonal frequency division multiplexing (OFDM) modulation formats are investigated, respectively. The performance comparisons with conventional channel equalization techniques are discussed.
An Evidence-Based Videotaped Running Biomechanics Analysis.
Souza, Richard B
2016-02-01
Running biomechanics play an important role in the development of injuries. Performing a running biomechanics analysis on injured runners can help to develop treatment strategies. This article provides a framework for a systematic video-based running biomechanics analysis plan based on the current evidence on running injuries, using 2-dimensional (2D) video and readily available tools. Fourteen measurements are proposed in this analysis plan from lateral and posterior video. Identifying simple 2D surrogates for 3D biomechanic variables of interest allows for widespread translation of best practices, and have the best opportunity to impact the highly prevalent problem of the injured runner. Copyright © 2016 Elsevier Inc. All rights reserved.
Ho, Derek; Drake, Tyler K.; Bentley, Rex C.; Valea, Fidel A.; Wax, Adam
2015-01-01
We evaluate a new hybrid algorithm for determining nuclear morphology using angle-resolved low coherence interferometry (a/LCI) measurements in ex vivo cervical tissue. The algorithm combines Mie theory based and continuous wavelet transform inverse light scattering analysis. The hybrid algorithm was validated and compared to traditional Mie theory based analysis using an ex vivo tissue data set. The hybrid algorithm achieved 100% agreement with pathology in distinguishing dysplastic and non-dysplastic biopsy sites in the pilot study. Significantly, the new algorithm performed over four times faster than traditional Mie theory based analysis. PMID:26309741
Oral health status and academic performance among Ohio third-graders, 2009-2010.
Detty, Amber M R; Oza-Frank, Reena
2014-01-01
Although recent literature indicated an association between dental caries and poor academic performance, previous work relied on self-reported measures. This analysis sought to determine the association between academic performance and untreated dental caries (tooth decay) using objective measures, controlling for school-level characteristics. School-level untreated caries prevalence was estimated from a 2009-2010 oral health survey of Ohio third-graders. Prevalence estimates were combined with school-level academic performance and other school characteristics obtained from the Ohio Department of Education. Linear regression models were developed as a result of bivariate testing, and final models were stratified based upon the presence of a school-based dental sealant program (SBSP). Preliminary bivariate analysis indicated a significant relationship between untreated caries and academic performance, which was more pronounced at schools with an SBSP. After controlling for other school characteristics, the prevalence of untreated caries was found to be a significant predictor of academic performance at schools without an SBSP (P=0.001) but not at schools with an SBSP (P=0.833). The results suggest the association between untreated caries and academic performance may be affected by the presence of a school-based oral health program. Further research focused on oral health and academic performance should consider the presence and/or availability of these programs. © 2014 American Association of Public Health Dentistry.
ERIC Educational Resources Information Center
Kubala, James Joseph
A quantitative and qualitative study examined three leadership strategies found in performance-based management (human resource, scientific management and political strategies used in public sector management); a framework by which performance measurement (PM) supports leadership strategies; and how the strategies impact PM. It examined leadership…
An Analysis of a High Performing School District's Culture
ERIC Educational Resources Information Center
Corum, Kenneth D.; Schuetz, Todd B.
2012-01-01
This report describes a problem based learning project focusing on the cultural elements of a high performing school district. Current literature on school district culture provides numerous cultural elements that are present in high performing school districts. With the current climate in education placing pressure on school districts to perform…
New integrated information system for pusan national university hospital.
Kim, Hyung Hoi; Cho, Kyung-Won; Kim, Hye Sook; Kim, Ju-Sim; Kim, Jung Hyun; Han, Sang Pil; Park, Chun Bok; Kim, Seok; Chae, Young Moon
2011-03-01
This study presents the information system for Pusan National University Hospital (PNUH), evaluates its performance qualitatively, and conducts economic analysis. Information system for PNUH was designed by component-based development and developed by internet technologies. Order Communication System, Electronic Medical Record, and Clinical Decision Support System were newly developed. The performance of the hospital information system was qualitatively evaluated based on the performance reference model in order to identify problem areas for the old system. The Information Economics approach was used to analyze the economic feasibility of hospital information system in order to account for the intangible benefits. Average performance scores were 3.16 for input layer, 3.35 for process layer, and 3.57 for business layer. In addition, the cumulative benefit to cost ratio was 0.50 in 2011, 1.73 in 2012, 1.76 in 2013, 1.71 in 2014, and 1.71 in 2015. The B/C ratios steadily increase as value items are added. While overall performance scores were reasonably high, doctors were less satisfied with the system, perhaps due to the weak clinical function in the systems. The information economics analysis demonstrated the economic profitability of the information systems if all intangible benefits were included. The second qualitative evaluation survey and economic analysis were proposed to evaluate the changes in performance of the new system.
NASA Technical Reports Server (NTRS)
Kirlik, Alex; Kossack, Merrick Frank
1993-01-01
This status report consists of a thesis entitled 'Ecological Task Analysis: A Method for Display Enhancements.' Previous use of various analysis processes for the purpose of display interface design or enhancement has run the risk of failing to improve user performance due to the analysis resulting in only a sequencial listing of user tasks. Adopting an ecological approach to performing the task analysis, however, may result in the necessary modeling of an unpredictable and variable task domain required to improve user performance. Kirlik has proposed an Ecological Task Analysis framework which is designed for this purpose. It is the purpose of this research to measure this framework's effectiveness at enhancing display interfaces in order to improve user performance. Following the proposed framework, an ecological task analysis of experienced users of a complex and dynamic laboratory task, Star Cruiser, was performed. Based on this analysis, display enhancements were proposed and implemented. An experiment was then conducted to compare this new version of Star Cruiser to the original. By measuring user performance at different tasks, it was determined that during early sessions, use of the enhanced display contributed to better user performance compared to that achieved using the original display. Furthermore, the results indicate that the enhancements proposed as a result of the ecological task analysis affected user performance differently depending on whether they are enhancements which aid in the selection of a possible action or in the performance of an action. Generalizations of these findings to larger, more complex systems were avoided since the analysis was only performed on this one particular system.
Optimizing construction quality management of pavements using mechanistic performance analysis.
DOT National Transportation Integrated Search
2004-08-01
This report presents a statistical-based algorithm that was developed to reconcile the results from several pavement performance models used in the state of practice with systematic process control techniques. These algorithms identify project-specif...
Voxel-Based Morphometry ALE meta-analysis of Bipolar Disorder
NASA Astrophysics Data System (ADS)
Magana, Omar; Laird, Robert
2012-03-01
A meta-analysis was performed independently to view the changes in gray matter (GM) on patients with Bipolar disorder (BP). The meta-analysis was conducted on a Talairach Space using GingerALE to determine the voxels and their permutation. In order to achieve the data acquisition, published experiments and similar research studies were uploaded onto the online Voxel-Based Morphometry database (VBM). By doing so, coordinates of activation locations were extracted from Bipolar disorder related journals utilizing Sleuth. Once the coordinates of given experiments were selected and imported to GingerALE, a Gaussian was performed on all foci points to create the concentration points of GM on BP patients. The results included volume reductions and variations of GM between Normal Healthy controls and Patients with Bipolar disorder. A significant amount of GM clusters were obtained in Normal Healthy controls over BP patients on the right precentral gyrus, right anterior cingulate, and the left inferior frontal gyrus. In future research, more published journals could be uploaded onto the database and another VBM meta-analysis could be performed including more activation coordinates or a variation of age groups.
NASA Technical Reports Server (NTRS)
Andrews, E. H., Jr.; Mackley, E. A.
1976-01-01
An aerodynamic engine inlet analysis was performed on the experimental results obtained at nominal Mach numbers of 5, 6, and 7 from the NASA Hypersonic Research Engine (HRE) Aerothermodynamic Integration Model (AIM). Incorporation on the AIM of the mixed-compression inlet design represented the final phase of an inlet development program of the HRE Project. The purpose of this analysis was to compare the AIM inlet experimental results with theoretical results. Experimental performance was based on measured surface pressures used in a one-dimensional force-momentum theorem. Results of the analysis indicate that surface static-pressure measurements agree reasonably well with theoretical predictions except in the regions where the theory predicts large pressure discontinuities. Experimental and theoretical results both based on the one-dimensional force-momentum theorem yielded inlet performance parameters as functions of Mach number that exhibited reasonable agreement. Previous predictions of inlet unstart that resulted from pressure disturbances created by fuel injection and combustion appeared to be pessimistic.
NASA Astrophysics Data System (ADS)
Ohlídal, Ivan; Vohánka, Jiří; Čermák, Martin; Franta, Daniel
2017-10-01
The modification of the effective medium approximation for randomly microrough surfaces covered by very thin overlayers based on inhomogeneous fictitious layers is formulated. The numerical analysis of this modification is performed using simulated ellipsometric data calculated using the Rayleigh-Rice theory. The system used to perform this numerical analysis consists of a randomly microrough silicon single crystal surface covered with a SiO2 overlayer. A comparison to the effective medium approximation based on homogeneous fictitious layers is carried out within this numerical analysis. For ellipsometry of the system mentioned above the possibilities and limitations of both the effective medium approximation approaches are discussed. The results obtained by means of the numerical analysis are confirmed by the ellipsometric characterization of two randomly microrough silicon single crystal substrates covered with native oxide overlayers. It is shown that the effective medium approximation approaches for this system exhibit strong deficiencies compared to the Rayleigh-Rice theory. The practical consequences implied by these results are presented. The results concerning the random microroughness are verified by means of measurements performed using atomic force microscopy.
Evaluation of Contamination Inspection and Analysis Methods through Modeling System Performance
NASA Technical Reports Server (NTRS)
Seasly, Elaine; Dever, Jason; Stuban, Steven M. F.
2016-01-01
Contamination is usually identified as a risk on the risk register for sensitive space systems hardware. Despite detailed, time-consuming, and costly contamination control efforts during assembly, integration, and test of space systems, contaminants are still found during visual inspections of hardware. Improved methods are needed to gather information during systems integration to catch potential contamination issues earlier and manage contamination risks better. This research explores evaluation of contamination inspection and analysis methods to determine optical system sensitivity to minimum detectable molecular contamination levels based on IEST-STD-CC1246E non-volatile residue (NVR) cleanliness levels. Potential future degradation of the system is modeled given chosen modules representative of optical elements in an optical system, minimum detectable molecular contamination levels for a chosen inspection and analysis method, and determining the effect of contamination on the system. By modeling system performance based on when molecular contamination is detected during systems integration and at what cleanliness level, the decision maker can perform trades amongst different inspection and analysis methods and determine if a planned method is adequate to meet system requirements and manage contamination risk.
Hamada, Motoharu; Doisaki, Sayoko; Okuno, Yusuke; Muramatsu, Hideki; Hama, Asahito; Kawashima, Nozomu; Narita, Atsushi; Nishio, Nobuhiro; Yoshida, Kenichi; Kanno, Hitoshi; Manabe, Atsushi; Taga, Takashi; Takahashi, Yoshiyuki; Miyano, Satoru; Ogawa, Seishi; Kojima, Seiji
2018-06-23
Congenital dyserythropoietic anemia (CDA) is a heterogeneous group of rare congenital disorders characterized by ineffective erythropoiesis and dysplastic changes in erythroblasts. Diagnosis of CDA is based primarily on the morphology of bone marrow erythroblasts; however, genetic tests have recently become more important. Here, we performed genetic analysis of 10 Japanese patients who had been diagnosed with CDA based on laboratory findings and morphological characteristics. We examined 10 CDA patients via central review of bone marrow morphology and genetic analysis for congenital bone marrow failure syndromes. Sanger sequencing for CDAN1, SEC23B, and KLF1 was performed for all patients. We performed whole-exome sequencing in patients without mutation in these genes. Three patients carried pathogenic CDAN1 mutations, whereas no SEC23B mutations were identified in our cohort. WES unexpectedly identified gene mutations known to cause congenital hemolytic anemia in two patients: canonical G6PD p.Val394Leu mutation and SPTA1 p.Arg28His mutation. Comprehensive genetic analysis is warranted for more effective diagnosis of patients with suspected CDA.
Specialized data analysis of SSME and advanced propulsion system vibration measurements
NASA Technical Reports Server (NTRS)
Coffin, Thomas; Swanson, Wayne L.; Jong, Yen-Yi
1993-01-01
The basic objectives of this contract were to perform detailed analysis and evaluation of dynamic data obtained during Space Shuttle Main Engine (SSME) test and flight operations, including analytical/statistical assessment of component dynamic performance, and to continue the development and implementation of analytical/statistical models to effectively define nominal component dynamic characteristics, detect anomalous behavior, and assess machinery operational conditions. This study was to provide timely assessment of engine component operational status, identify probable causes of malfunction, and define feasible engineering solutions. The work was performed under three broad tasks: (1) Analysis, Evaluation, and Documentation of SSME Dynamic Test Results; (2) Data Base and Analytical Model Development and Application; and (3) Development and Application of Vibration Signature Analysis Techniques.
User's Guide for a Modular Flutter Analysis Software System (Fast Version 1.0)
NASA Technical Reports Server (NTRS)
Desmarais, R. N.; Bennett, R. M.
1978-01-01
The use and operation of a group of computer programs to perform a flutter analysis of a single planar wing are described. This system of programs is called FAST for Flutter Analysis System, and consists of five programs. Each program performs certain portions of a flutter analysis and can be run sequentially as a job step or individually. FAST uses natural vibration modes as input data and performs a conventional V-g type of solution. The unsteady aerodynamics programs in FAST are based on the subsonic kernel function lifting-surface theory although other aerodynamic programs can be used. Application of the programs is illustrated by a sample case of a complete flutter calculation that exercises each program.
Overlapped Partitioning for Ensemble Classifiers of P300-Based Brain-Computer Interfaces
Onishi, Akinari; Natsume, Kiyohisa
2014-01-01
A P300-based brain-computer interface (BCI) enables a wide range of people to control devices that improve their quality of life. Ensemble classifiers with naive partitioning were recently applied to the P300-based BCI and these classification performances were assessed. However, they were usually trained on a large amount of training data (e.g., 15300). In this study, we evaluated ensemble linear discriminant analysis (LDA) classifiers with a newly proposed overlapped partitioning method using 900 training data. In addition, the classification performances of the ensemble classifier with naive partitioning and a single LDA classifier were compared. One of three conditions for dimension reduction was applied: the stepwise method, principal component analysis (PCA), or none. The results show that an ensemble stepwise LDA (SWLDA) classifier with overlapped partitioning achieved a better performance than the commonly used single SWLDA classifier and an ensemble SWLDA classifier with naive partitioning. This result implies that the performance of the SWLDA is improved by overlapped partitioning and the ensemble classifier with overlapped partitioning requires less training data than that with naive partitioning. This study contributes towards reducing the required amount of training data and achieving better classification performance. PMID:24695550
Overlapped partitioning for ensemble classifiers of P300-based brain-computer interfaces.
Onishi, Akinari; Natsume, Kiyohisa
2014-01-01
A P300-based brain-computer interface (BCI) enables a wide range of people to control devices that improve their quality of life. Ensemble classifiers with naive partitioning were recently applied to the P300-based BCI and these classification performances were assessed. However, they were usually trained on a large amount of training data (e.g., 15300). In this study, we evaluated ensemble linear discriminant analysis (LDA) classifiers with a newly proposed overlapped partitioning method using 900 training data. In addition, the classification performances of the ensemble classifier with naive partitioning and a single LDA classifier were compared. One of three conditions for dimension reduction was applied: the stepwise method, principal component analysis (PCA), or none. The results show that an ensemble stepwise LDA (SWLDA) classifier with overlapped partitioning achieved a better performance than the commonly used single SWLDA classifier and an ensemble SWLDA classifier with naive partitioning. This result implies that the performance of the SWLDA is improved by overlapped partitioning and the ensemble classifier with overlapped partitioning requires less training data than that with naive partitioning. This study contributes towards reducing the required amount of training data and achieving better classification performance.
Preisig, James C
2005-07-01
Equations are derived for analyzing the performance of channel estimate based equalizers. The performance is characterized in terms of the mean squared soft decision error (sigma2(s)) of each equalizer. This error is decomposed into two components. These are the minimum achievable error (sigma2(0)) and the excess error (sigma2(e)). The former is the soft decision error that would be realized by the equalizer if the filter coefficient calculation were based upon perfect knowledge of the channel impulse response and statistics of the interfering noise field. The latter is the additional soft decision error that is realized due to errors in the estimates of these channel parameters. These expressions accurately predict the equalizer errors observed in the processing of experimental data by a channel estimate based decision feedback equalizer (DFE) and a passive time-reversal equalizer. Further expressions are presented that allow equalizer performance to be predicted given the scattering function of the acoustic channel. The analysis using these expressions yields insights into the features of surface scattering that most significantly impact equalizer performance in shallow water environments and motivates the implementation of a DFE that is robust with respect to channel estimation errors.
Watson, Nathanial E; Parsons, Brendon A; Synovec, Robert E
2016-08-12
Performance of tile-based Fisher Ratio (F-ratio) data analysis, recently developed for discovery-based studies using comprehensive two-dimensional gas chromatography coupled with time-of-flight mass spectrometry (GC×GC-TOFMS), is evaluated with a metabolomics dataset that had been previously analyzed in great detail, but while taking a brute force approach. The previously analyzed data (referred to herein as the benchmark dataset) were intracellular extracts from Saccharomyces cerevisiae (yeast), either metabolizing glucose (repressed) or ethanol (derepressed), which define the two classes in the discovery-based analysis to find metabolites that are statistically different in concentration between the two classes. Beneficially, this previously analyzed dataset provides a concrete means to validate the tile-based F-ratio software. Herein, we demonstrate and validate the significant benefits of applying tile-based F-ratio analysis. The yeast metabolomics data are analyzed more rapidly in about one week versus one year for the prior studies with this dataset. Furthermore, a null distribution analysis is implemented to statistically determine an adequate F-ratio threshold, whereby the variables with F-ratio values below the threshold can be ignored as not class distinguishing, which provides the analyst with confidence when analyzing the hit table. Forty-six of the fifty-four benchmarked changing metabolites were discovered by the new methodology while consistently excluding all but one of the benchmarked nineteen false positive metabolites previously identified. Copyright © 2016 Elsevier B.V. All rights reserved.
Fetterman, J. Gregor; Killeen, Peter R.; Hall, Scott
2008-01-01
Four rats and four pigeons were monitored while performing retrospective timing tasks. All animals displayed collateral behaviors which could have mediated their temporal judgements. Statistical analysis made a good case for such mediation in the case of two pigeons performing on a spatially-differentiated response, but not for the two responding on a color-differentiated response. For the rats, all of which performed on a spatially-differentiated task, prediction of their temporal judgements was always better if based on collateral activity than if based on the passage of time. PMID:19701487
ERIC Educational Resources Information Center
Bessemer, David W.; Shrage, Jules H.
Recommendations for an alternative plan, based on typological analysis techniques, for the evaluation of student characteristics related to media, presentation design, and academic performance are presented. Difficulties with present evaluation plans are discussed, and different methods of typological analysis are described. Included are…
NASA Technical Reports Server (NTRS)
Bender, Robert L.; Reardon, John E.; Prendergast, Maurice J.; Schmitz, Craig P.; Brown, John R.
1992-01-01
A preliminary analysis of National Launch System ascent plume induced base heating environments has been completed to support the Induced Environments Panel's objective to assist in maturing the NLS vehicle (1.5 stage and heavy launch lift vehicle) design. Environments during ascent have been determined from this analysis for a few selected locations on the engine nozzles and base heat shield for both vehicles. The environments reflect early summer 1991 configurations and performance data and conservative methodology. A more complete and thorough analysis is under way to update these environments for the cycle 1 review in January 1992.
Basic gait analysis based on continuous wave radar.
Zhang, Jun
2012-09-01
A gait analysis method based on continuous wave (CW) radar is proposed in this paper. Time-frequency analysis is used to analyze the radar micro-Doppler echo from walking humans, and the relationships between the time-frequency spectrogram and human biological gait are discussed. The methods for extracting the gait parameters from the spectrogram are studied in depth and experiments on more than twenty subjects have been performed to acquire the radar gait data. The gait parameters are calculated and compared. The gait difference between men and women are presented based on the experimental data and extracted features. Gait analysis based on CW radar will provide a new method for clinical diagnosis and therapy. Copyright © 2012 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Li, Jin; Qiu, Zhiling; Hu, Leilei
2018-04-01
The inverter-based regenerative braking power utilization devices can re-utilize the regenerative energy, thus reduce the energy consumption of urban rail transit. In this paper the power absorption principle of the inverter-based device is introduced, then the key influencing factors of energy saving performance are analyzed based on the absorption model. The field operation data verified that the control DC voltage plays an important role and lower control DC voltage yields more energy saving. Also, the one year energy saving performance data of an inverter-based re-utilization device located in NanJing S8 line is provided, and more than 1.2 million kWh energy is recovered in the one year operation.
Roetzheim, Richard G.; Freund, Karen M.; Corle, Don K.; Murray, David M.; Snyder, Frederick R.; Kronman, Andrea C.; Jean-Pierre, Pascal; Raich, Peter C.; Holden, Alan E. C.; Darnell, Julie S.; Warren-Mears, Victoria; Patierno, Steven; Design, PNRP; Committee, Analysis
2013-01-01
Background The Patient Navigation Research Program (PNRP) is a cooperative effort of nine research projects, each employing its own unique study design. To evaluate projects such as PNRP, it is desirable to perform a pooled analysis to increase power relative to the individual projects. There is no agreed upon prospective methodology, however, for analyzing combined data arising from different study designs. Expert opinions were thus solicited from members of the PNRP Design and Analysis Committee Purpose To review possible methodologies for analyzing combined data arising from heterogeneous study designs. Methods The Design and Analysis Committee critically reviewed the pros and cons of five potential methods for analyzing combined PNRP project data. Conclusions were based on simple consensus. The five approaches reviewed included: 1) Analyzing and reporting each project separately, 2) Combining data from all projects and performing an individual-level analysis, 3) Pooling data from projects having similar study designs, 4) Analyzing pooled data using a prospective meta analytic technique, 5) Analyzing pooled data utilizing a novel simulated group randomized design. Results Methodologies varied in their ability to incorporate data from all PNRP projects, to appropriately account for differing study designs, and in their impact from differing project sample sizes. Limitations The conclusions reached were based on expert opinion and not derived from actual analyses performed. Conclusions The ability to analyze pooled data arising from differing study designs may provide pertinent information to inform programmatic, budgetary, and policy perspectives. Multi-site community-based research may not lend itself well to the more stringent explanatory and pragmatic standards of a randomized controlled trial design. Given our growing interest in community-based population research, the challenges inherent in the analysis of heterogeneous study design are likely to become more salient. Discussion of the analytic issues faced by the PNRP and the methodological approaches we considered may be of value to other prospective community-based research programs. PMID:22273587
Roetzheim, Richard G; Freund, Karen M; Corle, Don K; Murray, David M; Snyder, Frederick R; Kronman, Andrea C; Jean-Pierre, Pascal; Raich, Peter C; Holden, Alan Ec; Darnell, Julie S; Warren-Mears, Victoria; Patierno, Steven
2012-04-01
The Patient Navigation Research Program (PNRP) is a cooperative effort of nine research projects, with similar clinical criteria but with different study designs. To evaluate projects such as PNRP, it is desirable to perform a pooled analysis to increase power relative to the individual projects. There is no agreed-upon prospective methodology, however, for analyzing combined data arising from different study designs. Expert opinions were thus solicited from the members of the PNRP Design and Analysis Committee. To review possible methodologies for analyzing combined data arising from heterogeneous study designs. The Design and Analysis Committee critically reviewed the pros and cons of five potential methods for analyzing combined PNRP project data. The conclusions were based on simple consensus. The five approaches reviewed included the following: (1) analyzing and reporting each project separately, (2) combining data from all projects and performing an individual-level analysis, (3) pooling data from projects having similar study designs, (4) analyzing pooled data using a prospective meta-analytic technique, and (5) analyzing pooled data utilizing a novel simulated group-randomized design. Methodologies varied in their ability to incorporate data from all PNRP projects, to appropriately account for differing study designs, and to accommodate differing project sample sizes. The conclusions reached were based on expert opinion and not derived from actual analyses performed. The ability to analyze pooled data arising from differing study designs may provide pertinent information to inform programmatic, budgetary, and policy perspectives. Multisite community-based research may not lend itself well to the more stringent explanatory and pragmatic standards of a randomized controlled trial design. Given our growing interest in community-based population research, the challenges inherent in the analysis of heterogeneous study design are likely to become more salient. Discussion of the analytic issues faced by the PNRP and the methodological approaches we considered may be of value to other prospective community-based research programs.
ERIC Educational Resources Information Center
Laracy, Seth D.; Hojnoski, Robin L.; Dever, Bridget V.
2016-01-01
Receiver operating characteristic curve (ROC) analysis was used to investigate the ability of early numeracy curriculum-based measures (EN-CBM) administered in preschool to predict performance below the 25th and 40th percentiles on a quantity discrimination measure in kindergarten. Areas under the curve derived from a sample of 279 students ranged…
Using a bead-based method for multiplexed analysis of community DNA, the dynamics of aquatic microbial communities can be assessed. Capture probes, specific for a genus or species of bacteria, are attached to the surface of uniquely labeled, microscopic polystyrene beads. Primers...
NASA Astrophysics Data System (ADS)
Cheng, Liangliang; Busca, Giorgio; Cigada, Alfredo
2017-07-01
Modal analysis is commonly considered as an effective tool to obtain the intrinsic characteristics of structures including natural frequencies, modal damping ratios, and mode shapes, which are significant indicators for monitoring the health status of engineering structures. The complex mode indicator function (CMIF) can be regarded as an effective numerical tool to perform modal analysis. In this paper, experimental strain modal analysis based on the CMIF has been introduced. Moreover, a distributed fiber-optic sensor, as a dense measuring device, has been applied to acquire strain data along a beam surface. Thanks to the dense spatial resolution of the distributed fiber optics, more detailed mode shapes could be obtained. In order to test the effectiveness of the method, a mass lump—considered as a linear damage component—has been attached to the surface of the beam, and damage detection based on strain mode shape has been carried out. The results manifest that strain modal parameters can be estimated effectively by utilizing the CMIF based on the corresponding simulations and experiments. Furthermore, damage detection based on strain mode shapes benefits from the accuracy of strain mode shape recognition and the excellent performance of the distributed fiber optics.
A Biosequence-based Approach to Software Characterization
DOE Office of Scientific and Technical Information (OSTI.GOV)
Oehmen, Christopher S.; Peterson, Elena S.; Phillips, Aaron R.
For many applications, it is desirable to have some process for recognizing when software binaries are closely related without relying on them to be identical or have identical segments. Some examples include monitoring utilization of high performance computing centers or service clouds, detecting freeware in licensed code, and enforcing application whitelists. But doing so in a dynamic environment is a nontrivial task because most approaches to software similarity require extensive and time-consuming analysis of a binary, or they fail to recognize executables that are similar but nonidentical. Presented herein is a novel biosequence-based method for quantifying similarity of executable binaries.more » Using this method, it is shown in an example application on large-scale multi-author codes that 1) the biosequence-based method has a statistical performance in recognizing and distinguishing between a collection of real-world high performance computing applications better than 90% of ideal; and 2) an example of using family tree analysis to tune identification for a code subfamily can achieve better than 99% of ideal performance.« less
DeltaSA tool for source apportionment benchmarking, description and sensitivity analysis
NASA Astrophysics Data System (ADS)
Pernigotti, D.; Belis, C. A.
2018-05-01
DeltaSA is an R-package and a Java on-line tool developed at the EC-Joint Research Centre to assist and benchmark source apportionment applications. Its key functionalities support two critical tasks in this kind of studies: the assignment of a factor to a source in factor analytical models (source identification) and the model performance evaluation. The source identification is based on the similarity between a given factor and source chemical profiles from public databases. The model performance evaluation is based on statistical indicators used to compare model output with reference values generated in intercomparison exercises. The references values are calculated as the ensemble average of the results reported by participants that have passed a set of testing criteria based on chemical profiles and time series similarity. In this study, a sensitivity analysis of the model performance criteria is accomplished using the results of a synthetic dataset where "a priori" references are available. The consensus modulated standard deviation punc gives the best choice for the model performance evaluation when a conservative approach is adopted.
GOATS Image Projection Component
NASA Technical Reports Server (NTRS)
Haber, Benjamin M.; Green, Joseph J.
2011-01-01
When doing mission analysis and design of an imaging system in orbit around the Earth, answering the fundamental question of imaging performance requires an understanding of the image products that will be produced by the imaging system. GOATS software represents a series of MATLAB functions to provide for geometric image projections. Unique features of the software include function modularity, a standard MATLAB interface, easy-to-understand first-principles-based analysis, and the ability to perform geometric image projections of framing type imaging systems. The software modules are created for maximum analysis utility, and can all be used independently for many varied analysis tasks, or used in conjunction with other orbit analysis tools.
TAIWO, OLUWADAMILOLA O.; FINEGAN, DONAL P.; EASTWOOD, DAVID S.; FIFE, JULIE L.; BROWN, LEON D.; DARR, JAWWAD A.; LEE, PETER D.; BRETT, DANIEL J.L.
2016-01-01
Summary Lithium‐ion battery performance is intrinsically linked to electrode microstructure. Quantitative measurement of key structural parameters of lithium‐ion battery electrode microstructures will enable optimization as well as motivate systematic numerical studies for the improvement of battery performance. With the rapid development of 3‐D imaging techniques, quantitative assessment of 3‐D microstructures from 2‐D image sections by stereological methods appears outmoded; however, in spite of the proliferation of tomographic imaging techniques, it remains significantly easier to obtain two‐dimensional (2‐D) data sets. In this study, stereological prediction and three‐dimensional (3‐D) analysis techniques for quantitative assessment of key geometric parameters for characterizing battery electrode microstructures are examined and compared. Lithium‐ion battery electrodes were imaged using synchrotron‐based X‐ray tomographic microscopy. For each electrode sample investigated, stereological analysis was performed on reconstructed 2‐D image sections generated from tomographic imaging, whereas direct 3‐D analysis was performed on reconstructed image volumes. The analysis showed that geometric parameter estimation using 2‐D image sections is bound to be associated with ambiguity and that volume‐based 3‐D characterization of nonconvex, irregular and interconnected particles can be used to more accurately quantify spatially‐dependent parameters, such as tortuosity and pore‐phase connectivity. PMID:26999804
Taiwo, Oluwadamilola O; Finegan, Donal P; Eastwood, David S; Fife, Julie L; Brown, Leon D; Darr, Jawwad A; Lee, Peter D; Brett, Daniel J L; Shearing, Paul R
2016-09-01
Lithium-ion battery performance is intrinsically linked to electrode microstructure. Quantitative measurement of key structural parameters of lithium-ion battery electrode microstructures will enable optimization as well as motivate systematic numerical studies for the improvement of battery performance. With the rapid development of 3-D imaging techniques, quantitative assessment of 3-D microstructures from 2-D image sections by stereological methods appears outmoded; however, in spite of the proliferation of tomographic imaging techniques, it remains significantly easier to obtain two-dimensional (2-D) data sets. In this study, stereological prediction and three-dimensional (3-D) analysis techniques for quantitative assessment of key geometric parameters for characterizing battery electrode microstructures are examined and compared. Lithium-ion battery electrodes were imaged using synchrotron-based X-ray tomographic microscopy. For each electrode sample investigated, stereological analysis was performed on reconstructed 2-D image sections generated from tomographic imaging, whereas direct 3-D analysis was performed on reconstructed image volumes. The analysis showed that geometric parameter estimation using 2-D image sections is bound to be associated with ambiguity and that volume-based 3-D characterization of nonconvex, irregular and interconnected particles can be used to more accurately quantify spatially-dependent parameters, such as tortuosity and pore-phase connectivity. © 2016 The Authors. Journal of Microscopy published by John Wiley & Sons Ltd on behalf of Royal Microscopical Society.
Error rate information in attention allocation pilot models
NASA Technical Reports Server (NTRS)
Faulkner, W. H.; Onstott, E. D.
1977-01-01
The Northrop urgency decision pilot model was used in a command tracking task to compare the optimized performance of multiaxis attention allocation pilot models whose urgency functions were (1) based on tracking error alone, and (2) based on both tracking error and error rate. A matrix of system dynamics and command inputs was employed, to create both symmetric and asymmetric two axis compensatory tracking tasks. All tasks were single loop on each axis. Analysis showed that a model that allocates control attention through nonlinear urgency functions using only error information could not achieve performance of the full model whose attention shifting algorithm included both error and error rate terms. Subsequent to this analysis, tracking performance predictions for the full model were verified by piloted flight simulation. Complete model and simulation data are presented.
Base fluid in improving heat transfer for EV car battery
NASA Astrophysics Data System (ADS)
Bin-Abdun, Nazih A.; Razlan, Zuradzman M.; Shahriman, A. B.; Wan, Khairunizam; Hazry, D.; Ahmed, S. Faiz; Adnan, Nazrul H.; Heng, R.; Kamarudin, H.; Zunaidi, I.
2015-05-01
This study examined the effects of base fluid (as coolants) channeling inside the heat exchanger in the process of the increase in thermal conductivity between EV car battery and the heat exchanger. The analysis showed that secondary cooling system by means of water has advantages in improving the heat transfer process and reducing the electric power loss on the form of thermal energy from batteries. This leads to the increase in the efficiency of the EV car battery, hence also positively reflecting the performance of the EV car. The present work, analysis is performed to assess the design and use of heat exchanger in increasing the performance efficiency of the EV car battery. This provides a preface to the use this design for nano-fluids which increase and improve from heat transfer.
Fang, Xiang; Li, Ning-qiu; Fu, Xiao-zhe; Li, Kai-bin; Lin, Qiang; Liu, Li-hui; Shi, Cun-bin; Wu, Shu-qin
2015-07-01
As a key component of life science, bioinformatics has been widely applied in genomics, transcriptomics, and proteomics. However, the requirement of high-performance computers rather than common personal computers for constructing a bioinformatics platform significantly limited the application of bioinformatics in aquatic science. In this study, we constructed a bioinformatic analysis platform for aquatic pathogen based on the MilkyWay-2 supercomputer. The platform consisted of three functional modules, including genomic and transcriptomic sequencing data analysis, protein structure prediction, and molecular dynamics simulations. To validate the practicability of the platform, we performed bioinformatic analysis on aquatic pathogenic organisms. For example, genes of Flavobacterium johnsoniae M168 were identified and annotated via Blast searches, GO and InterPro annotations. Protein structural models for five small segments of grass carp reovirus HZ-08 were constructed by homology modeling. Molecular dynamics simulations were performed on out membrane protein A of Aeromonas hydrophila, and the changes of system temperature, total energy, root mean square deviation and conformation of the loops during equilibration were also observed. These results showed that the bioinformatic analysis platform for aquatic pathogen has been successfully built on the MilkyWay-2 supercomputer. This study will provide insights into the construction of bioinformatic analysis platform for other subjects.
International Space Station Configuration Analysis and Integration
NASA Technical Reports Server (NTRS)
Anchondo, Rebekah
2016-01-01
Ambitious engineering projects, such as NASA's International Space Station (ISS), require dependable modeling, analysis, visualization, and robotics to ensure that complex mission strategies are carried out cost effectively, sustainably, and safely. Learn how Booz Allen Hamilton's Modeling, Analysis, Visualization, and Robotics Integration Center (MAVRIC) team performs engineering analysis of the ISS Configuration based primarily on the use of 3D CAD models. To support mission planning and execution, the team tracks the configuration of ISS and maintains configuration requirements to ensure operational goals are met. The MAVRIC team performs multi-disciplinary integration and trade studies to ensure future configurations meet stakeholder needs.
NASA Astrophysics Data System (ADS)
Kim, Jeong-Man; Koo, Min-Mo; Jeong, Jae-Hoon; Hong, Keyyong; Cho, Il-Hyoung; Choi, Jang-Young
2017-05-01
This paper reports the design and analysis of a tubular permanent magnet linear generator (TPMLG) for a small-scale wave-energy converter. The analytical field computation is performed by applying a magnetic vector potential and a 2-D analytical model to determine design parameters. Based on analytical solutions, parametric analysis is performed to meet the design specifications of a wave-energy converter (WEC). Then, 2-D FEA is employed to validate the analytical method. Finally, the experimental result confirms the predictions of the analytical and finite element analysis (FEA) methods under regular and irregular wave conditions.
Overheating Anomalies during Flight Test Due to the Base Bleeding
NASA Technical Reports Server (NTRS)
Luchinsky, Dmitry; Hafiychuck, Halyna; Osipov, Slava; Ponizhovskaya, Ekaterina; Smelyanskiy, Vadim; Dagostino, Mark; Canabal, Francisco; Mobley, Brandon L.
2012-01-01
In this paper we present the results of the analytical and numerical studies of the plume interaction with the base flow in the presence of base out-gassing. The physics-based analysis and CFD modeling of the base heating for single solid rocket motor performed in this research addressed the following questions: what are the key factors making base flow so different from that in the Shuttle [1]; why CFD analysis of this problem reveals small plume recirculation; what major factors influence base temperature; and why overheating was initiated at a given time in the flight. To answer these questions topological analysis of the base flow was performed and Korst theory was used to estimate relative contributions of radiation, plume recirculation, and chemically reactive out-gassing to the base heating. It was shown that base bleeding and small base volume are the key factors contributing to the overheating, while plume recirculation is effectively suppressed by asymmetric configuration of the flow formed earlier in the flight. These findings are further verified using CFD simulations that include multi-species gas environment both in the plume and in the base. Solid particles in the exhaust plume (Al2O3) and char particles in the base bleeding were also included into the simulations and their relative contributions into the base temperature rise were estimated. The results of simulations are in good agreement with the temperature and pressure in the base measured during the test.
EBprot: Statistical analysis of labeling-based quantitative proteomics data.
Koh, Hiromi W L; Swa, Hannah L F; Fermin, Damian; Ler, Siok Ghee; Gunaratne, Jayantha; Choi, Hyungwon
2015-08-01
Labeling-based proteomics is a powerful method for detection of differentially expressed proteins (DEPs). The current data analysis platform typically relies on protein-level ratios, which is obtained by summarizing peptide-level ratios for each protein. In shotgun proteomics, however, some proteins are quantified with more peptides than others, and this reproducibility information is not incorporated into the differential expression (DE) analysis. Here, we propose a novel probabilistic framework EBprot that directly models the peptide-protein hierarchy and rewards the proteins with reproducible evidence of DE over multiple peptides. To evaluate its performance with known DE states, we conducted a simulation study to show that the peptide-level analysis of EBprot provides better receiver-operating characteristic and more accurate estimation of the false discovery rates than the methods based on protein-level ratios. We also demonstrate superior classification performance of peptide-level EBprot analysis in a spike-in dataset. To illustrate the wide applicability of EBprot in different experimental designs, we applied EBprot to a dataset for lung cancer subtype analysis with biological replicates and another dataset for time course phosphoproteome analysis of EGF-stimulated HeLa cells with multiplexed labeling. Through these examples, we show that the peptide-level analysis of EBprot is a robust alternative to the existing statistical methods for the DE analysis of labeling-based quantitative datasets. The software suite is freely available on the Sourceforge website http://ebprot.sourceforge.net/. All MS data have been deposited in the ProteomeXchange with identifier PXD001426 (http://proteomecentral.proteomexchange.org/dataset/PXD001426/). © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Li, Der-Chiang; Liu, Chiao-Wen; Hu, Susan C
2011-05-01
Medical data sets are usually small and have very high dimensionality. Too many attributes will make the analysis less efficient and will not necessarily increase accuracy, while too few data will decrease the modeling stability. Consequently, the main objective of this study is to extract the optimal subset of features to increase analytical performance when the data set is small. This paper proposes a fuzzy-based non-linear transformation method to extend classification related information from the original data attribute values for a small data set. Based on the new transformed data set, this study applies principal component analysis (PCA) to extract the optimal subset of features. Finally, we use the transformed data with these optimal features as the input data for a learning tool, a support vector machine (SVM). Six medical data sets: Pima Indians' diabetes, Wisconsin diagnostic breast cancer, Parkinson disease, echocardiogram, BUPA liver disorders dataset, and bladder cancer cases in Taiwan, are employed to illustrate the approach presented in this paper. This research uses the t-test to evaluate the classification accuracy for a single data set; and uses the Friedman test to show the proposed method is better than other methods over the multiple data sets. The experiment results indicate that the proposed method has better classification performance than either PCA or kernel principal component analysis (KPCA) when the data set is small, and suggest creating new purpose-related information to improve the analysis performance. This paper has shown that feature extraction is important as a function of feature selection for efficient data analysis. When the data set is small, using the fuzzy-based transformation method presented in this work to increase the information available produces better results than the PCA and KPCA approaches. Copyright © 2011 Elsevier B.V. All rights reserved.
Design Criteria for Adaptive Roadway Lighting
DOT National Transportation Integrated Search
2014-07-01
This report provides the background and analysis used to develop criteria for the implementation of an adaptive lighting system for roadway lighting. Based on the analysis of crashes and lighting performance, a series of criteria and the associated d...
NASA Technical Reports Server (NTRS)
Taylor, Arthur C., III; Hou, Gene W.; Korivi, Vamshi M.
1991-01-01
A gradient-based design optimization strategy for practical aerodynamic design applications is presented, which uses the 2D thin-layer Navier-Stokes equations. The strategy is based on the classic idea of constructing different modules for performing the major tasks such as function evaluation, function approximation and sensitivity analysis, mesh regeneration, and grid sensitivity analysis, all driven and controlled by a general-purpose design optimization program. The accuracy of aerodynamic shape sensitivity derivatives is validated on two viscous test problems: internal flow through a double-throat nozzle and external flow over a NACA 4-digit airfoil. A significant improvement in aerodynamic performance has been achieved in both cases. Particular attention is given to a consistent treatment of the boundary conditions in the calculation of the aerodynamic sensitivity derivatives for the classic problems of external flow over an isolated lifting airfoil on 'C' or 'O' meshes.
Plenis, Alina; Olędzka, Ilona; Bączek, Tomasz
2013-05-05
This paper focuses on a comparative study of the column classification system based on the quantitative structure-retention relationships (QSRR method) and column performance in real biomedical analysis. The assay was carried out for the LC separation of moclobemide and its metabolites in human plasma, using a set of 24 stationary phases. The QSRR models established for the studied stationary phases were compared with the column test performance results under two chemometric techniques - the principal component analysis (PCA) and the hierarchical clustering analysis (HCA). The study confirmed that the stationary phase classes found closely related by the QSRR approach yielded comparable separation for moclobemide and its metabolites. Therefore, the QSRR method could be considered supportive in the selection of a suitable column for the biomedical analysis offering the selection of similar or dissimilar columns with a relatively higher certainty. Copyright © 2013 Elsevier B.V. All rights reserved.
Automated Thermal Sample Acquisition with Applications
NASA Astrophysics Data System (ADS)
Kooshesh, K. A.; Lineberger, D. H.
2012-03-01
We created an Arduino®-based robot to detect samples subject to an experiment, perform measurements once each sample is located, and store the results for further analysis. We then relate the robot’s performance to an experiment on thermal inertia.
Li, Kangning; Ma, Jing; Tan, Liying; Yu, Siyuan; Zhai, Chao
2016-06-10
The performances of fiber-based free-space optical (FSO) communications over gamma-gamma distributed turbulence are studied for multiple aperture receiver systems. The equal gain combining (EGC) technique is considered as a practical scheme to mitigate the atmospheric turbulence. Bit error rate (BER) performances for binary-phase-shift-keying-modulated coherent detection fiber-based free-space optical communications are derived and analyzed for EGC diversity receptions through an approximation method. To show the net diversity gain of a multiple aperture receiver system, BER performances of EGC are compared with a single monolithic aperture receiver system with the same total aperture area (same average total incident optical power on the aperture surface) for fiber-based free-space optical communications. The analytical results are verified by Monte Carlo simulations. System performances are also compared for EGC diversity coherent FSO communications with or without considering fiber-coupling efficiencies.
Integrative analysis of environmental sequences using MEGAN4.
Huson, Daniel H; Mitra, Suparna; Ruscheweyh, Hans-Joachim; Weber, Nico; Schuster, Stephan C
2011-09-01
A major challenge in the analysis of environmental sequences is data integration. The question is how to analyze different types of data in a unified approach, addressing both the taxonomic and functional aspects. To facilitate such analyses, we have substantially extended MEGAN, a widely used taxonomic analysis program. The new program, MEGAN4, provides an integrated approach to the taxonomic and functional analysis of metagenomic, metatranscriptomic, metaproteomic, and rRNA data. While taxonomic analysis is performed based on the NCBI taxonomy, functional analysis is performed using the SEED classification of subsystems and functional roles or the KEGG classification of pathways and enzymes. A number of examples illustrate how such analyses can be performed, and show that one can also import and compare classification results obtained using others' tools. MEGAN4 is freely available for academic purposes, and installers for all three major operating systems can be downloaded from www-ab.informatik.uni-tuebingen.de/software/megan.
Waddell, Kimberly J; Lang, Catherine E
2018-03-10
To compare self-reported with sensor-measured upper limb (UL) performance in daily life for individuals with chronic (≥6mo) UL paresis poststroke. Secondary analysis of participants enrolled in a phase II randomized, parallel, dose-response UL movement trial. This analysis compared the accuracy and consistency between self-reported UL performance and sensor-measured UL performance at baseline and immediately post an 8-week intensive UL task-specific intervention. Outpatient rehabilitation. Community-dwelling individuals with chronic (≥6mo) UL paresis poststroke (N=64). Not applicable. Motor Activity Log amount of use scale and the sensor-derived use ratio from wrist-worn accelerometers. There was a high degree of variability between self-reported UL performance and the sensor-derived use ratio. Using sensor-based values as a reference, 3 distinct categories were identified: accurate reporters (reporting difference ±0.1), overreporters (difference >0.1), and underreporters (difference <-0.1). Five of 64 participants accurately self-reported UL performance at baseline and postintervention. Over half of participants (52%) switched categories from pre-to postintervention (eg, moved from underreporting preintervention to overreporting postintervention). For the consistent reporters, no participant characteristics were found to influence whether someone over- or underreported performance compared with sensor-based assessment. Participants did not consistently or accurately self-report UL performance when compared with the sensor-derived use ratio. Although self-report and sensor-based assessments are moderately associated and appear similar conceptually, these results suggest self-reported UL performance is often not consistent with sensor-measured performance and the measures cannot be used interchangeably. Copyright © 2018 American Congress of Rehabilitation Medicine. Published by Elsevier Inc. All rights reserved.
A Cross-Platform Infrastructure for Scalable Runtime Application Performance Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jack Dongarra; Shirley Moore; Bart Miller, Jeffrey Hollingsworth
2005-03-15
The purpose of this project was to build an extensible cross-platform infrastructure to facilitate the development of accurate and portable performance analysis tools for current and future high performance computing (HPC) architectures. Major accomplishments include tools and techniques for multidimensional performance analysis, as well as improved support for dynamic performance monitoring of multithreaded and multiprocess applications. Previous performance tool development has been limited by the burden of having to re-write a platform-dependent low-level substrate for each architecture/operating system pair in order to obtain the necessary performance data from the system. Manual interpretation of performance data is not scalable for large-scalemore » long-running applications. The infrastructure developed by this project provides a foundation for building portable and scalable performance analysis tools, with the end goal being to provide application developers with the information they need to analyze, understand, and tune the performance of terascale applications on HPC architectures. The backend portion of the infrastructure provides runtime instrumentation capability and access to hardware performance counters, with thread-safety for shared memory environments and a communication substrate to support instrumentation of multiprocess and distributed programs. Front end interfaces provides tool developers with a well-defined, platform-independent set of calls for requesting performance data. End-user tools have been developed that demonstrate runtime data collection, on-line and off-line analysis of performance data, and multidimensional performance analysis. The infrastructure is based on two underlying performance instrumentation technologies. These technologies are the PAPI cross-platform library interface to hardware performance counters and the cross-platform Dyninst library interface for runtime modification of executable images. The Paradyn and KOJAK projects have made use of this infrastructure to build performance measurement and analysis tools that scale to long-running programs on large parallel and distributed systems and that automate much of the search for performance bottlenecks.« less
Bossier, Han; Seurinck, Ruth; Kühn, Simone; Banaschewski, Tobias; Barker, Gareth J.; Bokde, Arun L. W.; Martinot, Jean-Luc; Lemaitre, Herve; Paus, Tomáš; Millenet, Sabina; Moerkerke, Beatrijs
2018-01-01
Given the increasing amount of neuroimaging studies, there is a growing need to summarize published results. Coordinate-based meta-analyses use the locations of statistically significant local maxima with possibly the associated effect sizes to aggregate studies. In this paper, we investigate the influence of key characteristics of a coordinate-based meta-analysis on (1) the balance between false and true positives and (2) the activation reliability of the outcome from a coordinate-based meta-analysis. More particularly, we consider the influence of the chosen group level model at the study level [fixed effects, ordinary least squares (OLS), or mixed effects models], the type of coordinate-based meta-analysis [Activation Likelihood Estimation (ALE) that only uses peak locations, fixed effects, and random effects meta-analysis that take into account both peak location and height] and the amount of studies included in the analysis (from 10 to 35). To do this, we apply a resampling scheme on a large dataset (N = 1,400) to create a test condition and compare this with an independent evaluation condition. The test condition corresponds to subsampling participants into studies and combine these using meta-analyses. The evaluation condition corresponds to a high-powered group analysis. We observe the best performance when using mixed effects models in individual studies combined with a random effects meta-analysis. Moreover the performance increases with the number of studies included in the meta-analysis. When peak height is not taken into consideration, we show that the popular ALE procedure is a good alternative in terms of the balance between type I and II errors. However, it requires more studies compared to other procedures in terms of activation reliability. Finally, we discuss the differences, interpretations, and limitations of our results. PMID:29403344
ERIC Educational Resources Information Center
Westrick, Paul A.; Le, Huy; Robbins, Steven B.; Radunzel, Justine M. R.; Schmidt, Frank L.
2015-01-01
This meta-analysis examines the strength of the relationships of ACT® Composite scores, high school grades, and socioeconomic status (SES) with academic performance and persistence into the 2nd and 3rd years at 4-year colleges and universities. Based upon a sample of 189,612 students at 50 institutions, ACT Composite scores and high school grade…
2016-04-05
applications in wireless networks such as military battlefields, emergency response, mobile commerce , online gaming, and collaborative work are based on the...www.elsevier.com/locate/peva Performance analysis of hierarchical group key management integrated with adaptive intrusion detection in mobile ad hoc...Accepted 19 September 2010 Available online 26 September 2010 Keywords: Mobile ad hoc networks Intrusion detection Group communication systems Group
NASA Astrophysics Data System (ADS)
Pembroke, A. D.; Colbert, J. A.
2015-12-01
The Community Coordinated Modeling Center (CCMC) provides hosting for many of the simulations used by the space weather community of scientists, educators, and forecasters. CCMC users may submit model runs through the Runs on Request system, which produces static visualizations of model output in the browser, while further analysis may be performed off-line via Kameleon, CCMC's cross-language access and interpolation library. Off-line analysis may be suitable for power-users, but storage and coding requirements present a barrier to entry for non-experts. Moreover, a lack of a consistent framework for analysis hinders reproducibility of scientific findings. To that end, we have developed Kameleon Live, a cloud based interactive analysis and visualization platform. Kameleon Live allows users to create scientific studies built around selected runs from the Runs on Request database, perform analysis on those runs, collaborate with other users, and disseminate their findings among the space weather community. In addition to showcasing these novel collaborative analysis features, we invite feedback from CCMC users as we seek to advance and improve on the new platform.
Embedded DCT and wavelet methods for fine granular scalable video: analysis and comparison
NASA Astrophysics Data System (ADS)
van der Schaar-Mitrea, Mihaela; Chen, Yingwei; Radha, Hayder
2000-04-01
Video transmission over bandwidth-varying networks is becoming increasingly important due to emerging applications such as streaming of video over the Internet. The fundamental obstacle in designing such systems resides in the varying characteristics of the Internet (i.e. bandwidth variations and packet-loss patterns). In MPEG-4, a new SNR scalability scheme, called Fine-Granular-Scalability (FGS), is currently under standardization, which is able to adapt in real-time (i.e. at transmission time) to Internet bandwidth variations. The FGS framework consists of a non-scalable motion-predicted base-layer and an intra-coded fine-granular scalable enhancement layer. For example, the base layer can be coded using a DCT-based MPEG-4 compliant, highly efficient video compression scheme. Subsequently, the difference between the original and decoded base-layer is computed, and the resulting FGS-residual signal is intra-frame coded with an embedded scalable coder. In order to achieve high coding efficiency when compressing the FGS enhancement layer, it is crucial to analyze the nature and characteristics of residual signals common to the SNR scalability framework (including FGS). In this paper, we present a thorough analysis of SNR residual signals by evaluating its statistical properties, compaction efficiency and frequency characteristics. The signal analysis revealed that the energy compaction of the DCT and wavelet transforms is limited and the frequency characteristic of SNR residual signals decay rather slowly. Moreover, the blockiness artifacts of the low bit-rate coded base-layer result in artificial high frequencies in the residual signal. Subsequently, a variety of wavelet and embedded DCT coding techniques applicable to the FGS framework are evaluated and their results are interpreted based on the identified signal properties. As expected from the theoretical signal analysis, the rate-distortion performances of the embedded wavelet and DCT-based coders are very similar. However, improved results can be obtained for the wavelet coder by deblocking the base- layer prior to the FGS residual computation. Based on the theoretical analysis and our measurements, we can conclude that for an optimal complexity versus coding-efficiency trade- off, only limited wavelet decomposition (e.g. 2 stages) needs to be performed for the FGS-residual signal. Also, it was observed that the good rate-distortion performance of a coding technique for a certain image type (e.g. natural still-images) does not necessarily translate into similarly good performance for signals with different visual characteristics and statistical properties.
Experimental BCAS Performance Results
DOT National Transportation Integrated Search
1978-07-01
The results of the (Litchford) Beacon-based Collision Avoidance System concept feasibility evaluation are reported. Included are a description of the concept, analysis and flight test results. The system concept is based on the range and bearing meas...
Thermodynamics of Gas Turbine Cycles with Analytic Derivatives in OpenMDAO
NASA Technical Reports Server (NTRS)
Gray, Justin; Chin, Jeffrey; Hearn, Tristan; Hendricks, Eric; Lavelle, Thomas; Martins, Joaquim R. R. A.
2016-01-01
A new equilibrium thermodynamics analysis tool was built based on the CEA method using the OpenMDAO framework. The new tool provides forward and adjoint analytic derivatives for use with gradient based optimization algorithms. The new tool was validated against the original CEA code to ensure an accurate analysis and the analytic derivatives were validated against finite-difference approximations. Performance comparisons between analytic and finite difference methods showed a significant speed advantage for the analytic methods. To further test the new analysis tool, a sample optimization was performed to find the optimal air-fuel equivalence ratio, , maximizing combustion temperature for a range of different pressures. Collectively, the results demonstrate the viability of the new tool to serve as the thermodynamic backbone for future work on a full propulsion modeling tool.
Model-based engineering for laser weapons systems
NASA Astrophysics Data System (ADS)
Panthaki, Malcolm; Coy, Steve
2011-10-01
The Comet Performance Engineering Workspace is an environment that enables integrated, multidisciplinary modeling and design/simulation process automation. One of the many multi-disciplinary applications of the Comet Workspace is for the integrated Structural, Thermal, Optical Performance (STOP) analysis of complex, multi-disciplinary space systems containing Electro-Optical (EO) sensors such as those which are designed and developed by and for NASA and the Department of Defense. The CometTM software is currently able to integrate performance simulation data and processes from a wide range of 3-D CAD and analysis software programs including CODE VTM from Optical Research Associates and SigFitTM from Sigmadyne Inc. which are used to simulate the optics performance of EO sensor systems in space-borne applications. Over the past year, Comet Solutions has been working with MZA Associates of Albuquerque, NM, under a contract with the Air Force Research Laboratories. This funded effort is a "risk reduction effort", to help determine whether the combination of Comet and WaveTrainTM, a wave optics systems engineering analysis environment developed and maintained by MZA Associates and used by the Air Force Research Laboratory, will result in an effective Model-Based Engineering (MBE) environment for the analysis and design of laser weapons systems. This paper will review the results of this effort and future steps.
Clemente, Isabel; Aznar, Margarita; Nerín, Cristina; Bosetti, Osvaldo
2016-01-01
Inks and varnishes used in food packaging multilayer materials can contain different substances that are potential migrants when packaging is in contact with food. Although printing inks are applied on the external layer, they can migrate due to set-off phenomena. In order to assess food safety, migration tests were performed from two materials sets: set A based on paper and set B based on PET; both contained inks. Migration was performed to four food simulants (EtOH 50%, isooctane, EtOH 95% and Tenax(®)) and the volatile compounds profile was analysed by GC-MS. The effect of presence/absence of inks and varnishes and also their position in the material was studied. A total of 149 volatile compounds were found in migration from set A and 156 from set B materials, some of them came from inks. Quantitative analysis and a principal component analysis were performed in order to identify patterns among sample groups.
Rocket-Based Combined Cycle Engine Technology Development: Inlet CFD Validation and Application
NASA Technical Reports Server (NTRS)
DeBonis, J. R.; Yungster, S.
1996-01-01
A CFD methodology has been developed for inlet analyses of Rocket-Based Combined Cycle (RBCC) Engines. A full Navier-Stokes analysis code, NPARC, was used in conjunction with pre- and post-processing tools to obtain a complete description of the flow field and integrated inlet performance. This methodology was developed and validated using results from a subscale test of the inlet to a RBCC 'Strut-Jet' engine performed in the NASA Lewis 1 x 1 ft. supersonic wind tunnel. Results obtained from this study include analyses at flight Mach numbers of 5 and 6 for super-critical operating conditions. These results showed excellent agreement with experimental data. The analysis tools were also used to obtain pre-test performance and operability predictions for the RBCC demonstrator engine planned for testing in the NASA Lewis Hypersonic Test Facility. This analysis calculated the baseline fuel-off internal force of the engine which is needed to determine the net thrust with fuel on.
Trajectory-based heating analysis for the European Space Agency/Rosetta Earth Return Vehicle
NASA Technical Reports Server (NTRS)
Henline, William D.; Tauber, Michael E.
1994-01-01
A coupled, trajectory-based flowfield and material thermal-response analysis is presented for the European Space Agency proposed Rosetta comet nucleus sample return vehicle. The probe returns to earth along a hyperbolic trajectory with an entry velocity of 16.5 km/s and requires an ablative heat shield on the forebody. Combined radiative and convective ablating flowfield analyses were performed for the significant heating portion of the shallow ballistic entry trajectory. Both quasisteady ablation and fully transient analyses were performed for a heat shield composed of carbon-phenolic ablative material. Quasisteady analysis was performed using the two-dimensional axisymmetric codes RASLE and BLIMPK. Transient computational results were obtained from the one-dimensional ablation/conduction code CMA. Results are presented for heating, temperature, and ablation rate distributions over the probe forebody for various trajectory points. Comparison of transient and quasisteady results indicates that, for the heating pulse encountered by this probe, the quasisteady approach is conservative from the standpoint of predicted surface recession.
Samsudin, Mohd Dinie Muhaimin; Mat Don, Mashitah
2015-01-01
Oil palm trunk (OPT) sap was utilized for growth and bioethanol production by Saccharomycescerevisiae with addition of palm oil mill effluent (POME) as nutrients supplier. Maximum yield (YP/S) was attained at 0.464g bioethanol/g glucose presence in the OPT sap-POME-based media. However, OPT sap and POME are heterogeneous in properties and fermentation performance might change if it is repeated. Contribution of parametric uncertainty analysis on bioethanol fermentation performance was then assessed using Monte Carlo simulation (stochastic variable) to determine probability distributions due to fluctuation and variation of kinetic model parameters. Results showed that based on 100,000 samples tested, the yield (YP/S) ranged 0.423-0.501g/g. Sensitivity analysis was also done to evaluate the impact of each kinetic parameter on the fermentation performance. It is found that bioethanol fermentation highly depend on growth of the tested yeast. Copyright © 2014 Elsevier Ltd. All rights reserved.
High-Performance Liquid Chromatography (HPLC)-Based Detection and Quantitation of Cellular c-di-GMP.
Petrova, Olga E; Sauer, Karin
2017-01-01
The modulation of c-di-GMP levels plays a vital role in the regulation of various processes in a wide array of bacterial species. Thus, investigation of c-di-GMP regulation requires reliable methods for the assessment of c-di-GMP levels and turnover. Reversed-phase high-performance liquid chromatography (RP-HPLC) analysis has become a commonly used approach to accomplish these goals. The following describes the extraction and HPLC-based detection and quantification of c-di-GMP from Pseudomonas aeruginosa samples, a procedure that is amenable to modifications for the analysis of c-di-GMP in other bacterial species.
Design and optimization of liquid core optical ring resonator for refractive index sensing.
Lin, Nai; Jiang, Lan; Wang, Sumei; Xiao, Hai; Lu, Yongfeng; Tsai, Hai-Lung
2011-07-10
This study performs a detailed theoretical analysis of refractive index (RI) sensors based on whispering gallery modes (WGMs) in liquid core optical ring resonators (LCORRs). Both TE- and TM-polarized WGMs of various orders are considered. The analysis shows that WGMs of higher orders need thicker walls to achieve a near-zero thermal drift, but WGMs of different orders exhibit a similar RI sensing performance at the thermostable wall thicknesses. The RI detection limit is very low at the thermostable thickness. The theoretical predications should provide a general guidance in the development of LCORR-based thermostable RI sensors. © 2011 Optical Society of America
[Video-based self-control in surgical teaching. A new tool in a new concept].
Dahmen, U; Sänger, C; Wurst, C; Arlt, J; Wei, W; Dondorf, F; Richter, B; Settmacher, U; Dirsch, O
2013-10-01
Image and video-based results and process control are essential tools of a new teaching concept for conveying surgical skills. The new teaching concept integrates approved teaching principles and new media. Every performance of exercises is videotaped and the result photographically recorded. The quality of the process and result becomes accessible for an analysis by the teacher and the student/learner. The learner is instructed to perform a criteria-based self-analysis of the video and image material by themselves. The new learning concept has so far been successfully applied in seven rounds within the newly designed modular class "Intensivkurs Chirurgische Techniken" (Intensive training of surgical techniques). Result documentation and analysis via digital picture was completed by almost every student. The quality of the results was high. Interestingly the result quality did not correlate with the time needed for the exercise. The training success had a lasting effect. The new and elaborate concept improves the quality of teaching. In the long run resources for patient care should be saved when training students according to this concept prior to performing tasks in the operating theater. These resources should be allocated for further refining innovative teaching concepts.
Frndak, Seth E; Smerbeck, Audrey M; Irwin, Lauren N; Drake, Allison S; Kordovski, Victoria M; Kunker, Katrina A; Khan, Anjum L; Benedict, Ralph H B
2016-10-01
We endeavored to clarify how distinct co-occurring symptoms relate to the presence of negative work events in employed multiple sclerosis (MS) patients. Latent profile analysis (LPA) was utilized to elucidate common disability patterns by isolating patient subpopulations. Samples of 272 employed MS patients and 209 healthy controls (HC) were administered neuroperformance tests of ambulation, hand dexterity, processing speed, and memory. Regression-based norms were created from the HC sample. LPA identified latent profiles using the regression-based z-scores. Finally, multinomial logistic regression tested for negative work event differences among the latent profiles. Four profiles were identified via LPA: a common profile (55%) characterized by slightly below average performance in all domains, a broadly low-performing profile (18%), a poor motor abilities profile with average cognition (17%), and a generally high-functioning profile (9%). Multinomial regression analysis revealed that the uniformly low-performing profile demonstrated a higher likelihood of reported negative work events. Employed MS patients with co-occurring motor, memory and processing speed impairments were most likely to report a negative work event, classifying them as uniquely at risk for job loss.
Limitations imposed on fire PRA methods as the result of incomplete and uncertain fire event data.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nowlen, Steven Patrick; Hyslop, J. S.
2010-04-01
Fire probabilistic risk assessment (PRA) methods utilize data and insights gained from actual fire events in a variety of ways. For example, fire occurrence frequencies, manual fire fighting effectiveness and timing, and the distribution of fire events by fire source and plant location are all based directly on the historical experience base. Other factors are either derived indirectly or supported qualitatively based on insights from the event data. These factors include the general nature and intensity of plant fires, insights into operator performance, and insights into fire growth and damage behaviors. This paper will discuss the potential methodology improvements thatmore » could be realized if more complete fire event reporting information were available. Areas that could benefit from more complete event reporting that will be discussed in the paper include fire event frequency analysis, analysis of fire detection and suppression system performance including incipient detection systems, analysis of manual fire fighting performance, treatment of fire growth from incipient stages to fully-involved fires, operator response to fire events, the impact of smoke on plant operations and equipment, and the impact of fire-induced cable failures on plant electrical circuits.« less
Performance analysis of Supply Chain Management with Supply Chain Operation reference model
NASA Astrophysics Data System (ADS)
Hasibuan, Abdurrozzaq; Arfah, Mahrani; Parinduri, Luthfi; Hernawati, Tri; Suliawati; Harahap, Bonar; Rahmah Sibuea, Siti; Krianto Sulaiman, Oris; purwadi, Adi
2018-04-01
This research was conducted at PT. Shamrock Manufacturing Corpora, the company is required to think creatively to implement competition strategy by producing goods/services that are more qualified, cheaper. Therefore, it is necessary to measure the performance of Supply Chain Management in order to improve the competitiveness. Therefore, the company is required to optimize its production output to meet the export quality standard. This research begins with the creation of initial dimensions based on Supply Chain Management process, ie Plan, Source, Make, Delivery, and Return with hierarchy based on Supply Chain Reference Operation that is Reliability, Responsiveness, Agility, Cost, and Asset. Key Performance Indicator identification becomes a benchmark in performance measurement whereas Snorm De Boer normalization serves to equalize Key Performance Indicator value. Analiytical Hierarchy Process is done to assist in determining priority criteria. Measurement of Supply Chain Management performance at PT. Shamrock Manufacturing Corpora produces SC. Responsiveness (0.649) has higher weight (priority) than other alternatives. The result of performance analysis using Supply Chain Reference Operation model of Supply Chain Management performance at PT. Shamrock Manufacturing Corpora looks good because its monitoring system between 50-100 is good.
Performance optimisations for distributed analysis in ALICE
NASA Astrophysics Data System (ADS)
Betev, L.; Gheata, A.; Gheata, M.; Grigoras, C.; Hristov, P.
2014-06-01
Performance is a critical issue in a production system accommodating hundreds of analysis users. Compared to a local session, distributed analysis is exposed to services and network latencies, remote data access and heterogeneous computing infrastructure, creating a more complex performance and efficiency optimization matrix. During the last 2 years, ALICE analysis shifted from a fast development phase to the more mature and stable code. At the same time, the frameworks and tools for deployment, monitoring and management of large productions have evolved considerably too. The ALICE Grid production system is currently used by a fair share of organized and individual user analysis, consuming up to 30% or the available resources and ranging from fully I/O-bound analysis code to CPU intensive correlations or resonances studies. While the intrinsic analysis performance is unlikely to improve by a large factor during the LHC long shutdown (LS1), the overall efficiency of the system has still to be improved by an important factor to satisfy the analysis needs. We have instrumented all analysis jobs with "sensors" collecting comprehensive monitoring information on the job running conditions and performance in order to identify bottlenecks in the data processing flow. This data are collected by the MonALISa-based ALICE Grid monitoring system and are used to steer and improve the job submission and management policy, to identify operational problems in real time and to perform automatic corrective actions. In parallel with an upgrade of our production system we are aiming for low level improvements related to data format, data management and merging of results to allow for a better performing ALICE analysis.
NASA Astrophysics Data System (ADS)
Rajshekhar, G.; Gorthi, Sai Siva; Rastogi, Pramod
2010-04-01
For phase estimation in digital holographic interferometry, a high-order instantaneous moments (HIM) based method was recently developed which relies on piecewise polynomial approximation of phase and subsequent evaluation of the polynomial coefficients using the HIM operator. A crucial step in the method is mapping the polynomial coefficient estimation to single-tone frequency determination for which various techniques exist. The paper presents a comparative analysis of the performance of the HIM operator based method in using different single-tone frequency estimation techniques for phase estimation. The analysis is supplemented by simulation results.
Wert, David M.; Hile, Elizabeth S.; Studenski, Stephanie A.; Brach, Jennifer S.
2011-01-01
Background The incidence of obesity is increasing in older adults, with associated worsening in the burden of disability. Little is known about the impact of body mass index (BMI) on self-report and performance-based balance and mobility measures in older adults. Objective The purposes of this study were (1) to examine the association of BMI with measures of balance and mobility and (2) to explore potential explanatory factors. Design This was a cross-sectional, observational study. Methods Older adults (mean age=77.6 years) who participated in an ongoing observational study (N=120) were classified as normal weight (BMI=18.5–24.9 kg/m2), overweight (BMI=25.0–29.9 kg/m2), moderately obese (BMI=30.0–34.9 kg/m2), or severely obese (BMI≥35 kg/m2). Body mass index data were missing for one individual; thus, data for 119 participants were included in the analysis. Mobility and balance were assessed using self-report and performance-based measures and were compared among weight groups using analysis of variance and chi-square analysis for categorical data. Multiple linear regression analysis was used to examine the association among BMI, mobility, and balance after controlling for potential confounding variables. Results Compared with participants who were of normal weight or overweight, those with moderate or severe obesity were less likely to report their mobility as very good or excellent (52%, 55%, 39%, and 6%, respectively); however, there was no difference in self-report of balance among weight groups. Participants with severe obesity (n=17) had the lowest levels of mobility on the performance-based measures, followed by those who were moderately obese (n=31), overweight (n=42), and of normal weight (n=29). There were no differences on performance-based balance measures among weight groups. After controlling for age, sex, minority status, physical activity level, education level, and comorbid conditions, BMI still significantly contributed to mobility (β=−.02, adjusted R2=.41). Conclusions Although older adults with severe obesity were most impaired, those with less severe obesity also demonstrated significant decrements in mobility. PMID:21680770
NASA Technical Reports Server (NTRS)
Cross, Jon B.; Koontz, Steven L.; Lan, Esther H.
1993-01-01
The effects of atomic oxygen on boron nitride (BN), silicon nitride (Si3N4), Intelsat 6 solar cell interconnects, organic polymers, and MoS2 and WS2 dry lubricant, were studied in Low Earth Orbit (LEO) flight experiments and in a ground based simulation facility. Both the inflight and ground based experiments employed in situ electrical resistance measurements to detect penetration of atomic oxygen through materials and Electron Spectroscopy for Chemical Analysis (ESCA) analysis to measure chemical composition changes. Results are given. The ground based results on the materials studied to date show good qualitative correlation with the LEO flight results, thus validating the simulation fidelity of the ground based facility in terms of reproducing LEO flight results. In addition it was demonstrated that ground based simulation is capable of performing more detailed experiments than orbital exposures can presently perform. This allows the development of a fundamental understanding of the mechanisms involved in the LEO environment degradation of materials.
ERIC Educational Resources Information Center
Titus, Elizabeth; Grant, Wallace
The purpose of this project was to perform an analysis of the Rockford Public Library (Illinois) circulation services department and provide recommendations leading to customer service improvement, better space utilization, and improved departmental work flow. Based on an analysis of input from individual interviews with staff, review of…
Player-Driven Video Analysis to Enhance Reflective Soccer Practice in Talent Development
ERIC Educational Resources Information Center
Hjort, Anders; Henriksen, Kristoffer; Elbæk, Lars
2018-01-01
In the present article, we investigate the introduction of a cloud-based video analysis platform called Player Universe (PU). Video analysis is not a new performance-enhancing element in sports, but PU is innovative in how it facilitates reflective learning. Video analysis is executed in the PU platform by involving the players in the analysis…
Fingeret, Abbey L; Martinez, Rebecca H; Hsieh, Christine; Downey, Peter; Nowygrod, Roman
2016-02-01
We aim to determine whether observed operations or internet-based video review predict improved performance in the surgery clerkship. A retrospective review of students' usage of surgical videos, observed operations, evaluations, and examination scores were used to construct an exploratory principal component analysis. Multivariate regression was used to determine factors predictive of clerkship performance. Case log data for 231 students revealed a median of 25 observed cases. Students accessed the web-based video platform a median of 15 times. Principal component analysis yielded 4 factors contributing 74% of the variability with a Kaiser-Meyer-Olkin coefficient of .83. Multivariate regression predicted shelf score (P < .0001), internal clinical skills examination score (P < .0001), subjective evaluations (P < .001), and video website utilization (P < .001) but not observed cases to be significantly associated with overall performance. Utilization of a web-based operative video platform during a surgical clerkship is an independently associated with improved clinical reasoning, fund of knowledge, and overall evaluation. Thus, this modality can serve as a useful adjunct to live observation. Copyright © 2016 Elsevier Inc. All rights reserved.
Ten-year performance of ponderosa pine provenances in the Great Plains of North America
Ralph A. Read
1983-01-01
A cluster and discriminant analysis based on nine of the best plantations, partitioned the seed provenance populations into six geographic clusters according to their consistency of performance in the plantations.The Northcentral Nebraska cluster of three provenances performed consistently well above average in all plantations. These easternmost...
DOT National Transportation Integrated Search
2010-10-01
Ultra-high performance concrete (UHPC) is an advanced cementitious composite material which has been developed in recent decades. When compared to more conventional cement-based concrete materials, UHPC tends to exhibit superior properties such as in...
A Study of Performance Support in Higher Education
ERIC Educational Resources Information Center
Lion, Robert W.
2011-01-01
Successful performance improvement efforts are closely tied to the strength and integrity of the performance analysis process. During a time when higher education institutions are facing increasing budget cuts, the ability to recruit and retain students is extremely important. For some institutions, web-based courses have been viewed as a way to…
Cross-industry Performance Modeling: Toward Cooperative Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Reece, Wendy Jane; Blackman, Harold Stabler
One of the current unsolved problems in human factors is the difficulty in acquiring information from lessons learned and data collected among human performance analysts in different domains. There are several common concerns and generally accepted issues of importance for human factors, psychology and industry analysts of performance and safety. Among these are the need to incorporate lessons learned in design, to carefully consider implementation of new designs and automation, and the need to reduce human performance-based contributions to risk. In spite of shared concerns, there are several roadblocks to widespread sharing of data and lessons learned from operating experiencemore » and simulation, including the fact that very few publicly accessible data bases exist (Gertman & Blackman, 1994, and Kirwan, 1997). There is a need to draw together analysts and analytic methodologies to comprise a centralized source of data with sufficient detail to be meaningful while ensuring source anonymity. We propose that a generic source of performance data and a multi-domain data store may provide the first steps toward cooperative performance modeling and analysis across industries.« less
Cross-Industry Performance Modeling: Toward Cooperative Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
H. S. Blackman; W. J. Reece
One of the current unsolved problems in human factors is the difficulty in acquiring information from lessons learned and data collected among human performance analysts in different domains. There are several common concerns and generally accepted issues of importance for human factors, psychology and industry analysts of performance and safety. Among these are the need to incorporate lessons learned in design, to carefully consider implementation of new designs and automation, and the need to reduce human performance-based contributions to risk. In spite of shared concerns, there are several road blocks to widespread sharing of data and lessons learned from operatingmore » experience and simulation, including the fact that very few publicly accessible data bases exist(Gertman & Blackman, 1994, and Kirwan, 1997). There is a need to draw together analysts and analytic methodologies to comprise a centralized source of data with sufficient detail to be meaningful while ensuring source anonymity. We propose that a generic source of performance data and a multi-domain data store may provide the first steps toward cooperative performance modeling and analysis across industries.« less
Biomotor structures in elite female handball players according to performance.
Cavala, Marijana; Rogulj, Nenad; Srhoj, Vatromir; Srhoj, Ljerka; Katić, Ratko
2008-03-01
In order to identify biomotor structures in elite female handball players, factor structures of morphological characteristics and basic motor abilities, and of variables evaluating situation motor abilities of elite female handball players (n = 53) were determined first, followed by determination of differences and relations of the morphological, motor and specific motor space according to handball performance. Factor analysis of 16 morphological measures produced three morphological factors, i.e. factor of absolute voluminosity, i.e. mesoendomorphy, factor of longitudinal skeleton dimensionality, and factor of transverse hand dimensionality. Factor analysis of 15 motor variables yielded five basic motor dimensions, i.e. factor of agility, factor of throwing explosive strength, factor of running explosive strength (sprint), factor of jumping explosive strength and factor of movement frequency rate. Factor analysis of 5 situation motor variables produced two dimensions: factor of specific agility with explosiveness and factor of specific precision with ball manipulation. Analysis of variance yielded greatest differences relative to handball performance in the factor of specific agility and throwing strength, and the factor of basic motoricity that integrates the ability of coordination (agility) with upper extremity throwing explosiveness and lower extremity sprint (30-m sprint) and jumping (standing triple jump). Considering morphological factors, the factor of voluminosity, i.e. mesoendomorphy, which is defined by muscle mass rather than adipose tissue, was found to contribute significantly to the players'performance. Results of regression analysis indicated the handball performance to be predominantly determined by the general specific motor factor based on specific agility and explosiveness, and by the morphological factor based on body mass and volume, i.e. muscle mass. Concerning basic motor abilities, the factor of movement frequency rate, which is associated with the ability of ball manipulation, was observed to predict significantly the handball players' performance.
Malay sentiment analysis based on combined classification approaches and Senti-lexicon algorithm.
Al-Saffar, Ahmed; Awang, Suryanti; Tao, Hai; Omar, Nazlia; Al-Saiagh, Wafaa; Al-Bared, Mohammed
2018-01-01
Sentiment analysis techniques are increasingly exploited to categorize the opinion text to one or more predefined sentiment classes for the creation and automated maintenance of review-aggregation websites. In this paper, a Malay sentiment analysis classification model is proposed to improve classification performances based on the semantic orientation and machine learning approaches. First, a total of 2,478 Malay sentiment-lexicon phrases and words are assigned with a synonym and stored with the help of more than one Malay native speaker, and the polarity is manually allotted with a score. In addition, the supervised machine learning approaches and lexicon knowledge method are combined for Malay sentiment classification with evaluating thirteen features. Finally, three individual classifiers and a combined classifier are used to evaluate the classification accuracy. In experimental results, a wide-range of comparative experiments is conducted on a Malay Reviews Corpus (MRC), and it demonstrates that the feature extraction improves the performance of Malay sentiment analysis based on the combined classification. However, the results depend on three factors, the features, the number of features and the classification approach.
NASA Astrophysics Data System (ADS)
Hamim, Salah Uddin Ahmed
Nanoindentation involves probing a hard diamond tip into a material, where the load and the displacement experienced by the tip is recorded continuously. This load-displacement data is a direct function of material's innate stress-strain behavior. Thus, theoretically it is possible to extract mechanical properties of a material through nanoindentation. However, due to various nonlinearities associated with nanoindentation the process of interpreting load-displacement data into material properties is difficult. Although, simple elastic behavior can be characterized easily, a method to characterize complicated material behavior such as nonlinear viscoelasticity is still lacking. In this study, a nanoindentation-based material characterization technique is developed to characterize soft materials exhibiting nonlinear viscoelasticity. Nanoindentation experiment was modeled in finite element analysis software (ABAQUS), where a nonlinear viscoelastic behavior was incorporated using user-defined subroutine (UMAT). The model parameters were calibrated using a process called inverse analysis. In this study, a surrogate model-based approach was used for the inverse analysis. The different factors affecting the surrogate model performance are analyzed in order to optimize the performance with respect to the computational cost.
Malay sentiment analysis based on combined classification approaches and Senti-lexicon algorithm
Awang, Suryanti; Tao, Hai; Omar, Nazlia; Al-Saiagh, Wafaa; Al-bared, Mohammed
2018-01-01
Sentiment analysis techniques are increasingly exploited to categorize the opinion text to one or more predefined sentiment classes for the creation and automated maintenance of review-aggregation websites. In this paper, a Malay sentiment analysis classification model is proposed to improve classification performances based on the semantic orientation and machine learning approaches. First, a total of 2,478 Malay sentiment-lexicon phrases and words are assigned with a synonym and stored with the help of more than one Malay native speaker, and the polarity is manually allotted with a score. In addition, the supervised machine learning approaches and lexicon knowledge method are combined for Malay sentiment classification with evaluating thirteen features. Finally, three individual classifiers and a combined classifier are used to evaluate the classification accuracy. In experimental results, a wide-range of comparative experiments is conducted on a Malay Reviews Corpus (MRC), and it demonstrates that the feature extraction improves the performance of Malay sentiment analysis based on the combined classification. However, the results depend on three factors, the features, the number of features and the classification approach. PMID:29684036
Martinez-Pinna, Roxana; Gonzalez de Peredo, Anne; Monsarrat, Bernard; Burlet-Schiltz, Odile; Martin-Ventura, Jose Luis
2014-08-01
To find potential biomarkers of abdominal aortic aneurysms (AAA), we performed a differential proteomic study based on human plasma-derived microvesicles. Exosomes and microparticles isolated from plasma of AAA patients and control subjects (n = 10 each group) were analyzed by a label-free quantitative MS-based strategy. Homemade and publicly available software packages have been used for MS data analysis. The application of two kinds of bioinformatic tools allowed us to find differential protein profiles from AAA patients. Some of these proteins found by the two analysis methods belong to main pathological mechanisms of AAA such as oxidative stress, immune-inflammation, and thrombosis. Data analysis from label-free MS-based experiments requires the use of sophisticated bioinformatic approaches to perform quantitative studies from complex protein mixtures. The application of two of these bioinformatic tools provided us a preliminary list of differential proteins found in plasma-derived microvesicles not previously associated to AAA, which could help us to understand the pathological mechanisms related to this disease. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Lipiäinen, Tiina; Pessi, Jenni; Movahedi, Parisa; Koivistoinen, Juha; Kurki, Lauri; Tenhunen, Mari; Yliruusi, Jouko; Juppo, Anne M; Heikkonen, Jukka; Pahikkala, Tapio; Strachan, Clare J
2018-04-03
Raman spectroscopy is widely used for quantitative pharmaceutical analysis, but a common obstacle to its use is sample fluorescence masking the Raman signal. Time-gating provides an instrument-based method for rejecting fluorescence through temporal resolution of the spectral signal and allows Raman spectra of fluorescent materials to be obtained. An additional practical advantage is that analysis is possible in ambient lighting. This study assesses the efficacy of time-gated Raman spectroscopy for the quantitative measurement of fluorescent pharmaceuticals. Time-gated Raman spectroscopy with a 128 × (2) × 4 CMOS SPAD detector was applied for quantitative analysis of ternary mixtures of solid-state forms of the model drug, piroxicam (PRX). Partial least-squares (PLS) regression allowed quantification, with Raman-active time domain selection (based on visual inspection) improving performance. Model performance was further improved by using kernel-based regularized least-squares (RLS) regression with greedy feature selection in which the data use in both the Raman shift and time dimensions was statistically optimized. Overall, time-gated Raman spectroscopy, especially with optimized data analysis in both the spectral and time dimensions, shows potential for sensitive and relatively routine quantitative analysis of photoluminescent pharmaceuticals during drug development and manufacturing.
Stability analysis for a multi-camera photogrammetric system.
Habib, Ayman; Detchev, Ivan; Kwak, Eunju
2014-08-18
Consumer-grade digital cameras suffer from geometrical instability that may cause problems when used in photogrammetric applications. This paper provides a comprehensive review of this issue of interior orientation parameter variation over time, it explains the common ways used for coping with the issue, and describes the existing methods for performing stability analysis for a single camera. The paper then points out the lack of coverage of stability analysis for multi-camera systems, suggests a modification of the collinearity model to be used for the calibration of an entire photogrammetric system, and proposes three methods for system stability analysis. The proposed methods explore the impact of the changes in interior orientation and relative orientation/mounting parameters on the reconstruction process. Rather than relying on ground truth in real datasets to check the system calibration stability, the proposed methods are simulation-based. Experiment results are shown, where a multi-camera photogrammetric system was calibrated three times, and stability analysis was performed on the system calibration parameters from the three sessions. The proposed simulation-based methods provided results that were compatible with a real-data based approach for evaluating the impact of changes in the system calibration parameters on the three-dimensional reconstruction.
Stability Analysis for a Multi-Camera Photogrammetric System
Habib, Ayman; Detchev, Ivan; Kwak, Eunju
2014-01-01
Consumer-grade digital cameras suffer from geometrical instability that may cause problems when used in photogrammetric applications. This paper provides a comprehensive review of this issue of interior orientation parameter variation over time, it explains the common ways used for coping with the issue, and describes the existing methods for performing stability analysis for a single camera. The paper then points out the lack of coverage of stability analysis for multi-camera systems, suggests a modification of the collinearity model to be used for the calibration of an entire photogrammetric system, and proposes three methods for system stability analysis. The proposed methods explore the impact of the changes in interior orientation and relative orientation/mounting parameters on the reconstruction process. Rather than relying on ground truth in real datasets to check the system calibration stability, the proposed methods are simulation-based. Experiment results are shown, where a multi-camera photogrammetric system was calibrated three times, and stability analysis was performed on the system calibration parameters from the three sessions. The proposed simulation-based methods provided results that were compatible with a real-data based approach for evaluating the impact of changes in the system calibration parameters on the three-dimensional reconstruction. PMID:25196012
Movement analysis of upper limb during resistance training using general purpose robot arm "PA10"
NASA Astrophysics Data System (ADS)
Morita, Yoshifumi; Yamamoto, Takashi; Suzuki, Takahiro; Hirose, Akinori; Ukai, Hiroyuki; Matsui, Nobuyuki
2005-12-01
In this paper we perform movement analysis of an upper limb during resistance training. We selected sanding training, which is one type of resistance training for upper limbs widely performed in occupational therapy. Our final aims in the future are to quantitatively evaluate the therapeutic effect of upper limb motor function during training and to develop a new rehabilitation training support system. For these purposes, first of all we perform movement analysis using a conventional training tool. By measuring upper limb motion during the sanding training we perform feature abstraction. Next we perform movement analysis using the simulated sanding training system. This system is constructed using the general purpose robot arm "PA10". This system enables us to measure the force/torque exerted by subjects and to easily change the load of resistance. The control algorithm is based on impedance control. We found these features of the upper limb motion during the sanding training.
Anomalous neural circuit function in schizophrenia during a virtual Morris water task.
Folley, Bradley S; Astur, Robert; Jagannathan, Kanchana; Calhoun, Vince D; Pearlson, Godfrey D
2010-02-15
Previous studies have reported learning and navigation impairments in schizophrenia patients during virtual reality allocentric learning tasks. The neural bases of these deficits have not been explored using functional MRI despite well-explored anatomic characterization of these paradigms in non-human animals. Our objective was to characterize the differential distributed neural circuits involved in virtual Morris water task performance using independent component analysis (ICA) in schizophrenia patients and controls. Additionally, we present behavioral data in order to derive relationships between brain function and performance, and we have included a general linear model-based analysis in order to exemplify the incremental and differential results afforded by ICA. Thirty-four individuals with schizophrenia and twenty-eight healthy controls underwent fMRI scanning during a block design virtual Morris water task using hidden and visible platform conditions. Independent components analysis was used to deconstruct neural contributions to hidden and visible platform conditions for patients and controls. We also examined performance variables, voxel-based morphometry and hippocampal subparcellation, and regional BOLD signal variation. Independent component analysis identified five neural circuits. Mesial temporal lobe regions, including the hippocampus, were consistently task-related across conditions and groups. Frontal, striatal, and parietal circuits were recruited preferentially during the visible condition for patients, while frontal and temporal lobe regions were more saliently recruited by controls during the hidden platform condition. Gray matter concentrations and BOLD signal in hippocampal subregions were associated with task performance in controls but not patients. Patients exhibited impaired performance on the hidden and visible conditions of the task, related to negative symptom severity. While controls showed coupling between neural circuits, regional neuroanatomy, and behavior, patients activated different task-related neural circuits, not associated with appropriate regional neuroanatomy. GLM analysis elucidated several comparable regions, with the exception of the hippocampus. Inefficient allocentric learning and memory in patients may be related to an inability to recruit appropriate task-dependent neural circuits. Copyright 2009 Elsevier Inc. All rights reserved.
Carvajal, Roberto C; Arias, Luis E; Garces, Hugo O; Sbarbaro, Daniel G
2016-04-01
This work presents a non-parametric method based on a principal component analysis (PCA) and a parametric one based on artificial neural networks (ANN) to remove continuous baseline features from spectra. The non-parametric method estimates the baseline based on a set of sampled basis vectors obtained from PCA applied over a previously composed continuous spectra learning matrix. The parametric method, however, uses an ANN to filter out the baseline. Previous studies have demonstrated that this method is one of the most effective for baseline removal. The evaluation of both methods was carried out by using a synthetic database designed for benchmarking baseline removal algorithms, containing 100 synthetic composed spectra at different signal-to-baseline ratio (SBR), signal-to-noise ratio (SNR), and baseline slopes. In addition to deomonstrating the utility of the proposed methods and to compare them in a real application, a spectral data set measured from a flame radiation process was used. Several performance metrics such as correlation coefficient, chi-square value, and goodness-of-fit coefficient were calculated to quantify and compare both algorithms. Results demonstrate that the PCA-based method outperforms the one based on ANN both in terms of performance and simplicity. © The Author(s) 2016.
Brucker, Debra L.; Stewart, Maureen
2013-01-01
To explore whether the implementation of performance-based contracting (PBC) within the State of Maine’s substance abuse treatment system resulted in improved performance, one descriptive and two empirical analyses were conducted. The first analysis examined utilization and payment structure. The second study was designed to examine whether timeliness of access to outpatient (OP) and intensive outpatient (IOP) substance abuse assessments and treatment, measures that only became available after the implementation of PBC, differed between PBC and non-PBC agencies in the year following implementation of PBC. Using treatment admission records from the state treatment data system (N=9,128), logistic regression models run using generalized equation estimation techniques found no significant difference between PBC agencies and other agencies on timeliness of access to assessments or treatment, for both OP and IOP services. The third analysis, conducted using discharge data from the years prior to and after the implementation of performance-based contracting (N=6,740) for those agencies that became a part of the performance-based contracting system, was designed to assess differences in level of participation, retention, and completion of treatment. Regression models suggest that performance on OP client engagement and retention measures was significantly poorer the year after the implementation of PBC, but that temporal rather than a PBC effects were more significant. No differences were found between years for IOP level of participation or completion of treatment measures. PMID:21249461
Jung, Seung H.; Brownlow, Milene L.; Pellegrini, Matteo; Jankord, Ryan
2017-01-01
Individual susceptibility determines the magnitude of stress effects on cognitive function. The hippocampus, a brain region of memory consolidation, is vulnerable to stressful environments, and the impact of stress on hippocampus may determine individual variability in cognitive performance. Therefore, the purpose of this study was to define the relationship between the divergence in spatial memory performance under chronically unpredictable stress and an associated transcriptomic alternation in hippocampus, the brain region of spatial memory consolidation. Multiple strains of BXD (B6 × D2) recombinant inbred mice went through a 4-week chronic variable stress (CVS) paradigm, and the Morris water maze (MWM) test was conducted during the last week of CVS to assess hippocampal-dependent spatial memory performance and grouped animals into low and high performing groups based on the cognitive performance. Using hippocampal whole transcriptome RNA-sequencing data, differential expression, PANTHER analysis, WGCNA, Ingenuity's upstream regulator analysis in the Ingenuity Pathway Analysis® and phenotype association analysis were conducted. Our data identified multiple genes and pathways that were significantly associated with chronic stress-associated cognitive modification and the divergence in hippocampal dependent memory performance under chronic stress. Biological pathways associated with memory performance following chronic stress included metabolism, neurotransmitter and receptor regulation, immune response and cellular process. The Ingenuity's upstream regulator analysis identified 247 upstream transcriptional regulators from 16 different molecule types. Transcripts predictive of cognitive performance under high stress included genes that are associated with a high occurrence of Alzheimer's and cognitive impairments (e.g., Ncl, Eno1, Scn9a, Slc19a3, Ncstn, Fos, Eif4h, Copa, etc.). Our results show that the variable effects of chronic stress on the hippocampal transcriptome are related to the ability to complete the MWM task and that the modulations of specific pathways are indicative of hippocampal dependent memory performance. Thus, the divergence in spatial memory performance following chronic stress is related to the unique pattern of gene expression within the hippocampus. PMID:28912681
A case study by life cycle assessment
NASA Astrophysics Data System (ADS)
Li, Shuyun
2017-05-01
This article aims to assess the potential environmental impact of an electrical grinder during its life cycle. The Life Cycle Inventory Analysis was conducted based on the Simplified Life Cycle Assessment (SLCA) Drivers that calculated from the Valuation of Social Cost and Simplified Life Cycle Assessment Model (VSSM). The detailed results for LCI can be found under Appendix II. The Life Cycle Impact Assessment was performed based on Eco-indicator 99 method. The analysis results indicated that the major contributor to the environmental impact as it accounts for over 60% overall SLCA output. In which, 60% of the emission resulted from the logistic required for the maintenance activities. This was measured by conducting the hotspot analysis. After performing sensitivity analysis, it is evidenced that changing fuel type results in significant decrease environmental footprint. The environmental benefit can also be seen from the negative output values of the recycling activities. By conducting Life Cycle Assessment analysis, the potential environmental impact of the electrical grinder was investigated.
In situ visualization and data analysis for turbidity currents simulation
NASA Astrophysics Data System (ADS)
Camata, Jose J.; Silva, Vítor; Valduriez, Patrick; Mattoso, Marta; Coutinho, Alvaro L. G. A.
2018-01-01
Turbidity currents are underflows responsible for sediment deposits that generate geological formations of interest for the oil and gas industry. LibMesh-sedimentation is an application built upon the libMesh library to simulate turbidity currents. In this work, we present the integration of libMesh-sedimentation with in situ visualization and in transit data analysis tools. DfAnalyzer is a solution based on provenance data to extract and relate strategic simulation data in transit from multiple data for online queries. We integrate libMesh-sedimentation and ParaView Catalyst to perform in situ data analysis and visualization. We present a parallel performance analysis for two turbidity currents simulations showing that the overhead for both in situ visualization and in transit data analysis is negligible. We show that our tools enable monitoring the sediments appearance at runtime and steer the simulation based on the solver convergence and visual information on the sediment deposits, thus enhancing the analytical power of turbidity currents simulations.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Adams, Brian M.; Ebeida, Mohamed Salah; Eldred, Michael S.
The Dakota (Design Analysis Kit for Optimization and Terascale Applications) toolkit provides a exible and extensible interface between simulation codes and iterative analysis methods. Dakota contains algorithms for optimization with gradient and nongradient-based methods; uncertainty quanti cation with sampling, reliability, and stochastic expansion methods; parameter estimation with nonlinear least squares methods; and sensitivity/variance analysis with design of experiments and parameter study methods. These capabilities may be used on their own or as components within advanced strategies such as surrogate-based optimization, mixed integer nonlinear programming, or optimization under uncertainty. By employing object-oriented design to implement abstractions of the key components requiredmore » for iterative systems analyses, the Dakota toolkit provides a exible and extensible problem-solving environment for design and performance analysis of computational models on high performance computers. This report serves as a user's manual for the Dakota software and provides capability overviews and procedures for software execution, as well as a variety of example studies.« less
Multivariate meta-analysis: a robust approach based on the theory of U-statistic.
Ma, Yan; Mazumdar, Madhu
2011-10-30
Meta-analysis is the methodology for combining findings from similar research studies asking the same question. When the question of interest involves multiple outcomes, multivariate meta-analysis is used to synthesize the outcomes simultaneously taking into account the correlation between the outcomes. Likelihood-based approaches, in particular restricted maximum likelihood (REML) method, are commonly utilized in this context. REML assumes a multivariate normal distribution for the random-effects model. This assumption is difficult to verify, especially for meta-analysis with small number of component studies. The use of REML also requires iterative estimation between parameters, needing moderately high computation time, especially when the dimension of outcomes is large. A multivariate method of moments (MMM) is available and is shown to perform equally well to REML. However, there is a lack of information on the performance of these two methods when the true data distribution is far from normality. In this paper, we propose a new nonparametric and non-iterative method for multivariate meta-analysis on the basis of the theory of U-statistic and compare the properties of these three procedures under both normal and skewed data through simulation studies. It is shown that the effect on estimates from REML because of non-normal data distribution is marginal and that the estimates from MMM and U-statistic-based approaches are very similar. Therefore, we conclude that for performing multivariate meta-analysis, the U-statistic estimation procedure is a viable alternative to REML and MMM. Easy implementation of all three methods are illustrated by their application to data from two published meta-analysis from the fields of hip fracture and periodontal disease. We discuss ideas for future research based on U-statistic for testing significance of between-study heterogeneity and for extending the work to meta-regression setting. Copyright © 2011 John Wiley & Sons, Ltd.
NASA Astrophysics Data System (ADS)
Lee, Youngjoo; Seo, Joon Beom; Kang, Bokyoung; Kim, Dongil; Lee, June Goo; Kim, Song Soo; Kim, Namkug; Kang, Suk Ho
2007-03-01
The performance of classification algorithms for differentiating among obstructive lung diseases based on features from texture analysis using HRCT (High Resolution Computerized Tomography) images was compared. HRCT can provide accurate information for the detection of various obstructive lung diseases, including centrilobular emphysema, panlobular emphysema and bronchiolitis obliterans. Features on HRCT images can be subtle, however, particularly in the early stages of disease, and image-based diagnosis is subject to inter-observer variation. To automate the diagnosis and improve the accuracy, we compared three types of automated classification systems, naÃve Bayesian classifier, ANN (Artificial Neural Net) and SVM (Support Vector Machine), based on their ability to differentiate among normal lung and three types of obstructive lung diseases. To assess the performance and cross-validation of these three classifiers, 5 folding methods with 5 randomly chosen groups were used. For a more robust result, each validation was repeated 100 times. SVM showed the best performance, with 86.5% overall sensitivity, significantly different from the other classifiers (one way ANOVA, p<0.01). We address the characteristics of each classifier affecting performance and the issue of which classifier is the most suitable for clinical applications, and propose an appropriate method to choose the best classifier and determine its optimal parameters for optimal disease discrimination. These results can be applied to classifiers for differentiation of other diseases.
Zhao, Huawei
2009-01-01
A ZEMAX model was constructed to simulate a clinical trial of intraocular lenses (IOLs) based on a clinically oriented Monte Carlo ensemble analysis using postoperative ocular parameters. The purpose of this model is to test the feasibility of streamlining and optimizing both the design process and the clinical testing of IOLs. This optical ensemble analysis (OEA) is also validated. Simulated pseudophakic eyes were generated by using the tolerancing and programming features of ZEMAX optical design software. OEA methodology was verified by demonstrating that the results of clinical performance simulations were consistent with previously published clinical performance data using the same types of IOLs. From these results we conclude that the OEA method can objectively simulate the potential clinical trial performance of IOLs.
Integrated Modeling of Optical Systems (IMOS): An Assessment and Future Directions
NASA Technical Reports Server (NTRS)
Moore, Gregory; Broduer, Steve (Technical Monitor)
2001-01-01
Integrated Modeling of Optical Systems (IMOS) is a finite element-based code combining structural, thermal, and optical ray-tracing capabilities in a single environment for analysis of space-based optical systems. We'll present some recent examples of IMOS usage and discuss future development directions. Due to increasing model sizes and a greater emphasis on multidisciplinary analysis and design, much of the anticipated future work will be in the areas of improved architecture, numerics, and overall performance and analysis integration.
An overview of PM-10 base year emissions inventories
DOT National Transportation Integrated Search
1999-01-01
This report provides an overview of the Long Term Pavement Performance (LTPP) program's analysis program. Specifically, it outlines the analysis projects that will be undertaken by the Federal Highway Administration in fiscal years 1999 and 2000 and ...
Highway User Benefit Analysis System Research Project #128
DOT National Transportation Integrated Search
2000-10-01
In this research, a methodology for estimating road user costs of various competing alternatives was developed. Also, software was developed to calculate the road user cost, perform economic analysis and update cost tables. The methodology is based o...
IVHS Institutional Issues and Case Studies, Analysis and Lessons Learned, Final Report
DOT National Transportation Integrated Search
1994-04-01
This 'Analysis and Lessons Learned' report contains observations, conclusions, and recommendations based on the performance of six case studies of Intelligent Vehicle-Highway Systems (IVHS) projects. Information to support the development of the case...
Early warning reporting categories analysis of recall and complaints data.
DOT National Transportation Integrated Search
2001-12-31
This analysis was performed to assist the National Highway Traffic Safety Administration (NHTSA) in identifying components and systems to be included in early warning reporting (EWR) categories that would be based upon historical safety-related recal...
Dynamic mechanical analysis and organization/storage of data for polymetric materials
NASA Technical Reports Server (NTRS)
Rosenberg, M.; Buckley, W.
1982-01-01
Dynamic mechanical analysis was performed on a variety of temperature resistant polymers and composite resin matrices. Data on glass transition temperatures and degree of cure attained were derived. In addition a laboratory based computer system was installed and data base set up to allow entry of composite data. The laboratory CPU termed TYCHO is based on a DEC PDP 11/44 CPU with a Datatrieve relational data base. The function of TYCHO is integration of chemical laboratory analytical instrumentation and storage of chemical structures for modeling of new polymeric structures and compounds
Pistón, Mariela; Knochen, Moisés
2012-01-01
Two flow methods, based, respectively, on flow-injection analysis (FIA) and on multicommutated flow analysis (MCFA), were compared with regard to their use for the determination of total selenium in infant formulas by hydride-generation atomic absorption spectrometry. The method based on multicommutation provided lower detection and quantification limits (0.08 and 0.27 μg L−1 compared to 0.59 and 1.95 μ L−1, resp.), higher sampling frequency (160 versus. 70 samples per hour), and reduced reagent consumption. Linearity, precision, and accuracy were similar for the two methods compared. It was concluded that, while both methods proved to be appropriate for the purpose, the MCFA-based method exhibited a better performance. PMID:22505923
NASA Astrophysics Data System (ADS)
Aktas, Metin; Maral, Hakan; Akgun, Toygar
2018-02-01
Extinction ratio is an inherent limiting factor that has a direct effect on the detection performance of phase-OTDR based distributed acoustics sensing systems. In this work we present a model based analysis of Rayleigh scattering to simulate the effects of extinction ratio on the received signal under varying signal acquisition scenarios and system parameters. These signal acquisition scenarios are constructed to represent typically observed cases such as multiple vibration sources cluttered around the target vibration source to be detected, continuous wave light sources with center frequency drift, varying fiber optic cable lengths and varying ADC bit resolutions. Results show that an insufficient ER can result in high optical noise floor and effectively hide the effects of elaborate system improvement efforts.
Position Accuracy Analysis of a Robust Vision-Based Navigation
NASA Astrophysics Data System (ADS)
Gaglione, S.; Del Pizzo, S.; Troisi, S.; Angrisano, A.
2018-05-01
Using images to determine camera position and attitude is a consolidated method, very widespread for application like UAV navigation. In harsh environment, where GNSS could be degraded or denied, image-based positioning could represent a possible candidate for an integrated or alternative system. In this paper, such method is investigated using a system based on single camera and 3D maps. A robust estimation method is proposed in order to limit the effect of blunders or noisy measurements on position solution. The proposed approach is tested using images collected in an urban canyon, where GNSS positioning is very unaccurate. A previous photogrammetry survey has been performed to build the 3D model of tested area. The position accuracy analysis is performed and the effect of the robust method proposed is validated.
Relative performance of academic departments using DEA with sensitivity analysis.
Tyagi, Preeti; Yadav, Shiv Prasad; Singh, S P
2009-05-01
The process of liberalization and globalization of Indian economy has brought new opportunities and challenges in all areas of human endeavor including education. Educational institutions have to adopt new strategies to make best use of the opportunities and counter the challenges. One of these challenges is how to assess the performance of academic programs based on multiple criteria. Keeping this in view, this paper attempts to evaluate the performance efficiencies of 19 academic departments of IIT Roorkee (India) through data envelopment analysis (DEA) technique. The technique has been used to assess the performance of academic institutions in a number of countries like USA, UK, Australia, etc. But we are using it first time in Indian context to the best of our knowledge. Applying DEA models, we calculate technical, pure technical and scale efficiencies and identify the reference sets for inefficient departments. Input and output projections are also suggested for inefficient departments to reach the frontier. Overall performance, research performance and teaching performance are assessed separately using sensitivity analysis.
Detrended fluctuation analysis for major depressive disorder.
Mumtaz, Wajid; Malik, Aamir Saeed; Ali, Syed Saad Azhar; Yasin, Mohd Azhar Mohd; Amin, Hafeezullah
2015-01-01
Clinical utility of Electroencephalography (EEG) based diagnostic studies is less clear for major depressive disorder (MDD). In this paper, a novel machine learning (ML) scheme was presented to discriminate the MDD patients and healthy controls. The proposed method inherently involved feature extraction, selection, classification and validation. The EEG data acquisition involved eyes closed (EC) and eyes open (EO) conditions. At feature extraction stage, the de-trended fluctuation analysis (DFA) was performed, based on the EEG data, to achieve scaling exponents. The DFA was performed to analyzes the presence or absence of long-range temporal correlations (LRTC) in the recorded EEG data. The scaling exponents were used as input features to our proposed system. At feature selection stage, 3 different techniques were used for comparison purposes. Logistic regression (LR) classifier was employed. The method was validated by a 10-fold cross-validation. As results, we have observed that the effect of 3 different reference montages on the computed features. The proposed method employed 3 different types of feature selection techniques for comparison purposes as well. The results show that the DFA analysis performed better in LE data compared with the IR and AR data. In addition, during Wilcoxon ranking, the AR performed better than LE and IR. Based on the results, it was concluded that the DFA provided useful information to discriminate the MDD patients and with further validation can be employed in clinics for diagnosis of MDD.
Ximenes, Sofia; Silva, Ana; Soares, António; Flores-Colen, Inês; de Brito, Jorge
2016-05-04
Statistical models using multiple linear regression are some of the most widely used methods to study the influence of independent variables in a given phenomenon. This study's objective is to understand the influence of the various components of aerogel-based renders on their thermal and mechanical performance, namely cement (three types), fly ash, aerial lime, silica sand, expanded clay, type of aerogel, expanded cork granules, expanded perlite, air entrainers, resins (two types), and rheological agent. The statistical analysis was performed using SPSS (Statistical Package for Social Sciences), based on 85 mortar mixes produced in the laboratory and on their values of thermal conductivity and compressive strength obtained using tests in small-scale samples. The results showed that aerial lime assumes the main role in improving the thermal conductivity of the mortars. Aerogel type, fly ash, expanded perlite and air entrainers are also relevant components for a good thermal conductivity. Expanded clay can improve the mechanical behavior and aerogel has the opposite effect.
Ximenes, Sofia; Silva, Ana; Soares, António; Flores-Colen, Inês; de Brito, Jorge
2016-01-01
Statistical models using multiple linear regression are some of the most widely used methods to study the influence of independent variables in a given phenomenon. This study’s objective is to understand the influence of the various components of aerogel-based renders on their thermal and mechanical performance, namely cement (three types), fly ash, aerial lime, silica sand, expanded clay, type of aerogel, expanded cork granules, expanded perlite, air entrainers, resins (two types), and rheological agent. The statistical analysis was performed using SPSS (Statistical Package for Social Sciences), based on 85 mortar mixes produced in the laboratory and on their values of thermal conductivity and compressive strength obtained using tests in small-scale samples. The results showed that aerial lime assumes the main role in improving the thermal conductivity of the mortars. Aerogel type, fly ash, expanded perlite and air entrainers are also relevant components for a good thermal conductivity. Expanded clay can improve the mechanical behavior and aerogel has the opposite effect. PMID:28773460
NASA trend analysis procedures
NASA Technical Reports Server (NTRS)
1993-01-01
This publication is primarily intended for use by NASA personnel engaged in managing or implementing trend analysis programs. 'Trend analysis' refers to the observation of current activity in the context of the past in order to infer the expected level of future activity. NASA trend analysis was divided into 5 categories: problem, performance, supportability, programmatic, and reliability. Problem trend analysis uncovers multiple occurrences of historical hardware or software problems or failures in order to focus future corrective action. Performance trend analysis observes changing levels of real-time or historical flight vehicle performance parameters such as temperatures, pressures, and flow rates as compared to specification or 'safe' limits. Supportability trend analysis assesses the adequacy of the spaceflight logistics system; example indicators are repair-turn-around time and parts stockage levels. Programmatic trend analysis uses quantitative indicators to evaluate the 'health' of NASA programs of all types. Finally, reliability trend analysis attempts to evaluate the growth of system reliability based on a decreasing rate of occurrence of hardware problems over time. Procedures for conducting all five types of trend analysis are provided in this publication, prepared through the joint efforts of the NASA Trend Analysis Working Group.
NASA Astrophysics Data System (ADS)
Geng, Xinli; Xu, Hao; Qin, Xiaowei
2016-10-01
During the last several years, the amount of wireless network traffic data increased fast and relative technologies evolved rapidly. In order to improve the performance and Quality of Experience (QoE) of wireless network services, the analysis of field network data and existing delivery mechanisms comes to be a promising research topic. In order to achieve this goal, a smartphone based platform named Monitor and Diagnosis of Mobile Applications (MDMA) was developed to collect field data. Based on this tool, the web browsing service of High Speed Downlink Packet Access (HSDPA) network was tested. The top 200 popular websites in China were selected and loaded on smartphone for thousands times automatically. Communication packets between the smartphone and the cell station were captured for various scenarios (e.g. residential area, urban roads, bus station etc.) in the selected city. A cross-layer database was constructed to support the off-line analysis. Based on the results of client-side experiments and analysis, the usability of proposed portable tool was verified. The preliminary findings and results for existing web browsing service were also presented.
Li, Zhifei; Qin, Dongliang
2014-01-01
In defense related programs, the use of capability-based analysis, design, and acquisition has been significant. In order to confront one of the most challenging features of a huge design space in capability based analysis (CBA), a literature review of design space exploration was first examined. Then, in the process of an aerospace system of systems design space exploration, a bilayer mapping method was put forward, based on the existing experimental and operating data. Finally, the feasibility of the foregoing approach was demonstrated with an illustrative example. With the data mining RST (rough sets theory) and SOM (self-organized mapping) techniques, the alternative to the aerospace system of systems architecture was mapping from P-space (performance space) to C-space (configuration space), and then from C-space to D-space (design space), respectively. Ultimately, the performance space was mapped to the design space, which completed the exploration and preliminary reduction of the entire design space. This method provides a computational analysis and implementation scheme for large-scale simulation. PMID:24790572
Virtual reality measures in neuropsychological assessment: a meta-analytic review.
Neguț, Alexandra; Matu, Silviu-Andrei; Sava, Florin Alin; David, Daniel
2016-02-01
Virtual reality-based assessment is a new paradigm for neuropsychological evaluation, that might provide an ecological assessment, compared to paper-and-pencil or computerized neuropsychological assessment. Previous research has focused on the use of virtual reality in neuropsychological assessment, but no meta-analysis focused on the sensitivity of virtual reality-based measures of cognitive processes in measuring cognitive processes in various populations. We found eighteen studies that compared the cognitive performance between clinical and healthy controls on virtual reality measures. Based on a random effects model, the results indicated a large effect size in favor of healthy controls (g = .95). For executive functions, memory and visuospatial analysis, subgroup analysis revealed moderate to large effect sizes, with superior performance in the case of healthy controls. Participants' mean age, type of clinical condition, type of exploration within virtual reality environments, and the presence of distractors were significant moderators. Our findings support the sensitivity of virtual reality-based measures in detecting cognitive impairment. They highlight the possibility of using virtual reality measures for neuropsychological assessment in research applications, as well as in clinical practice.
Li, Zhifei; Qin, Dongliang; Yang, Feng
2014-01-01
In defense related programs, the use of capability-based analysis, design, and acquisition has been significant. In order to confront one of the most challenging features of a huge design space in capability based analysis (CBA), a literature review of design space exploration was first examined. Then, in the process of an aerospace system of systems design space exploration, a bilayer mapping method was put forward, based on the existing experimental and operating data. Finally, the feasibility of the foregoing approach was demonstrated with an illustrative example. With the data mining RST (rough sets theory) and SOM (self-organized mapping) techniques, the alternative to the aerospace system of systems architecture was mapping from P-space (performance space) to C-space (configuration space), and then from C-space to D-space (design space), respectively. Ultimately, the performance space was mapped to the design space, which completed the exploration and preliminary reduction of the entire design space. This method provides a computational analysis and implementation scheme for large-scale simulation.
JPL IGS Analysis Center Report, 2001-2003
NASA Technical Reports Server (NTRS)
Heflin, M. B.; Bar-Sever, Y. E.; Jefferson, D. C.; Meyer, R. F.; Newport, B. J.; Vigue-Rodi, Y.; Webb, F. H.; Zumberge, J. F.
2004-01-01
Three GPS orbit and clock products are currently provided by JPL for consideration by the IGS. Each differs in its latency and quality, with later results being more accurate. Results are typically available in both IGS and GIPSY formats via anonymous ftp. Current performance based on comparisons with the IGS final products is summarized. Orbit performance was determined by computing the 3D RMS difference between each JPL product and the IGS final orbits based on 15 minute estimates from the sp3 files. Clock performance was computed as the RMS difference after subtracting a linear trend based on 15 minute estimates from the sp3 files.
A Risk-Based Approach for Aerothermal/TPS Analysis and Testing
2007-07-01
RTO-EN-AVT-142 17 - 1 A Risk-Based Approach for Aerothermal/ TPS Analysis and Testing Michael J. Wright∗ and Jay H. Grinstead† NASA Ames...of the thermal protection system ( TPS ) is to protect the payload (crew, cargo, or science) from this entry heating environment. The performance of...the TPS is determined by the efficiency and reliability of this system, typically measured
Breath analysis based on micropreconcentrator for early cancer diagnosis
NASA Astrophysics Data System (ADS)
Lee, Sang-Seok
2018-02-01
We are developing micropreconcentrators based on micro/nanotechnology to detect trace levels of volatile organic compound (VOC) gases contained in human and canine exhaled breath. The possibility of using exhaled VOC gases as biomarkers for various cancer diagnoses has been previously discussed. For early cancer diagnosis, detection of trace levels of VOC gas is indispensable. Using micropreconcentrators based on MEMS technology or nanotechnology is very promising for detection of VOC gas. A micropreconcentrator based breath analysis technique also has advantages from the viewpoints of cost performance and availability for various cancers diagnosis. In this paper, we introduce design, fabrication and evaluation results of our MEMS and nanotechnology based micropreconcentrators. In the MEMS based device, we propose a flower leaf type Si microstructure, and its shape and configuration are optimized quantitatively by finite element method simulation. The nanotechnology based micropreconcentrator consists of carbon nanotube (CNT) structures. As a result, we achieve ppb level VOC gas detection with our micropreconcentrators and usual gas chromatography system that can detect on the order of ppm VOC in gas samples. In performance evaluation, we also confirm that the CNT based micropreconcentrator shows 115 times better concentration ratio than that of the Si based micropreconcentrator. Moreover, we discuss a commercialization idea for new cancer diagnosis using breath analysis. Future work and preliminary clinical testing in dogs is also discussed.
Probabilistic Analysis of Solid Oxide Fuel Cell Based Hybrid Gas Turbine System
NASA Technical Reports Server (NTRS)
Gorla, Rama S. R.; Pai, Shantaram S.; Rusick, Jeffrey J.
2003-01-01
The emergence of fuel cell systems and hybrid fuel cell systems requires the evolution of analysis strategies for evaluating thermodynamic performance. A gas turbine thermodynamic cycle integrated with a fuel cell was computationally simulated and probabilistically evaluated in view of the several uncertainties in the thermodynamic performance parameters. Cumulative distribution functions and sensitivity factors were computed for the overall thermal efficiency and net specific power output due to the uncertainties in the thermodynamic random variables. These results can be used to quickly identify the most critical design variables in order to optimize the design and make it cost effective. The analysis leads to the selection of criteria for gas turbine performance.
Panchal, Mitesh B; Upadhyay, Sanjay H
2014-09-01
In this study, the feasibility of single walled boron nitride nanotube (SWBNNT)-based biosensors has been ensured considering the continuum modelling-based simulation approach, for mass-based detection of various bacterium/viruses. Various types of bacterium or viruses have been taken into consideration at the free-end of the cantilevered configuration of the SWBNNT, as a biosensor. Resonant frequency shift-based analysis has been performed with the adsorption of various bacterium/viruses considered as additional mass to the SWBNNT-based sensor system. The continuum mechanics-based analytical approach, considering effective wall thickness has been considered to validate the finite element method (FEM)-based simulation results, based on continuum volume-based modelling of the SWBNNT. As a systematic analysis approach, the FEM-based simulation results are found in excellent agreement with the analytical results, to analyse the SWBNNTs for their wide range of applications such as nanoresonators, biosensors, gas-sensors, transducers and so on. The obtained results suggest that by using the SWBNNT of smaller size the sensitivity of the sensor system can be enhanced and detection of the bacterium/virus having mass of 4.28 × 10⁻²⁴ kg can be effectively performed.
NASA Technical Reports Server (NTRS)
Wang, Yeou-Fang; Schrock, Mitchell; Baldwin, John R.; Borden, Charles S.
2010-01-01
The Ground Resource Allocation and Planning Environment (GRAPE 1.0) is a Web-based, collaborative team environment based on the Microsoft SharePoint platform, which provides Deep Space Network (DSN) resource planners tools and services for sharing information and performing analysis.
Modeling and Analysis of High Torque Density Transverse Flux Machines for Direct-Drive Applications
NASA Astrophysics Data System (ADS)
Hasan, Iftekhar
Commercially available permanent magnet synchronous machines (PMSM) typically use rare-earth-based permanent magnets (PM). However, volatility and uncertainty associated with the supply and cost of rare-earth magnets have caused a push for increased research into the development of non-rare-earth based PM machines and reluctance machines. Compared to other PMSM topologies, the Transverse Flux Machine (TFM) is a promising candidate to get higher torque densities at low speed for direct-drive applications, using non-rare-earth based PMs. The TFMs can be designed with a very small pole pitch which allows them to attain higher force density than conventional radial flux machines (RFM) and axial flux machines (AFM). This dissertation presents the modeling, electromagnetic design, vibration analysis, and prototype development of a novel non-rare-earth based PM-TFM for a direct-drive wind turbine application. The proposed TFM addresses the issues of low power factor, cogging torque, and torque ripple during the electromagnetic design phase. An improved Magnetic Equivalent Circuit (MEC) based analytical model was developed as an alternative to the time-consuming 3D Finite Element Analysis (FEA) for faster electromagnetic analysis of the TFM. The accuracy and reliability of the MEC model were verified, both with 3D-FEA and experimental results. The improved MEC model was integrated with a Particle Swarm Optimization (PSO) algorithm to further enhance the capability of the analytical tool for performing rigorous optimization of performance-sensitive machine design parameters to extract the highest torque density for rated speed. A novel concept of integrating the rotary transformer within the proposed TFM design was explored to completely eliminate the use of magnets from the TFM. While keeping the same machine envelope, and without changing the stator or rotor cores, the primary and secondary of a rotary transformer were embedded into the double-sided TFM. The proposed structure allowed for improved flux-weakening capabilities of the TFM for wide speed operations. The electromagnetic design feature of stator pole shaping was used to address the issue of cogging torque and torque ripple in 3-phase TFM. The slant-pole tooth-face in the stator showed significant improvements in cogging torque and torque ripple performance during the 3-phase FEA analysis of the TFM. A detailed structural analysis for the proposed TFM was done prior to the prototype development to validate the structural integrity of the TFM design at rated and maximum speed operation. Vibration performance of the TFM was investigated to determine the structural performance of the TFM under resonance. The prototype for the proposed TFM was developed at the Alternative Energy Laboratory of the University of Akron. The working prototype is a testament to the feasibility of developing and implementing the novel TFM design proposed in this research. Experiments were performed to validate the 3D-FEA electromagnetic and vibration performance result.
NASA Technical Reports Server (NTRS)
Hopkins, W. D.; Washburn, D. A.; Hyatt, C. W.; Rumbaugh, D. M. (Principal Investigator)
1996-01-01
This study describes video-task acquisition in two nonhuman primate species. The subjects were seven rhesus monkeys (Macaca mulatta) and seven chimpanzees (Pan troglodytes). All subjects were trained to manipulate a joystick which controlled a cursor displayed on a computer monitor. Two criterion levels were used: one based on conceptual knowledge of the task and one based on motor performance. Chimpanzees and rhesus monkeys attained criterion in a comparable number of trials using a conceptually based criterion. However, using a criterion based on motor performance, chimpanzees reached criterion significantly faster than rhesus monkeys. Analysis of error patterns and latency indicated that the rhesus monkeys had a larger asymmetry in response bias and were significantly slower in responding than the chimpanzees. The results are discussed in terms of the relation between object manipulation skills and video-task acquisition.
Gosho, Masahiko; Hirakawa, Akihiro; Noma, Hisashi; Maruo, Kazushi; Sato, Yasunori
2017-10-01
In longitudinal clinical trials, some subjects will drop out before completing the trial, so their measurements towards the end of the trial are not obtained. Mixed-effects models for repeated measures (MMRM) analysis with "unstructured" (UN) covariance structure are increasingly common as a primary analysis for group comparisons in these trials. Furthermore, model-based covariance estimators have been routinely used for testing the group difference and estimating confidence intervals of the difference in the MMRM analysis using the UN covariance. However, using the MMRM analysis with the UN covariance could lead to convergence problems for numerical optimization, especially in trials with a small-sample size. Although the so-called sandwich covariance estimator is robust to misspecification of the covariance structure, its performance deteriorates in settings with small-sample size. We investigated the performance of the sandwich covariance estimator and covariance estimators adjusted for small-sample bias proposed by Kauermann and Carroll ( J Am Stat Assoc 2001; 96: 1387-1396) and Mancl and DeRouen ( Biometrics 2001; 57: 126-134) fitting simpler covariance structures through a simulation study. In terms of the type 1 error rate and coverage probability of confidence intervals, Mancl and DeRouen's covariance estimator with compound symmetry, first-order autoregressive (AR(1)), heterogeneous AR(1), and antedependence structures performed better than the original sandwich estimator and Kauermann and Carroll's estimator with these structures in the scenarios where the variance increased across visits. The performance based on Mancl and DeRouen's estimator with these structures was nearly equivalent to that based on the Kenward-Roger method for adjusting the standard errors and degrees of freedom with the UN structure. The model-based covariance estimator with the UN structure under unadjustment of the degrees of freedom, which is frequently used in applications, resulted in substantial inflation of the type 1 error rate. We recommend the use of Mancl and DeRouen's estimator in MMRM analysis if the number of subjects completing is ( n + 5) or less, where n is the number of planned visits. Otherwise, the use of Kenward and Roger's method with UN structure should be the best way.
Kosins, Aaron M; Scholz, Thomas; Cetinkaya, Mine; Evans, Gregory R D
2013-08-01
The purpose of this study was to determine the evidenced-based value of prophylactic drainage of subcutaneous wounds in surgery. An electronic search was performed. Articles comparing subcutaneous prophylactic drainage with no drainage were identified and classified by level of evidence. If sufficient randomized controlled trials were included, a meta-analysis was performed using the random-effects model. Fifty-two randomized controlled trials were included in the meta-analysis, and subgroups were determined by specific surgical procedures or characteristics (cesarean delivery, abdominal wound, breast reduction, breast biopsy, femoral wound, axillary lymph node dissection, hip and knee arthroplasty, obesity, and clean-contaminated wound). Studies were compared for the following endpoints: hematoma, wound healing issues, seroma, abscess, and infection. Fifty-two studies with a total of 6930 operations were identified as suitable for this analysis. There were 3495 operations in the drain group and 3435 in the no-drain group. Prophylactic subcutaneous drainage offered a statistically significant advantage only for (1) prevention of hematomas in breast biopsy procedures and (2) prevention of seromas in axillary node dissections. In all other procedures studied, drainage did not offer an advantage. Many surgical operations can be performed safely without prophylactic drainage. Surgeons can consider omitting drains after cesarean section, breast reduction, abdominal wounds, femoral wounds, and hip and knee joint replacement. Furthermore, surgeons should consider not placing drains prophylactically in obese patients. However, drain placement following a surgical procedure is the surgeon's choice and can be based on multiple factors beyond the type of procedure being performed or the patient's body habitus. Therapeutic, II.
NASA Astrophysics Data System (ADS)
Wang, Huiqin; Wang, Xue; Lynette, Kibe; Cao, Minghua
2018-06-01
The performance of multiple-input multiple-output wireless optical communication systems that adopt Q-ary pulse position modulation over spatial correlated log-normal fading channel is analyzed in terms of its un-coded bit error rate and ergodic channel capacity. The analysis is based on the Wilkinson's method which approximates the distribution of a sum of correlated log-normal random variables to a log-normal random variable. The analytical and simulation results corroborate the increment of correlation coefficients among sub-channels lead to system performance degradation. Moreover, the receiver diversity has better performance in resistance of spatial correlation caused channel fading.
Holmquist-Johnson, C. L.
2009-01-01
River spanning rock structures are being constructed for water delivery as well as to enable fish passage at barriers and provide or improve the aquatic habitat for endangered fish species. Current design methods are based upon anecdotal information applicable to a narrow range of channel conditions. The complex flow patterns and performance of rock weirs is not well understood. Without accurate understanding of their hydraulics, designers cannot address the failure mechanisms of these structures. Flow characteristics such as jets, near bed velocities, recirculation, eddies, and plunging flow govern scour pool development. These detailed flow patterns can be replicated using a 3D numerical model. Numerical studies inexpensively simulate a large number of cases resulting in an increased range of applicability in order to develop design tools and predictive capability for analysis and design. The analysis and results of the numerical modeling, laboratory modeling, and field data provide a process-based method for understanding how structure geometry affects flow characteristics, scour development, fish passage, water delivery, and overall structure stability. Results of the numerical modeling allow designers to utilize results of the analysis to determine the appropriate geometry for generating desirable flow parameters. The end product of this research will develop tools and guidelines for more robust structure design or retrofits based upon predictable engineering and hydraulic performance criteria. ?? 2009 ASCE.
Analysis and design of algorithm-based fault-tolerant systems
NASA Technical Reports Server (NTRS)
Nair, V. S. Sukumaran
1990-01-01
An important consideration in the design of high performance multiprocessor systems is to ensure the correctness of the results computed in the presence of transient and intermittent failures. Concurrent error detection and correction have been applied to such systems in order to achieve reliability. Algorithm Based Fault Tolerance (ABFT) was suggested as a cost-effective concurrent error detection scheme. The research was motivated by the complexity involved in the analysis and design of ABFT systems. To that end, a matrix-based model was developed and, based on that, algorithms for both the design and analysis of ABFT systems are formulated. These algorithms are less complex than the existing ones. In order to reduce the complexity further, a hierarchical approach is developed for the analysis of large systems.
Design, Development and Analysis of Centrifugal Blower
NASA Astrophysics Data System (ADS)
Baloni, Beena Devendra; Channiwala, Salim Abbasbhai; Harsha, Sugnanam Naga Ramannath
2018-06-01
Centrifugal blowers are widely used turbomachines equipment in all kinds of modern and domestic life. Manufacturing of blowers seldom follow an optimum design solution for individual blower. Although centrifugal blowers are developed as highly efficient machines, design is still based on various empirical and semi empirical rules proposed by fan designers. There are different methodologies used to design the impeller and other components of blowers. The objective of present study is to study explicit design methodologies and tracing unified design to get better design point performance. This unified design methodology is based more on fundamental concepts and minimum assumptions. Parametric study is also carried out for the effect of design parameters on pressure ratio and their interdependency in the design. The code is developed based on a unified design using C programming. Numerical analysis is carried out to check the flow parameters inside the blower. Two blowers, one based on the present design and other on industrial design, are developed with a standard OEM blower manufacturing unit. A comparison of both designs is done based on experimental performance analysis as per IS standard. The results suggest better efficiency and more flow rate for the same pressure head in case of the present design compared with industrial one.
NASA Astrophysics Data System (ADS)
Anggit Maulana, Hiska; Haris, Abdul
2018-05-01
Reservoir and source rock Identification has been performed to deliniate the reservoir distribution of Talangakar Formation South Sumatra Basin. This study is based on integrated geophysical, geological and petrophysical data. The aims of study to determine the characteristics of the reservoir and source rock, to differentiate reservoir and source rock in same Talangakar formation, to find out the distribution of net pay reservoir and source rock layers. The method of geophysical included seismic data interpretation using time and depth structures map, post-stack inversion, interval velocity, geological interpretations included the analysis of structures and faults, and petrophysical processing is interpret data log wells that penetrating Talangakar formation containing hydrocarbons (oil and gas). Based on seismic interpretation perform subsurface mapping on Layer A and Layer I to determine the development of structures in the Regional Research. Based on the geological interpretation, trapping in the form of regional research is anticline structure on southwest-northeast trending and bounded by normal faults on the southwest-southeast regional research structure. Based on petrophysical analysis, the main reservoir in the field of research, is a layer 1,375 m of depth and a thickness 2 to 8.3 meters.
Gagné, Mathieu; Moore, Lynne; Beaudoin, Claudia; Batomen Kuimi, Brice Lionel; Sirois, Marie-Josée
2016-03-01
The International Classification of Diseases (ICD) is the main classification system used for population-based injury surveillance activities but does not contain information on injury severity. ICD-based injury severity measures can be empirically derived or mapped, but no single approach has been formally recommended. This study aimed to compare the performance of ICD-based injury severity measures to predict in-hospital mortality among injury-related admissions. A systematic review and a meta-analysis were conducted. MEDLINE, EMBASE, and Global Health databases were searched from their inception through September 2014. Observational studies that assessed the performance of ICD-based injury severity measures to predict in-hospital mortality and reported discriminative ability using the area under a receiver operating characteristic curve (AUC) were included. Metrics of model performance were extracted. Pooled AUC were estimated under random-effects models. Twenty-two eligible studies reported 72 assessments of discrimination on ICD-based injury severity measures. Reported AUC ranged from 0.681 to 0.958. Of the 72 assessments, 46 showed excellent (0.80 ≤ AUC < 0.90) and 6 outstanding (AUC ≥ 0.90) discriminative ability. Pooled AUC for ICD-based Injury Severity Score (ICISS) based on the product of traditional survival proportions was significantly higher than measures based on ICD mapped to Abbreviated Injury Scale (AIS) scores (0.863 vs. 0.825 for ICDMAP-ISS [p = 0.005] and ICDMAP-NISS [p = 0.016]). Similar results were observed when studies were stratified by the type of data used (trauma registry or hospital discharge) or the provenance of survival proportions (internally or externally derived). However, among studies published after 2003 the Trauma Mortality Prediction Model based on ICD-9 codes (TMPM-9) demonstrated superior discriminative ability than ICISS using the product of traditional survival proportions (0.850 vs. 0.802, p = 0.002). Models generally showed poor calibration. ICISS using the product of traditional survival proportions and TMPM-9 predict mortality more accurately than those mapped to AIS codes and should be preferred for describing injury severity when ICD is used to record injury diagnoses. Systematic review and meta-analysis, level III.
Angeler, David G; Viedma, Olga; Moreno, José M
2009-11-01
Time lag analysis (TLA) is a distance-based approach used to study temporal dynamics of ecological communities by measuring community dissimilarity over increasing time lags. Despite its increased use in recent years, its performance in comparison with other more direct methods (i.e., canonical ordination) has not been evaluated. This study fills this gap using extensive simulations and real data sets from experimental temporary ponds (true zooplankton communities) and landscape studies (landscape categories as pseudo-communities) that differ in community structure and anthropogenic stress history. Modeling time with a principal coordinate of neighborhood matrices (PCNM) approach, the canonical ordination technique (redundancy analysis; RDA) consistently outperformed the other statistical tests (i.e., TLAs, Mantel test, and RDA based on linear time trends) using all real data. In addition, the RDA-PCNM revealed different patterns of temporal change, and the strength of each individual time pattern, in terms of adjusted variance explained, could be evaluated, It also identified species contributions to these patterns of temporal change. This additional information is not provided by distance-based methods. The simulation study revealed better Type I error properties of the canonical ordination techniques compared with the distance-based approaches when no deterministic component of change was imposed on the communities. The simulation also revealed that strong emphasis on uniform deterministic change and low variability at other temporal scales is needed to result in decreased statistical power of the RDA-PCNM approach relative to the other methods. Based on the statistical performance of and information content provided by RDA-PCNM models, this technique serves ecologists as a powerful tool for modeling temporal change of ecological (pseudo-) communities.
Li, Li; Nguyen, Kim-Huong; Comans, Tracy; Scuffham, Paul
2018-04-01
Several utility-based instruments have been applied in cost-utility analysis to assess health state values for people with dementia. Nevertheless, concerns and uncertainty regarding their performance for people with dementia have been raised. To assess the performance of available utility-based instruments for people with dementia by comparing their psychometric properties and to explore factors that cause variations in the reported health state values generated from those instruments by conducting meta-regression analyses. A literature search was conducted and psychometric properties were synthesized to demonstrate the overall performance of each instrument. When available, health state values and variables such as the type of instrument and cognitive impairment levels were extracted from each article. A meta-regression analysis was undertaken and available covariates were included in the models. A total of 64 studies providing preference-based values were identified and included. The EuroQol five-dimension questionnaire demonstrated the best combination of feasibility, reliability, and validity. Meta-regression analyses suggested that significant differences exist between instruments, type of respondents, and mode of administration and the variations in estimated utility values had influences on incremental quality-adjusted life-year calculation. This review finds that the EuroQol five-dimension questionnaire is the most valid utility-based instrument for people with dementia, but should be replaced by others under certain circumstances. Although no utility estimates were reported in the article, the meta-regression analyses that examined variations in utility estimates produced by different instruments impact on cost-utility analysis, potentially altering the decision-making process in some circumstances. Copyright © 2018 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Marisarla, Soujanya; Ghia, Urmila; "Karman" Ghia, Kirti
2002-11-01
Towards a comprehensive aeroelastic analysis of a joined wing, fluid dynamics and structural analyses are initially performed separately. Steady flow calculations are currently performed using 3-D compressible Navier-Stokes equations. Flow analysis of M6-Onera wing served to validate the software for the fluid dynamics analysis. The complex flow field of the joined wing is analyzed and the prevailing fluid dynamic forces are computed using COBALT software. Currently, these forces are being transferred as fluid loads on the structure. For the structural analysis, several test cases were run considering the wing as a cantilever beam; these served as validation cases. A nonlinear structural analysis of the wing is being performed using ANSYS software to predict the deflections and stresses on the joined wing. Issues related to modeling, and selecting appropriate mesh for the structure were addressed by first performing a linear analysis. The frequencies and mode shapes of the deformed wing are obtained from modal analysis. Both static and dynamic analyses are carried out, and the results obtained are carefully analyzed. Loose coupling between the fluid and structural analyses is currently being examined.
Shin, Jaeyoung; Müller, Klaus-R; Hwang, Han-Jeong
2016-01-01
We propose a near-infrared spectroscopy (NIRS)-based brain-computer interface (BCI) that can be operated in eyes-closed (EC) state. To evaluate the feasibility of NIRS-based EC BCIs, we compared the performance of an eye-open (EO) BCI paradigm and an EC BCI paradigm with respect to hemodynamic response and classification accuracy. To this end, subjects performed either mental arithmetic or imagined vocalization of the English alphabet as a baseline task with very low cognitive loading. The performances of two linear classifiers were compared; resulting in an advantage of shrinkage linear discriminant analysis (LDA). The classification accuracy of EC paradigm (75.6 ± 7.3%) was observed to be lower than that of EO paradigm (77.0 ± 9.2%), which was statistically insignificant (p = 0.5698). Subjects reported they felt it more comfortable (p = 0.057) and easier (p < 0.05) to perform the EC BCI tasks. The different task difficulty may become a cause of the slightly lower classification accuracy of EC data. From the analysis results, we could confirm the feasibility of NIRS-based EC BCIs, which can be a BCI option that may ultimately be of use for patients who cannot keep their eyes open consistently. PMID:27824089
Shin, Jaeyoung; Müller, Klaus-R; Hwang, Han-Jeong
2016-11-08
We propose a near-infrared spectroscopy (NIRS)-based brain-computer interface (BCI) that can be operated in eyes-closed (EC) state. To evaluate the feasibility of NIRS-based EC BCIs, we compared the performance of an eye-open (EO) BCI paradigm and an EC BCI paradigm with respect to hemodynamic response and classification accuracy. To this end, subjects performed either mental arithmetic or imagined vocalization of the English alphabet as a baseline task with very low cognitive loading. The performances of two linear classifiers were compared; resulting in an advantage of shrinkage linear discriminant analysis (LDA). The classification accuracy of EC paradigm (75.6 ± 7.3%) was observed to be lower than that of EO paradigm (77.0 ± 9.2%), which was statistically insignificant (p = 0.5698). Subjects reported they felt it more comfortable (p = 0.057) and easier (p < 0.05) to perform the EC BCI tasks. The different task difficulty may become a cause of the slightly lower classification accuracy of EC data. From the analysis results, we could confirm the feasibility of NIRS-based EC BCIs, which can be a BCI option that may ultimately be of use for patients who cannot keep their eyes open consistently.