Thiele, Kristina; Quinting, Jana Marie; Stenneken, Prisca
2016-09-01
The investigation of word generation performance is an accepted, widely used, and well-established method for examining cognitive, language, or communication impairment due to brain damage. The performance measure traditionally applied in the investigation of word generation is the number of correct responses. Previous studies, however, have suggested that this measure does not capture all potentially relevant aspects of word generation performance and hence its underlying processes, so that its analytical and explanatory power of word generation performance might be rather limited. Therefore, additional qualitative or quantitative performance measures have been introduced to gain information that goes beyond the deficit and allows for therapeutic implications. We undertook a systematic review and meta-analysis of original research that focused on the application of additional measures of word generation performance in adult clinical populations with acquired brain injury. Word generation tasks are an integral part of many different tests, but only few use additional performance measures in addition to the number of correct responses in the analysis of word generation performance. Additional measures, which showed increased or similar diagnostic utility relative to the traditional performance measure, regarded clustering and switching, error types, and temporal characteristics. The potential of additional performance measures is not yet fully exhausted in patients with brain injury. The temporal measure of response latencies in particular is not adequately represented, though it may be a reliable measure especially for identifying subtle impairments. Unfortunately, there is no general consensus as of yet on which additional measures are best suited to characterizing word generation performance. Further research is needed to specify the additional parameters that are best qualified for identifying and characterizing impaired word generation performance.
Computed Tomography Inspection and Analysis for Additive Manufacturing Components
NASA Technical Reports Server (NTRS)
Beshears, Ronald D.
2016-01-01
Computed tomography (CT) inspection was performed on test articles additively manufactured from metallic materials. Metallic AM and machined wrought alloy test articles with programmed flaws were inspected using a 2MeV linear accelerator based CT system. Performance of CT inspection on identically configured wrought and AM components and programmed flaws was assessed using standard image analysis techniques to determine the impact of additive manufacturing on inspectability of objects with complex geometries.
NASA Technical Reports Server (NTRS)
Gleman, Stuart M. (Inventor); Rowe, Geoffrey K. (Inventor)
1999-01-01
An ultrasonic bolt gage is described which uses a crosscorrelation algorithm to determine a tension applied to a fastener, such as a bolt. The cross-correlation analysis is preferably performed using a processor operating on a series of captured ultrasonic echo waveforms. The ultrasonic bolt gage is further described as using the captured ultrasonic echo waveforms to perform additional modes of analysis, such as feature recognition. Multiple tension data outputs, therefore, can be obtained from a single data acquisition for increased measurement reliability. In addition, one embodiment of the gage has been described as multi-channel, having a multiplexer for performing a tension analysis on one of a plurality of bolts.
Fu, Yanqing; Zhou, Zhihui; Kong, Hongwei; Lu, Xin; Zhao, Xinjie; Chen, Yihui; Chen, Jia; Wu, Zeming; Xu, Zhiliang; Zhao, Chunxia; Xu, Guowang
2016-09-06
Identification of illegal additives in complex matrixes is important in the food safety field. In this study a nontargeted screening strategy was developed to find illegal additives based on ultrahigh-performance liquid chromatography-high-resolution mass spectrometry (UHPLC-HRMS). First, an analytical method for possible illegal additives in complex matrixes was established including fast sample pretreatment, accurate UHPLC separation, and HRMS detection. Second, efficient data processing and differential analysis workflow were suggested and applied to find potential risk compounds. Third, structure elucidation of risk compounds was performed by (1) searching online databases [Metlin and the Human Metabolome Database (HMDB)] and an in-house database which was established at the above-defined conditions of UHPLC-HRMS analysis and contains information on retention time, mass spectra (MS), and tandem mass spectra (MS/MS) of 475 illegal additives, (2) analyzing fragment ions, and (3) referring to fragmentation rules. Fish was taken as an example to show the usefulness of the nontargeted screening strategy, and six additives were found in suspected fish samples. Quantitative analysis was further carried out to determine the contents of these compounds. The satisfactory application of this strategy in fish samples means that it can also be used in the screening of illegal additives in other kinds of food samples.
DOT National Transportation Integrated Search
2010-12-01
This project mainly focuses on exit ramp performance analysis of safety and operations. In addition, issues of advance guide sign for exit ramp are also mentioned. : Safety analysis evaluates safety performances of different exit ramps used in Florid...
NASA Astrophysics Data System (ADS)
Karakas, Ahmet Sertac; Bozkurt, Tarik Serhat; Sayin, Baris; Ortes, Faruk
2017-07-01
In passenger and freight traffic on the roads, which has the largest share of the hot mix asphalt (HMA) prepared asphalt concrete pavement is one of the most preferred type of flexible superstructure. During the service life of the road, they must provide the performance which is expected to show. HMA must be high performance mix design, comfortable, safe and resistance to degradation. In addition, it becomes a critical need to use various additives materials for roads to be able to serve long-term against environmental conditions such as traffic and climate due to the fact that the way of raw materials is limited. Styrene Butadiene Styrene (SBS) polymers are widely used among additives. In this study, the numerical analysis of SBS modified HMA designed asphalt concrete coatings prepared with different thicknesses with SBS modified HMA is performed. After that, stress and deformation values of the three pavement models are compared and evaluated.
SEP thrust subsystem performance sensitivity analysis
NASA Technical Reports Server (NTRS)
Atkins, K. L.; Sauer, C. G., Jr.; Kerrisk, D. J.
1973-01-01
This is a two-part report on solar electric propulsion (SEP) performance sensitivity analysis. The first part describes the preliminary analysis of the SEP thrust system performance for an Encke rendezvous mission. A detailed description of thrust subsystem hardware tolerances on mission performance is included together with nominal spacecraft parameters based on these tolerances. The second part describes the method of analysis and graphical techniques used in generating the data for Part 1. Included is a description of both the trajectory program used and the additional software developed for this analysis. Part 2 also includes a comprehensive description of the use of the graphical techniques employed in this performance analysis.
Bass, Ellen J; Baumgart, Leigh A; Shepley, Kathryn Klein
2013-03-01
Displaying both the strategy that information analysis automation employs to makes its judgments and variability in the task environment may improve human judgment performance, especially in cases where this variability impacts the judgment performance of the information analysis automation. This work investigated the contribution of providing either information analysis automation strategy information, task environment information, or both, on human judgment performance in a domain where noisy sensor data are used by both the human and the information analysis automation to make judgments. In a simplified air traffic conflict prediction experiment, 32 participants made probability of horizontal conflict judgments under different display content conditions. After being exposed to the information analysis automation, judgment achievement significantly improved for all participants as compared to judgments without any of the automation's information. Participants provided with additional display content pertaining to cue variability in the task environment had significantly higher aided judgment achievement compared to those provided with only the automation's judgment of a probability of conflict. When designing information analysis automation for environments where the automation's judgment achievement is impacted by noisy environmental data, it may be beneficial to show additional task environment information to the human judge in order to improve judgment performance.
Indicators of suboptimal performance embedded in the Wechsler Memory Scale-Fourth Edition (WMS-IV).
Bouman, Zita; Hendriks, Marc P H; Schmand, Ben A; Kessels, Roy P C; Aldenkamp, Albert P
2016-01-01
Recognition and visual working memory tasks from the Wechsler Memory Scale-Fourth Edition (WMS-IV) have previously been documented as useful indicators for suboptimal performance. The present study examined the clinical utility of the Dutch version of the WMS-IV (WMS-IV-NL) for the identification of suboptimal performance using an analogue study design. The patient group consisted of 59 mixed-etiology patients; the experimental malingerers were 50 healthy individuals who were asked to simulate cognitive impairment as a result of a traumatic brain injury; the last group consisted of 50 healthy controls who were instructed to put forth full effort. Experimental malingerers performed significantly lower on all WMS-IV-NL tasks than did the patients and healthy controls. A binary logistic regression analysis was performed on the experimental malingerers and the patients. The first model contained the visual working memory subtests (Spatial Addition and Symbol Span) and the recognition tasks of the following subtests: Logical Memory, Verbal Paired Associates, Designs, Visual Reproduction. The results showed an overall classification rate of 78.4%, and only Spatial Addition explained a significant amount of variation (p < .001). Subsequent logistic regression analysis and receiver operating characteristic (ROC) analysis supported the discriminatory power of the subtest Spatial Addition. A scaled score cutoff of <4 produced 93% specificity and 52% sensitivity for detection of suboptimal performance. The WMS-IV-NL Spatial Addition subtest may provide clinically useful information for the detection of suboptimal performance.
Solar energy system economic evaluation: Fern Tunkhannock, Tunkhannock, Pennsylvania
NASA Astrophysics Data System (ADS)
1980-09-01
The economic performance of an Operational Test Site (OTS) is described. The long term economic performance of the system at its installation site and extrapolation to four additional selected locations to demonstrate the viability of the design over a broad range of environmental and economic conditions is reported. Topics discussed are: system description, study approach, economic analysis and system optimization, and technical and economical results of analysis. Data for the economic analysis are generated through evaluation of the OTS. The simulation is based on the technical results of the seasonal report simulation. In addition localized and standard economic parameters are used for economic analysis.
Solar energy system economic evaluation: Fern Tunkhannock, Tunkhannock, Pennsylvania
NASA Technical Reports Server (NTRS)
1980-01-01
The economic performance of an Operational Test Site (OTS) is described. The long term economic performance of the system at its installation site and extrapolation to four additional selected locations to demonstrate the viability of the design over a broad range of environmental and economic conditions is reported. Topics discussed are: system description, study approach, economic analysis and system optimization, and technical and economical results of analysis. Data for the economic analysis are generated through evaluation of the OTS. The simulation is based on the technical results of the seasonal report simulation. In addition localized and standard economic parameters are used for economic analysis.
Crop Identification Technology Assessment for Remote Sensing (CITARS)
NASA Technical Reports Server (NTRS)
Bauer, M. E.; Cary, T. K.; Davis, B. J.; Swain, P. H.
1975-01-01
The results of classifications and experiments performed for the Crop Identification Technology Assessment for Remote Sensing (CITARS) project are summarized. Fifteen data sets were classified using two analysis procedures. One procedure used class weights while the other assumed equal probabilities of occurrence for all classes. In addition, 20 data sets were classified using training statistics from another segment or date. The results of both the local and non-local classifications in terms of classification and proportion estimation are presented. Several additional experiments are described which were performed to provide additional understanding of the CITARS results. These experiments investigated alternative analysis procedures, training set selection and size, effects of multitemporal registration, the spectral discriminability of corn, soybeans, and other, and analysis of aircraft multispectral data.
NASA Technical Reports Server (NTRS)
Bebis, George (Inventor); Amayeh, Gholamreza (Inventor)
2015-01-01
Hand-based biometric analysis systems and techniques are described which provide robust hand-based identification and verification. An image of a hand is obtained, which is then segmented into a palm region and separate finger regions. Acquisition of the image is performed without requiring particular orientation or placement restrictions. Segmentation is performed without the use of reference points on the images. Each segment is analyzed by calculating a set of Zernike moment descriptors for the segment. The feature parameters thus obtained are then fused and compared to stored sets of descriptors in enrollment templates to arrive at an identity decision. By using Zernike moments, and through additional manipulation, the biometric analysis is invariant to rotation, scale, or translation or an in put image. Additionally, the analysis utilizes re-use of commonly-seen terms in Zernike calculations to achieve additional efficiencies over traditional Zernike moment calculation.
NASA Technical Reports Server (NTRS)
Bebis, George
2013-01-01
Hand-based biometric analysis systems and techniques provide robust hand-based identification and verification. An image of a hand is obtained, which is then segmented into a palm region and separate finger regions. Acquisition of the image is performed without requiring particular orientation or placement restrictions. Segmentation is performed without the use of reference points on the images. Each segment is analyzed by calculating a set of Zernike moment descriptors for the segment. The feature parameters thus obtained are then fused and compared to stored sets of descriptors in enrollment templates to arrive at an identity decision. By using Zernike moments, and through additional manipulation, the biometric analysis is invariant to rotation, scale, or translation or an input image. Additionally, the analysis uses re-use of commonly seen terms in Zernike calculations to achieve additional efficiencies over traditional Zernike moment calculation.
Bass, Ellen J.; Baumgart, Leigh A.; Shepley, Kathryn Klein
2014-01-01
Displaying both the strategy that information analysis automation employs to makes its judgments and variability in the task environment may improve human judgment performance, especially in cases where this variability impacts the judgment performance of the information analysis automation. This work investigated the contribution of providing either information analysis automation strategy information, task environment information, or both, on human judgment performance in a domain where noisy sensor data are used by both the human and the information analysis automation to make judgments. In a simplified air traffic conflict prediction experiment, 32 participants made probability of horizontal conflict judgments under different display content conditions. After being exposed to the information analysis automation, judgment achievement significantly improved for all participants as compared to judgments without any of the automation's information. Participants provided with additional display content pertaining to cue variability in the task environment had significantly higher aided judgment achievement compared to those provided with only the automation's judgment of a probability of conflict. When designing information analysis automation for environments where the automation's judgment achievement is impacted by noisy environmental data, it may be beneficial to show additional task environment information to the human judge in order to improve judgment performance. PMID:24847184
Stability, performance and sensitivity analysis of I.I.D. jump linear systems
NASA Astrophysics Data System (ADS)
Chávez Fuentes, Jorge R.; González, Oscar R.; Gray, W. Steven
2018-06-01
This paper presents a symmetric Kronecker product analysis of independent and identically distributed jump linear systems to develop new, lower dimensional equations for the stability and performance analysis of this type of systems than what is currently available. In addition, new closed form expressions characterising multi-parameter relative sensitivity functions for performance metrics are introduced. The analysis technique is illustrated with a distributed fault-tolerant flight control example where the communication links are allowed to fail randomly.
Computed Tomography Inspection and Analysis for Additive Manufacturing Components
NASA Technical Reports Server (NTRS)
Beshears, Ronald D.
2017-01-01
Computed tomography (CT) inspection was performed on test articles additively manufactured from metallic materials. Metallic AM and machined wrought alloy test articles with programmed flaws and geometric features were inspected using a 2-megavolt linear accelerator based CT system. Performance of CT inspection on identically configured wrought and AM components and programmed flaws was assessed to determine the impact of additive manufacturing on inspectability of objects with complex geometries.
Session 6: Dynamic Modeling and Systems Analysis
NASA Technical Reports Server (NTRS)
Csank, Jeffrey; Chapman, Jeffryes; May, Ryan
2013-01-01
These presentations cover some of the ongoing work in dynamic modeling and dynamic systems analysis. The first presentation discusses dynamic systems analysis and how to integrate dynamic performance information into the systems analysis. The ability to evaluate the dynamic performance of an engine design may allow tradeoffs between the dynamic performance and operability of a design resulting in a more efficient engine design. The second presentation discusses the Toolbox for Modeling and Analysis of Thermodynamic Systems (T-MATS). T-MATS is a Simulation system with a library containing the basic building blocks that can be used to create dynamic Thermodynamic Systems. Some of the key features include Turbo machinery components, such as turbines, compressors, etc., and basic control system blocks. T-MAT is written in the Matlab-Simulink environment and is open source software. The third presentation focuses on getting additional performance from the engine by allowing the limit regulators only to be active when a limit is danger of being violated. Typical aircraft engine control architecture is based on MINMAX scheme, which is designed to keep engine operating within prescribed mechanical/operational safety limits. Using a conditionally active min-max limit regulator scheme, additional performance can be gained by disabling non-relevant limit regulators
Mars Exploration Rover Six-Degree-Of-Freedom Entry Trajectory Analysis
NASA Technical Reports Server (NTRS)
Desai, Prasun N.; Schoenenberger, Mark; Cheatwood, F. M.
2003-01-01
The Mars Exploration Rover mission will be the next opportunity for surface exploration of Mars in January 2004. Two rovers will be delivered to the surface of Mars using the same entry, descent, and landing scenario that was developed and successfully implemented by Mars Pathfinder. This investigation describes the trajectory analysis that was performed for the hypersonic portion of the MER entry. In this analysis, a six-degree-of-freedom trajectory simulation of the entry is performed to determine the entry characteristics of the capsules. In addition, a Monte Carlo analysis is also performed to statistically assess the robustness of the entry design to off-nominal conditions to assure that all entry requirements are satisfied. The results show that the attitude at peak heating and parachute deployment are well within entry limits. In addition, the parachute deployment dynamics pressure and Mach number are also well within the design requirements.
Monir, Md. Mamun; Zhu, Jun
2017-01-01
Most of the genome-wide association studies (GWASs) for human complex diseases have ignored dominance, epistasis and ethnic interactions. We conducted comparative GWASs for total cholesterol using full model and additive models, which illustrate the impacts of the ignoring genetic variants on analysis results and demonstrate how genetic effects of multiple loci could differ across different ethnic groups. There were 15 quantitative trait loci with 13 individual loci and 3 pairs of epistasis loci identified by full model, whereas only 14 loci (9 common loci and 5 different loci) identified by multi-loci additive model. Again, 4 full model detected loci were not detected using multi-loci additive model. PLINK-analysis identified two loci and GCTA-analysis detected only one locus with genome-wide significance. Full model identified three previously reported genes as well as several new genes. Bioinformatics analysis showed some new genes are related with cholesterol related chemicals and/or diseases. Analyses of cholesterol data and simulation studies revealed that the full model performs were better than the additive-model performs in terms of detecting power and unbiased estimations of genetic variants of complex traits. PMID:28079101
Additive Factors Analysis of Inhibitory Processing in the Stop-Signal Paradigm
ERIC Educational Resources Information Center
van den Wildenberg, W.P.M.; van der Molen, M.W.
2004-01-01
This article reports an additive factors analysis of choice reaction and selective stop processes manipulated in a stop-signal paradigm. Three experiments were performed in which stimulus discriminability (SD) and stimulus-response compatibility (SRC) were manipulated in a factorial fashion. In each experiment, the effects of SD and SRC were…
2016-04-04
Terminal Performance of Lead-Free Pistol Bullets in Ballistic Gelatin Using Retarding Force Analysis from High Speed Video ELIJAH COURTNEY, AMY...quantified using high speed video . The temporary stretch cavities and permanent wound cavities are also characterized. Two factors tend to re- duce the...Performance of Lead-Free Pistol Bullets in Ballistic Gelatin Using Retarding Force Analysis from High Speed Video cavity. In addition, stretching can also
Skylab M518 multipurpose furnace convection analysis
NASA Technical Reports Server (NTRS)
Bourgeois, S. V.; Spradley, L. W.
1975-01-01
An analysis was performed of the convection which existed on ground tests and during skylab processing of two experiments: vapor growth of IV-VI compounds growth of spherical crystals. A parallel analysis was also performed on Skylab experiment indium antimonide crystals because indium antimonide (InSb) was used and a free surface existed in the tellurium-doped Skylab III sample. In addition, brief analyses were also performed of the microsegregation in germanium experiment because the Skylab crystals indicated turbulent convection effects. Simple dimensional analysis calculations and a more accurate, but complex, convection computer model, were used in the analysis.
Shao, Ying-Hui; Gu, Gao-Feng; Jiang, Zhi-Qiang; Zhou, Wei-Xing; Sornette, Didier
2012-01-01
Notwithstanding the significant efforts to develop estimators of long-range correlations (LRC) and to compare their performance, no clear consensus exists on what is the best method and under which conditions. In addition, synthetic tests suggest that the performance of LRC estimators varies when using different generators of LRC time series. Here, we compare the performances of four estimators [Fluctuation Analysis (FA), Detrended Fluctuation Analysis (DFA), Backward Detrending Moving Average (BDMA), and Centred Detrending Moving Average (CDMA)]. We use three different generators [Fractional Gaussian Noises, and two ways of generating Fractional Brownian Motions]. We find that CDMA has the best performance and DFA is only slightly worse in some situations, while FA performs the worst. In addition, CDMA and DFA are less sensitive to the scaling range than FA. Hence, CDMA and DFA remain “The Methods of Choice” in determining the Hurst index of time series. PMID:23150785
Panda, Jibitesh Kumar; Sastry, Gadepalli Ravi Kiran; Rai, Ram Naresh
2018-05-25
The energy situation and the concerns about global warming nowadays have ignited research interest in non-conventional and alternative fuel resources to decrease the emission and the continuous dependency on fossil fuels, particularly for various sectors like power generation, transportation, and agriculture. In the present work, the research is focused on evaluating the performance, emission characteristics, and combustion of biodiesel such as palm kernel methyl ester with the addition of diesel additive "triacetin" in it. A timed manifold injection (TMI) system was taken up to examine the influence of durations of several blends induced on the emission and performance characteristics as compared to normal diesel operation. This experimental study shows better performance and releases less emission as compared with mineral diesel and in turn, indicates that high performance and low emission is promising in PKME-triacetin fuel operation. This analysis also attempts to describe the application of the fuzzy logic-based Taguchi analysis to optimize the emission and performance parameters.
Human Schedule Performance, Protocol Analysis, and the "Silent Dog" Methodology
ERIC Educational Resources Information Center
Cabello, Francisco; Luciano, Carmen; Gomez, Inmaculada; Barnes-Holmes, Dermot
2004-01-01
The purpose of the current experiment was to investigate the role of private verbal behavior on the operant performances of human adults, using a protocol analysis procedure with additional methodological controls (the "silent dog" method). Twelve subjects were exposed to fixed ratio 8 and differential reinforcement of low rate 3-s schedules. For…
Economic Evaluation of Observatory Solar-Energy System
NASA Technical Reports Server (NTRS)
1982-01-01
Long-term economic performance of a commercial solar-energy system was analyzed and used to predict economic performance at four additional sites. Analysis described in report was done to demonstrate viability of design over a broad range of environmental/economic conditions. Topics covered are system description, study approach, economic analysis and system optimization.
Impact of tailored feedback in assessment of communication skills for medical students.
Uhm, Seilin; Lee, Gui H; Jin, Jeong K; Bak, Yong I; Jeoung, Yeon O; Kim, Chan W
2015-01-01
Finding out the effective ways of teaching and assessing communication skills remain a challenging part of medication education. This study aims at exploring the usefulness and effectiveness of having additional feedback using qualitative analysis in assessment of communication skills in undergraduate medical training. We also determined the possibilities of using qualitative analysis in developing tailored strategies for improvement in communication skills training. This study was carried out on medical students (n=87) undergoing their final year clinical performance examination on communication skills using standardized patient by video-recording and transcribing their performances. Video-recordings of 26 students were randomly selected for qualitative analysis, and additional feedback was provided. We assessed the level of acceptance of communication skills scores between the study and nonstudy group and within the study group, before and after receiving feedback based on qualitative analysis. There was a statistically significant increase in the level of acceptance of feedback after delivering additional feedback using qualitative analysis, where the percentage of agreement with feedback increased from 15.4 to 80.8% (p<0.001). Incorporating feedback based on qualitative analysis for communication skills assessment gives essential information for medical students to learn and self-reflect, which could potentially lead to improved communication skills. As evident from our study, feedback becomes more meaningful and effective with additional feedback using qualitative analysis.
Impact of tailored feedback in assessment of communication skills for medical students
Uhm, Seilin; Lee, Gui H.; Jin, Jeong K.; Bak, Yong I.; Jeoung, Yeon O.; Kim, Chan W.
2015-01-01
Background Finding out the effective ways of teaching and assessing communication skills remain a challenging part of medication education. This study aims at exploring the usefulness and effectiveness of having additional feedback using qualitative analysis in assessment of communication skills in undergraduate medical training. We also determined the possibilities of using qualitative analysis in developing tailored strategies for improvement in communication skills training. Methods This study was carried out on medical students (n=87) undergoing their final year clinical performance examination on communication skills using standardized patient by video-recording and transcribing their performances. Video-recordings of 26 students were randomly selected for qualitative analysis, and additional feedback was provided. We assessed the level of acceptance of communication skills scores between the study and nonstudy group and within the study group, before and after receiving feedback based on qualitative analysis. Results There was a statistically significant increase in the level of acceptance of feedback after delivering additional feedback using qualitative analysis, where the percentage of agreement with feedback increased from 15.4 to 80.8% (p<0.001). Conclusions Incorporating feedback based on qualitative analysis for communication skills assessment gives essential information for medical students to learn and self-reflect, which could potentially lead to improved communication skills. As evident from our study, feedback becomes more meaningful and effective with additional feedback using qualitative analysis. PMID:26154864
NASA Astrophysics Data System (ADS)
Ye, Z.; Meng, Q.; Mohamadian, H. P.; Wang, J. T.; Chen, L.; Zhu, L.
2007-06-01
The formation of SI engine combustion deposits is a complex phenomenon which depends on various factors of fuel, oil, additives, and engine. The goal of this study is to examine the effects of operating conditions, gasoline, lubricating oil, and additives on deposit formation. Both an experimental investigation and theoretical analysis are conducted on a single cylinder engine. As a result, the impact of deposits on engine performance and exhaust emissions (HC, NO x ) has been indicated. Using samples from a cylinder head and exhaust pipe as well as switching gases via the dual-gas method (N2, O2), the deposit formation mechanism is thoroughly investigated via the thermogravity analysis approach, where the roles of organic, inorganic, and volatile components of fuel, additives, and oil on deposit formation are identified from thermogravity curves. Sustainable feedback control design is then proposed for potential emission control and performance optimization
The influence of cellulose nanocrystal additions on the performance of cement paste
Yizheng Cao; Pablo Zavaterri; Jeff Youngblood; Robert Moon; Jason Weiss
2015-01-01
The influence of cellulose nanocrystals (CNCs) addition on the performance of cement paste was investigated. Our mechanical tests show an increase in the flexural strength of approximately 30% with only 0.2% volume of CNCs with respect to cement. Isothermal calorimetry (IC) and thermogravimetric analysis (TGA) show that the degree of hydration (DOH) of the cement paste...
Box truss analysis and technology development. Task 1: Mesh analysis and control
NASA Technical Reports Server (NTRS)
Bachtell, E. E.; Bettadapur, S. S.; Coyner, J. V.
1985-01-01
An analytical tool was developed to model, analyze and predict RF performance of box truss antennas with reflective mesh surfaces. The analysis system is unique in that it integrates custom written programs for cord tied mesh surfaces, thereby drastically reducing the cost of analysis. The analysis system is capable of determining the RF performance of antennas under any type of manufacturing or operating environment by integrating together the various disciplines of design, finite element analysis, surface best fit analysis and RF analysis. The Integrated Mesh Analysis System consists of six separate programs: The Mesh Tie System Model Generator, The Loadcase Generator, The Model Optimizer, The Model Solver, The Surface Topography Solver and The RF Performance Solver. Additionally, a study using the mesh analysis system was performed to determine the effect of on orbit calibration, i.e., surface adjustment, on a typical box truss antenna.
Hydrothermal Liquefaction Treatment Hazard Analysis Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lowry, Peter P.; Wagner, Katie A.
Hazard analyses were performed to evaluate the modular hydrothermal liquefaction treatment system. The hazard assessment process was performed in 2 stages. An initial assessment utilizing Hazard Identification and Preliminary Hazards Analysis (PHA) techniques identified areas with significant or unique hazards (process safety-related hazards) that fall outside of the normal operating envelope of PNNL and warranted additional analysis. The subsequent assessment was based on a qualitative What-If analysis. The analysis was augmented, as necessary, by additional quantitative analysis for scenarios involving a release of hazardous material or energy with the potential for affecting the public. The following selected hazardous scenarios receivedmore » increased attention: •Scenarios involving a release of hazardous material or energy, controls were identified in the What-If analysis table that prevent the occurrence or mitigate the effects of the release. •Scenarios with significant consequences that could impact personnel outside the immediate operations area, quantitative analyses were performed to determine the potential magnitude of the scenario. The set of “critical controls” were identified for these scenarios (see Section 4) which prevent the occurrence or mitigate the effects of the release of events with significant consequences.« less
NASA Technical Reports Server (NTRS)
Wells, C.; Kolkhorst, H. E.
1971-01-01
The consumables analysis was performed for the Skylab 2, 3, and 4 Preliminary Reference Interim Revision Flight Plan. The analysis and the results are based on the mission requirements as specified in the flight plan and on other available data. The results indicate that the consumables requirements for the Skylab missions allow for remaining margins (percent) of oxygen, nitrogen, and water nominal as follows: 83.5, 90.8, and 88.7 for mission SL-2; 57.1, 64.1, and 67.3 for SL-3; and 30.8, 44.3, and 46.5 for SL-4. Performance of experiment M509 as scheduled in the flight plan results in venting overboard the cluster atmosphere. This is due to the addition of nitrogen for propulsion and to the additional oxygen introduced into the cabin when the experiment is performed with the crewman suited.
Re-Analysis Report: Daylighting in Schools, Additional Analysis. Tasks 2.2.1 through 2.2.5.
ERIC Educational Resources Information Center
Heschong, Lisa; Elzeyadi, Ihab; Knecht, Carey
This study expands and validates previous research that found a statistical correlation between the amount of daylight in elementary school classrooms and the performance of students on standardized math and reading tests. The researchers reanalyzed the 19971998 school year student performance data from the Capistrano Unified School District…
Performance of Oil Pumping Rings: An Analytical and Experimental Study
NASA Technical Reports Server (NTRS)
Eusepi, M. W.; Walowit, J. A.; Pinkus, O.; Holmes, P.
1986-01-01
A steady-state design computer program was developed to predict the performance of pumping rings as functions of geometry, applied loading, speed, ring modulus, and fluid viscosity. Additional analyses were developed to predict transient behavior of the ring and the effects of temperature rises occurring in the hydrodynamic film between the ring and shaft. The analysis was initially compared with previous experimental data and then used to design additional rings for further testing. Tests were performed with Rulon, carbon-graphite, and babbit rings. The design analysis was used to size all of the rings and to select the ranges of clearances, thickness, and loading. Although full quantitative agreement was lacking, relative agreement existed in that rings that were predicted to perform well theoretically, generally performed well experimentally. Some causes for discrepanices between theory and experiment are believed to be due to starvation, leakage past the secondary seal at high pressures, and uncertainties in the small clearances and local inlet temperatures to the pumping ring. A separate preliminary analysis was performed for a pumping Leningrader seal. This anlaysis can be used to predict the film thickness and flow rate thr ough the seal as a function of pressure, speed, loading, and geometry.
Model-Based Linkage Analysis of a Quantitative Trait.
Song, Yeunjoo E; Song, Sunah; Schnell, Audrey H
2017-01-01
Linkage Analysis is a family-based method of analysis to examine whether any typed genetic markers cosegregate with a given trait, in this case a quantitative trait. If linkage exists, this is taken as evidence in support of a genetic basis for the trait. Historically, linkage analysis was performed using a binary disease trait, but has been extended to include quantitative disease measures. Quantitative traits are desirable as they provide more information than binary traits. Linkage analysis can be performed using single-marker methods (one marker at a time) or multipoint (using multiple markers simultaneously). In model-based linkage analysis the genetic model for the trait of interest is specified. There are many software options for performing linkage analysis. Here, we use the program package Statistical Analysis for Genetic Epidemiology (S.A.G.E.). S.A.G.E. was chosen because it also includes programs to perform data cleaning procedures and to generate and test genetic models for a quantitative trait, in addition to performing linkage analysis. We demonstrate in detail the process of running the program LODLINK to perform single-marker analysis, and MLOD to perform multipoint analysis using output from SEGREG, where SEGREG was used to determine the best fitting statistical model for the trait.
ERIC Educational Resources Information Center
Burk, Erlan
2012-01-01
Aerospace companies needed additional research on technology-based training to verify expectations when enhancing human capital through online systems analysis training. The research for online systems analysis training provided aerospace companies a means to verify expectations for systems analysis technology-based training on business…
NASA Astrophysics Data System (ADS)
Chen, Shuming; Wang, Dengfeng; Liu, Bo
This paper investigates optimization design of the thickness of the sound package performed on a passenger automobile. The major characteristics indexes for performance selected to evaluate the processes are the SPL of the exterior noise and the weight of the sound package, and the corresponding parameters of the sound package are the thickness of the glass wool with aluminum foil for the first layer, the thickness of the glass fiber for the second layer, and the thickness of the PE foam for the third layer. In this paper, the process is fundamentally with multiple performances, thus, the grey relational analysis that utilizes grey relational grade as performance index is especially employed to determine the optimal combination of the thickness of the different layers for the designed sound package. Additionally, in order to evaluate the weighting values corresponding to various performance characteristics, the principal component analysis is used to show their relative importance properly and objectively. The results of the confirmation experiments uncover that grey relational analysis coupled with principal analysis methods can successfully be applied to find the optimal combination of the thickness for each layer of the sound package material. Therefore, the presented method can be an effective tool to improve the vehicle exterior noise and lower the weight of the sound package. In addition, it will also be helpful for other applications in the automotive industry, such as the First Automobile Works in China, Changan Automobile in China, etc.
NASA Astrophysics Data System (ADS)
Sena Maia, Bruno
The presented work is focused on characterization of thermal treated recycled and virgin carbon fibers. Their thermal performances, chemical surface composition and its influence on interfacial adhesion phenomena on PP/PA12 hybrid matrix were compared using TGA, FTIR and XPS analysis. Additionally, differences between hybrid matrix structural performances of PP/PA12 using both surface modifiers PMPPIC and MAPP were investigated. Final mechanical properties improvements between 8% up to 17% were reached by addition of PMPPIC in PP/PA12 hybrid matrix. For PP/PA12 matrix reinforcement using virgin and recycled carbon fibers, impact energy was improved up to 98% compared with MAPP modified matrix leading to a novel composite with good energy absorption. Finally, wettability studies and surface free energy analysis of all materials studied support the effect of the addition of PMPPIC, MAPP and carbon fibers in final composite surface thermodynamics bringing important data correlation between interfacial adhesion mechanisms and final composite performance.
User Instructions for the Policy Analysis Modeling System (PAMS)
DOE Office of Scientific and Technical Information (OSTI.GOV)
McNeil, Michael A.; Letschert, Virginie E.; Van Buskirk, Robert D.
PAMS uses country-specific and product-specific data to calculate estimates of impacts of a Minimum Efficiency Performance Standard (MEPS) program. The analysis tool is self-contained in a Microsoft Excel spreadsheet, and requires no links to external data, or special code additions to run. The analysis can be customized to a particular program without additional user input, through the use of the pull-down menus located on the Summary page. In addition, the spreadsheet contains many areas into which user-generated input data can be entered for increased accuracy of projection. The following is a step-by-step guide for using and customizing the tool.
Hydrothermal Liquefaction Treatment Preliminary Hazard Analysis Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lowry, Peter P.; Wagner, Katie A.
A preliminary hazard assessment was completed during February 2015 to evaluate the conceptual design of the modular hydrothermal liquefaction treatment system. The hazard assessment was performed in 2 stages. An initial assessment utilizing Hazard Identification and Preliminary Hazards Analysis (PHA) techniques identified areas with significant or unique hazards (process safety-related hazards) that fall outside of the normal operating envelope of PNNL and warranted additional analysis. The subsequent assessment was based on a qualitative What-If analysis. This analysis was augmented, as necessary, by additional quantitative analysis for scenarios involving a release of hazardous material or energy with the potential for affectingmore » the public.« less
NeoAnalysis: a Python-based toolbox for quick electrophysiological data processing and analysis.
Zhang, Bo; Dai, Ji; Zhang, Tao
2017-11-13
In a typical electrophysiological experiment, especially one that includes studying animal behavior, the data collected normally contain spikes, local field potentials, behavioral responses and other associated data. In order to obtain informative results, the data must be analyzed simultaneously with the experimental settings. However, most open-source toolboxes currently available for data analysis were developed to handle only a portion of the data and did not take into account the sorting of experimental conditions. Additionally, these toolboxes require that the input data be in a specific format, which can be inconvenient to users. Therefore, the development of a highly integrated toolbox that can process multiple types of data regardless of input data format and perform basic analysis for general electrophysiological experiments is incredibly useful. Here, we report the development of a Python based open-source toolbox, referred to as NeoAnalysis, to be used for quick electrophysiological data processing and analysis. The toolbox can import data from different data acquisition systems regardless of their formats and automatically combine different types of data into a single file with a standardized format. In cases where additional spike sorting is needed, NeoAnalysis provides a module to perform efficient offline sorting with a user-friendly interface. Then, NeoAnalysis can perform regular analog signal processing, spike train, and local field potentials analysis, behavioral response (e.g. saccade) detection and extraction, with several options available for data plotting and statistics. Particularly, it can automatically generate sorted results without requiring users to manually sort data beforehand. In addition, NeoAnalysis can organize all of the relevant data into an informative table on a trial-by-trial basis for data visualization. Finally, NeoAnalysis supports analysis at the population level. With the multitude of general-purpose functions provided by NeoAnalysis, users can easily obtain publication-quality figures without writing complex codes. NeoAnalysis is a powerful and valuable toolbox for users doing electrophysiological experiments.
NASA Astrophysics Data System (ADS)
Hawkins, Donovan Lee
In this thesis I present a software framework for use on the ATLAS muon CSC readout driver. This C++ framework uses plug-in Decoders incorporating hand-optimized assembly language routines to perform sparsification and data formatting. The software is designed with both flexibility and performance in mind, and runs on a custom 9U VME board using Texas Instruments TMS360C6203 digital signal processors. I describe the requirements of the software, the methods used in its design, and the results of testing the software with simulated data. I also present modifications to a chi-squared analysis of the Standard Model and Four Down Quark Model (FDQM) originally done by Dr. Dennis Silverman. The addition of four new experiments to the analysis has little effect on the Standard Model but provides important new restrictions on the FDQM. The method used to incorporate these new experiments is presented, and the consequences of their addition are reviewed.
Yan, Jun; Shi, Songshan; Wang, Hongwei; Liu, Ruimin; Li, Ning; Chen, Yonglin; Wang, Shunchun
2016-01-20
A novel analytical method for neutral monosaccharide composition analysis of plant-derived oligo- and polysaccharides was developed using hydrophilic interaction liquid chromatography coupled to a charged aerosol detector. The effects of column type, additives, pH and column temperature on retention and separation were evaluated. Additionally, the method could distinguish potential impurities in samples, including chloride, sulfate and sodium, from sugars. The results of validation demonstrated that this method had good linearity (R(2) ≥ 0.9981), high precision (relative standard deviation ≤ 4.43%), and adequate accuracy (94.02-103.37% recovery) and sensitivity (detection limit: 15-40 ng). Finally, the monosaccharide compositions of the polysaccharide from Eclipta prostrasta L. and stachyose were successfully profiled through this method. This report represents the first time that all of these common monosaccharides could be well-separated and determined simultaneously by high performance liquid chromatography without additional derivatization. This newly developed method is convenient, efficient and reliable for monosaccharide analysis. Copyright © 2015 Elsevier Ltd. All rights reserved.
Droplet-Based Segregation and Extraction of Concentrated Samples
DOE Office of Scientific and Technical Information (OSTI.GOV)
Buie, C R; Buckley, P; Hamilton, J
2007-02-23
Microfluidic analysis often requires sample concentration and separation techniques to isolate and detect analytes of interest. Complex or scarce samples may also require an orthogonal separation and detection method or off-chip analysis to confirm results. To perform these additional steps, the concentrated sample plug must be extracted from the primary microfluidic channel with minimal sample loss and dilution. We investigated two extraction techniques; injection of immiscible fluid droplets into the sample stream (''capping'''') and injection of the sample into an immiscible fluid stream (''extraction''). From our results we conclude that capping is the more effective partitioning technique. Furthermore, this functionalitymore » enables additional off-chip post-processing procedures such as DNA/RNA microarray analysis, realtime polymerase chain reaction (RT-PCR), and culture growth to validate chip performance.« less
Evaluation of Parallel Analysis Methods for Determining the Number of Factors
ERIC Educational Resources Information Center
Crawford, Aaron V.; Green, Samuel B.; Levy, Roy; Lo, Wen-Juo; Scott, Lietta; Svetina, Dubravka; Thompson, Marilyn S.
2010-01-01
Population and sample simulation approaches were used to compare the performance of parallel analysis using principal component analysis (PA-PCA) and parallel analysis using principal axis factoring (PA-PAF) to identify the number of underlying factors. Additionally, the accuracies of the mean eigenvalue and the 95th percentile eigenvalue criteria…
40 CFR 265.200 - Waste analysis and trial tests.
Code of Federal Regulations, 2010 CFR
2010-07-01
... operator to place the results from each waste analysis and trial test, or the documented information, in... 40 Protection of Environment 25 2010-07-01 2010-07-01 false Waste analysis and trial tests. 265... DISPOSAL FACILITIES Tank Systems § 265.200 Waste analysis and trial tests. In addition to performing the...
40 CFR 265.200 - Waste analysis and trial tests.
Code of Federal Regulations, 2013 CFR
2013-07-01
... operator to place the results from each waste analysis and trial test, or the documented information, in... 40 Protection of Environment 27 2013-07-01 2013-07-01 false Waste analysis and trial tests. 265... DISPOSAL FACILITIES Tank Systems § 265.200 Waste analysis and trial tests. In addition to performing the...
40 CFR 265.200 - Waste analysis and trial tests.
Code of Federal Regulations, 2011 CFR
2011-07-01
... operator to place the results from each waste analysis and trial test, or the documented information, in... 40 Protection of Environment 26 2011-07-01 2011-07-01 false Waste analysis and trial tests. 265... DISPOSAL FACILITIES Tank Systems § 265.200 Waste analysis and trial tests. In addition to performing the...
40 CFR 265.200 - Waste analysis and trial tests.
Code of Federal Regulations, 2012 CFR
2012-07-01
... operator to place the results from each waste analysis and trial test, or the documented information, in... 40 Protection of Environment 27 2012-07-01 2012-07-01 false Waste analysis and trial tests. 265... DISPOSAL FACILITIES Tank Systems § 265.200 Waste analysis and trial tests. In addition to performing the...
40 CFR 265.200 - Waste analysis and trial tests.
Code of Federal Regulations, 2014 CFR
2014-07-01
... operator to place the results from each waste analysis and trial test, or the documented information, in... 40 Protection of Environment 26 2014-07-01 2014-07-01 false Waste analysis and trial tests. 265... DISPOSAL FACILITIES Tank Systems § 265.200 Waste analysis and trial tests. In addition to performing the...
NDARC NASA Design and Analysis of Rotorcraft. Appendix 5; Theory
NASA Technical Reports Server (NTRS)
Johnson, Wayne
2017-01-01
The NASA Design and Analysis of Rotorcraft (NDARC) software is an aircraft system analysis tool that supports both conceptual design efforts and technology impact assessments. The principal tasks are to design (or size) a rotorcraft to meet specified requirements, including vertical takeoff and landing (VTOL) operation, and then analyze the performance of the aircraft for a set of conditions. For broad and lasting utility, it is important that the code have the capability to model general rotorcraft configurations, and estimate the performance and weights of advanced rotor concepts. The architecture of the NDARC code accommodates configuration flexibility, a hierarchy of models, and ultimately multidisciplinary design, analysis, and optimization. Initially the software is implemented with low-fidelity models, typically appropriate for the conceptual design environment. An NDARC job consists of one or more cases, each case optionally performing design and analysis tasks. The design task involves sizing the rotorcraft to satisfy specified design conditions and missions. The analysis tasks can include off-design mission performance calculation, flight performance calculation for point operating conditions, and generation of subsystem or component performance maps. For analysis tasks, the aircraft description can come from the sizing task, from a previous case or a previous NDARC job, or be independently generated (typically the description of an existing aircraft). The aircraft consists of a set of components, including fuselage, rotors, wings, tails, and propulsion. For each component, attributes such as performance, drag, and weight can be calculated; and the aircraft attributes are obtained from the sum of the component attributes. Description and analysis of conventional rotorcraft configurations is facilitated, while retaining the capability to model novel and advanced concepts. Specific rotorcraft configurations considered are single-main-rotor and tail-rotor helicopter, tandem helicopter, coaxial helicopter, and tiltrotor. The architecture of the code accommodates addition of new or higher-fidelity attribute models for a component, as well as addition of new components.
NDARC: NASA Design and Analysis of Rotorcraft. Appendix 3; Theory
NASA Technical Reports Server (NTRS)
Johnson, Wayne
2016-01-01
The NASA Design and Analysis of Rotorcraft (NDARC) software is an aircraft system analysis tool that supports both conceptual design efforts and technology impact assessments. The principal tasks are to design (or size) a rotorcraft to meet speci?ed requirements, including vertical takeoff and landing (VTOL) operation, and then analyze the performance of the aircraft for a set of conditions. For broad and lasting utility, it is important that the code have the capability to model general rotorcraft con?gurations, and estimate the performance and weights of advanced rotor concepts. The architecture of the NDARC code accommodates con?guration ?exibility, a hierarchy of models, and ultimately multidisciplinary design, analysis, and optimization. Initially the software is implemented with low-?delity models, typically appropriate for the conceptual design environment. An NDARC job consists of one or more cases, each case optionally performing design and analysis tasks. The design task involves sizing the rotorcraft to satisfy speci?ed design conditions and missions. The analysis tasks can include off-design mission performance calculation, ?ight performance calculation for point operating conditions, and generation of subsystem or component performance maps. For analysis tasks, the aircraft description can come from the sizing task, from a previous case or a previous NDARC job, or be independently generated (typically the description of an existing aircraft). The aircraft consists of a set of components, including fuselage, rotors, wings, tails, and propulsion. For each component, attributes such as performance, drag, and weight can be calculated; and the aircraft attributes are obtained from the sum of the component attributes. Description and analysis of conventional rotorcraft con?gurations is facilitated, while retaining the capability to model novel and advanced concepts. Speci?c rotorcraft con?gurations considered are single-main-rotor and tail-rotor helicopter, tandem helicopter, coaxial helicopter, and tiltrotor. The architecture of the code accommodates addition of new or higher-?delity attribute models for a component, as well as addition of new components.
NDARC NASA Design and Analysis of Rotorcraft - Input, Appendix 2
NASA Technical Reports Server (NTRS)
Johnson, Wayne
2016-01-01
The NASA Design and Analysis of Rotorcraft (NDARC) software is an aircraft system analysis tool that supports both conceptual design efforts and technology impact assessments. The principal tasks are to design (or size) a rotorcraft to meet specified requirements, including vertical takeoff and landing (VTOL) operation, and then analyze the performance of the aircraft for a set of conditions. For broad and lasting utility, it is important that the code have the capability to model general rotorcraft configurations, and estimate the performance and weights of advanced rotor concepts. The architecture of the NDARC code accommodates configuration exibility, a hierarchy of models, and ultimately multidisciplinary design, analysis, and optimization. Initially the software is implemented with low-fidelity models, typically appropriate for the conceptual design environment. An NDARC job consists of one or more cases, each case optionally performing design and analysis tasks. The design task involves sizing the rotorcraft to satisfy specified design conditions and missions. The analysis tasks can include off-design mission performance calculation, flight performance calculation for point operating conditions, and generation of subsystem or component performance maps. For analysis tasks, the aircraft description can come from the sizing task, from a previous case or a previous NDARC job, or be independently generated (typically the description of an existing aircraft). The aircraft consists of a set of components, including fuselage, rotors, wings, tails, and propulsion. For each component, attributes such as performance, drag, and weight can be calculated; and the aircraft attributes are obtained from the sum of the component attributes. Description and analysis of conventional rotorcraft configurations is facilitated, while retaining the capability to model novel and advanced concepts. Specific rotorcraft configurations considered are single-main-rotor and tail-rotor helicopter, tandem helicopter, coaxial helicopter, and tilt-rotor. The architecture of the code accommodates addition of new or higher-fidelity attribute models for a component, as well as addition of new components.
NDARC NASA Design and Analysis of Rotorcraft. Appendix 6; Input
NASA Technical Reports Server (NTRS)
Johnson, Wayne
2017-01-01
The NASA Design and Analysis of Rotorcraft (NDARC) software is an aircraft system analysis tool that supports both conceptual design efforts and technology impact assessments. The principal tasks are to design (or size) a rotorcraft to meet specified requirements, including vertical takeoff and landing (VTOL) operation, and then analyze the performance of the aircraft for a set of conditions. For broad and lasting utility, it is important that the code have the capability to model general rotorcraft configurations, and estimate the performance and weights of advanced rotor concepts. The architecture of the NDARC code accommodates configuration flexibility, a hierarchy of models, and ultimately multidisciplinary design, analysis, and optimization. Initially the software is implemented with low-fidelity models, typically appropriate for the conceptual design environment. An NDARC job consists of one or more cases, each case optionally performing design and analysis tasks. The design task involves sizing the rotorcraft to satisfy specified design conditions and missions. The analysis tasks can include off-design mission performance calculation, flight performance calculation for point operating conditions, and generation of subsystem or component performance maps. For analysis tasks, the aircraft description can come from the sizing task, from a previous case or a previous NDARC job, or be independently generated (typically the description of an existing aircraft). The aircraft consists of a set of components, including fuselage, rotors, wings, tails, and propulsion. For each component, attributes such as performance, drag, and weight can be calculated; and the aircraft attributes are obtained from the sum of the component attributes. Description and analysis of conventional rotorcraft configurations is facilitated, while retaining the capability to model novel and advanced concepts. Specific rotorcraft configurations considered are single-main-rotor and tail-rotor helicopter, tandem helicopter, coaxial helicopter, and tiltrotor. The architecture of the code accommodates addition of new or higher-fidelity attribute models for a component, as well as addition of new components.
NDARC NASA Design and Analysis of Rotorcraft
NASA Technical Reports Server (NTRS)
Johnson, Wayne R.
2009-01-01
The NASA Design and Analysis of Rotorcraft (NDARC) software is an aircraft system analysis tool intended to support both conceptual design efforts and technology impact assessments. The principal tasks are to design (or size) a rotorcraft to meet specified requirements, including vertical takeoff and landing (VTOL) operation, and then analyze the performance of the aircraft for a set of conditions. For broad and lasting utility, it is important that the code have the capability to model general rotorcraft configurations, and estimate the performance and weights of advanced rotor concepts. The architecture of the NDARC code accommodates configuration flexibility; a hierarchy of models; and ultimately multidisciplinary design, analysis, and optimization. Initially the software is implemented with lowfidelity models, typically appropriate for the conceptual design environment. An NDARC job consists of one or more cases, each case optionally performing design and analysis tasks. The design task involves sizing the rotorcraft to satisfy specified design conditions and missions. The analysis tasks can include off-design mission performance calculation, flight performance calculation for point operating conditions, and generation of subsystem or component performance maps. For analysis tasks, the aircraft description can come from the sizing task, from a previous case or a previous NDARC job, or be independently generated (typically the description of an existing aircraft). The aircraft consists of a set of components, including fuselage, rotors, wings, tails, and propulsion. For each component, attributes such as performance, drag, and weight can be calculated; and the aircraft attributes are obtained from the sum of the component attributes. Description and analysis of conventional rotorcraft configurations is facilitated, while retaining the capability to model novel and advanced concepts. Specific rotorcraft configurations considered are single main-rotor and tailrotor helicopter; tandem helicopter; coaxial helicopter; and tiltrotors. The architecture of the code accommodates addition of new or higher-fidelity attribute models for a component, as well as addition of new components.
NDARC - NASA Design and Analysis of Rotorcraft
NASA Technical Reports Server (NTRS)
Johnson, Wayne
2015-01-01
The NASA Design and Analysis of Rotorcraft (NDARC) software is an aircraft system analysis tool that supports both conceptual design efforts and technology impact assessments. The principal tasks are to design (or size) a rotorcraft to meet specified requirements, including vertical takeoff and landing (VTOL) operation, and then analyze the performance of the aircraft for a set of conditions. For broad and lasting utility, it is important that the code have the capability to model general rotorcraft configurations, and estimate the performance and weights of advanced rotor concepts. The architecture of the NDARC code accommodates configuration flexibility, a hierarchy of models, and ultimately multidisciplinary design, analysis, and optimization. Initially the software is implemented with low-fidelity models, typically appropriate for the conceptual design environment. An NDARC job consists of one or more cases, each case optionally performing design and analysis tasks. The design task involves sizing the rotorcraft to satisfy specified design conditions and missions. The analysis tasks can include off-design mission performance calculation, flight performance calculation for point operating conditions, and generation of subsystem or component performance maps. For analysis tasks, the aircraft description can come from the sizing task, from a previous case or a previous NDARC job, or be independently generated (typically the description of an existing aircraft). The aircraft consists of a set of components, including fuselage, rotors, wings, tails, and propulsion. For each component, attributes such as performance, drag, and weight can be calculated; and the aircraft attributes are obtained from the sum of the component attributes. Description and analysis of conventional rotorcraft configurations is facilitated, while retaining the capability to model novel and advanced concepts. Specific rotorcraft configurations considered are single-main-rotor and tail-rotor helicopter, tandem helicopter, coaxial helicopter, and tiltrotor. The architecture of the code accommodates addition of new or higher-fidelity attribute models for a component, as well as addition of new components.
NDARC NASA Design and Analysis of Rotorcraft Theory Appendix 1
NASA Technical Reports Server (NTRS)
Johnson, Wayne
2016-01-01
The NASA Design and Analysis of Rotorcraft (NDARC) software is an aircraft system analysis tool that supports both conceptual design efforts and technology impact assessments. The principal tasks are to design (or size) a rotorcraft to meet specified requirements, including vertical takeoff and landing (VTOL) operation, and then analyze the performance of the aircraft for a set of conditions. For broad and lasting utility, it is important that the code have the capability to model general rotorcraft configurations, and estimate the performance and weights of advanced rotor concepts. The architecture of the NDARC code accommodates configuration flexibility, a hierarchy of models, and ultimately multidisciplinary design, analysis, and optimization. Initially the software is implemented with low-fidelity models, typically appropriate for the conceptual design environment. An NDARC job consists of one or more cases, each case optionally performing design and analysis tasks. The design task involves sizing the rotorcraft to satisfy specified design conditions and missions. The analysis tasks can include off-design mission performance calculation, flight performance calculation for point operating conditions, and generation of subsystem or component performance maps. For analysis tasks, the aircraft description can come from the sizing task, from a previous case or a previous NDARC job, or be independently generated (typically the description of an existing aircraft). The aircraft consists of a set of components, including fuselage, rotors, wings, tails, and propulsion. For each component, attributes such as performance, drag, and weight can be calculated; and the aircraft attributes are obtained from the sum of the component attributes. Description and analysis of conventional rotorcraft configurations is facilitated, while retaining the capability to model novel and advanced concepts. Specific rotorcraft configurations considered are single-main-rotor and tail-rotor helicopter, tandem helicopter, coaxial helicopter, and tiltrotor. The architecture of the code accommodates addition of new or higher-fidelity attribute models for a component, as well as addition of new components.
Vening, W; Willigendael, E M; Tjeertes, E K M; Hulsewé, K W E; Hoofwijk, A G M
2010-02-01
This study was performed to determine the probability of finding additional pathology, requiring treatment or follow-up, in patients referred with symptoms suggestive of haemorrhoids. Secondly, to determine, at what age a flexible sigmoidoscopy should be performed in these patients. All patients referred for the treatment of haemorrhoids over a period of 5 years were prospectively included in a database. Data included patient characteristics, clinical information, histopathological analysis and the sigmoidoscopy results. Haemorrhoids were present in 961 (95.6%) of 1005 patients. Of these patients, 692 (72.0%) patients were free from any additional pathology, 161 (16%) patients had diverticulosis, in 15 (1.5%) patients the sigmoidoscopy showed signs of colitis, 116 (11.5%) patients had polyps and a malignancy was present in eight (0.8%) patients. In the age group between 30-40 and 40-50, the presence of additional pathology increased significantly (P < 0.05). No malignancies were found under the age of 40. The vast majority of patients referred for the treatment and analysis of haemorrhoids were free from any additional pathology. But, over the age of 40, the incidence of additional pathology increased significantly. Therefore, we suggest that a flexible sigmoidoscopy should be performed in all patients over the age of 40, with clinical signs of haemorrhoids.
When Practice Doesn't Lead to Retrieval: An Analysis of Children's Errors with Simple Addition
ERIC Educational Resources Information Center
de Villiers, Celéste; Hopkins, Sarah
2013-01-01
Counting strategies initially used by young children to perform simple addition are often replaced by more efficient counting strategies, decomposition strategies and rule-based strategies until most answers are encoded in memory and can be directly retrieved. Practice is thought to be the key to developing fluent retrieval of addition facts. This…
Overcoming Barriers to Technology Adoption in Small Manufacturing Enterprises (SMEs)
2003-06-01
automates quote-generation, order - processing workflow management, perform- ance analysis, and accounting functions. Ultimately, it will enable Magdic...that Magdic imple- ment an MES instead. The MES, in addition to solving the problem of document manage- ment, would automate quote-generation, order ... processing , workflow management, perform- ance analysis, and accounting functions. To help Magdic personnel learn about the MES, TIDE personnel provided
A balanced perspective: using nonfinancial measures to assess financial performance.
Watkins, Ann L
2003-11-01
Assessments of hospitals' financial performance have traditionally been based exclusively on analysis of a concise set of key financial ratios. One study, however, demonstrates that analysis of a hospital's financial condition can be significantly enhanced with the addition of several nonfinancial measures, including case-mix adjusted admissions, case-mix adjusted admissions per full-time equivalent, and case-mix adjusted admissions per beds in service.
Herbicide Orange Site Characterization Study Naval Construction Battalion Center
1987-01-01
U.S. Testing Laboratories for analysis. Over 200 additional analyses were performed for a variety of quality assurance criteria. The resultant data...TABLE 9. NCBC PERFORMANCE AUDIT SAMPLE ANALYSIS SUNMARYa (SERIES 1) TCDD Sppb ) Reported Detection Relative b Sample Number Concentration Limit...limit rather than estimating the variance of the results. The sample results were transformed using the natural logarithm. The Shapiro-Wilk W test
Jacobson, Sheldon H; Yu, Ge; Jokela, Janet A
2016-07-01
This paper provides an alternative policy for Ebola entry screening at airports in the United States. This alternative policy considers a social contact tracing (SCT) risk level, in addition to the current health risk level used by the CDC. The performances of both policies are compared based on the scenarios that occur and the expected cost associated with implementing such policies. Sensitivity analysis is performed to identify conditions under which one policy dominates the other policy. This analysis takes into account that the alternative policy requires additional data collection, which is balanced by a more cost-effective allocation of resources. Copyright © 2016 Elsevier Inc. All rights reserved.
NASA Technical Reports Server (NTRS)
Hahne, David E.; Glaab, Louis J.
1999-01-01
An investigation was performed to evaluate leading-and trailing-edge flap deflections for optimal aerodynamic performance of a High-Speed Civil Transport concept during takeoff and approach-to-landing conditions. The configuration used for this study was designed by the Douglas Aircraft Company during the 1970's. A 0.1-scale model of this configuration was tested in the Langley 30- by 60-Foot Tunnel with both the original leading-edge flap system and a new leading-edge flap system, which was designed with modem computational flow analysis and optimization tools. Leading-and trailing-edge flap deflections were generated for the original and modified leading-edge flap systems with the computational flow analysis and optimization tools. Although wind tunnel data indicated improvements in aerodynamic performance for the analytically derived flap deflections for both leading-edge flap systems, perturbations of the analytically derived leading-edge flap deflections yielded significant additional improvements in aerodynamic performance. In addition to the aerodynamic performance optimization testing, stability and control data were also obtained. An evaluation of the crosswind landing capability of the aircraft configuration revealed that insufficient lateral control existed as a result of high levels of lateral stability. Deflection of the leading-and trailing-edge flaps improved the crosswind landing capability of the vehicle considerably; however, additional improvements are required.
Tada, Atsuko; Ishizuki, Kyoko; Sugimoto, Naoki; Yoshimatsu, Kayo; Kawahara, Nobuo; Suematsu, Takako; Arifuku, Kazunori; Fukai, Toshio; Tamura, Yukiyoshi; Ohtsuki, Takashi; Tahara, Maiko; Yamazaki, Takeshi; Akiyama, Hiroshi
2015-01-01
"Licorice oil extract" (LOE) (antioxidant agent) is described in the notice of Japanese food additive regulations as a material obtained from the roots and/or rhizomes of Glycyrrhiza uralensis, G. inflata or G. glabra. In this study, we aimed to identify the original Glycyrrhiza species of eight food additive products using LC/MS. Glabridin, a characteristic compound in G. glabra, was specifically detected in seven products, and licochalcone A, a characteristic compound in G. inflata, was detected in one product. In addition, Principal Component Analysis (PCA) (a kind of multivariate analysis) using the data of LC/MS or (1)H-NMR analysis was performed. The data of thirty-one samples, including LOE products used as food additives, ethanol extracts of various Glycyrrhiza species and commercially available Glycyrrhiza species-derived products were assessed. Based on the PCA results, the majority of LOE products was confirmed to be derived from G. glabra. This study suggests that PCA using (1)H-NMR analysis data is a simple and useful method to identify the plant species of origin of natural food additive products.
A Virtual Rat for Simulating Environmental and Exertional Heat Stress
2014-10-02
unsuitable for accurately determin- ing the spatiotemporal temperature distribution in the animal due to heat stress and for performing mechanistic analysis ...possible in the original experiments. Finally, we performed additional simu- lations using the virtual rat to facilitate comparative analysis of the...capability of the virtual rat to account for the circadian rhythmicity in core temperatures during an in- crease in the external temperature from 22
Periodically-Scheduled Controller Analysis using Hybrid Systems Reachability and Continuization
2015-12-01
tools to verify specifications for hybrid automata do not perform well on such periodically scheduled models. This is due to a combination of the large...an additive nondeterministic input. Reachability tools for hybrid automata can better handle such systems. We further improve the analysis by...formally as a hybrid automaton. However, reachability tools to verify specifications for hybrid automata do not perform well on such periodically
ERIC Educational Resources Information Center
Kim, Jinok; Chung, Gregory K. W. K.
2012-01-01
In this study we compared the effects of two math game designs on math and game performance, using discrete-time survival analysis (DTSA) to model players' risk of not advancing to the next level in the game. 137 students were randomly assigned to two game conditions. The game covered the concept of a unit and the addition of like-sized fractional…
NASA Astrophysics Data System (ADS)
Bastos, Isadora T. S.; Costa, Fanny N.; Silva, Tiago F.; Barreiro, Eliezer J.; Lima, Lídia M.; Braz, Delson; Lombardo, Giuseppe M.; Punzo, Francesco; Ferreira, Fabio F.; Barroso, Regina C.
2017-10-01
LASSBio-1755 is a new cycloalkyl-N-acylhydrazone parent compound designed for the development of derivatives with antinociceptive and anti-inflammatory activities. Although single crystal X-ray diffraction has been considered as the golden standard in structure determination, we successfully used X-ray powder diffraction data in the structural determination of new synthesized compounds, in order to overcome the bottle-neck due to the difficulties experienced in harvesting good quality single crystals of the compounds. We therefore unequivocally assigned the relative configuration (E) to the imine double bond and a s-cis conformation of the amide function of the N-acylhydrazone compound. These features are confirmed by a computational analysis performed on the basis of molecular dynamics calculations, which are extended not only to the structural characteristics but also to the analysis of the anisotropic atomic displacement parameters, a further information - missed in a typical powder diffraction analysis. The so inferred data were used to perform additional cycles of refinement and eventually generate a new cif file with additional physical information. Furthermore, crystal morphology prediction was performed, which is in agreement with the experimental images acquired by scanning electron microscopy, thus providing useful information on possible alternative paths for better crystallization strategies.
The Development of a Handbook for Astrobee F Performance and Stability Analysis
NASA Technical Reports Server (NTRS)
Wolf, R. S.
1982-01-01
An astrobee F performance and stability analysis is presented, for use by the NASA Sounding Rocket Division. The performance analysis provides information regarding altitude, mach number, dynamic pressure, and velocity as functions of time since launch. It is found that payload weight has the greatest effect on performance, and performance prediction accuracy was calculated to remain within 1%. In addition, to assure sufficient flight stability, a predicted rigid-body static margin of at least 8% of the total vehicle length is required. Finally, fin cant angle predictions are given in order to achieve a 2.5 cycle per second burnout roll rate, based on obtaining 75% of the steady roll rate. It is noted that this method can be used by flight performance engineers to create a similar handbook for any sounding rocket series.
Siebert, Uwe; Rochau, Ursula; Claxton, Karl
2013-01-01
Decision analysis (DA) and value-of-information (VOI) analysis provide a systematic, quantitative methodological framework that explicitly considers the uncertainty surrounding the currently available evidence to guide healthcare decisions. In medical decision making under uncertainty, there are two fundamental questions: 1) What decision should be made now given the best available evidence (and its uncertainty)?; 2) Subsequent to the current decision and given the magnitude of the remaining uncertainty, should we gather further evidence (i.e., perform additional studies), and if yes, which studies should be undertaken (e.g., efficacy, side effects, quality of life, costs), and what sample sizes are needed? Using the currently best available evidence, VoI analysis focuses on the likelihood of making a wrong decision if the new intervention is adopted. The value of performing further studies and gathering additional evidence is based on the extent to which the additional information will reduce this uncertainty. A quantitative framework allows for the valuation of the additional information that is generated by further research, and considers the decision maker's objectives and resource constraints. Claxton et al. summarise: "Value of information analysis can be used to inform a range of policy questions including whether a new technology should be approved based on existing evidence, whether it should be approved but additional research conducted or whether approval should be withheld until the additional evidence becomes available." [Claxton K. Value of information entry in Encyclopaedia of Health Economics, Elsevier, forthcoming 2014.] The purpose of this tutorial is to introduce the framework of systematic VoI analysis to guide further research. In our tutorial article, we explain the theoretical foundations and practical methods of decision analysis and value-of-information analysis. To illustrate, we use a simple case example of a foot ulcer (e.g., with diabetes) as well as key references from the literature, including examples for the use of the decision-analytic VoI framework by health technology assessment agencies to guide further research. These concepts may guide stakeholders involved or interested in how to determine whether or not and, if so, which additional evidence is needed to make decisions. Copyright © 2013. Published by Elsevier GmbH.
Araki, Kenichiro; Shirabe, Ken; Watanabe, Akira; Kubo, Norio; Sasaki, Shigeru; Suzuki, Hideki; Asao, Takayuki; Kuwano, Hiroyuki
2017-01-01
Although single-incision laparoscopic cholecystectomy is now widely performed in patients with cholecystitis, some cases require an additional port to complete the procedure. In this study, we focused on risk factor of additional port in this surgery. We performed single-incision cholecystectomy in 75 patients with acute cholecystitis or after cholecystitis between 2010 and 2014 at Gunma University Hospital. Surgical indications followed the TG13 guidelines. Our standard procedure for single-incision cholecystectomy routinely uses two needlescopic devices. We used logistic regression analysis to identify the risk factors associated with use of an additional full-size port (5 or 10 mm). Surgical outcome was acceptable without biliary injury. Nine patients (12.0%) required an additional port, and one patient (1.3%) required conversion to open cholecystectomy because of severe adhesions around the cystic duct and common bile duct. In multivariate analysis, high C-reactive protein (CRP) values (>7.0 mg/dl) during cholecystitis attacks were significantly correlated with the need for an additional port (P = 0.009), with a sensitivity of 55.6%, specificity of 98.5%, and accuracy of 93.3%. This study indicated that the severe inflammation indicated by high CRP values during cholecystitis attacks predicts the need for an additional port. J. Med. Invest. 64: 245-249, August, 2017.
Overview of MSFC AMSD Integrated Modeling and Analysis
NASA Technical Reports Server (NTRS)
Cummings, Ramona; Russell, Kevin (Technical Monitor)
2002-01-01
Structural, thermal, dynamic, and optical models of the NGST AMSD mirror assemblies are being finalized and integrated for predicting cryogenic vacuum test performance of the developing designs. Analyzers in use by the MSFC Modeling and Analysis Team are identified, with overview of approach to integrate simulated effects. Guidelines to verify the individual models and calibration cases for comparison with the vendors' analyses are presented. In addition, baseline and proposed additional scenarios for the cryogenic vacuum testing are briefly described.
29 CFR 1470.40 - Monitoring and reporting program performance.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 29 Labor 4 2010-07-01 2010-07-01 false Monitoring and reporting program performance. 1470.40 Section 1470.40 Labor Regulations Relating to Labor (Continued) FEDERAL MEDIATION AND CONCILIATION SERVICE... not met. (iii) Additional pertinent information including, when appropriate, analysis and explanation...
Factors Associated With Surgery Clerkship Performance and Subsequent USMLE Step Scores.
Dong, Ting; Copeland, Annesley; Gangidine, Matthew; Schreiber-Gregory, Deanna; Ritter, E Matthew; Durning, Steven J
2018-03-12
We conducted an in-depth empirical investigation to achieve a better understanding of the surgery clerkship from multiple perspectives, including the influence of clerkship sequence on performance, the relationship between self-logged work hours and performance, as well as the association between surgery clerkship performance with subsequent USMLE Step exams' scores. The study cohort consisted of medical students graduating between 2015 and 2018 (n = 687). The primary measures of interest were clerkship sequence (internal medicine clerkship before or after surgery clerkship), self-logged work hours during surgery clerkship, surgery NBME subject exam score, surgery clerkship overall grade, and Step 1, Step 2 CK, and Step 3 exam scores. We reported the descriptive statistics and conducted correlation analysis, stepwise linear regression analysis, and variable selection analysis of logistic regression to answer the research questions. Students who completed internal medicine clerkship prior to surgery clerkship had better performance on surgery subject exam. The subject exam score explained an additional 28% of the variance of the Step 2 CK score, and the clerkship overall score accounted for an additional 24% of the variance after the MCAT scores and undergraduate GPA were controlled. Our finding suggests that the clerkship sequence does matter when it comes to performance on the surgery NBME subject exam. Performance on the surgery subject exam is predictive of subsequent performance on future USMLE Step exams. Copyright © 2018 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.
Practical and Efficient Searching in Proteomics: A Cross Engine Comparison
Paulo, Joao A.
2014-01-01
Background Analysis of large datasets produced by mass spectrometry-based proteomics relies on database search algorithms to sequence peptides and identify proteins. Several such scoring methods are available, each based on different statistical foundations and thereby not producing identical results. Here, the aim is to compare peptide and protein identifications using multiple search engines and examine the additional proteins gained by increasing the number of technical replicate analyses. Methods A HeLa whole cell lysate was analyzed on an Orbitrap mass spectrometer for 10 technical replicates. The data were combined and searched using Mascot, SEQUEST, and Andromeda. Comparisons were made of peptide and protein identifications among the search engines. In addition, searches using each engine were performed with incrementing number of technical replicates. Results The number and identity of peptides and proteins differed across search engines. For all three search engines, the differences in proteins identifications were greater than the differences in peptide identifications indicating that the major source of the disparity may be at the protein inference grouping level. The data also revealed that analysis of 2 technical replicates can increase protein identifications by up to 10-15%, while a third replicate results in an additional 4-5%. Conclusions The data emphasize two practical methods of increasing the robustness of mass spectrometry data analysis. The data show that 1) using multiple search engines can expand the number of identified proteins (union) and validate protein identifications (intersection), and 2) analysis of 2 or 3 technical replicates can substantially expand protein identifications. Moreover, information can be extracted from a dataset by performing database searching with different engines and performing technical repeats, which requires no additional sample preparation and effectively utilizes research time and effort. PMID:25346847
Practical and Efficient Searching in Proteomics: A Cross Engine Comparison.
Paulo, Joao A
2013-10-01
Analysis of large datasets produced by mass spectrometry-based proteomics relies on database search algorithms to sequence peptides and identify proteins. Several such scoring methods are available, each based on different statistical foundations and thereby not producing identical results. Here, the aim is to compare peptide and protein identifications using multiple search engines and examine the additional proteins gained by increasing the number of technical replicate analyses. A HeLa whole cell lysate was analyzed on an Orbitrap mass spectrometer for 10 technical replicates. The data were combined and searched using Mascot, SEQUEST, and Andromeda. Comparisons were made of peptide and protein identifications among the search engines. In addition, searches using each engine were performed with incrementing number of technical replicates. The number and identity of peptides and proteins differed across search engines. For all three search engines, the differences in proteins identifications were greater than the differences in peptide identifications indicating that the major source of the disparity may be at the protein inference grouping level. The data also revealed that analysis of 2 technical replicates can increase protein identifications by up to 10-15%, while a third replicate results in an additional 4-5%. The data emphasize two practical methods of increasing the robustness of mass spectrometry data analysis. The data show that 1) using multiple search engines can expand the number of identified proteins (union) and validate protein identifications (intersection), and 2) analysis of 2 or 3 technical replicates can substantially expand protein identifications. Moreover, information can be extracted from a dataset by performing database searching with different engines and performing technical repeats, which requires no additional sample preparation and effectively utilizes research time and effort.
ERIC Educational Resources Information Center
Kar, Tugrul
2015-01-01
This study aimed to investigate how the semantic structures of problems posed by sixth-grade middle school students for the addition of fractions affect their problem-posing performance. The students were presented with symbolic operations involving the addition of fractions and asked to pose two different problems related to daily-life situations…
ERIC Educational Resources Information Center
Astrom, Raven L.; Wadsworth, Sally J.; Olson, Richard K.; Willcutt, Erik G.; DeFries, John C.
2012-01-01
Reading performance data from 254 pairs of identical (MZ) and 420 pairs of fraternal (DZ) twins, 8.0 to 20.0 years of age, were subjected to multiple regression analyses. An extension of the DeFries-Fulker (DF) analysis (DeFries & Fulker, 1985, 1988) that facilitated inclusion of data from 303 of their nontwin siblings was employed. In addition to…
Thermodynamic Cycle and CFD Analyses for Hydrogen Fueled Air-breathing Pulse Detonation Engines
NASA Technical Reports Server (NTRS)
Povinelli, Louis A.; Yungster, Shaye
2002-01-01
This paper presents the results of a thermodynamic cycle analysis of a pulse detonation engine (PDE) using a hydrogen-air mixture at static conditions. The cycle performance results, namely the specific thrust, fuel consumption and impulse are compared to a single cycle CFD analysis for a detonation tube which considers finite rate chemistry. The differences in the impulse values were indicative of the additional performance potential attainable in a PDE.
Effects of glycine on motor performance in rats after traumatic spinal cord injury.
Gonzalez-Piña, Rigoberto; Nuño-Licona, Alberto
2007-01-01
It has been reported that glycine improves some functions lost after spinal cord injury (SCI). In order to assess the effects of glycine administration on motor performance after SCI, we used fifteen male Wistar rats distributed into three groups: sham (n = 3), spinal-cord injury (n = 6,) and spinal cord injury + glycine (n = 6). Motor performance was assessed using the beam-walking paradigm and footprint analysis. Results showed that for all animals with spinal-cord injury, scores in the beam-walking increased, which is an indication of increased motor deficit. In addition, footprint analysis showed a decrease in stride length and an increase in stride angle, additional indicators of motor deficit. These effects trended towards recovery after 8 weeks of recording and trended toward improvement by glycine administration; the effect was not significant. These results suggest that glycine replacement alone is not sufficient to improve the motor deficits that occur after SCI.
Multiple performance measures are needed to evaluate triage systems in the emergency department.
Zachariasse, Joany M; Nieboer, Daan; Oostenbrink, Rianne; Moll, Henriëtte A; Steyerberg, Ewout W
2018-02-01
Emergency department triage systems can be considered prediction rules with an ordinal outcome, where different directions of misclassification have different clinical consequences. We evaluated strategies to compare the performance of triage systems and aimed to propose a set of performance measures that should be used in future studies. We identified performance measures based on literature review and expert knowledge. Their properties are illustrated in a case study evaluating two triage modifications in a cohort of 14,485 pediatric emergency department visits. Strengths and weaknesses of the performance measures were systematically appraised. Commonly reported performance measures are measures of statistical association (34/60 studies) and diagnostic accuracy (17/60 studies). The case study illustrates that none of the performance measures fulfills all criteria for triage evaluation. Decision curves are the performance measures with the most attractive features but require dichotomization. In addition, paired diagnostic accuracy measures can be recommended for dichotomized analysis, and the triage-weighted kappa and Nagelkerke's R 2 for ordinal analyses. Other performance measures provide limited additional information. When comparing modifications of triage systems, decision curves and diagnostic accuracy measures should be used in a dichotomized analysis, and the triage-weighted kappa and Nagelkerke's R 2 in an ordinal approach. Copyright © 2017 Elsevier Inc. All rights reserved.
Self-serving bias effects on job analysis ratings.
Cucina, Jeffrey M; Martin, Nicholas R; Vasilopoulos, Nicholas L; Thibodeuax, Henry F
2012-01-01
The purpose of this study was to investigate whether worker-oriented job analysis importance ratings were influenced by subject matter experts' (SME) standing (as measured by self-rated performance) on a competency. This type of relationship (whereby SMEs indicate that the traits they have are important for successful job performance) is an example of the self-serving bias (which is widely described in the social cognition literature and rarely described in the industrial/organizational psychology literature). An archival dataset covering 57 clerical and technical occupations with 26,682 participants was used. Support was found for the relationship between self-rated performance and importance ratings. Significant relationships (typically in the .30s) were observed for all 31 competencies that were studied. Controls were taken to account for common method bias and differences in the competencies required for each of the 57 occupations. Past research has demonstrated the effects of the self-serving bias on personality-based job analysis ratings. This study was the first to extend these findings to traditional job analysis, which covers other competencies in addition to personality. In addition, this study is the first to use operational field data instead of laboratory data.
Morais, E C; Esmerino, E A; Monteiro, R A; Pinheiro, C M; Nunes, C A; Cruz, A G; Bolini, Helena M A
2016-01-01
The addition of prebiotic and sweeteners in chocolate dairy desserts opens up new opportunities to develop dairy desserts that besides having a lower calorie intake still has functional properties. In this study, prebiotic low sugar dairy desserts were evaluated by 120 consumers using a 9-point hedonic scale, in relation to the attributes of appearance, aroma, flavor, texture, and overall liking. Internal preference map using parallel factor analysis (PARAFAC) and principal component analysis (PCA) was performed using the consumer data. In addition, physical (texture profile) and optical (instrumental color) analyses were also performed. Prebiotic dairy desserts containing sucrose and sucralose were equally liked by the consumers. These samples were characterized by firmness and gumminess, which can be considered drivers of liking by the consumers. Optimization of the prebiotic low sugar dessert formulation should take in account the choice of ingredients that contribute in a positive manner for these parameters. PARAFAC allowed the extraction of more relevant information in relation to PCA, demonstrating that consumer acceptance analysis can be evaluated by simultaneously considering several attributes. Multiple factor analysis reported Rv value of 0.964, suggesting excellent concordance for both methods. © 2015 Institute of Food Technologists®
NASA Technical Reports Server (NTRS)
Watts, Michael E.; Dejpour, Shabob R.
1989-01-01
The changes made on the data analysis and management program DATAMAP (Data from Aeromechanics Test and Analytics - Management and Analysis Package) are detailed. These changes are made to Version 3.07 (released February, 1981) and are called Version 4.0. Version 4.0 improvements were performed by Sterling Software under contract to NASA Ames Research Center. The increased capabilities instituted in this version include the breakout of the source code into modules for ease of modification, addition of a more accurate curve fit routine, ability to handle higher frequency data, additional data analysis features, and improvements in the functionality of existing features. These modification will allow DATAMAP to be used on more data sets and will make future modifications and additions easier to implement.
Griffiths, Rian L; Bunch, Josephine
2012-07-15
Matrix-assisted laser desorption/ionization (MALDI) is a powerful technique for the direct analysis of lipids in complex mixtures and thin tissue sections, making it an extremely attractive method for profiling lipids in health and disease. Lipids are readily detected as [M+H](+), [M+Na](+) and [M+K](+) ions in positive ion MALDI mass spectrometry (MS) experiments. This not only decreases sensitivity, but can also lead to overlapping m/z values of the various adducts of different lipids. Additives can be used to promote formation of a particular adduct, improving sensitivity, reducing spectral complexity and enhancing structural characterization in collision-induced dissociation (CID) experiments. Li(+), Na(+), K(+), Cs(+) and NH(4)(+) cations were considered as a range of salt types (acetates, chlorides and nitrates) incorporated into DHB matrix solutions at concentrations between 5 and 80 mM. The study was extended to evaluate the effect of these additives on CID experiments of a lipid standard, after optimization of collision energy parameters. Experiments were performed on a hybrid quadrupole time-of-flight (QqTOF) instrument. The systematic evaluation of new and existing additives in MALDI-MS and MS/MS of lipids demonstrated the importance of additive cation and anion choice and concentration for tailoring spectral results. The recommended choice of additive depends on the desired outcomes of the experiment to be performed (MS or MS/MS). Nitrates are found to be particularly useful additives for lipid analysis. Copyright © 2012 John Wiley & Sons, Ltd.
NASA Astrophysics Data System (ADS)
Brekhna, Brekhna; Mahmood, Arif; Zhou, Yuanfeng; Zhang, Caiming
2017-11-01
Superpixels have gradually become popular in computer vision and image processing applications. However, no comprehensive study has been performed to evaluate the robustness of superpixel algorithms in regard to common forms of noise in natural images. We evaluated the robustness of 11 recently proposed algorithms to different types of noise. The images were corrupted with various degrees of Gaussian blur, additive white Gaussian noise, and impulse noise that either made the object boundaries weak or added extra information to it. We performed a robustness analysis of simple linear iterative clustering (SLIC), Voronoi Cells (VCells), flooding-based superpixel generation (FCCS), bilateral geodesic distance (Bilateral-G), superpixel via geodesic distance (SSS-G), manifold SLIC (M-SLIC), Turbopixels, superpixels extracted via energy-driven sampling (SEEDS), lazy random walk (LRW), real-time superpixel segmentation by DBSCAN clustering, and video supervoxels using partially absorbing random walks (PARW) algorithms. The evaluation process was carried out both qualitatively and quantitatively. For quantitative performance comparison, we used achievable segmentation accuracy (ASA), compactness, under-segmentation error (USE), and boundary recall (BR) on the Berkeley image database. The results demonstrated that all algorithms suffered performance degradation due to noise. For Gaussian blur, Bilateral-G exhibited optimal results for ASA and USE measures, SLIC yielded optimal compactness, whereas FCCS and DBSCAN remained optimal for BR. For the case of additive Gaussian and impulse noises, FCCS exhibited optimal results for ASA, USE, and BR, whereas Bilateral-G remained a close competitor in ASA and USE for Gaussian noise only. Additionally, Turbopixel demonstrated optimal performance for compactness for both types of noise. Thus, no single algorithm was able to yield optimal results for all three types of noise across all performance measures. Conclusively, to solve real-world problems effectively, more robust superpixel algorithms must be developed.
Do Business Communication Courses Improve Student Performance in Introductory Marketing?
ERIC Educational Resources Information Center
Marcal, Leah E.; Hennessey, Judith E.; Curren, Mary T.; Roberts, William W.
2005-01-01
In this study, the authors investigated whether completion of a business communications course improved student performance in an introductory marketing management course. Regression analysis indicated that students who completed the communications course received higher grades than the otherwise comparable students. In addition, marketing majors…
Step 1: C3 Flight Demo Data Analysis Plan
NASA Technical Reports Server (NTRS)
2005-01-01
The Data Analysis Plan (DAP) describes the data analysis that the C3 Work Package (WP) will perform in support of the Access 5 Step 1 C3 flight demonstration objectives as well as the processes that will be used by the Flight IPT to gather and distribute the data collected to satisfy those objectives. In addition to C3 requirements, this document will encompass some Human Systems Interface (HSI) requirements in performing the C3 flight demonstrations. The C3 DAP will be used as the primary interface requirements document between the C3 Work Package and Flight Test organizations (Flight IPT and Non-Access 5 Flight Programs). In addition to providing data requirements for Access 5 flight test (piggyback technology demonstration flights, dedicated C3 technology demonstration flights, and Airspace Operations Demonstration flights), the C3 DAP will be used to request flight data from Non- Access 5 flight programs for C3 related data products
1987-09-01
Visual Communication . Although this task is performed several times, the task is performed at different points during the mission. In addition, the...Perform visual communication Give thumbs-up signal when ready for takeoff; check lights on pri-fly B. Perform takeoff and Aircraft operating clear ship...FM c. Operate ICS 2. Perform visual communication 3. Operate IFF transponder B. Maintain mission and fuel logs C. Perform checklists 1. Perform AMCM
Payload Performance Analysis for a Reusable Two-Stage-to-Orbit Vehicle
NASA Technical Reports Server (NTRS)
Tartabini, Paul V.; Beaty, James R.; Lepsch, Roger A.; Gilbert, Michael G.
2015-01-01
This paper investigates a unique approach in the development of a reusable launch vehicle where, instead of designing the vehicle to be reusable from its inception, as was done for the Space Shuttle, an expendable two stage launch vehicle is evolved over time into a reusable launch vehicle. To accomplish this objective, each stage is made reusable by adding the systems necessary to perform functions such as thermal protection and landing, without significantly altering the primary subsystems and outer mold line of the original expendable vehicle. In addition, some of the propellant normally used for ascent is used instead for additional propulsive maneuvers after staging in order to return both stages to the launch site, keep loads within acceptable limits and perform a soft landing. This paper presents a performance analysis that was performed to investigate the feasibility of this approach by quantifying the reduction in payload capability of the original expendable launch vehicle after accounting for the mass additions, trajectory changes and increased propellant requirements necessary for reusability. Results show that it is feasible to return both stages to the launch site with a positive payload capability equal to approximately 50 percent of an equivalent expendable launch vehicle. Further discussion examines the ability to return a crew/cargo capsule to the launch site and presents technical challenges that would have to be overcome.
Efficacy of Virtual Patients in Medical Education: A Meta-Analysis of Randomized Studies
ERIC Educational Resources Information Center
Consorti, Fabrizio; Mancuso, Rosaria; Nocioni, Martina; Piccolo, Annalisa
2012-01-01
A meta-analysis was performed to assess the Effect Size (ES) from randomized studies comparing the effect of educational interventions in which Virtual patients (VPs) were used either as an alternative method or additive to usual curriculum versus interventions based on more traditional methods. Meta-analysis was designed, conducted and reported…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Im, Piljae; Munk, Jeffrey D; Gehl, Anthony C
2015-06-01
A research project “Evaluation of Variable Refrigerant Flow (VRF) Systems Performance and the Enhanced Control Algorithm on Oak Ridge National Laboratory’s (ORNL’s) Flexible Research Platform” was performed to (1) install and validate the performance of Samsung VRF systems compared with the baseline rooftop unit (RTU) variable-air-volume (VAV) system and (2) evaluate the enhanced control algorithm for the VRF system on the two-story flexible research platform (FRP) in Oak Ridge, Tennessee. Based on the VRF system designed by Samsung and ORNL, the system was installed from February 18 through April 15, 2014. The final commissioning and system optimization were completed onmore » June 2, 2014, and the initial test for system operation was started the following day, June 3, 2014. In addition, the enhanced control algorithm was implemented and updated on June 18. After a series of additional commissioning actions, the energy performance data from the RTU and the VRF system were monitored from July 7, 2014, through February 28, 2015. Data monitoring and analysis were performed for the cooling season and heating season separately, and the calibrated simulation model was developed and used to estimate the energy performance of the RTU and VRF systems. This final report includes discussion of the design and installation of the VRF system, the data monitoring and analysis plan, the cooling season and heating season data analysis, and the building energy modeling study« less
Shenoy, Shailesh M
2016-07-01
A challenge in any imaging laboratory, especially one that uses modern techniques, is to achieve a sustainable and productive balance between using open source and commercial software to perform quantitative image acquisition, analysis and visualization. In addition to considering the expense of software licensing, one must consider factors such as the quality and usefulness of the software's support, training and documentation. Also, one must consider the reproducibility with which multiple people generate results using the same software to perform the same analysis, how one may distribute their methods to the community using the software and the potential for achieving automation to improve productivity.
Work-family conflicts and work performance.
Roth, Lawrence; David, Emily M
2009-08-01
Prior research indicates that work-family conflict interferes with family far more than it interferes with work. Conservation of resources provides a possible explanation: when shifting resources from family is no longer sufficient to maintain satisfactory work performance, then workers must acquire additional resources or reduce investments in work. One source of such additional resources could be high performance peers in the work group. The performance of workers with resource-rich peers may be less adversely affected by work-family conflict. In this study, 136 employees of a wholesale distribution firm (61% women, 62% minority) working in groups of 7 to 11 in manual labor and low-level administrative jobs rated their own work-to-family conflict. Their supervisors rated workers' performance. Hierarchical regression analysis indicated that work-to-family conflict increasingly adversely affected job performance as work group performance decreased. Hence, work group performance may be an important moderator of the effects of work-family conflict.
On an additive partial correlation operator and nonparametric estimation of graphical models.
Lee, Kuang-Yao; Li, Bing; Zhao, Hongyu
2016-09-01
We introduce an additive partial correlation operator as an extension of partial correlation to the nonlinear setting, and use it to develop a new estimator for nonparametric graphical models. Our graphical models are based on additive conditional independence, a statistical relation that captures the spirit of conditional independence without having to resort to high-dimensional kernels for its estimation. The additive partial correlation operator completely characterizes additive conditional independence, and has the additional advantage of putting marginal variation on appropriate scales when evaluating interdependence, which leads to more accurate statistical inference. We establish the consistency of the proposed estimator. Through simulation experiments and analysis of the DREAM4 Challenge dataset, we demonstrate that our method performs better than existing methods in cases where the Gaussian or copula Gaussian assumption does not hold, and that a more appropriate scaling for our method further enhances its performance.
On an additive partial correlation operator and nonparametric estimation of graphical models
Li, Bing; Zhao, Hongyu
2016-01-01
Abstract We introduce an additive partial correlation operator as an extension of partial correlation to the nonlinear setting, and use it to develop a new estimator for nonparametric graphical models. Our graphical models are based on additive conditional independence, a statistical relation that captures the spirit of conditional independence without having to resort to high-dimensional kernels for its estimation. The additive partial correlation operator completely characterizes additive conditional independence, and has the additional advantage of putting marginal variation on appropriate scales when evaluating interdependence, which leads to more accurate statistical inference. We establish the consistency of the proposed estimator. Through simulation experiments and analysis of the DREAM4 Challenge dataset, we demonstrate that our method performs better than existing methods in cases where the Gaussian or copula Gaussian assumption does not hold, and that a more appropriate scaling for our method further enhances its performance. PMID:29422689
Differential thermal analysis of lunar soil simulant
NASA Technical Reports Server (NTRS)
Tucker, D.; Setzer, A.
1991-01-01
Differential thermal analysis of a lunar soil simulant, 'Minnesota Lunar Simulant-1' (MLS-1) was performed. The MLS-1 was tested in as-received form (in glass form) and with another silica. The silica addition was seen to depress nucleation events which lead to a better glass former.
Peterson, Diana Coomes; Mlynarczyk, Gregory S A
2016-11-01
This study examined whether student learning outcome measures are influenced by the addition of three-dimensional and digital teaching tools to a traditional dissection and lecture learning format curricula. The study was performed in a semester long graduate level course that incorporated both gross anatomy and neuroanatomy curricula. Methods compared student examination performance on material taught using lecture and cadaveric dissection teaching tools alone or lecture and cadaveric dissection augmented with computerized three-dimensional teaching tools. Additional analyses were performed to examine potential correlations between question difficulty and format, previous student performance (i.e., undergraduate grade point average), and a student perception survey. The results indicated that students performed better on material in which three-dimensional (3D) technologies are utilized in conjunction with lecture and dissection methodologies. The improvement in performance was observed across the student population primarily on laboratory examinations. Although, student performance was increased, students did not perceive that the use of the additional 3D technology significantly influenced their learning. The results indicate that the addition of 3D learning tools can influence long-term retention of gross anatomy material and should be considered as a beneficial supplement for anatomy courses. Anat Sci Educ 9: 529-536. © 2016 American Association of Anatomists. © 2016 American Association of Anatomists.
Development of small scale cluster computer for numerical analysis
NASA Astrophysics Data System (ADS)
Zulkifli, N. H. N.; Sapit, A.; Mohammed, A. N.
2017-09-01
In this study, two units of personal computer were successfully networked together to form a small scale cluster. Each of the processor involved are multicore processor which has four cores in it, thus made this cluster to have eight processors. Here, the cluster incorporate Ubuntu 14.04 LINUX environment with MPI implementation (MPICH2). Two main tests were conducted in order to test the cluster, which is communication test and performance test. The communication test was done to make sure that the computers are able to pass the required information without any problem and were done by using simple MPI Hello Program where the program written in C language. Additional, performance test was also done to prove that this cluster calculation performance is much better than single CPU computer. In this performance test, four tests were done by running the same code by using single node, 2 processors, 4 processors, and 8 processors. The result shows that with additional processors, the time required to solve the problem decrease. Time required for the calculation shorten to half when we double the processors. To conclude, we successfully develop a small scale cluster computer using common hardware which capable of higher computing power when compare to single CPU processor, and this can be beneficial for research that require high computing power especially numerical analysis such as finite element analysis, computational fluid dynamics, and computational physics analysis.
NASA Technical Reports Server (NTRS)
Johnston, John D.; Howard, Joseph M.; Mosier, Gary E.; Parrish, Keith A.; McGinnis, Mark A.; Bluth, Marcel; Kim, Kevin; Ha, Kong Q.
2004-01-01
The James Web Space Telescope (JWST) is a large, infrared-optimized space telescope scheduled for launch in 2011. This is a continuation of a series of papers on modeling activities for JWST. The structural-thermal-optical, often referred to as STOP, analysis process is used to predict the effect of thermal distortion on optical performance. The benchmark STOP analysis for JWST assesses the effect of an observatory slew on wavefront error. Temperatures predicted using geometric and thermal math models are mapped to a structural finite element model in order to predict thermally induced deformations. Motions and deformations at optical surfaces are then input to optical models, and optical performance is predicted using either an optical ray trace or a linear optical analysis tool. In addition to baseline performance predictions, a process for performing sensitivity studies to assess modeling uncertainties is described.
NASA Astrophysics Data System (ADS)
Veerasubramani, Ganesh Kumar; Krishnamoorthy, Karthikeyan; Kim, Sang Jae
2016-02-01
Herein, we are successfully prepared cobalt molybdate (CoMoO4) grown on nickel foam as a binder free electrode by hydrothermal approach for supercapacitors and improved their electrochemical performances using potassium ferricyanide (K3Fe(CN)6) as redox additive. The formation of CoMoO4 on Ni foam with high crystallinity is confirmed using XRD, Raman, and XPS measurements. The nanoplate arrays (NPAs) of CoMoO4 are uniformly grown on Ni foam which is confirmed by FE-SEM analysis. The prepared binder-free CoMoO4 NPAs achieved maximum areal capacity of 227 μAh cm-2 with KOH electrolyte at 2.5 mA cm-2. This achieved areal capacity is further improved about three times using the addition of K3Fe(CN)6 as redox additive. The increased electrochemical performances of CoMoO4 NPAs on Ni foam electrode via redox additive are discussed in detail and the mechanism has been explored. Moreover, the assembled CoMoO4 NPAs on Ni foam//activated carbon asymmetric supercapacitor device with an extended operating voltage window of 1.5 V exhibits an excellent performances such as high energy density and cyclic stability. The overall performances of binder-free CoMoO4 NPAs on Ni foam with redox additives suggesting their potential use as positive electrode material for high performance supercapacitors.
Rueda, Oscar M; Diaz-Uriarte, Ramon
2007-10-16
Yu et al. (BMC Bioinformatics 2007,8: 145+) have recently compared the performance of several methods for the detection of genomic amplification and deletion breakpoints using data from high-density single nucleotide polymorphism arrays. One of the methods compared is our non-homogenous Hidden Markov Model approach. Our approach uses Markov Chain Monte Carlo for inference, but Yu et al. ran the sampler for a severely insufficient number of iterations for a Markov Chain Monte Carlo-based method. Moreover, they did not use the appropriate reference level for the non-altered state. We rerun the analysis in Yu et al. using appropriate settings for both the Markov Chain Monte Carlo iterations and the reference level. Additionally, to show how easy it is to obtain answers to additional specific questions, we have added a new analysis targeted specifically to the detection of breakpoints. The reanalysis shows that the performance of our method is comparable to that of the other methods analyzed. In addition, we can provide probabilities of a given spot being a breakpoint, something unique among the methods examined. Markov Chain Monte Carlo methods require using a sufficient number of iterations before they can be assumed to yield samples from the distribution of interest. Running our method with too small a number of iterations cannot be representative of its performance. Moreover, our analysis shows how our original approach can be easily adapted to answer specific additional questions (e.g., identify edges).
Comparison of fluorescent tags for analysis of mannose-6-phosphate glycans.
Kang, Ji-Yeon; Kwon, Ohsuk; Gil, Jin Young; Oh, Doo-Byoung
2016-05-15
Mannose-6-phosphate (M-6-P) glycan analysis is important for quality control of therapeutic enzymes for lysosomal storage diseases. Here, we found that the analysis of glycans containing two M-6-Ps was highly affected by the hydrophilicity of the elution solvent used in high-performance liquid chromatography (HPLC). In addition, the performances of three fluorescent tags--2-aminobenzoic acid (2-AA), 2-aminobenzamide (2-AB), and 3-(acetyl-amino)-6-aminoacridine (AA-Ac)--were compared with each other for M-6-P glycan analysis using HPLC and matrix-assisted laser desorption/ionization time-of-flight mass spectrometry. The best performance for analyzing M-6-P glycans was shown by 2-AA labeling in both analyses. Copyright © 2016 Elsevier Inc. All rights reserved.
Can Distributed Volunteers Accomplish Massive Data Analysis Tasks?
NASA Technical Reports Server (NTRS)
Kanefsky, B.; Barlow, N. G.; Gulick, V. C.
2001-01-01
We argue that many image analysis tasks can be performed by distributed amateurs. Our pilot study, with crater surveying and classification, has produced encouraging results in terms of both quantity (100,000 crater entries in 2 months) and quality. Additional information is contained in the original extended abstract.
Orderedness and Stratificational "and" Nodes.
ERIC Educational Resources Information Center
Herrick, Earl M.
It is possible to apply Lamb's stratificational theory and analysis to English graphonomy, but additional notation devices must be used to explain particular graphemes and their characteristics. The author presents cases where Lamb's notation is inadequate. In those cases, he devises new means for performing the analysis. The result of this…
NPAC-Nozzle Performance Analysis Code
NASA Technical Reports Server (NTRS)
Barnhart, Paul J.
1997-01-01
A simple and accurate nozzle performance analysis methodology has been developed. The geometry modeling requirements are minimal and very flexible, thus allowing rapid design evaluations. The solution techniques accurately couple: continuity, momentum, energy, state, and other relations which permit fast and accurate calculations of nozzle gross thrust. The control volume and internal flow analyses are capable of accounting for the effects of: over/under expansion, flow divergence, wall friction, heat transfer, and mass addition/loss across surfaces. The results from the nozzle performance methodology are shown to be in excellent agreement with experimental data for a variety of nozzle designs over a range of operating conditions.
NASA Technical Reports Server (NTRS)
Hague, D. S.; Rozendaal, H. L.
1977-01-01
A rapid mission analysis code based on the use of approximate flight path equations of motion is described. Equation form varies with the segment type, for example, accelerations, climbs, cruises, descents, and decelerations. Realistic and detailed vehicle characteristics are specified in tabular form. In addition to its mission performance calculation capabilities, the code also contains extensive flight envelop performance mapping capabilities. Approximate take off and landing analyses can be performed. At high speeds, centrifugal lift effects are taken into account. Extensive turbojet and ramjet engine scaling procedures are incorporated in the code.
Kawakatsu, Yoshito; Sugishita, Tomohiko; Tsutsui, Junya; Oruenjo, Kennedy; Wakhule, Stephen; Kibosia, Kennedy; Were, Eric; Honda, Sumihisa
2015-10-01
Several African and South Asian countries are currently investing in new cadres of community health workers (CHWs) as a major part of strategies aimed at reaching the Millennium Development Goals. However, one review concluded that community health workers did not consistently provide services likely to have substantial effects on health and that quality was usually poor. The objective of this research was to assess the CHWs' performance in Western Kenya and describe determinants of that performance using a multilevel analysis of the two levels, individual and supervisor/community. This study conducted three surveys between August and September 2011 in Nyanza Province, Kenya. The participants of the three surveys were all 1,788 active CHWs, all their supervisors, and 2,560 randomly selected mothers who had children aged 12 to 23 months. CHW performance was generated by three indicators: reporting rate, health knowledge and household coverage. Multilevel analysis was performed to describe the determinants of that performance. The significant factors associated with the CHWs' performance were their marital status, educational level, the size of their household, their work experience, personal sanitation practice, number of supervisions received and the interaction between their supervisors' better health knowledge and the number of supervisions. A high quality of routine supervisions is one of the key interventions in sustaining a CHW's performance. In addition, decreasing the dropout rate of CHWs is important both for sustaining their performance and for avoiding the additional cost of replacing them. As for the selection criteria of new CHWs, good educational status, availability of supporters for household chores and good sanitation practices are all important in selecting CHWs who can maintain their high performance level.
ERIC Educational Resources Information Center
Liu, Shujie; Xu, Xianxuan; Grant, Leslie; Strong, James; Fang, Zheng
2017-01-01
This article presents the results of an interpretive policy analysis of China's Ministry of Education Standards (2013) for the professional practice of principals. In addition to revealing the evolution of the evaluation of principals in China and the processes by which this policy is formulated, a comparative analysis was conducted to compare it…
ERIC Educational Resources Information Center
Aragón, Sonia; Lapresa, Daniel; Arana, Javier; Anguera, M. Teresa; Garzón, Belén
2017-01-01
Polar coordinate analysis is a powerful data reduction technique based on the Zsum statistic, which is calculated from adjusted residuals obtained by lag sequential analysis. Its use has been greatly simplified since the addition of a module in the free software program HOISAN for performing the necessary computations and producing…
Brettschneider, Christian; Kohlmann, Sebastian; Gierk, Benjamin; Löwe, Bernd; König, Hans-Helmut
2017-01-01
Although depression is common in patients with heart disease, screening for depression is much debated. DEPSCREEN-INFO showed that a patient-targeted feedback in addition to screening results in lower depression level six months after screening. The purpose of this analysis was to perform a cost-effectiveness analysis of DEPSCREEN-INFO. Patients with coronary heart disease or arterial hypertension were included. Participants in both groups were screened for depression. Participants in the intervention group additionally received a patient-targeted feedback of their result and recommended treatment options. A cost-utility analysis using quality-adjusted life years (QALY) based on the EQ-5D was performed. The time horizon was 6 months. Resource utilization was assessed by a telephone interview. Multiple imputation using chained equations was used. Net-benefit regressions controlled for prognostic variables at baseline were performed to construct cost-effectiveness acceptability curves. Different sensitivity analyses were performed. 375 participants (intervention group: 155; control group: 220) were included at baseline. After 6 months, in the intervention group adjusted total costs were lower (-€2,098; SE: €1,717) and more QALY were gained (0.0067; SD: 0.0133); yet differences were not statistically significant. The probability of cost-effectiveness was around 80% independent of the willingness-to-pay (range: €0/QALY-€130,000/QALY). The results were robust. A patient-targeted feedback in addition to depression screening in cardiology is cost-effective with a high probability. This underpins the use of the patient-targeted feedbacks and the PHQ-9 that are both freely available and easy to implement in routine care.
Transition play in team performance of volleyball: a log-linear analysis.
Eom, H J; Schutz, R W
1992-09-01
The purpose of this study was to develop and test a method to analyze and evaluate sequential skill performances in a team sport. An on-line computerized system was developed to record and summarize the sequential skill performances in volleyball. Seventy-two sample games from the third Federation of International Volleyball Cup men's competition were videotaped and grouped into two categories according to the final team standing and game outcome. Log-linear procedures were used to investigate the nature and degree of the relationship in the first-order (pass-to-set, set-to-spike) and second-order (pass-to-spike) transition plays. Results showed that there was a significant dependency in both the first-order and second-order transition plays, indicating that the outcome of a skill performance is highly influenced by the quality of a preceding skill performance. In addition, the pattern of the transition plays was stable and consistent, regardless of the classification status: Game Outcome, Team Standing, or Transition Process. The methodology and subsequent results provide valuable aids for a thorough understanding of the characteristics of transition plays in volleyball. In addition, the concept of sequential performance analysis may serve as an example for sport scientists in investigating probabilistic patterns of motor performance.
Dynamic Discharge Arc Driver. [computerized simulation
NASA Technical Reports Server (NTRS)
Dannenberg, R. E.; Slapnicar, P. I.
1975-01-01
A computer program using nonlinear RLC circuit analysis was developed to accurately model the electrical discharge performance of the Ames 1-MJ energy storage and arc-driver system. Solutions of circuit parameters are compared with experimental circuit data and related to shock speed measurements. Computer analysis led to the concept of a Dynamic Discharge Arc Driver (DDAD) capable of increasing the range of operation of shock-driven facilities. Utilization of mass addition of the driver gas offers a unique means of improving driver performance. Mass addition acts to increase the arc resistance, which results in better electrical circuit damping with more efficient Joule heating, producing stronger shock waves. Preliminary tests resulted in an increase in shock Mach number from 34 to 39 in air at an initial pressure of 2.5 torr.
NASA transmission research and its probable effects on helicopter transmission design
NASA Technical Reports Server (NTRS)
Zaretsky, E. V.; Coy, J. J.; Townsend, D. P.
1983-01-01
Transmissions studied for application to helicopters in addition to the more conventional geared transmissions include hybrid (traction/gear), bearingless planetary, and split torque transmissions. Research is being performed to establish the validity of analysis and computer codes developed to predict the performance, efficiency, life, and reliability of these transmissions. Results of this research should provide the transmission designer with analytical tools to design for minimum weight and noise with maximum life and efficiency. In addition, the advantages and limitations of drive systems as well as the more conventional systems will be defined.
NASA transmission research and its probable effects on helicopter transmission design
NASA Technical Reports Server (NTRS)
Zaretsky, E. V.; Coy, J. J.; Townsend, D. P.
1984-01-01
Transmissions studied for application to helicopters in addition to the more conventional geared transmissions include hybrid (traction/gear), bearingless planetary, and split torque transmissions. Research is being performed to establish the validity of analysis and computer codes developed to predict the performance, efficiency, life, and reliability of these transmissions. Results of this research should provide the transmission designer with analytical tools to design for minimum weight and noise with maximum life and efficiency. In addition, the advantages and limitations of drive systems as well as the more conventional systems will be defined.
CytoSPADE: high-performance analysis and visualization of high-dimensional cytometry data
Linderman, Michael D.; Simonds, Erin F.; Qiu, Peng; Bruggner, Robert V.; Sheode, Ketaki; Meng, Teresa H.; Plevritis, Sylvia K.; Nolan, Garry P.
2012-01-01
Motivation: Recent advances in flow cytometry enable simultaneous single-cell measurement of 30+ surface and intracellular proteins. CytoSPADE is a high-performance implementation of an interface for the Spanning-tree Progression Analysis of Density-normalized Events algorithm for tree-based analysis and visualization of this high-dimensional cytometry data. Availability: Source code and binaries are freely available at http://cytospade.org and via Bioconductor version 2.10 onwards for Linux, OSX and Windows. CytoSPADE is implemented in R, C++ and Java. Contact: michael.linderman@mssm.edu Supplementary Information: Additional documentation available at http://cytospade.org. PMID:22782546
Thermal/structural Tailoring of Engine Blades (T/STAEBL) User's Manual
NASA Technical Reports Server (NTRS)
Brown, K. W.; Clevenger, W. B.; Arel, J. D.
1994-01-01
The Thermal/Structural Tailoring of Engine Blades (T/STAEBL) system is a family of computer programs executed by a control program. The T/STAEBL system performs design optimizations of cooled, hollow turbine blades and vanes. This manual contains an overview of the system, fundamentals of the data block structure, and detailed descriptions of the inputs required by the optimizer. Additionally, the thermal analysis input requirements are described as well as the inputs required to perform a finite element blade vibrations analysis.
ERIC Educational Resources Information Center
Wheldall, Kevin; Arakelian, Sarah
2016-01-01
The aim of this study was to compare the York Assessment of Reading for Comprehension (YARC) with the Neale Analysis of Reading Ability (NARA) and other measures of reading and related skills with a sample of older low-progress readers and to provide additional information regarding the validity of the YARC in Australia. The data from an…
MAGMA: Generalized Gene-Set Analysis of GWAS Data
de Leeuw, Christiaan A.; Mooij, Joris M.; Heskes, Tom; Posthuma, Danielle
2015-01-01
By aggregating data for complex traits in a biologically meaningful way, gene and gene-set analysis constitute a valuable addition to single-marker analysis. However, although various methods for gene and gene-set analysis currently exist, they generally suffer from a number of issues. Statistical power for most methods is strongly affected by linkage disequilibrium between markers, multi-marker associations are often hard to detect, and the reliance on permutation to compute p-values tends to make the analysis computationally very expensive. To address these issues we have developed MAGMA, a novel tool for gene and gene-set analysis. The gene analysis is based on a multiple regression model, to provide better statistical performance. The gene-set analysis is built as a separate layer around the gene analysis for additional flexibility. This gene-set analysis also uses a regression structure to allow generalization to analysis of continuous properties of genes and simultaneous analysis of multiple gene sets and other gene properties. Simulations and an analysis of Crohn’s Disease data are used to evaluate the performance of MAGMA and to compare it to a number of other gene and gene-set analysis tools. The results show that MAGMA has significantly more power than other tools for both the gene and the gene-set analysis, identifying more genes and gene sets associated with Crohn’s Disease while maintaining a correct type 1 error rate. Moreover, the MAGMA analysis of the Crohn’s Disease data was found to be considerably faster as well. PMID:25885710
MAGMA: generalized gene-set analysis of GWAS data.
de Leeuw, Christiaan A; Mooij, Joris M; Heskes, Tom; Posthuma, Danielle
2015-04-01
By aggregating data for complex traits in a biologically meaningful way, gene and gene-set analysis constitute a valuable addition to single-marker analysis. However, although various methods for gene and gene-set analysis currently exist, they generally suffer from a number of issues. Statistical power for most methods is strongly affected by linkage disequilibrium between markers, multi-marker associations are often hard to detect, and the reliance on permutation to compute p-values tends to make the analysis computationally very expensive. To address these issues we have developed MAGMA, a novel tool for gene and gene-set analysis. The gene analysis is based on a multiple regression model, to provide better statistical performance. The gene-set analysis is built as a separate layer around the gene analysis for additional flexibility. This gene-set analysis also uses a regression structure to allow generalization to analysis of continuous properties of genes and simultaneous analysis of multiple gene sets and other gene properties. Simulations and an analysis of Crohn's Disease data are used to evaluate the performance of MAGMA and to compare it to a number of other gene and gene-set analysis tools. The results show that MAGMA has significantly more power than other tools for both the gene and the gene-set analysis, identifying more genes and gene sets associated with Crohn's Disease while maintaining a correct type 1 error rate. Moreover, the MAGMA analysis of the Crohn's Disease data was found to be considerably faster as well.
BFS Simulation and Experimental Analysis of the Effect of Ti Additions on the Structure of NiAl
NASA Technical Reports Server (NTRS)
Bozzolo, Guillermo; Noebe, Ronald D.; Ferrante,John; Garg, Anita; Honecy, Frank S.; Amador, Carlos
1999-01-01
The Bozzolo-Ferrante-Smith (BFS) method for alloy energetics is applied to the study of ternary additions to NiAl. A description of the method and its application to alloy design is given. Two different approaches are used in the analysis of the effect of Ti additions to NiAl. First, a thorough analytical study is performed, where the energy of formation, lattice parameter and bulk modulus are calculated for a large number of possible atomic distributions of Ni, Al and Ti. Substitutional site preference schemes and formation of precipitates are thus predicted and analyzed. The second approach used consists of the determination of temperature effects on the final results, as obtained by performing a number of large scale numerical simulations using the Monte Carlo-Metropolis procedure and BFS for the calculation of the energy at every step in the simulation. The results indicate a sharp preference of Ti for Al sites in Ni-rich NiAl alloys and the formation of ternary Heusler precipitates beyond the predicted solubility limit of 5 at. % Ti. Experimental analysis of three Ni-Al-Ti alloys confirms the theoretical predictions.
Xiu, Junshan; Liu, Shiming; Sun, Meiling; Dong, Lili
2018-01-20
The photoelectric performance of metal ion-doped TiO 2 film will be improved with the changing of the compositions and concentrations of additive elements. In this work, the TiO 2 films doped with different Sn concentrations were obtained with the hydrothermal method. Qualitative and quantitative analysis of the Sn element in TiO 2 film was achieved with laser induced breakdown spectroscopy (LIBS) with the calibration curves plotted accordingly. The photoelectric characteristics of TiO 2 films doped with different Sn content were observed with UV visible absorption spectra and J-V curves. All results showed that Sn doping could improve the optical absorption to be red-shifted and advance the photoelectric properties of the TiO 2 films. We had obtained that when the concentration of Sn doping in TiO 2 films was 11.89 mmol/L, which was calculated by the LIBS calibration curves, the current density of the film was the largest, which indicated the best photoelectric performance. It indicated that LIBS was a potential and feasible measured method, which was applied to qualitative and quantitative analysis of the additive element in metal oxide nanometer film.
NASA Technical Reports Server (NTRS)
Bozzolo, Guillermo; Noebe, Ronald D.; Ferrante, John; Garg, Anita; Amador, Carlos
1997-01-01
The Bozzolo-Ferrante-Smith (BFS) semiempirical method for alloy energetics is applied to the study of ternary additions to NiAl alloys. A detailed description of the method and its application to alloy design is given. Two different approaches are used in the analysis of the effect of Ti additions to NiAl. First, a thorough analytical study is performed, where the energy of formation, lattice parameter and bulk modulus are calculated for hundreds of possible atomic distributions of Ni, Al and Ti. Substitutional site preference schemes and formation of precipitates are thus predicted and analyzed. The second approach used consists of the determination of temperature effects on the final results, as obtained by performing a number of large scale numerical simulations using the Monte Carlo - Metropolis procedure and BFS for the calculation of the energy at every step in the simulation. The results indicate a sharp preference of Ti for Al sites in Ni-rich NiAl alloys and the formation of ternary Heusler precipitates beyond the predicted solubility limit of 5 at. % Ti. Experimental analysis of three NiAl+Ti alloys confirms the theoretical predictions.
This presentation described implementation of the Common Representative Intermediate (CRI) atmospheric chemistry in CMAQ, a short analysis of its performance in CMAQ relative to other mechanisms and an example of the additional detail it gives us for understanding atmospheric che...
CASAS: Cancer Survival Analysis Suite, a web based application
Rupji, Manali; Zhang, Xinyan; Kowalski, Jeanne
2017-01-01
We present CASAS, a shiny R based tool for interactive survival analysis and visualization of results. The tool provides a web-based one stop shop to perform the following types of survival analysis: quantile, landmark and competing risks, in addition to standard survival analysis. The interface makes it easy to perform such survival analyses and obtain results using the interactive Kaplan-Meier and cumulative incidence plots. Univariate analysis can be performed on one or several user specified variable(s) simultaneously, the results of which are displayed in a single table that includes log rank p-values and hazard ratios along with their significance. For several quantile survival analyses from multiple cancer types, a single summary grid is constructed. The CASAS package has been implemented in R and is available via http://shinygispa.winship.emory.edu/CASAS/. The developmental repository is available at https://github.com/manalirupji/CASAS/. PMID:28928946
CASAS: Cancer Survival Analysis Suite, a web based application.
Rupji, Manali; Zhang, Xinyan; Kowalski, Jeanne
2017-01-01
We present CASAS, a shiny R based tool for interactive survival analysis and visualization of results. The tool provides a web-based one stop shop to perform the following types of survival analysis: quantile, landmark and competing risks, in addition to standard survival analysis. The interface makes it easy to perform such survival analyses and obtain results using the interactive Kaplan-Meier and cumulative incidence plots. Univariate analysis can be performed on one or several user specified variable(s) simultaneously, the results of which are displayed in a single table that includes log rank p-values and hazard ratios along with their significance. For several quantile survival analyses from multiple cancer types, a single summary grid is constructed. The CASAS package has been implemented in R and is available via http://shinygispa.winship.emory.edu/CASAS/. The developmental repository is available at https://github.com/manalirupji/CASAS/.
Lifetime assessment analysis of Galileo Li/SO2 cells: Final report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Levy, S.C.; Jaeger, C.D.; Bouchard, D.A.
Galileo Li/SO2 cells from five lots and five storage temperatures were studied to establish a database from which the performance of flight modules may be predicted. Nondestructive tests consisting of complex impedance analysis and a 15-s pulse were performed on all cells. Chemical analysis was performed on one cell from each lot/storage group, and the remaining cells were discharged at Galileo mission loads. An additional number of cells were placed on high-temperature accelerated aging storage for 6 months and then discharged. All data were statistically analyzed. Results indicate that the present Galileo design Li/SO2 cell will satisfy electrical requirements formore » a 10-year mission. 10 figs., 4 tabs.« less
Enhanced sulfidation xanthate flotation of malachite using ammonium ions as activator.
Wu, Dandan; Ma, Wenhui; Mao, Yingbo; Deng, Jiushuai; Wen, Shuming
2017-05-18
In this study, ammonium ion was used to enhance the sulfidation flotation of malachite. The effect of ammonium ion on the sulfidation flotation of malachite was investigated using microflotation test, inductively coupled plasma (ICP) analysis, zeta potential measurements, and scanning electron microscope analysis (SEM). The results of microflotation test show that the addition of sodium sulfide and ammonium sulfate resulted in better sulfidation than the addition of sodium sulfide alone. The results of ICP analysis indicate that the dissolution of enhanced sulfurized malachite surface is significantly decreased. Zeta potential measurements indicate that a smaller isoelectric point value and a large number of copper-sulfide films formed on the malachite surface by enhancing sulfidation resulted in a large amount of sodium butyl xanthate absorbed onto the enhanced sulfurized malachite surface. EDS semi-quantitative analysis and XPS analysis show that malachite was easily sulfurized by sodium sulfide with ammonium ion. These results show that the addition of ammonium ion plays a significant role in the sulfidation of malachite and results in improved flotation performance.
2007-05-01
RESULTS .............................................................................92 TABLE 17: RATINGS OF THE THERMAL COMFORT ON A 7 POINT SCALE...98. In addition to the body mapping of thermal discomfort, participants also rated thermal comfort acceptability for hot spots, ventilation and...overall comfort. Additionally each participant completed a thermal comfort Humansystems® Counter IED Page 91 questionnaire that examined ventilation
Analyzing large-scale spiking neural data with HRLAnalysis™
Thibeault, Corey M.; O'Brien, Michael J.; Srinivasa, Narayan
2014-01-01
The additional capabilities provided by high-performance neural simulation environments and modern computing hardware has allowed for the modeling of increasingly larger spiking neural networks. This is important for exploring more anatomically detailed networks but the corresponding accumulation in data can make analyzing the results of these simulations difficult. This is further compounded by the fact that many existing analysis packages were not developed with large spiking data sets in mind. Presented here is a software suite developed to not only process the increased amount of spike-train data in a reasonable amount of time, but also provide a user friendly Python interface. We describe the design considerations, implementation and features of the HRLAnalysis™ suite. In addition, performance benchmarks demonstrating the speedup of this design compared to a published Python implementation are also presented. The result is a high-performance analysis toolkit that is not only usable and readily extensible, but also straightforward to interface with existing Python modules. PMID:24634655
Parametric study of supersonic STOVL flight characteristics
NASA Technical Reports Server (NTRS)
Rapp, David C.
1985-01-01
A number of different control devices and techniques are evaluated to determine their suitability for increasing the short takeoff performance of a supersonic short-takeoff/vertical landing (STOVL) aircraft. Analysis was based on a rigid-body mathematical model of the General Dynamics E-7, a single engine configuration that utilizes ejectors and thrust deflection for propulsive lift. Alternatives investigated include increased static pitch, the addition of a close-coupled canard, use of boundary layer control to increase the takeoff lift coefficient, and the addition of a vectorable aft fan air nozzle. Other performance studies included the impact of individual E-7 features, the sensitivity to ejector performance, the effect of removing the afterburners, and a determination of optional takeoff and landing transition methods. The results pertain to both the E-7 and other configurations. Several alternatives were not as well suited to the E-7 characteristics as they would be to an alternative configuration, and vice versa. A large amount of supporting data for each analysis is included.
NASA Technical Reports Server (NTRS)
Sareen, Ashish K.; Sparks, Chad; Mullins, B. R., Jr.; Fasanella, Edwin; Jackson, Karen
2002-01-01
A comparison of the soft soil and hard surface impact performance of a crashworthy composite fuselage concept has been performed. Specifically, comparisons of the peak acceleration values, pulse duration, and onset rate at specific locations on the fuselage were evaluated. In a prior research program, the composite fuselage section was impacted at 25 feet per second onto concrete at the Impact Dynamics Research Facility (IDRF) at NASA Langley Research Center. A soft soil test was conducted at the same impact velocity as a part of the NRTC/RITA Crashworthy and Energy Absorbing Structures project. In addition to comparisons of soft soil and hard surface test results, an MSC. Dytran dynamic finite element model was developed to evaluate the test analysis correlation. In addition, modeling parameters and techniques affecting test analysis correlation are discussed. Once correlated, the analytical methodology will be used in follow-on work to evaluate the specific energy absorption of various subfloor concepts for improved crash protection during hard surface and soft soil impacts.
Fermentation performance optimization in an ectopic fermentation system.
Yang, Xiaotong; Geng, Bing; Zhu, Changxiong; Li, Hongna; He, Buwei; Guo, Hui
2018-07-01
Ectopic fermentation systems (EFSs) were developed for wastewater treatment. Previous studies have investigated the ability of thermophilic bacteria to improve fermentation performance in EFS. Continuing this research, we evaluated EFS performance using principle component analysis and investigated the addition of different proportions of cow dung. Viable bacteria communities were clustered and identified using BOX-AIR-based repetitive extragenic palindromic-PCR and 16S rDNA analysis. The results revealed optimal conditions for the padding were maize straw inoculated with thermophilic bacteria. Adding 20% cow dung yielded the best pH values (6.94-8.56), higher temperatures, increased wastewater absorption, improved litter quality, and greater microbial quantities. The viable bacteria groups were enriched by the addition of thermophilic consortium, and exogenous strains G21, G14, G4-1, and CR-15 were detected in fermentation process. The proportion of Bacillus species in treatment groups reached 70.37% after fermentation, demonstrating that thermophilic bacteria, especially Bacillus, have an important role in EFS, supporting previous predictions. Copyright © 2018 Elsevier Ltd. All rights reserved.
2015-10-01
analysis General support a. Domestic cannabis suppression and eradication b. Transportation Reconnaissance and observation a. Ground...mapping analysis n/a n/a n/a 433 433 Training or administrative 5,321 1,431 3,650 2,878 13,281 General support Cannabis eradication 23,679
Shape Optimization by Bayesian-Validated Computer-Simulation Surrogates
NASA Technical Reports Server (NTRS)
Patera, Anthony T.
1997-01-01
A nonparametric-validated, surrogate approach to optimization has been applied to the computational optimization of eddy-promoter heat exchangers and to the experimental optimization of a multielement airfoil. In addition to the baseline surrogate framework, a surrogate-Pareto framework has been applied to the two-criteria, eddy-promoter design problem. The Pareto analysis improves the predictability of the surrogate results, preserves generality, and provides a means to rapidly determine design trade-offs. Significant contributions have been made in the geometric description used for the eddy-promoter inclusions as well as to the surrogate framework itself. A level-set based, geometric description has been developed to define the shape of the eddy-promoter inclusions. The level-set technique allows for topology changes (from single-body,eddy-promoter configurations to two-body configurations) without requiring any additional logic. The continuity of the output responses for input variations that cross the boundary between topologies has been demonstrated. Input-output continuity is required for the straightforward application of surrogate techniques in which simplified, interpolative models are fitted through a construction set of data. The surrogate framework developed previously has been extended in a number of ways. First, the formulation for a general, two-output, two-performance metric problem is presented. Surrogates are constructed and validated for the outputs. The performance metrics can be functions of both outputs, as well as explicitly of the inputs, and serve to characterize the design preferences. By segregating the outputs and the performance metrics, an additional level of flexibility is provided to the designer. The validated outputs can be used in future design studies and the error estimates provided by the output validation step still apply, and require no additional appeals to the expensive analysis. Second, a candidate-based a posteriori error analysis capability has been developed which provides probabilistic error estimates on the true performance for a design randomly selected near the surrogate-predicted optimal design.
NASA Technical Reports Server (NTRS)
Johnston, John D.; Parrish, Keith; Howard, Joseph M.; Mosier, Gary E.; McGinnis, Mark; Bluth, Marcel; Kim, Kevin; Ha, Hong Q.
2004-01-01
This is a continuation of a series of papers on modeling activities for JWST. The structural-thermal- optical, often referred to as "STOP", analysis process is used to predict the effect of thermal distortion on optical performance. The benchmark STOP analysis for JWST assesses the effect of an observatory slew on wavefront error. The paper begins an overview of multi-disciplinary engineering analysis, or integrated modeling, which is a critical element of the JWST mission. The STOP analysis process is then described. This process consists of the following steps: thermal analysis, structural analysis, and optical analysis. Temperatures predicted using geometric and thermal math models are mapped to the structural finite element model in order to predict thermally-induced deformations. Motions and deformations at optical surfaces are input to optical models and optical performance is predicted using either an optical ray trace or WFE estimation techniques based on prior ray traces or first order optics. Following the discussion of the analysis process, results based on models representing the design at the time of the System Requirements Review. In addition to baseline performance predictions, sensitivity studies are performed to assess modeling uncertainties. Of particular interest is the sensitivity of optical performance to uncertainties in temperature predictions and variations in metal properties. The paper concludes with a discussion of modeling uncertainty as it pertains to STOP analysis.
Structure identification methods for atomistic simulations of crystalline materials
Stukowski, Alexander
2012-05-28
Here, we discuss existing and new computational analysis techniques to classify local atomic arrangements in large-scale atomistic computer simulations of crystalline solids. This article includes a performance comparison of typical analysis algorithms such as common neighbor analysis (CNA), centrosymmetry analysis, bond angle analysis, bond order analysis and Voronoi analysis. In addition we propose a simple extension to the CNA method that makes it suitable for multi-phase systems. Finally, we introduce a new structure identification algorithm, the neighbor distance analysis, which is designed to identify atomic structure units in grain boundaries.
Washko, George R; Criner, Gerald J; Mohsenifar, Zab; Sciurba, Frank C; Sharafkhaneh, Amir; Make, Barry J; Hoffman, Eric A; Reilly, John J
2008-06-01
Computed tomographic based indices of emphysematous lung destruction may highlight differences in disease pathogenesis and further enable the classification of subjects with Chronic Obstructive Pulmonary Disease. While there are multiple techniques that can be utilized for such radiographic analysis, there is very little published information comparing the performance of these methods in a clinical case series. Our objective was to examine several quantitative and semi-quantitative methods for the assessment of the burden of emphysema apparent on computed tomographic scans and compare their ability to predict lung mechanics and function. Automated densitometric analysis was performed on 1094 computed tomographic scans collected upon enrollment into the National Emphysema Treatment Trial. Trained radiologists performed an additional visual grading of emphysema on high resolution CT scans. Full pulmonary function test results were available for correlation, with a subset of subjects having additional measurements of lung static recoil. There was a wide range of emphysematous lung destruction apparent on the CT scans and univariate correlations to measures of lung function were of modest strength. No single method of CT scan analysis clearly outperformed the rest of the group. Quantification of the burden of emphysematous lung destruction apparent on CT scan is a weak predictor of lung function and mechanics in severe COPD with no uniformly superior method found to perform this analysis. The CT based quantification of emphysema may augment pulmonary function testing in the characterization of COPD by providing complementary phenotypic information.
The role of ecological dynamics in analysing performance in team sports.
Vilar, Luís; Araújo, Duarte; Davids, Keith; Button, Chris
2012-01-01
Performance analysis is a subdiscipline of sports sciences and one-approach, notational analysis, has been used to objectively audit and describe behaviours of performers during different subphases of play, providing additional information for practitioners to improve future sports performance. Recent criticisms of these methods have suggested the need for a sound theoretical rationale to explain performance behaviours, not just describe them. The aim of this article was to show how ecological dynamics provides a valid theoretical explanation of performance in team sports by explaining the formation of successful and unsuccessful patterns of play, based on symmetry-breaking processes emerging from functional interactions between players and the performance environment. We offer the view that ecological dynamics is an upgrade to more operational methods of performance analysis that merely document statistics of competitive performance. In support of our arguments, we refer to exemplar data on competitive performance in team sports that have revealed functional interpersonal interactions between attackers and defenders, based on variations in the spatial positioning of performers relative to each other in critical performance areas, such as the scoring zones. Implications of this perspective are also considered for practice task design and sport development programmes.
The proprietary hospital industry: a financial analysis 1972-1982.
Michel, A; Shaked, I; Daley, J
1985-01-01
This paper evaluates the performance of both specific firms within the American for-profit hospital industry and the industry as a whole. First, traditional financial analysis is used to evaluate individual publicly traded for-profit chains. Then, industry performance from 1973 to 1982 is evaluated using a set of measures based on Modern Portfolio Theory. The traditional financial analysis indicates that the industry seems increasingly profitable as well as increasingly healthy from the perspective of utilizing its assets and reducing its collection period. However, the industry's rapid growth rate has strained its ability to use additional debt funding and has created a potentially dangerous liquidity position. Measures based on Modern Portfolio Theory indicate that the average return of the industry has improved over the past 5 years. However, its risk has also increased. Nevertheless, the increase in risk is more than offset by the increased average return. In addition, recent legislation designed 'to reward the efficient' has introduced a significant degree of uncertainty into the industry's performance for the coming years. Thus, hospitals' ability to maintain the substantial profitability and rate of growth they have experienced over the past decade will depend on how well they will adapt to the changing environment.
Performing particle image velocimetry using artificial neural networks: a proof-of-concept
NASA Astrophysics Data System (ADS)
Rabault, Jean; Kolaas, Jostein; Jensen, Atle
2017-12-01
Traditional programs based on feature engineering are underperforming on a steadily increasing number of tasks compared with artificial neural networks (ANNs), in particular for image analysis. Image analysis is widely used in fluid mechanics when performing particle image velocimetry (PIV) and particle tracking velocimetry (PTV), and therefore it is natural to test the ability of ANNs to perform such tasks. We report for the first time the use of convolutional neural networks (CNNs) and fully connected neural networks (FCNNs) for performing end-to-end PIV. Realistic synthetic images are used for training the networks and several synthetic test cases are used to assess the quality of each network’s predictions and compare them with state-of-the-art PIV software. In addition, we present tests on real-world data that prove ANNs can be used not only with synthetic images but also with more noisy, imperfect images obtained in a real experimental setup. While the ANNs we present have slightly higher root mean square error than state-of-the-art cross-correlation methods, they perform better near edges and allow for higher spatial resolution than such methods. In addition, it is likely that one could with further work develop ANNs which perform better that the proof-of-concept we offer.
NASA Astrophysics Data System (ADS)
Dzarnisa; Rachmadi, D.; Azhar, A.; Fakhrur Riza, R.; Hidayati, A.
2018-02-01
Study on the effect of the addition of mangosteen (Garcinia mangostana L.) peel flour on physiological condition and performance of Etawa crossbreed goats was done. This was to grant the use of mangosteen peel flour that rich of antioxidants and has variety good benefits for health as feed additive for cattle. This study used a Complete Randomized Block Design consisting of 4 treatment groups and 4 replications each. Subjects were 16 female Etawa crossbreed goats randomly designed into treatments group based on lactation periods. Subjects were feed with traditional rations (control, A), traditional rations and 2.5% mangosteen peel flour (B), tradition rations and 5% mangosteen peel flour (C), and traditional rations and 7,5 % mangosteen peel flour (D). Data on performance (milk production) and physiological condition (respiratory frequency, rectal temperature, and heart rate) obtained were analyzed using analysis of variance (ANOVA). The results showed that the addition of mangosteen peel flour as food additive in the rations resulted in variations in the milk production, physiological condition (rectal temperature, heart rate and respiration frequency) and performances (daily weigh gain, food consumption, ration conversion and breast volume) of Etawa crossbreed goats, but significant effect was only observed in the respiration frequency. The addition of 2.5% mangosteen peel flour in the ration caused the best, expected effects on milk production physiological condition and performance of Etawa crossbreed goats.
Experimental study on behaviors of dielectric elastomer based on acrylonitrile butadiene rubber
NASA Astrophysics Data System (ADS)
An, Kuangjun; Chuc, Nguyen Huu; Kwon, Hyeok Yong; Phuc, Vuong Hong; Koo, Jachoon; Lee, Youngkwan; Nam, Jaedo; Choi, Hyouk Ryeol
2010-04-01
Previously, the dielectric elastomer based on Acrylonitrile Butadiene Rubber (NBR), called synthetic elastomer has been reported by our group. It has the advantages that its characteristics can be modified according to the requirements of performances, and thus, it is applicable to a wide variety of applications. In this paper, we address the effects of additives and vulcanization conditions on the overall performance of synthetic elastomer. In the present work, factors to have effects on the performances are extracted, e.g additives such as dioctyl phthalate (DOP), barium titanium dioxide (BaTiO3) and vulcanization conditions such as dicumyl peroxide (DCP), cross-linking times. Also, it is described how the performances can be optimized by using DOE (Design of Experiments) technique and experimental results are analyzed by ANOVA (Analysis of variance).
NASA Astrophysics Data System (ADS)
Wang, Zi-han; Wang, Chun-mei; Tang, Hua-xin; Zuo, Cheng-ji; Xu, Hong-ming
2009-06-01
Ignition timing control is of great importance in homogeneous charge compression ignition engines. The effect of hydrogen addition on methane combustion was investigated using a CHEMKIN multi-zone model. Results show that hydrogen addition advances ignition timing and enhances peak pressure and temperature. A brief analysis of chemical kinetics of methane blending hydrogen is also performed in order to investigate the scope of its application, and the analysis suggests that OH radical plays an important role in the oxidation. Hydrogen addition increases NOx while decreasing HC and CO emissions. Exhaust gas recirculation (EGR) also advances ignition timing; however, its effects on emissions are generally the opposite. By adjusting the hydrogen addition and EGR rate, the ignition timing can be regulated with a low emission level. Investigation into zones suggests that NOx is mostly formed in core zones while HC and CO mostly originate in the crevice and the quench layer.
Revolutionary opportunities for materials and structures study, addendum
NASA Technical Reports Server (NTRS)
Feig, P. D.
1987-01-01
This report is an addendum to the Revolutionary Opportunities for Materials and Structures Study (ROMS), modifying the original by the addition of two tasks. The primary purpose of these tasks was to conduct additional aircraft/engine sizing and mission analysis to obtain contributory aircraft performance data such as fuel burns and direct operating costs for both the subsonic and supersonic engines.
Additional Support for the Information Systems Analyst Exam as a Valid Program Assessment Tool
ERIC Educational Resources Information Center
Carpenter, Donald A.; Snyder, Johnny; Slauson, Gayla Jo; Bridge, Morgan K.
2011-01-01
This paper presents a statistical analysis to support the notion that the Information Systems Analyst (ISA) exam can be used as a program assessment tool in addition to measuring student performance. It compares ISA exam scores earned by students in one particular Computer Information Systems program with scores earned by the same students on the…
NASA Astrophysics Data System (ADS)
Behera, Kishore Kumar; Pal, Snehanshu
2018-03-01
This paper describes a new approach towards optimum utilisation of ferrochrome added during stainless steel making in AOD converter. The objective of optimisation is to enhance end blow chromium content of steel and reduce the ferrochrome addition during refining. By developing a thermodynamic based mathematical model, a study has been conducted to compute the optimum trade-off between ferrochrome addition and end blow chromium content of stainless steel using a predator prey genetic algorithm through training of 100 dataset considering different input and output variables such as oxygen, argon, nitrogen blowing rate, duration of blowing, initial bath temperature, chromium and carbon content, weight of ferrochrome added during refining. Optimisation is performed within constrained imposed on the input parameters whose values fall within certain ranges. The analysis of pareto fronts is observed to generate a set of feasible optimal solution between the two conflicting objectives that provides an effective guideline for better ferrochrome utilisation. It is found out that after a certain critical range, further addition of ferrochrome does not affect the chromium percentage of steel. Single variable response analysis is performed to study the variation and interaction of all individual input parameters on output variables.
Saliba, E; Abbassi-Ghadi, S; Vowles, R; Camilleri, J; Hooper, S; Camilleri, J
2009-04-01
To study the effect of addition of various proportions of bismuth oxide on compressive strength and radiopacity of Portland cement. The compressive strength of white Portland cement and cement replaced with 10, 15, 20, 25 and 30% bismuth oxide was evaluated by testing cylinders 6 mm in diameter and 12 mm high. Twelve cylinders were tested for each material under study. The radiopacity of the cements tested was evaluated using an aluminium step-wedge and densitometer. The optical density was compared with the relevant thickness of aluminium (Al). Statistical analysis was performed using Analysis of Variance (ANOVA) with P = 0.05 and Tukey test to perform multiple comparison tests. Various additions of bismuth oxide had no significant effect on the strength of the material when compared with the unmodified Portland cement (P > 0.05). The radiopacity of the cements tested ranged from 2.02 mm Al for Portland cement to 9.79 mm Al for the highest bismuth replacement. Addition of bismuth oxide did not affect the compressive strength of Portland cement. All the bismuth oxide cement mixtures had radio-opacities higher than 3 mm thickness of aluminium.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ashby, H.A.; Carlson, T.R.; Husson, L.
1986-01-01
Significant reductions in motor vehicle emissions are possible through the implementation of inspection and maintenance (I/M) programs. However, the potential benefits of I/M are obviously not achieved when specific inspection requirements are ignored or improperly performed. In addition, I/M benefits may be substantially reduced when improper repair procedures are used on vehicles which fail the test. In order for the ''theoretical'' benefits of I/M to be achieved, certain program design and enforcement procedures are necessary. The use of instrumentation and data analysis methods capable of identifying individuals who are improperly performing inspections and repairs is critical.
NASA Technical Reports Server (NTRS)
Egolf, T. Alan; Anderson, Olof L.; Edwards, David E.; Landgrebe, Anton J.
1988-01-01
A user's manual for the computer program developed for the prediction of propeller-nacelle aerodynamic performance reported in, An Analysis for High Speed Propeller-Nacelle Aerodynamic Performance Prediction: Volume 1 -- Theory and Application, is presented. The manual describes the computer program mode of operation requirements, input structure, input data requirements and the program output. In addition, it provides the user with documentation of the internal program structure and the software used in the computer program as it relates to the theory presented in Volume 1. Sample input data setups are provided along with selected printout of the program output for one of the sample setups.
Enhanced Component Performance Study. Emergency Diesel Generators 1998–2013
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schroeder, John Alton
2014-11-01
This report presents an enhanced performance evaluation of emergency diesel generators (EDGs) at U.S. commercial nuclear power plants. This report evaluates component performance over time using Institute of Nuclear Power Operations (INPO) Consolidated Events Database (ICES) data from 1998 through 2013 and maintenance unavailability (UA) performance data using Mitigating Systems Performance Index (MSPI) Basis Document data from 2002 through 2013. The objective is to present an analysis of factors that could influence the system and component trends in addition to annual performance trends of failure rates and probabilities. The factors analyzed for the EDG component are the differences in failuresmore » between all demands and actual unplanned engineered safety feature (ESF) demands, differences among manufacturers, and differences among EDG ratings. Statistical analyses of these differences are performed and results showing whether pooling is acceptable across these factors. In addition, engineering analyses were performed with respect to time period and failure mode. The factors analyzed are: sub-component, failure cause, detection method, recovery, manufacturer, and EDG rating.« less
Analysis of Long-Range Interaction in Lithium-Ion Battery Electrodes
Mistry, Aashutosh; Juarez-Robles, Daniel; Stein, Malcolm; ...
2016-12-01
The lithium-ion battery (LIB) electrode represents a complex porous composite, consisting of multiple phases including active material (AM), conductive additive, and polymeric binder. This study proposes a mesoscale model to probe the effects of the cathode composition, e.g., the ratio of active material, conductive additive, and binder content, on the electrochemical properties and performance. The results reveal a complex nonmonotonic behavior in the effective electrical conductivity as the amount of conductive additive is increased. Insufficient electronic conductivity of the electrode limits the cell operation to lower currents. Once sufficient electron conduction (i.e., percolation) is achieved, the rate performance can bemore » a strong function of ion-blockage effect and pore phase transport resistance. In conclusion, even for the same porosity, different arrangements of the solid phases may lead to notable difference in the cell performance, which highlights the need for accurate microstructural characterization and composite electrode preparation strategies.« less
Analysis of Long-Range Interaction in Lithium-Ion Battery Electrodes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mistry, Aashutosh; Juarez-Robles, Daniel; Stein, Malcolm
The lithium-ion battery (LIB) electrode represents a complex porous composite, consisting of multiple phases including active material (AM), conductive additive, and polymeric binder. This study proposes a mesoscale model to probe the effects of the cathode composition, e.g., the ratio of active material, conductive additive, and binder content, on the electrochemical properties and performance. The results reveal a complex nonmonotonic behavior in the effective electrical conductivity as the amount of conductive additive is increased. Insufficient electronic conductivity of the electrode limits the cell operation to lower currents. Once sufficient electron conduction (i.e., percolation) is achieved, the rate performance can bemore » a strong function of ion-blockage effect and pore phase transport resistance. In conclusion, even for the same porosity, different arrangements of the solid phases may lead to notable difference in the cell performance, which highlights the need for accurate microstructural characterization and composite electrode preparation strategies.« less
A Ten-Week Biochemistry Lab Project Studying Wild-Type and Mutant Bacterial Alkaline Phosphatase
ERIC Educational Resources Information Center
Witherow, D. Scott
2016-01-01
This work describes a 10-week laboratory project studying wild-type and mutant bacterial alkaline phosphatase, in which students purify, quantitate, and perform kinetic assays on wild-type and selected mutants of the enzyme. Students also perform plasmid DNA purification, digestion, and gel analysis. In addition to simply learning important…
ERIC Educational Resources Information Center
Leacock, Rachel E.; Stankus, John J.; Davis, Julian M.
2011-01-01
A high-performance liquid chromatography experiment to determine the concentration of caffeine and vitamin B6 in sports energy drinks has been developed. This laboratory activity, which is appropriate for an upper-level instrumental analysis course, illustrates the standard addition method and simultaneous determination of two species. (Contains 1…
Accounting for Slipping and Other False Negatives in Logistic Models of Student Learning
ERIC Educational Resources Information Center
MacLellan, Christopher J.; Liu, Ran; Koedinger, Kenneth R.
2015-01-01
Additive Factors Model (AFM) and Performance Factors Analysis (PFA) are two popular models of student learning that employ logistic regression to estimate parameters and predict performance. This is in contrast to Bayesian Knowledge Tracing (BKT) which uses a Hidden Markov Model formalism. While all three models tend to make similar predictions,…
NASA Technical Reports Server (NTRS)
Egebrecht, R. A.; Thorbjornsen, A. R.
1967-01-01
Digital computer programs determine steady-state performance characteristics of active and passive linear circuits. The ac analysis program solves the basic circuit parameters. The compiler program solves these circuit parameters and in addition provides a more versatile program by allowing the user to perform mathematical and logical operations.
An economic cost analysis of emergency department key performance indicators in Ireland.
Gannon, Brenda; Jones, Cheryl; McCabe, Aileen; O'Sullivan, Ronan; Wakai, Abel
2017-06-01
High quality data is fundamental to using key performance indicators (KPIs) for performance monitoring. However, the resources required to collect high quality data are often significant and should usually be targeted at high priority areas. As part of a study of 11 emergency department (ED) KPIs in Ireland, the primary objective of this study was to estimate the relative cost of collecting the additional minimum data set (MDS) elements for those 11 KPIs. An economic cost analysis focused on 12 EDs in the Republic of Ireland. The resource use data were obtained using two separate focus group interviews. The number of available MDS elements was obtained from a sample of 100 patient records per KPI per participating ED. Unit costs for all resource use were taken at the midpoint of the relevant staff salary scales. An ED would need to spend an estimated additional &OV0556;3561 per month on average to capture all the MDS elements relevant to the 11 KPIs investigated. The additional cost ranges from 14.8 to 39.2%; this range is 13.9-32.3% for small EDs, whereas the range for medium EDs is 11.7-40%. Regional EDs have a higher additional estimated cost to capture all the relevant MDS elements (&OV0556;3907), compared with urban EDs (&OV0556;3353). The additional cost of data collection, contingent on that already collected, required to capture all the relevant MDS elements for the KPIs examined, ranges from 14.8 to 39.2% per KPI, with variation identified between regional and urban hospitals.
Coupé, Veerle M. H.; Knottnerus, Bart J.; Geerlings, Suzanne E.; Moll van Charante, Eric P.; ter Riet, Gerben
2017-01-01
Background Uncomplicated Urinary Tract Infections (UTIs) are common in primary care resulting in substantial costs. Since antimicrobial resistance against antibiotics for UTIs is rising, accurate diagnosis is needed in settings with low rates of multidrug-resistant bacteria. Objective To compare the cost-effectiveness of different strategies to diagnose UTIs in women who contacted their general practitioner (GP) with painful and/or frequent micturition between 2006 and 2008 in and around Amsterdam, The Netherlands. Methods This is a model-based cost-effectiveness analysis using data from 196 women who underwent four tests: history, urine stick, sediment, dipslide, and the gold standard, a urine culture. Decision trees were constructed reflecting 15 diagnostic strategies comprising different parallel and sequential combinations of the four tests. Using the decision trees, for each strategy the costs and the proportion of women with a correct positive or negative diagnosis were estimated. Probabilistic sensitivity analysis was used to estimate uncertainty surrounding costs and effects. Uncertainty was presented using cost-effectiveness planes and acceptability curves. Results Most sequential testing strategies resulted in higher proportions of correctly classified women and lower costs than parallel testing strategies. For different willingness to pay thresholds, the most cost-effective strategies were: 1) performing a dipstick after a positive history for thresholds below €10 per additional correctly classified patient, 2) performing both a history and dipstick for thresholds between €10 and €17 per additional correctly classified patient, 3) performing a dipstick if history was negative, followed by a sediment if the dipstick was negative for thresholds between €17 and €118 per additional correctly classified patient, 4) performing a dipstick if history was negative, followed by a dipslide if the dipstick was negative for thresholds above €118 per additional correctly classified patient. Conclusion Depending on decision makers’ willingness to pay for one additional correctly classified woman, the strategy consisting of performing a history and dipstick simultaneously (ceiling ratios between €10 and €17) or performing a sediment if history and subsequent dipstick are negative (ceiling ratios between €17 and €118) are the most cost-effective strategies to diagnose a UTI. PMID:29186185
Bosmans, Judith E; Coupé, Veerle M H; Knottnerus, Bart J; Geerlings, Suzanne E; Moll van Charante, Eric P; Ter Riet, Gerben
2017-01-01
Uncomplicated Urinary Tract Infections (UTIs) are common in primary care resulting in substantial costs. Since antimicrobial resistance against antibiotics for UTIs is rising, accurate diagnosis is needed in settings with low rates of multidrug-resistant bacteria. To compare the cost-effectiveness of different strategies to diagnose UTIs in women who contacted their general practitioner (GP) with painful and/or frequent micturition between 2006 and 2008 in and around Amsterdam, The Netherlands. This is a model-based cost-effectiveness analysis using data from 196 women who underwent four tests: history, urine stick, sediment, dipslide, and the gold standard, a urine culture. Decision trees were constructed reflecting 15 diagnostic strategies comprising different parallel and sequential combinations of the four tests. Using the decision trees, for each strategy the costs and the proportion of women with a correct positive or negative diagnosis were estimated. Probabilistic sensitivity analysis was used to estimate uncertainty surrounding costs and effects. Uncertainty was presented using cost-effectiveness planes and acceptability curves. Most sequential testing strategies resulted in higher proportions of correctly classified women and lower costs than parallel testing strategies. For different willingness to pay thresholds, the most cost-effective strategies were: 1) performing a dipstick after a positive history for thresholds below €10 per additional correctly classified patient, 2) performing both a history and dipstick for thresholds between €10 and €17 per additional correctly classified patient, 3) performing a dipstick if history was negative, followed by a sediment if the dipstick was negative for thresholds between €17 and €118 per additional correctly classified patient, 4) performing a dipstick if history was negative, followed by a dipslide if the dipstick was negative for thresholds above €118 per additional correctly classified patient. Depending on decision makers' willingness to pay for one additional correctly classified woman, the strategy consisting of performing a history and dipstick simultaneously (ceiling ratios between €10 and €17) or performing a sediment if history and subsequent dipstick are negative (ceiling ratios between €17 and €118) are the most cost-effective strategies to diagnose a UTI.
DOE Office of Scientific and Technical Information (OSTI.GOV)
John Smart
A preliminary analysis of data from The EV Project was performed to begin answering the question: are corridor charging stations used to extend the range of electric vehicles? Data analyzed were collected from Blink brand electric vehicle supply equipment (EVSE) units based in California, Washington, and Oregon. Analysis was performed on data logged between October 1, 2012 and January 1, 2013. It should be noted that as additional AC Level 2 EVSE and DC fast chargers are deployed, and as drivers become more familiar with the use of public charging infrastructure, future analysis may have dissimilar conclusions.
An urban energy performance evaluation system and its computer implementation.
Wang, Lei; Yuan, Guan; Long, Ruyin; Chen, Hong
2017-12-15
To improve the urban environment and effectively reflect and promote urban energy performance, an urban energy performance evaluation system was constructed, thereby strengthening urban environmental management capabilities. From the perspectives of internalization and externalization, a framework of evaluation indicators and key factors that determine urban energy performance and explore the reasons for differences in performance was proposed according to established theory and previous studies. Using the improved stochastic frontier analysis method, an urban energy performance evaluation and factor analysis model was built that brings performance evaluation and factor analysis into the same stage for study. According to data obtained for the Chinese provincial capitals from 2004 to 2013, the coefficients of the evaluation indicators and key factors were calculated by the urban energy performance evaluation and factor analysis model. These coefficients were then used to compile the program file. The urban energy performance evaluation system developed in this study was designed in three parts: a database, a distributed component server, and a human-machine interface. Its functions were designed as login, addition, edit, input, calculation, analysis, comparison, inquiry, and export. On the basis of these contents, an urban energy performance evaluation system was developed using Microsoft Visual Studio .NET 2015. The system can effectively reflect the status of and any changes in urban energy performance. Beijing was considered as an example to conduct an empirical study, which further verified the applicability and convenience of this evaluation system. Copyright © 2017 Elsevier Ltd. All rights reserved.
Zhang, Lu; Huang, Jinhua; Youssef, Kyrrilos; ...
2014-10-31
A novel electrolyte additive, 3-oxabicyclo[3.1.0]hexane-2,4-dione (OHD), has been discovered and evaluated in Li- 1.1(Mn 1/3Ni 1/3Co 1/3) 0.9O 2/graphite cells under elevated temperature. When an appropriate amount of OHD is used, the cell capacity retention is improved from 60% (Gen 2 electrolyte alone) to 82% (Gen 2 electrolyte plus OHD) after 200 cycles with no obvious impedance increase. The amount of OHD added is the key to achieving the superior cell performance. In conclusion, the effect of OHD additive was investigated by means of electrochemical analysis, fourier transform infrared spectroscopy, scanning electron microscopy, and density functional theory computation.
Kohlmann, Sebastian; Gierk, Benjamin
2017-01-01
Background Although depression is common in patients with heart disease, screening for depression is much debated. DEPSCREEN-INFO showed that a patient-targeted feedback in addition to screening results in lower depression level six months after screening. The purpose of this analysis was to perform a cost-effectiveness analysis of DEPSCREEN-INFO. Methods Patients with coronary heart disease or arterial hypertension were included. Participants in both groups were screened for depression. Participants in the intervention group additionally received a patient-targeted feedback of their result and recommended treatment options. A cost-utility analysis using quality-adjusted life years (QALY) based on the EQ-5D was performed. The time horizon was 6 months. Resource utilization was assessed by a telephone interview. Multiple imputation using chained equations was used. Net-benefit regressions controlled for prognostic variables at baseline were performed to construct cost-effectiveness acceptability curves. Different sensitivity analyses were performed. Results 375 participants (intervention group: 155; control group: 220) were included at baseline. After 6 months, in the intervention group adjusted total costs were lower (-€2,098; SE: €1,717) and more QALY were gained (0.0067; SD: 0.0133); yet differences were not statistically significant. The probability of cost-effectiveness was around 80% independent of the willingness-to-pay (range: €0/QALY–€130,000/QALY). The results were robust. Conclusions A patient-targeted feedback in addition to depression screening in cardiology is cost-effective with a high probability. This underpins the use of the patient-targeted feedbacks and the PHQ-9 that are both freely available and easy to implement in routine care. PMID:28806775
Constrained independent component analysis approach to nonobtrusive pulse rate measurements
NASA Astrophysics Data System (ADS)
Tsouri, Gill R.; Kyal, Survi; Dianat, Sohail; Mestha, Lalit K.
2012-07-01
Nonobtrusive pulse rate measurement using a webcam is considered. We demonstrate how state-of-the-art algorithms based on independent component analysis suffer from a sorting problem which hinders their performance, and propose a novel algorithm based on constrained independent component analysis to improve performance. We present how the proposed algorithm extracts a photoplethysmography signal and resolves the sorting problem. In addition, we perform a comparative study between the proposed algorithm and state-of-the-art algorithms over 45 video streams using a finger probe oxymeter for reference measurements. The proposed algorithm provides improved accuracy: the root mean square error is decreased from 20.6 and 9.5 beats per minute (bpm) for existing algorithms to 3.5 bpm for the proposed algorithm. An error of 3.5 bpm is within the inaccuracy expected from the reference measurements. This implies that the proposed algorithm provided performance of equal accuracy to the finger probe oximeter.
Constrained independent component analysis approach to nonobtrusive pulse rate measurements.
Tsouri, Gill R; Kyal, Survi; Dianat, Sohail; Mestha, Lalit K
2012-07-01
Nonobtrusive pulse rate measurement using a webcam is considered. We demonstrate how state-of-the-art algorithms based on independent component analysis suffer from a sorting problem which hinders their performance, and propose a novel algorithm based on constrained independent component analysis to improve performance. We present how the proposed algorithm extracts a photoplethysmography signal and resolves the sorting problem. In addition, we perform a comparative study between the proposed algorithm and state-of-the-art algorithms over 45 video streams using a finger probe oxymeter for reference measurements. The proposed algorithm provides improved accuracy: the root mean square error is decreased from 20.6 and 9.5 beats per minute (bpm) for existing algorithms to 3.5 bpm for the proposed algorithm. An error of 3.5 bpm is within the inaccuracy expected from the reference measurements. This implies that the proposed algorithm provided performance of equal accuracy to the finger probe oximeter.
Spinal schwannomatosis in the absence of neurofibromatosis: A very rare condition
Landi, A.; Dugoni, D.E.; Marotta, N.; Mancarella, C.; Delfini, R.
2010-01-01
Schwannomatosis is defined as an extremely rare tumors syndrome characterized by the presence of multiple schwannomas in the absence of typical signs of NF1 and NF2 syndromes. The genetic and molecular analysis performed on these tumors makes it possible to name schwannomatosis as distinct clinical and genetic syndrome. The treatment in the case of symptomatic lesions is surgical removal; if the lesions are asymptomatic it is better to perform serial MRI studies. Given the high incidence of developing additional lesions in patients with schwannomatosis, it remains imperative to perform serial brain and spinal cord MRI studies during follow-up. The differential diagnosis is important including clinical and radiological criteria plus molecular genetic analysis of tumor cells and lymphocyte DNA. We report a rare case of spinal schwannomatosis in which genetic analysis performed on surgical samples showed two different mutations in the cells of the two lesions. PMID:22096683
Spinal schwannomatosis in the absence of neurofibromatosis: A very rare condition.
Landi, A; Dugoni, D E; Marotta, N; Mancarella, C; Delfini, R
2011-01-01
Schwannomatosis is defined as an extremely rare tumors syndrome characterized by the presence of multiple schwannomas in the absence of typical signs of NF1 and NF2 syndromes. The genetic and molecular analysis performed on these tumors makes it possible to name schwannomatosis as distinct clinical and genetic syndrome. The treatment in the case of symptomatic lesions is surgical removal; if the lesions are asymptomatic it is better to perform serial MRI studies. Given the high incidence of developing additional lesions in patients with schwannomatosis, it remains imperative to perform serial brain and spinal cord MRI studies during follow-up. The differential diagnosis is important including clinical and radiological criteria plus molecular genetic analysis of tumor cells and lymphocyte DNA. We report a rare case of spinal schwannomatosis in which genetic analysis performed on surgical samples showed two different mutations in the cells of the two lesions.
NASA Technical Reports Server (NTRS)
Rockfeller, W C
1939-01-01
Equations have been developed for the analysis of the performance of the ideal airplane, leading to an approximate physical interpretation of the performance problem. The basic sea-level airplane parameters have been generalized to altitude parameters and a new parameter has been introduced and physically interpreted. The performance analysis for actual airplanes has been obtained in terms of the equivalent ideal airplane in order that the charts developed for use in practical calculations will for the most part apply to any type of engine-propeller combination and system of control, the only additional material required consisting of the actual engine and propeller curves for propulsion unit. Finally, a more exact method for the calculation of the climb characteristics for the constant-speed controllable propeller is presented in the appendix.
Trampush, J W; Yang, M L Z; Yu, J; Knowles, E; Davies, G; Liewald, D C; Starr, J M; Djurovic, S; Melle, I; Sundet, K; Christoforou, A; Reinvang, I; DeRosse, P; Lundervold, A J; Steen, V M; Espeseth, T; Räikkönen, K; Widen, E; Palotie, A; Eriksson, J G; Giegling, I; Konte, B; Roussos, P; Giakoumaki, S; Burdick, K E; Payton, A; Ollier, W; Horan, M; Chiba-Falek, O; Attix, D K; Need, A C; Cirulli, E T; Voineskos, A N; Stefanis, N C; Avramopoulos, D; Hatzimanolis, A; Arking, D E; Smyrnis, N; Bilder, R M; Freimer, N A; Cannon, T D; London, E; Poldrack, R A; Sabb, F W; Congdon, E; Conley, E D; Scult, M A; Dickinson, D; Straub, R E; Donohoe, G; Morris, D; Corvin, A; Gill, M; Hariri, A R; Weinberger, D R; Pendleton, N; Bitsios, P; Rujescu, D; Lahti, J; Le Hellard, S; Keller, M C; Andreassen, O A; Deary, I J; Glahn, D C; Malhotra, A K; Lencz, T
2017-03-01
The complex nature of human cognition has resulted in cognitive genomics lagging behind many other fields in terms of gene discovery using genome-wide association study (GWAS) methods. In an attempt to overcome these barriers, the current study utilized GWAS meta-analysis to examine the association of common genetic variation (~8M single-nucleotide polymorphisms (SNP) with minor allele frequency ⩾1%) to general cognitive function in a sample of 35 298 healthy individuals of European ancestry across 24 cohorts in the Cognitive Genomics Consortium (COGENT). In addition, we utilized individual SNP lookups and polygenic score analyses to identify genetic overlap with other relevant neurobehavioral phenotypes. Our primary GWAS meta-analysis identified two novel SNP loci (top SNPs: rs76114856 in the CENPO gene on chromosome 2 and rs6669072 near LOC105378853 on chromosome 1) associated with cognitive performance at the genome-wide significance level (P<5 × 10 -8 ). Gene-based analysis identified an additional three Bonferroni-corrected significant loci at chromosomes 17q21.31, 17p13.1 and 1p13.3. Altogether, common variation across the genome resulted in a conservatively estimated SNP heritability of 21.5% (s.e.=0.01%) for general cognitive function. Integration with prior GWAS of cognitive performance and educational attainment yielded several additional significant loci. Finally, we found robust polygenic correlations between cognitive performance and educational attainment, several psychiatric disorders, birth length/weight and smoking behavior, as well as a novel genetic association to the personality trait of openness. These data provide new insight into the genetics of neurocognitive function with relevance to understanding the pathophysiology of neuropsychiatric illness.
Trampush, J W; Yang, M L Z; Yu, J; Knowles, E; Davies, G; Liewald, D C; Starr, J M; Djurovic, S; Melle, I; Sundet, K; Christoforou, A; Reinvang, I; DeRosse, P; Lundervold, A J; Steen, V M; Espeseth, T; Räikkönen, K; Widen, E; Palotie, A; Eriksson, J G; Giegling, I; Konte, B; Roussos, P; Giakoumaki, S; Burdick, K E; Payton, A; Ollier, W; Horan, M; Chiba-Falek, O; Attix, D K; Need, A C; Cirulli, E T; Voineskos, A N; Stefanis, N C; Avramopoulos, D; Hatzimanolis, A; Arking, D E; Smyrnis, N; Bilder, R M; Freimer, N A; Cannon, T D; London, E; Poldrack, R A; Sabb, F W; Congdon, E; Conley, E D; Scult, M A; Dickinson, D; Straub, R E; Donohoe, G; Morris, D; Corvin, A; Gill, M; Hariri, A R; Weinberger, D R; Pendleton, N; Bitsios, P; Rujescu, D; Lahti, J; Le Hellard, S; Keller, M C; Andreassen, O A; Deary, I J; Glahn, D C; Malhotra, A K; Lencz, T
2017-01-01
The complex nature of human cognition has resulted in cognitive genomics lagging behind many other fields in terms of gene discovery using genome-wide association study (GWAS) methods. In an attempt to overcome these barriers, the current study utilized GWAS meta-analysis to examine the association of common genetic variation (~8M single-nucleotide polymorphisms (SNP) with minor allele frequency ⩾1%) to general cognitive function in a sample of 35 298 healthy individuals of European ancestry across 24 cohorts in the Cognitive Genomics Consortium (COGENT). In addition, we utilized individual SNP lookups and polygenic score analyses to identify genetic overlap with other relevant neurobehavioral phenotypes. Our primary GWAS meta-analysis identified two novel SNP loci (top SNPs: rs76114856 in the CENPO gene on chromosome 2 and rs6669072 near LOC105378853 on chromosome 1) associated with cognitive performance at the genome-wide significance level (P<5 × 10−8). Gene-based analysis identified an additional three Bonferroni-corrected significant loci at chromosomes 17q21.31, 17p13.1 and 1p13.3. Altogether, common variation across the genome resulted in a conservatively estimated SNP heritability of 21.5% (s.e.=0.01%) for general cognitive function. Integration with prior GWAS of cognitive performance and educational attainment yielded several additional significant loci. Finally, we found robust polygenic correlations between cognitive performance and educational attainment, several psychiatric disorders, birth length/weight and smoking behavior, as well as a novel genetic association to the personality trait of openness. These data provide new insight into the genetics of neurocognitive function with relevance to understanding the pathophysiology of neuropsychiatric illness. PMID:28093568
Cheng, Yang; Zhu, Yun; Huang, Xiuping; Zhang, Wei; Han, Zelong; Liu, Side
2015-01-01
The associations between toll-like receptor 2 (TLR2) and toll-like receptor 4(TLR4) polymorphisms and inflammatory bowel disease (IBD) susceptibility remain controversial. A meta-analysis was performed to assess these associations. A systematic search was performed to identify all relevant studies relating TLR2 and TLR4 polymorphisms and IBD susceptibility. Odds ratios (ORs) and 95% confidence intervals (CIs) were calculated. Subgroup analyses were performed by ethnicity and publication quality. Thirty-eight eligible studies, assessing 10970 cases and 7061 controls were included. No TLR2 Arg677Trp polymorphism was found. No significant association was observed between TLR2 Arg753Gln polymorphism and Crohn's disease (CD) or ulcerative colitis (UC) in all genetic models. Interestingly, TLR4 Asp299Gly polymorphism was significantly associated with increased risk of CD and UC in all genetic models, except for the additive one in CD. In addition, a statistically significant association between TLR4 Asp299Gly polymorphism and IBD was observed among high quality studies evaluating Caucasians, but not Asians. Associations between TLR4 Thr399Ile polymorphisms and CD risk were found only in the allele and dominant models. The TLR4 Thr399Ile polymorphism was associated with UC risk in pooled results as well as subgroup analysis of high quality publications assessing Caucasians, in allele and dominant models. The meta-analysis provides evidence that TLR2 Arg753Gln is not associated with CD and UC susceptibility in Asians; TLR4 Asp299Gly is associated with CD and UC susceptibility in Caucasians, but not Asians. TLR4 Thr399Ile may be associated with IBD susceptibility in Caucasians only. Additional well-powered studies of Asp299Gly and other TLR4 variants are warranted.
Levy, A R; Perry, J; Nicholls, A R; Larkin, D; Davies, J
2015-01-01
This study explored the mediating role of sport confidence upon (1) sources of sport confidence-performance relationship and (2) imagery-performance relationship. Participants were 157 competitive athletes who completed state measures of confidence level/sources, imagery type and performance within one hour after competition. Among the current sample, confirmatory factor analysis revealed appropriate support for the nine-factor SSCQ and the five-factor SIQ. Mediational analysis revealed that sport confidence had a mediating influence upon the achievement source of confidence-performance relationship. In addition, both cognitive and motivational imagery types were found to be important sources of confidence, as sport confidence mediated imagery type- performance relationship. Findings indicated that athletes who construed confidence from their own achievements and report multiple images on a more frequent basis are likely to benefit from enhanced levels of state sport confidence and subsequent performance.
Gamma Ray Observatory (GRO) OBC attitude error analysis
NASA Technical Reports Server (NTRS)
Harman, R. R.
1990-01-01
This analysis involves an in-depth look into the onboard computer (OBC) attitude determination algorithm. A review of TRW error analysis and necessary ground simulations to understand the onboard attitude determination process are performed. In addition, a plan is generated for the in-flight calibration and validation of OBC computed attitudes. Pre-mission expected accuracies are summarized and sensitivity of onboard algorithms to sensor anomalies and filter tuning parameters are addressed.
Processing of Ceramics by Biopolymers. Ultrastructure-Property Relationships in Biocrystals
1991-10-09
34brick and mortar" microarchitecture with 0.5-pm-thick aragonite paltelets and a 20-nm-thick organic matrix between them. An analysis performed by TEM...proteins with the precipitates, (iii) analysis of the calcite lattice in terms of protein occlusion (atomic resolution electron microscopy and...crystallites have definite crystallographic orientation relationships on the "in- layer" and "through-thickness" directions. In addition, analysis of
2007-06-15
the base -case, a series analysis can be performed by varying the various inputs to the network to examine the impact of potential changes to improve...successfully interrogated was the primary MOE. • Based solely on the cost benefit analysis , the RSTG found that the addition of an Unmanned Surface...cargo. The CBP uses a risk based analysis and intelligence to pre-screen, assess and examine 100% of suspicious containers. The remaining cargo is
Reliability studies of Integrated Modular Engine system designs
NASA Technical Reports Server (NTRS)
Hardy, Terry L.; Rapp, Douglas C.
1993-01-01
A study was performed to evaluate the reliability of Integrated Modular Engine (IME) concepts. Comparisons were made between networked IME systems and non-networked discrete systems using expander cycle configurations. Both redundant and non-redundant systems were analyzed. Binomial approximation and Markov analysis techniques were employed to evaluate total system reliability. In addition, Failure Modes and Effects Analyses (FMEA), Preliminary Hazard Analyses (PHA), and Fault Tree Analysis (FTA) were performed to allow detailed evaluation of the IME concept. A discussion of these system reliability concepts is also presented.
Reliability studies of integrated modular engine system designs
NASA Technical Reports Server (NTRS)
Hardy, Terry L.; Rapp, Douglas C.
1993-01-01
A study was performed to evaluate the reliability of Integrated Modular Engine (IME) concepts. Comparisons were made between networked IME systems and non-networked discrete systems using expander cycle configurations. Both redundant and non-redundant systems were analyzed. Binomial approximation and Markov analysis techniques were employed to evaluate total system reliability. In addition, Failure Modes and Effects Analyses (FMEA), Preliminary Hazard Analyses (PHA), and Fault Tree Analysis (FTA) were performed to allow detailed evaluation of the IME concept. A discussion of these system reliability concepts is also presented.
Reliability studies of integrated modular engine system designs
NASA Astrophysics Data System (ADS)
Hardy, Terry L.; Rapp, Douglas C.
1993-06-01
A study was performed to evaluate the reliability of Integrated Modular Engine (IME) concepts. Comparisons were made between networked IME systems and non-networked discrete systems using expander cycle configurations. Both redundant and non-redundant systems were analyzed. Binomial approximation and Markov analysis techniques were employed to evaluate total system reliability. In addition, Failure Modes and Effects Analyses (FMEA), Preliminary Hazard Analyses (PHA), and Fault Tree Analysis (FTA) were performed to allow detailed evaluation of the IME concept. A discussion of these system reliability concepts is also presented.
Reliability studies of Integrated Modular Engine system designs
NASA Astrophysics Data System (ADS)
Hardy, Terry L.; Rapp, Douglas C.
1993-06-01
A study was performed to evaluate the reliability of Integrated Modular Engine (IME) concepts. Comparisons were made between networked IME systems and non-networked discrete systems using expander cycle configurations. Both redundant and non-redundant systems were analyzed. Binomial approximation and Markov analysis techniques were employed to evaluate total system reliability. In addition, Failure Modes and Effects Analyses (FMEA), Preliminary Hazard Analyses (PHA), and Fault Tree Analysis (FTA) were performed to allow detailed evaluation of the IME concept. A discussion of these system reliability concepts is also presented.
High Resolution Neutron Radiography and Tomography of Hydrided Zircaloy-4 Cladding Materials
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, Tyler S; Bilheux, Hassina Z; Ray, Holly B
2015-01-01
Neutron radiography for hydrogen analysis was performed with several Zircaloy-4 cladding samples with controlled hydrogen concentrations up to 1100 ppm. Hydrogen charging was performed in a process tube that was heated to facilitate hydrogen absorption by the metal. A correlation between the hydrogen concentration in the hydrided tubes and the neutron intensity was established, by which hydrogen content can be determined precisely in a small area (55 m x 55 m). Radiography analysis was also performed to evaluate the heating rate and its correlation with the hydrogen distribution through hydrided materials. In addition to radiography analysis, tomography experiments were performedmore » on Zircaloy-4 tube samples to study the local hydrogen distribution. Through tomography analysis a 3D reconstruction of the tube was evaluated in which an uneven hydrogen distribution in the circumferential direction can be observed.« less
Component Cost Analysis of Large Scale Systems
NASA Technical Reports Server (NTRS)
Skelton, R. E.; Yousuff, A.
1982-01-01
The ideas of cost decomposition is summarized to aid in the determination of the relative cost (or 'price') of each component of a linear dynamic system using quadratic performance criteria. In addition to the insights into system behavior that are afforded by such a component cost analysis CCA, these CCA ideas naturally lead to a theory for cost-equivalent realizations.
SP-100 lithium thaw design, analysis, and testing
NASA Astrophysics Data System (ADS)
Choe, Hwang; Schrag, Michael R.; Koonce, David R.; Gamble, Robert E.; Halfen, Frank J.; Kirpich, Aaron S.
1993-01-01
The thaw design has been established for the 100 kWe SP-100 Space Reactor Power System. System thaw/startup analysis has confirmed that all system thaw requirements are met, and that rethaw and restart can be easily accomplished with this design. In addition, a series of lithium thaw characterization tests has been performed, confirming key design assumptions.
An Overview of the NIRA Status
NASA Technical Reports Server (NTRS)
Hughes, William
2003-01-01
The NASA Glenn Research Center (GRC) has been tasked by NASA JSC's ISS Payloads Office to perform the NIRA (Non-Isolated Rack Assessment) microgravity prediction analysis task for the International Space Station. Previously, the NIRA analysis task had been performed by Boeing/Houston. Boeing's last NIRA analysis was released in 1999 and was denoted as "NIRA 99." GRC is currently close to completing our first full-NIRA analysis (encompassing the frequency range from 0 to 50 Hz) to be released as "NIRA 2003." This presentation will focus on describing the NIRA analysis, the transition of this analysis task from Boeing to GRC, and the current status and schedule for release of the NIRA 2003 results. Additionally, the results obtained from a mini-NIRA analysis requested by ESA and completed by GRC in the Spring of 2003 will be shown. This mini-analysis focused solely on predicting the microgravity environment at the COF-EPF (Columbus Orbiting Facility - External Payload Facility).
NASA Astrophysics Data System (ADS)
Brooks, R.
2012-05-01
The employment market for graduates is competitive with employers requiring appropriate work experience in addition to academic qualifications. Sandwich courses, where up to a year is spent in industry, provide an opportunity for structured work experience to be gained alongside studying. Benefits of placements include improved academic performance and the development of transferable skills to increase employability. This paper evaluates the impact of placements on academic performance and graduate employment among management students. Analysing performance data and graduate destinations data, results indicate that management students completing a placement are more likely to perform better academically with improvements in their personal grades between year 2 and the final year. Additionally, a qualitative themed analysis of student experiences indicates placement students feel more confident in engaging with the graduate recruitment process, with a better understanding of their personal skills and an ability to articulate their experience in relation to the workplace.
Performance Optimization of Marine Science and Numerical Modeling on HPC Cluster
Yang, Dongdong; Yang, Hailong; Wang, Luming; Zhou, Yucong; Zhang, Zhiyuan; Wang, Rui; Liu, Yi
2017-01-01
Marine science and numerical modeling (MASNUM) is widely used in forecasting ocean wave movement, through simulating the variation tendency of the ocean wave. Although efforts have been devoted to improve the performance of MASNUM from various aspects by existing work, there is still large space unexplored for further performance improvement. In this paper, we aim at improving the performance of propagation solver and data access during the simulation, in addition to the efficiency of output I/O and load balance. Our optimizations include several effective techniques such as the algorithm redesign, load distribution optimization, parallel I/O and data access optimization. The experimental results demonstrate that our approach achieves higher performance compared to the state-of-the-art work, about 3.5x speedup without degrading the prediction accuracy. In addition, the parameter sensitivity analysis shows our optimizations are effective under various topography resolutions and output frequencies. PMID:28045972
Performance Analysis of Constrained Loosely Coupled GPS/INS Integration Solutions
Falco, Gianluca; Einicke, Garry A.; Malos, John T.; Dovis, Fabio
2012-01-01
The paper investigates approaches for loosely coupled GPS/INS integration. Error performance is calculated using a reference trajectory. A performance improvement can be obtained by exploiting additional map information (for example, a road boundary). A constrained solution has been developed and its performance compared with an unconstrained one. The case of GPS outages is also investigated showing how a Kalman filter that operates on the last received GPS position and velocity measurements provides a performance benefit. Results are obtained by means of simulation studies and real data. PMID:23202241
Real-time computed tomography dosimetry during ultrasound-guided brachytherapy for prostate cancer.
Kaplan, Irving D; Meskell, Paul; Oldenburg, Nicklas E; Saltzman, Brian; Kearney, Gary P; Holupka, Edward J
2006-01-01
Ultrasound-guided implantation of permanent radioactive seeds is a treatment option for localized prostate cancer. Several techniques have been described for the optimal placement of the seeds in the prostate during this procedure. Postimplantation dosimetric calculations are performed after the implant. Areas of underdosing can only be corrected with either an external beam boost or by performing a second implant. We demonstrate the feasibility of performing computed tomography (CT)-based postplanning during the ultrasound-guided implant and subsequently correcting for underdosed areas. Ultrasound-guided brachytherapy is performed on a modified CT table with general anesthesia. The postplanning CT scan is performed after the implant, while the patient is still under anesthesia. Additional seeds are implanted into "cold spots," and the resultant dosimetry confirmed with CT. Intraoperative postplanning was successfully performed. Dose-volume histograms demonstrated adequate dose coverage during the initial implant, but on detailed analysis, for some patients, areas of underdosing were observed either at the apex or the peripheral zone. Additional seeds were implanted to bring these areas to prescription dose. Intraoperative postplanning is feasible during ultrasound-guided brachytherapy for prostate cancer. Although the postimplant dose-volume histograms for all patients, before the implantation of additional seeds, were adequate according to the American Brachytherapy Society criteria, specific critical areas can be underdosed. Additional seeds can then be implanted to optimize the dosimetry and reduce the risk of underdosing areas of cancer.
Habitat Utilization Assessment - Building in Behaviors
NASA Technical Reports Server (NTRS)
Whitmore, Mihriban; Blume, Jennifer
2004-01-01
Habitability, and the associated architectural and design attributes of an environment, is a powerful performance shaping factor. By identifying how inhabitants use an area, we can draw conclusions about what design or architectural attributes cause what behaviors and systematically design in desired human performance. We are analyzing how a crew uses a long duration habitat and work environment during a four-day underwater mission and identifying certain architectural and design attributes that are related to, and potential enablers of, certain crew behaviors. By identifying how inhabitants use the habitat, we can draw conclusions about what habitability attributes cause what behaviors and systematically design in desired human performance (applicable to NASA's Bioastronautics Human Behavior and Performance Critical Path Roadmap question 6.12). This assessment replicates a methodology reported in a chapter titled "Sociokinetic Analysis as a Tool for Optimization of Environmental Design" by C. Adams.' That study collected video imagery of certain areas of a closed habitat during a 91 day test and from that data calculated time spent in different volumes during the mission, and characterized the behaviors occurring in certain habitat volumes thus concluding various rules for design of such habitats. This study assesses the utilization of the Aquarius Habitat, an underwater station, which will support six Aquanauts for a fourteen-day mission during which the crew will perform specific scientific and engineering studies. Video is recorded for long uninterrupted periods of time during the mission and from that data the time spent in each area is calculated. In addition, qualitative and descriptive analysis of the types of behaviors in each area is performed with the purpose of identifying any behaviors that are not typical of a certain area. If a participant uses an area in a way different from expected, a subsequent analysis of the features of that area may result in conclusions of performance shaping factors. With the addition of this study, we can make comparisons between the two different habitats and begin drawing correlation judgments about design features and behavior. Ideally, this methodology should be repeated in additional Aquarius missions and other analog environments because the real information will come from comparisons between habitats.
An Approach to Experimental Design for the Computer Analysis of Complex Phenomenon
NASA Technical Reports Server (NTRS)
Rutherford, Brian
2000-01-01
The ability to make credible system assessments, predictions and design decisions related to engineered systems and other complex phenomenon is key to a successful program for many large-scale investigations in government and industry. Recently, many of these large-scale analyses have turned to computational simulation to provide much of the required information. Addressing specific goals in the computer analysis of these complex phenomenon is often accomplished through the use of performance measures that are based on system response models. The response models are constructed using computer-generated responses together with physical test results where possible. They are often based on probabilistically defined inputs and generally require estimation of a set of response modeling parameters. As a consequence, the performance measures are themselves distributed quantities reflecting these variabilities and uncertainties. Uncertainty in the values of the performance measures leads to uncertainties in predicted performance and can cloud the decisions required of the analysis. A specific goal of this research has been to develop methodology that will reduce this uncertainty in an analysis environment where limited resources and system complexity together restrict the number of simulations that can be performed. An approach has been developed that is based on evaluation of the potential information provided for each "intelligently selected" candidate set of computer runs. Each candidate is evaluated by partitioning the performance measure uncertainty into two components - one component that could be explained through the additional computational simulation runs and a second that would remain uncertain. The portion explained is estimated using a probabilistic evaluation of likely results for the additional computational analyses based on what is currently known about the system. The set of runs indicating the largest potential reduction in uncertainty is then selected and the computational simulations are performed. Examples are provided to demonstrate this approach on small scale problems. These examples give encouraging results. Directions for further research are indicated.
Elsawy, Amr S; Eldawlatly, Seif; Taher, Mohamed; Aly, Gamal M
2014-01-01
The current trend to use Brain-Computer Interfaces (BCIs) with mobile devices mandates the development of efficient EEG data processing methods. In this paper, we demonstrate the performance of a Principal Component Analysis (PCA) ensemble classifier for P300-based spellers. We recorded EEG data from multiple subjects using the Emotiv neuroheadset in the context of a classical oddball P300 speller paradigm. We compare the performance of the proposed ensemble classifier to the performance of traditional feature extraction and classifier methods. Our results demonstrate the capability of the PCA ensemble classifier to classify P300 data recorded using the Emotiv neuroheadset with an average accuracy of 86.29% on cross-validation data. In addition, offline testing of the recorded data reveals an average classification accuracy of 73.3% that is significantly higher than that achieved using traditional methods. Finally, we demonstrate the effect of the parameters of the P300 speller paradigm on the performance of the method.
Supplemental Hazard Analysis and Risk Assessment - Hydrotreater
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lowry, Peter P.; Wagner, Katie A.
A supplemental hazard analysis was conducted and quantitative risk assessment performed in response to an independent review comment received by the Pacific Northwest National Laboratory (PNNL) from the U.S. Department of Energy Pacific Northwest Field Office (PNSO) against the Hydrotreater/Distillation Column Hazard Analysis Report issued in April 2013. The supplemental analysis used the hazardous conditions documented by the previous April 2013 report as a basis. The conditions were screened and grouped for the purpose of identifying whether additional prudent, practical hazard controls could be identified, using a quantitative risk evaluation to assess the adequacy of the controls and establish amore » lower level of concern for the likelihood of potential serious accidents. Calculations were performed to support conclusions where necessary.« less
An advanced probabilistic structural analysis method for implicit performance functions
NASA Technical Reports Server (NTRS)
Wu, Y.-T.; Millwater, H. R.; Cruse, T. A.
1989-01-01
In probabilistic structural analysis, the performance or response functions usually are implicitly defined and must be solved by numerical analysis methods such as finite element methods. In such cases, the most commonly used probabilistic analysis tool is the mean-based, second-moment method which provides only the first two statistical moments. This paper presents a generalized advanced mean value (AMV) method which is capable of establishing the distributions to provide additional information for reliability design. The method requires slightly more computations than the second-moment method but is highly efficient relative to the other alternative methods. In particular, the examples show that the AMV method can be used to solve problems involving non-monotonic functions that result in truncated distributions.
Jäderkvist Fegraeus, K; Hirschberg, I; Árnason, T; Andersson, L; Velie, B D; Andersson, L S; Lindgren, G
2017-12-01
The Icelandic horse is a breed known mainly for its ability to perform the ambling four-beat gait 'tölt' and the lateral two-beat gait pace. The natural ability of the breed to perform these alternative gaits is highly desired by breeders. Therefore, the discovery that a nonsense mutation (C>A) in the DMRT3 gene was the main genetic factor for horses' ability to perform gaits in addition to walk, trot and canter was of great interest. Although several studies have demonstrated that homozygosity for the DMRT3 mutation is important for the ability to pace, only about 70% of the homozygous mutant (AA) Icelandic horses are reported to pace. The aim of the study was to genetically compare four- and five-gaited (i.e. horses with and without the ability to pace) AA Icelandic horses by performing a genome-wide association (GWA) analysis. All horses (n = 55) were genotyped on the 670K Axiom Equine Genotyping Array, and a GWA analysis was performed using the genabel package in r. No SNP demonstrated genome-wide significance, implying that the ability to pace goes beyond the presence of a single gene variant. Despite its limitations, the current study provides additional information regarding the genetic complexity of pacing ability in horses. However, to fully understand the genetic differences between four- and five-gaited AA horses, additional studies with larger sample materials and consistent phenotyping are needed. © 2017 Stichting International Foundation for Animal Genetics.
Numerical Analysis of Coolant Flow and Heat Transfer in ITER Diagnostic First Wall
DOE Office of Scientific and Technical Information (OSTI.GOV)
Khodak, A.; Loesser, G.; Zhai, Y.
2015-07-24
We performed numerical simulations of the ITER Diagnostic First Wall (DFW) using ANSYS workbench. During operation DFW will include solid main body as well as liquid coolant. Thus thermal and hydraulic analysis of the DFW was performed using conjugated heat transfer approach, in which heat transfer was resolved in both solid and liquid parts, and simultaneously fluid dynamics analysis was performed only in the liquid part. This approach includes interface between solid and liquid part of the systemAnalysis was performed using ANSYS CFX software. CFX software allows solution of heat transfer equations in solid and liquid part, and solution ofmore » the flow equations in the liquid part. Coolant flow in the DFW was assumed turbulent and was resolved using Reynolds averaged Navier-Stokes equations with Shear Stress Transport turbulence model. Meshing was performed using CFX method available within ANSYS. The data cloud for thermal loading consisting of volumetric heating and surface heating was imported into CFX Volumetric heating source was generated using Attila software. Surface heating was obtained using radiation heat transfer analysis. Our results allowed us to identify areas of excessive heating. Proposals for cooling channel relocation were made. Additional suggestions were made to improve hydraulic performance of the cooling system.« less
Inertial Sensor Technology for Elite Swimming Performance Analysis: A Systematic Review
Mooney, Robert; Corley, Gavin; Godfrey, Alan; Quinlan, Leo R; ÓLaighin, Gearóid
2015-01-01
Technical evaluation of swimming performance is an essential factor of elite athletic preparation. Novel methods of analysis, incorporating body worn inertial sensors (i.e., Microelectromechanical systems, or MEMS, accelerometers and gyroscopes), have received much attention recently from both research and commercial communities as an alternative to video-based approaches. This technology may allow for improved analysis of stroke mechanics, race performance and energy expenditure, as well as real-time feedback to the coach, potentially enabling more efficient, competitive and quantitative coaching. The aim of this paper is to provide a systematic review of the literature related to the use of inertial sensors for the technical analysis of swimming performance. This paper focuses on providing an evaluation of the accuracy of different feature detection algorithms described in the literature for the analysis of different phases of swimming, specifically starts, turns and free-swimming. The consequences associated with different sensor attachment locations are also considered for both single and multiple sensor configurations. Additional information such as this should help practitioners to select the most appropriate systems and methods for extracting the key performance related parameters that are important to them for analysing their swimmers’ performance and may serve to inform both applied and research practices. PMID:26712760
Smith, Erin; Cusack, Tara; Cunningham, Caitriona; Blake, Catherine
2017-10-01
This review examines the effect of a dual task on the gait parameters of older adults with a mean gait speed of 1.0 m/s or greater, and the effect of type and complexity of task. A systematic review of Web of Science, PubMed, SCOPUS, Embase, and PsycINFO was performed in July 2016. Twenty-three studies (28 data sets) were reviewed and pooled for meta-analysis. The effect size on seven gait parameters was measured as the raw mean difference between single- and dual-task performance. Gait speed significantly reduced with the addition of a dual task, with increasing complexity showing greater decrements. Cadence, stride time, and measures of gait variability were all negatively affected under the dual-task condition. In older adults, the addition of a dual task significantly reduces gait speed and cadence, with possible implications for the assessment of older people, as the addition of a dual task may expose deficits not observed under single-task assessment.
Response of the Alliance 1 Proof-of-Concept Airplane Under Gust Loads
NASA Technical Reports Server (NTRS)
Naser, A. S.; Pototzky, A. S.; Spain, C. V.
2001-01-01
This report presents the work performed by Lockheed Martin's Langley Program Office in support of NASA's Environmental Research Aircraft and Sensor Technology (ERAST) program. The primary purpose of this work was to develop and demonstrate a gust analysis method which accounts for the span-wise variation of gust velocity. This is important because these unmanned aircraft having high aspect ratios and low wing loading are very flexible, and fly at low speeds. The main focus of the work was therefore to perform a two-dimensional Power Spectrum Density (PSD) analysis of the Alliance 1 Proof-of-Concept Unmanned Aircraft, As of this writing, none of the aircraft described in this report have been constructed. They are concepts represented by analytical models. The process first involved the development of suitable structural and aeroelastic Finite Element Models (FEM). This was followed by development of a one-dimensional PSD gust analysis, and then the two-dimensional (PSD) analysis of the Alliance 1. For further validation and comparison, two additional analyses were performed. A two-dimensional PSD gust analysis was performed on a simplet MSC/NASTRAN example problem. Finally a one-dimensional discrete gust analysis was performed on Alliance 1. This report describes this process, shows the relevant comparisons between analytical methods, and discusses the physical meanings of the results.
Economic Evaluation of Single-Family-Residence Solar-Energy Installation
NASA Technical Reports Server (NTRS)
1982-01-01
Long-term economic performance of a commercial solar-energy system was analyzed and used to predict economic performance at four additional sites. Analysis described in report was done to demonstrate viability of design over a broad range of environmental/economic conditions. Report contains graphs and tables that present evaluation procedure and results. Also contains appendixes that aid in understanding methods used.
Deconstructing Hub Drag. Part 2. Computational Development and Anaysis
2013-09-30
leveraged a Vertical Lift Consortium ( VLC )-funded hub drag scaling research effort. To confirm this objective, correlations are performed with the...Technology™ Demonstrator aircraft using an unstructured computational solver. These simpler faired elliptical geome- tries can prove to be challenging ...possible. However, additional funding was obtained from the Vertical Lift Consortium ( VLC ) to perform this study. This analysis is documented in
Skrzek, Anna; Přidalová, Miroslava; Sebastjan, Anna; Harásková, Dominika; Fugiel, Jaroslaw; Ignasiak, Zofia; Slawinska, Teresa; Rozek, Krystyna
2015-08-01
The aim of the present study was an in-depth analysis of fine motor skills of the hands in elderly women from different socio-cultural backgrounds. The research also included analysis of the associations of age with the variables assessing right- and left-hand motor skills and its effect on hand performance asymmetry. The study examined 486 women over the age of 60. The study measured dominant and non-dominant hand performance using the motor performance series test battery (aiming, line tracking, inserting pins, tapping) from the Vienna test system. The best results in the tests assessing coordinated hand movements were achieved by the group of elderly women attending a University of the Third Age in Poland. This may be the result of a larger variety of physical activity programs offered at this type of institution. However, due to the cross-sectional design of the study, additional research of a longitudinal nature needs to be performed using the same sample of individuals to draw any definitive conclusions. Additionally, a decrease in the differences between dominant and non-dominant hand function with age was observed.
Wotango, Aselefech Sorsa; Su, Wei-Nien; Haregewoin, Atetegeb Meazah; Chen, Hung-Ming; Cheng, Ju-Hsiang; Lin, Ming-Hsien; Wang, Chia-Hsin; Hwang, Bing-Joe
2018-05-09
The performance of lithium ion batteries rapidly falls at lower temperatures due to decreasing conductivity of electrolytes and Solid Electrolyte Interphase (SEI) on graphite anode. Hence, it limits the practical use of lithium ion batteries at sub-zero temperatures and also affects the development of lithium ion batteries for widespread applications. The SEI formed on the graphite surface is very influential in determining the performance of the battery. Herein, a new electrolyte additive, 4-Chloromethyl-1,3,2-dioxathiolane-2-oxide (CMDO), is prepared to improve the properties of commonly used electrolyte constituents - ethylene carbonate (EC), and fluoroethylene carbonate (FEC). The formation of an efficient passivation layer in propylene carbonate (PC) -based electrolyte for MCMB electrode was investigated. The addition of CMDO resulted in a much less irreversible capacity loss and induces thin SEI formation. However, the combination of the three additives played a key role to enhance reversible capacity of MCMB electrode at lower or ambient temperature. The electrochemical measurement analysis showed that the SEI formed from a mixture of the three additives gave better intercalation-deintercalation of lithium ions.
Systems and methods for circuit lifetime evaluation
NASA Technical Reports Server (NTRS)
Heaps, Timothy L. (Inventor); Sheldon, Douglas J. (Inventor); Bowerman, Paul N. (Inventor); Everline, Chester J. (Inventor); Shalom, Eddy (Inventor); Rasmussen, Robert D. (Inventor)
2013-01-01
Systems and methods for estimating the lifetime of an electrical system in accordance with embodiments of the invention are disclosed. One embodiment of the invention includes iteratively performing Worst Case Analysis (WCA) on a system design with respect to different system lifetimes using a computer to determine the lifetime at which the worst case performance of the system indicates the system will pass with zero margin or fail within a predetermined margin for error given the environment experienced by the system during its lifetime. In addition, performing WCA on a system with respect to a specific system lifetime includes identifying subcircuits within the system, performing Extreme Value Analysis (EVA) with respect to each subcircuit to determine whether the subcircuit fails EVA for the specific system lifetime, when the subcircuit passes EVA, determining that the subcircuit does not fail WCA for the specified system lifetime, when a subcircuit fails EVA performing at least one additional WCA process that provides a tighter bound on the WCA than EVA to determine whether the subcircuit fails WCA for the specified system lifetime, determining that the system passes WCA with respect to the specific system lifetime when all subcircuits pass WCA, and determining that the system fails WCA when at least one subcircuit fails WCA.
Removal of Cesium From Acidic Radioactive Tank Waste Using IONSIV IE-911 (CST)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mann, Nicholas Robert; Todd, Terry Allen
2004-10-01
IONSIV IE-911, or the engineered form of crystalline silicotitanate (CST), manufactured by UOP Molecular Sieves, has been evaluated for the removal of cesium from Idaho National Engineering and Environmental Laboratory (INEEL) acidic radioactive tank waste. A series of batch contacts and column tests were performed by using three separate batches of CST. Batch contacts were performed to evaluate the concentration effects of nitric acid, sodium, and potassium ions on cesium sorption. Additional batch tests were performed to determine if americium, mercury, and plutonium would sorb onto IONSIV IE-911. An equilibrium isotherm was generated by using a concentrated tank waste simulant.more » Column tests using a 1.5 cm 3 column and flow rates of 3, 5, 10, 20, and 30 bed volumes (BV)/hr were performed to elucidate dynamic cesium sorption capacities and sorption kinetics. Additional experiments investigated the effect of CST batch and pretreatment on cesium sorption. The thermal stability of IONSIV IE-911 was evaluated by performing thermal gravimetric analysis/differential thermal analysis. Overall, IONSIV IE-911 was shown to be effective for cesium sorption from complex, highly acidic solutions; however, sorbent stability in these solutions may have a deleterious effect on cesium sorption.« less
Application of paper spray ionization for explosives analysis.
Tsai, Chia-Wei; Tipple, Christopher A; Yost, Richard A
2017-10-15
A desired feature in the analysis of explosives is to decrease the time of the entire analysis procedure, including sampling. A recently utilized ambient ionization technique, paper spray ionization (PSI), provides the possibility of combining sampling and ionization. However, an interesting phenomenon that occurs in generating negatively charged ions pose some challenges in applying PSI to explosives analysis. The goal of this work is to investigate the possible solutions for generating explosives ions in negative mode PSI. The analysis of 2,4,6-trinitrotoluene (TNT), pentaerythritol tetranitrate (PETN), octahydro-1,3,5,7-tetranitro-1,3,5,7-tetrazocine (HMX), and 1,3,5-trinitroperhydro-1,3,5-triazine (RDX) was performed. Several solvent systems with different surface tensions and additives were compared to determine their effect on the ionization of explosives. The solvents tested include tert-butanol, isopropanol, methanol, and acetonitrile. The additives tested were carbon tetrachloride and ammonium nitrate. Of the solvents tested, isopropanol yielded the best results. In addition, adding ammonium nitrate to the isopropanol enhanced the analyte signal. Experimentally determined limits of detection (LODs) as low as 0.06 ng for PETN, on paper, were observed with isopropanol and the addition of 0.4 mM ammonium nitrate as the spray solution. In addition, the explosive components of two plastic explosive samples, Composition 4 and Semtex, were successfully analyzed via surface sampling when using the developed method. The analysis of explosives using PSI-MS in negative ion mode was achieved. The addition of ammonium nitrate to isopropanol, in general, enhanced the analyte signal and yielded better ionization stability. Real-world explosive samples were analyzed, which demonstrates one of the potential applications of PSI-MS analysis. Copyright © 2017 John Wiley & Sons, Ltd.
Cautela, Domenico; Laratta, Bruna; Santelli, Francesca; Trifirò, Antonio; Servillo, Luigi; Castaldo, Domenico
2008-07-09
The chemical composition of 30 samples of juices obtained from bergamot (Citrus bergamia Risso and Poit.) fruits is reported and compared to the genuineness parameters adopted by Association of the Industry of Juice and Nectars (AIJN) for lemon juice. It was found that the compositional differences between the two juices are distinguishable, although with difficulty. However, these differences are not strong enough to detect the fraudulent addition of bergamot juice to lemon juice. Instead, we found the high-performance liquid chromatography (HPLC) analysis of the flavanones naringin, neohesperidin, and neoeriocitrin, which are present in bergamot juice and practically absent in the lemon juice, is a convenient way to detect and quantify the fraudulent addition of bergamot juice. The method has been validated by calculating the detection and quantification limits according to Eurachem procedures. Employing neoeriocitrin (detection limit = 0.7 mg/L) and naringin (detection limit = 1 mg/L) as markers, it is possible to detect the addition of bergamot juice to lemon juice at the 1% level. When using neohesperidin as a marker (detection limit = 1 mg/L), the minimal percentage of detectable addition of bergamot juice was about 2%. Finally, it is reported that the pattern of flavonoid content of the bergamot juice is similar to those of chinotto (Citrus myrtifolia Raf) and bitter orange (Citrus aurantium L.) juices and that it is possible to distinguish the three kinds of juices by HPLC analysis.
CONTROLLING SO2 EMISSIONS: A REVIEW OF TECHNOLOGIES
The report describes flue gas desulfurization (FGD) technologies, assesses their applications, and characterizes their performance. Additionally, it describes some of the advancements that have occurred in FGD technologies. Finally, it presents an analysis of the costs associated...
Evaluation of methods for freeway operational analysis.
DOT National Transportation Integrated Search
2001-10-01
The ability to estimate accurately the operational performance of roadway segments has become increasingly critical as we move from a period of new construction into one of operations, maintenance, and, in some cases, reconstruction. In addition to m...
Long-Term Pavement Performance Program
DOT National Transportation Integrated Search
2015-12-01
The LTPP program will yield additional benefits as data are added to the database and as data analysis effortssome currently planned and some yet to be identifiedare completed. Continued monitoring of the test sections that remain in service is...
2011-01-01
field repair technique for enamel -coated steel used in reinforcing concrete structures. In addition to solving real problems, these efforts provide...projects are varied and range from designing and validating repairs, performing residual life analysis, augmenting the current crack growth prediction
2018-02-28
qualification testing to include vibrational, thermal bake and thermal cycling to ensure the experiment would perform as expected during operation on...series of tests for flight qualification. These tests included bake and thermal cycling. In addition, vibrational testing was also accomplished
Dynamic test/analysis correlation using reduced analytical models
NASA Technical Reports Server (NTRS)
Mcgowan, Paul E.; Angelucci, A. Filippo; Javeed, Mehzad
1992-01-01
Test/analysis correlation is an important aspect of the verification of analysis models which are used to predict on-orbit response characteristics of large space structures. This paper presents results of a study using reduced analysis models for performing dynamic test/analysis correlation. The reduced test-analysis model (TAM) has the same number and orientation of DOF as the test measurements. Two reduction methods, static (Guyan) reduction and the Improved Reduced System (IRS) reduction, are applied to the test/analysis correlation of a laboratory truss structure. Simulated test results and modal test data are used to examine the performance of each method. It is shown that selection of DOF to be retained in the TAM is critical when large structural masses are involved. In addition, the use of modal test results may provide difficulties in TAM accuracy even if a large number of DOF are retained in the TAM.
Cazon, Aitor; Kelly, Sarah; Paterson, Abby M; Bibb, Richard J; Campbell, R Ian
2017-09-01
Rheumatoid arthritis is a chronic disease affecting the joints. Treatment can include immobilisation of the affected joint with a custom-fitting splint, which is typically fabricated by hand from low temperature thermoplastic, but the approach poses several limitations. This study focused on the evaluation, by finite element analysis, of additive manufacturing techniques for wrist splints in order to improve upon the typical splinting approach. An additive manufactured/3D printed splint, specifically designed to be built using Objet Connex multi-material technology and a virtual model of a typical splint, digitised from a real patient-specific splint using three-dimensional scanning, were modelled in computer-aided design software. Forty finite element analysis simulations were performed in flexion-extension and radial-ulnar wrist movements to compare the displacements and the stresses. Simulations have shown that for low severity loads, the additive manufacturing splint has 25%, 76% and 27% less displacement in the main loading direction than the typical splint in flexion, extension and radial, respectively, while ulnar values were 75% lower in the traditional splint. For higher severity loads, the flexion and extension movements resulted in deflections that were 24% and 60%, respectively, lower in the additive manufacturing splint. However, for higher severity loading, the radial defection values were very similar in both splints and ulnar movement deflection was higher in the additive manufacturing splint. A physical prototype of the additive manufacturing splint was also manufactured and was tested under normal conditions to validate the finite element analysis data. Results from static tests showed maximum displacements of 3.46, 0.97, 3.53 and 2.51 mm flexion, extension, radial and ulnar directions, respectively. According to these results, the present research argues that from a technical point of view, the additive manufacturing splint design stands at the same or even better level of performance in displacements and stress values in comparison to the typical low temperature thermoplastic approach and is therefore a feasible approach to splint design and manufacture.
Preliminary Analysis of the Transient Reactor Test Facility (TREAT) with PROTEUS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Connaway, H. M.; Lee, C. H.
The neutron transport code PROTEUS has been used to perform preliminary simulations of the Transient Reactor Test Facility (TREAT). TREAT is an experimental reactor designed for the testing of nuclear fuels and other materials under transient conditions. It operated from 1959 to 1994, when it was placed on non-operational standby. The restart of TREAT to support the U.S. Department of Energy’s resumption of transient testing is currently underway. Both single assembly and assembly-homogenized full core models have been evaluated. Simulations were performed using a historic set of WIMS-ANL-generated cross-sections as well as a new set of Serpent-generated cross-sections. To supportmore » this work, further analyses were also performed using additional codes in order to investigate particular aspects of TREAT modeling. DIF3D and the Monte-Carlo codes MCNP and Serpent were utilized in these studies. MCNP and Serpent were used to evaluate the effect of geometry homogenization on the simulation results and to support code-to-code comparisons. New meshes for the PROTEUS simulations were created using the CUBIT toolkit, with additional meshes generated via conversion of selected DIF3D models to support code-to-code verifications. All current analyses have focused on code-to-code verifications, with additional verification and validation studies planned. The analysis of TREAT with PROTEUS-SN is an ongoing project. This report documents the studies that have been performed thus far, and highlights key challenges to address in future work.« less
Data acquisition instruments: Psychopharmacology
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hartley, D.S. III
This report contains the results of a Direct Assistance Project performed by Lockheed Martin Energy Systems, Inc., for Dr. K. O. Jobson. The purpose of the project was to perform preliminary analysis of the data acquisition instruments used in the field of psychiatry, with the goal of identifying commonalities of data and strategies for handling and using the data in the most advantageous fashion. Data acquisition instruments from 12 sources were provided by Dr. Jobson. Several commonalities were identified and a potentially useful data strategy is reported here. Analysis of the information collected for utility in performing diagnoses is recommended.more » In addition, further work is recommended to refine the commonalities into a directly useful computer systems structure.« less
Nakagawa, Hiroko; Yuno, Tomoji; Itho, Kiichi
2009-03-01
Recently, specific detection method for Bacteria, by flow cytometry method using nucleic acid staining, was developed as a function of automated urine formed elements analyzer for routine urine testing. Here, we performed a basic study on this bacteria analysis method. In addition, we also have a comparison among urine sediment analysis, urine Gram staining and urine quantitative cultivation, the conventional methods performed up to now. As a result, the bacteria analysis with flow cytometry method that uses nucleic acid staining was excellent in reproducibility, and higher sensitivity compared with microscopic urinary sediment analysis. Based on the ROC curve analysis, which settled urine culture method as standard, cut-off level of 120/microL was defined and its sensitivity = 85.7%, specificity = 88.2%. In the analysis of scattergram, accompanied with urine culture method, among 90% of rod positive samples, 80% of dots were appeared in the area of 30 degrees from axis X. In addition, one case even indicated that analysis of bacteria by flow cytometry and scattergram of time series analysis might be helpful to trace the progress of causative bacteria therefore the information supposed to be clinically significant. Reporting bacteria information with nucleic acid staining flow cytometry method is expected to contribute to a rapid diagnostics and treatment of urinary tract infections. Besides, the contribution to screening examination of microbiology and clinical chemistry, will deliver a more efficient solution to urine analysis.
An Assessment of the State-of-the-Art in Multidisciplinary Aeromechanical Analyses
2008-01-01
monolithic formulations. In summary, for aerospace structures, partitioned formulations provide fundamental advantages over fully coupled ones, in addition...important frequencies of local analysis directly to global analysis using detailed modeling. Performed ju- diciously, based on a fundamental understanding of...in 2000 has com- prehensively described the problem, and reviewed the status of fundamental understanding, experimental data, and analytical
ApoE Genotype and Alzheimer's Disease in Adults with Down Syndrome: Meta-Analysis.
ERIC Educational Resources Information Center
Prashner, V. P.; Chowdhury, T. A.; Rowe, B. R.; Bain, S. C.
1997-01-01
ApoE gene polymorphism was examined in 100 adults with Down syndrome with and without dementia (Alzheimer's disease) and 346 control subjects. Additionally, a meta analysis of studies (total N=480 subjects) was performed. Results indicated a similar incidence of the gene across groups but subjects with the allele tended to an earlier onset of…
NUMERICAL FLOW AND TRANSPORT SIMULATIONS SUPPORTING THE SALTSTONE FACILITY PERFORMANCE ASSESSMENT
DOE Office of Scientific and Technical Information (OSTI.GOV)
Flach, G.
2009-02-28
The Saltstone Disposal Facility Performance Assessment (PA) is being revised to incorporate requirements of Section 3116 of the Ronald W. Reagan National Defense Authorization Act for Fiscal Year 2005 (NDAA), and updated data and understanding of vault performance since the 1992 PA (Cook and Fowler 1992) and related Special Analyses. A hybrid approach was chosen for modeling contaminant transport from vaults and future disposal cells to exposure points. A higher resolution, largely deterministic, analysis is performed on a best-estimate Base Case scenario using the PORFLOW numerical analysis code. a few additional sensitivity cases are simulated to examine alternative scenarios andmore » parameter settings. Stochastic analysis is performed on a simpler representation of the SDF system using the GoldSim code to estimate uncertainty and sensitivity about the Base Case. This report describes development of PORFLOW models supporting the SDF PA, and presents sample results to illustrate model behaviors and define impacts relative to key facility performance objectives. The SDF PA document, when issued, should be consulted for a comprehensive presentation of results.« less
Sensor image prediction techniques
NASA Astrophysics Data System (ADS)
Stenger, A. J.; Stone, W. R.; Berry, L.; Murray, T. J.
1981-02-01
The preparation of prediction imagery is a complex, costly, and time consuming process. Image prediction systems which produce a detailed replica of the image area require the extensive Defense Mapping Agency data base. The purpose of this study was to analyze the use of image predictions in order to determine whether a reduced set of more compact image features contains enough information to produce acceptable navigator performance. A job analysis of the navigator's mission tasks was performed. It showed that the cognitive and perceptual tasks he performs during navigation are identical to those performed for the targeting mission function. In addition, the results of the analysis of his performance when using a particular sensor can be extended to the analysis of this mission tasks using any sensor. An experimental approach was used to determine the relationship between navigator performance and the type of amount of information in the prediction image. A number of subjects were given image predictions containing varying levels of scene detail and different image features, and then asked to identify the predicted targets in corresponding dynamic flight sequences over scenes of cultural, terrain, and mixed (both cultural and terrain) content.
Knight, Jo; North, Bernard V; Sham, Pak C; Curtis, David
2003-12-31
This paper presents a method of performing model-free LOD-score based linkage analysis on quantitative traits. It is implemented in the QMFLINK program. The method is used to perform a genome screen on the Framingham Heart Study data. A number of markers that show some support for linkage in our study coincide substantially with those implicated in other linkage studies of hypertension. Although the new method needs further testing on additional real and simulated data sets we can already say that it is straightforward to apply and may offer a useful complementary approach to previously available methods for the linkage analysis of quantitative traits.
Knight, Jo; North, Bernard V; Sham, Pak C; Curtis, David
2003-01-01
This paper presents a method of performing model-free LOD-score based linkage analysis on quantitative traits. It is implemented in the QMFLINK program. The method is used to perform a genome screen on the Framingham Heart Study data. A number of markers that show some support for linkage in our study coincide substantially with those implicated in other linkage studies of hypertension. Although the new method needs further testing on additional real and simulated data sets we can already say that it is straightforward to apply and may offer a useful complementary approach to previously available methods for the linkage analysis of quantitative traits. PMID:14975142
Interdependencies and Causalities in Coupled Financial Networks.
Vodenska, Irena; Aoyama, Hideaki; Fujiwara, Yoshi; Iyetomi, Hiroshi; Arai, Yuta
2016-01-01
We explore the foreign exchange and stock market networks for 48 countries from 1999 to 2012 and propose a model, based on complex Hilbert principal component analysis, for extracting significant lead-lag relationships between these markets. The global set of countries, including large and small countries in Europe, the Americas, Asia, and the Middle East, is contrasted with the limited scopes of targets, e.g., G5, G7 or the emerging Asian countries, adopted by previous works. We construct a coupled synchronization network, perform community analysis, and identify formation of four distinct network communities that are relatively stable over time. In addition to investigating the entire period, we divide the time period into into "mild crisis," (1999-2002), "calm," (2003-2006) and "severe crisis" (2007-2012) sub-periods and find that the severe crisis period behavior dominates the dynamics in the foreign exchange-equity synchronization network. We observe that in general the foreign exchange market has predictive power for the global stock market performances. In addition, the United States, German and Mexican markets have forecasting power for the performances of other global equity markets.
Computational study of single-expansion-ramp nozzles with external burning
NASA Astrophysics Data System (ADS)
Yungster, Shaye; Trefny, Charles J.
1992-04-01
A computational investigation of the effects of external burning on the performance of single expansion ramp nozzles (SERN) operating at transonic speeds is presented. The study focuses on the effects of external heat addition and introduces a simplified injection and mixing model based on a control volume analysis. This simplified model permits parametric and scaling studies that would have been impossible to conduct with a detailed CFD analysis. The CFD model is validated by comparing the computed pressure distribution and thrust forces, for several nozzle configurations, with experimental data. Specific impulse calculations are also presented which indicate that external burning performance can be superior to other methods of thrust augmentation at transonic speeds. The effects of injection fuel pressure and nozzle pressure ratio on the performance of SERN nozzles with external burning are described. The results show trends similar to those reported in the experimental study, and provide additional information that complements the experimental data, improving our understanding of external burning flowfields. A study of the effect of scale is also presented. The results indicate that combustion kinetics do not make the flowfield sensitive to scale.
Computational study of single-expansion-ramp nozzles with external burning
NASA Technical Reports Server (NTRS)
Yungster, Shaye; Trefny, Charles J.
1992-01-01
A computational investigation of the effects of external burning on the performance of single expansion ramp nozzles (SERN) operating at transonic speeds is presented. The study focuses on the effects of external heat addition and introduces a simplified injection and mixing model based on a control volume analysis. This simplified model permits parametric and scaling studies that would have been impossible to conduct with a detailed CFD analysis. The CFD model is validated by comparing the computed pressure distribution and thrust forces, for several nozzle configurations, with experimental data. Specific impulse calculations are also presented which indicate that external burning performance can be superior to other methods of thrust augmentation at transonic speeds. The effects of injection fuel pressure and nozzle pressure ratio on the performance of SERN nozzles with external burning are described. The results show trends similar to those reported in the experimental study, and provide additional information that complements the experimental data, improving our understanding of external burning flowfields. A study of the effect of scale is also presented. The results indicate that combustion kinetics do not make the flowfield sensitive to scale.
Factor structure and validation of the Attentional Control Scale.
Judah, Matt R; Grant, DeMond M; Mills, Adam C; Lechner, William V
2014-04-01
The Attentional Control Scale (ACS; Derryberry & Reed, 2002) has been used to assess executive control over attention in numerous studies, but no published data have examined the factor structure of the English version. The current studies addressed this need and tested the predictive and convergent validity of the ACS subscales. In Study 1, exploratory factor analysis yielded a two-factor model with Focusing and Shifting subscales. In Study 2, confirmatory factor analysis supported this model and suggested superior fit compared to the factor structure of the Icelandic version (Ólafsson et al., 2011). Study 3 examined correlations between the ACS subscales and measures of working memory, anxiety, and cognitive control. Study 4 examined correlations between the subscales and reaction times on a mixed-antisaccade task, revealing positive correlations for antisaccade performance and prosaccade latency with Focusing scores and between switch trial performance and Shifting scores. Additionally, the findings partially supported unique relationships between Focusing and trait anxiety and between Shifting and depression that have been noted in recent research. Although the results generally support the validity of the ACS, additional research using performance-based tasks is needed.
Scalable Performance Environments for Parallel Systems
NASA Technical Reports Server (NTRS)
Reed, Daniel A.; Olson, Robert D.; Aydt, Ruth A.; Madhyastha, Tara M.; Birkett, Thomas; Jensen, David W.; Nazief, Bobby A. A.; Totty, Brian K.
1991-01-01
As parallel systems expand in size and complexity, the absence of performance tools for these parallel systems exacerbates the already difficult problems of application program and system software performance tuning. Moreover, given the pace of technological change, we can no longer afford to develop ad hoc, one-of-a-kind performance instrumentation software; we need scalable, portable performance analysis tools. We describe an environment prototype based on the lessons learned from two previous generations of performance data analysis software. Our environment prototype contains a set of performance data transformation modules that can be interconnected in user-specified ways. It is the responsibility of the environment infrastructure to hide details of module interconnection and data sharing. The environment is written in C++ with the graphical displays based on X windows and the Motif toolkit. It allows users to interconnect and configure modules graphically to form an acyclic, directed data analysis graph. Performance trace data are represented in a self-documenting stream format that includes internal definitions of data types, sizes, and names. The environment prototype supports the use of head-mounted displays and sonic data presentation in addition to the traditional use of visual techniques.
Alternatives Analysis for the Resumption of Transient Testing Program
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee Nelson
2013-11-01
An alternatives analysis was performed for resumption of transient testing. The analysis considered eleven alternatives – including both US international facilities. A screening process was used to identify two viable alternatives from the original eleven. In addition, the alternatives analysis includes a no action alternative as required by the National Environmental Policy Act (NEPA). The alternatives considered in this analysis included: 1. Restart the Transient Reactor Test Facility (TREAT) 2. Modify the Annular Core Research Reactor (ACRR) which includes construction of a new hot cell and installation of a new hodoscope. 3. No Action
Parametric study of closed wet cooling tower thermal performance
NASA Astrophysics Data System (ADS)
Qasim, S. M.; Hayder, M. J.
2017-08-01
The present study involves experimental and theoretical analysis to evaluate the thermal performance of modified Closed Wet Cooling Tower (CWCT). The experimental study includes: design, manufacture and testing prototype of a modified counter flow forced draft CWCT. The modification based on addition packing to the conventional CWCT. A series of experiments was carried out at different operational parameters. In view of energy analysis, the thermal performance parameters of the tower are: cooling range, tower approach, cooling capacity, thermal efficiency, heat and mass transfer coefficients. The theoretical study included develops Artificial Neural Network (ANN) models to predicting various thermal performance parameters of the tower. Utilizing experimental data for training and testing, the models simulated by multi-layer back propagation algorithm for varying all operational parameters stated in experimental test.
NASA Astrophysics Data System (ADS)
Dagger, Tim; Lürenbaum, Constantin; Schappacher, Falko M.; Winter, Martin
2017-02-01
A modified self-extinguishing time (SET) device which enhances the reproducibility of the results is presented. Pentafluoro(phenoxy)cyclotriphosphazene (FPPN) is investigated as flame retardant electrolyte additive for lithium ion batteries (LIBs) in terms of thermal stability and electrochemical performance. SET measurements and adiabatic reaction calorimetry are applied to determine the flammability and the reactivity of a standard LIB electrolyte containing 5% FPPN. The results reveal that the additive-containing electrolyte is nonflammable for 10 s whereas the commercially available reference electrolyte inflames instantaneously after 1 s of ignition. The onset temperature of the safety enhanced electrolyte is delayed by ≈ 21 °C. Compatibility tests in half cells show that the electrolyte is reductively stable while the cyclic voltammogram indicates oxidative decomposition during the first cycle. Cycling experiments in full cells show improved cycling performance and rate capability, which can be attributed to cathode passivation during the first cycle. Post-mortem analysis of the electrolyte by gas chromatography-mass spectrometry confirms the presence of the additive in high amounts after 501 cycles which ensures enhanced safety of the electrolyte. The investigations present FPPN as stable electrolyte additive that improves the intrinsic safety of the electrolyte and its cycling performance at the same time.
NASA Astrophysics Data System (ADS)
Siewiorek, A.; Malczyk, P.; Sobczak, N.; Sobczak, J. J.; Czulak, A.; Kozera, R.; Gude, M.; Boczkowska, A.; Homa, M.
2016-08-01
To develop an optimised manufacturing method of fly ash-reinforced metal matrix composites, the preliminary tests were performed on the cenospheres selected from fly ash (FACS) with halloysite nanotubes (HNTs) addition. The preform made out of FACS with and without the addition of HNT (with 5 and 10 wt.%) has been infiltrated by the pure aluminium (Al) via adapted gas pressure infiltration process. This paper reveals the influence of HNT addition on the microstructure (analysis was done by computed tomography and scanning electron microscopy combined with energy-dispersive x-ray spectroscopy), thermal properties (thermal expansion coefficient, thermal conductivity and specific heat) and the mechanical properties (hardness and compression test) of manufactured composites. The analysis of structure-property relationships for Al/FACS-HNT composites produced shows that the addition of 5 wt.% of HNT to FACS preform contributes to receiving of the best mechanical and structural properties of investigated composites.
Post Flight Analysis of Optical Specimens from MISSE7
NASA Technical Reports Server (NTRS)
Stewart, Alan F.; Finckenor, Miria
2012-01-01
More than 100 optical specimens were flown on the MISSE7 platform. These included bare substrates in addition to coatings designed to exhibit clearly defined or enhanced sensitivity to the accumulation of contamination. Measurements were performed using spectrophotometers operating from the UV through the IR as well as ellipsometry. Results will be presented in addition to discussion of the best options for design of samples for future exposure experiments.
Agricultural Commodity and Utility Carriers Hours of Service Exemption Analysis.
DOT National Transportation Integrated Search
2010-05-01
The study was conducted in two phases. Phase 1 compares the safety performance of agricultural and non-agricultural carriers for the period 2005 through 2008, and also examines two additional industries: livestock and utility carriers, whose operatio...
Simulation of Attacks for Security in Wireless Sensor Network.
Diaz, Alvaro; Sanchez, Pablo
2016-11-18
The increasing complexity and low-power constraints of current Wireless Sensor Networks (WSN) require efficient methodologies for network simulation and embedded software performance analysis of nodes. In addition, security is also a very important feature that has to be addressed in most WSNs, since they may work with sensitive data and operate in hostile unattended environments. In this paper, a methodology for security analysis of Wireless Sensor Networks is presented. The methodology allows designing attack-aware embedded software/firmware or attack countermeasures to provide security in WSNs. The proposed methodology includes attacker modeling and attack simulation with performance analysis (node's software execution time and power consumption estimation). After an analysis of different WSN attack types, an attacker model is proposed. This model defines three different types of attackers that can emulate most WSN attacks. In addition, this paper presents a virtual platform that is able to model the node hardware, embedded software and basic wireless channel features. This virtual simulation analyzes the embedded software behavior and node power consumption while it takes into account the network deployment and topology. Additionally, this simulator integrates the previously mentioned attacker model. Thus, the impact of attacks on power consumption and software behavior/execution-time can be analyzed. This provides developers with essential information about the effects that one or multiple attacks could have on the network, helping them to develop more secure WSN systems. This WSN attack simulator is an essential element of the attack-aware embedded software development methodology that is also introduced in this work.
Relationships between Contextual and Task Performance and Interrater Agreement: Are There Any?
Díaz-Vilela, Luis F; Delgado Rodríguez, Naira; Isla-Díaz, Rosa; Díaz-Cabrera, Dolores; Hernández-Fernaud, Estefanía; Rosales-Sánchez, Christian
2015-01-01
Work performance is one of the most important dependent variables in Work and Organizational Psychology. The main objective of this paper was to explore the relationships between citizenship performance and task performance measures obtained from different appraisers and their consistency through a seldom-used methodology, intraclass correlation coefficients. Participants were 135 public employees, the total staff in a local government department. Jobs were clustered into job families through a work analysis based on standard questionnaires. A task description technique was used to develop a performance appraisal questionnaire for each job family, with three versions: self-, supervisor-, and peer-evaluation, in addition to a measure of citizenship performance. Only when the self-appraisal bias is controlled, significant correlations appeared between task performance rates. However, intraclass correlations analyses show that only self- (contextual and task) performance measures are consistent, while interrater agreement disappears. These results provide some interesting clues about the procedure of appraisal instrument development, the role of appraisers, and the importance of choosing adequate consistency analysis methods.
Relationships between Contextual and Task Performance and Interrater Agreement: Are There Any?
Díaz-Cabrera, Dolores; Hernández-Fernaud, Estefanía; Rosales-Sánchez, Christian
2015-01-01
Work performance is one of the most important dependent variables in Work and Organizational Psychology. The main objective of this paper was to explore the relationships between citizenship performance and task performance measures obtained from different appraisers and their consistency through a seldom-used methodology, intraclass correlation coefficients. Participants were 135 public employees, the total staff in a local government department. Jobs were clustered into job families through a work analysis based on standard questionnaires. A task description technique was used to develop a performance appraisal questionnaire for each job family, with three versions: self-, supervisor-, and peer-evaluation, in addition to a measure of citizenship performance. Only when the self-appraisal bias is controlled, significant correlations appeared between task performance rates. However, intraclass correlations analyses show that only self- (contextual and task) performance measures are consistent, while interrater agreement disappears. These results provide some interesting clues about the procedure of appraisal instrument development, the role of appraisers, and the importance of choosing adequate consistency analysis methods. PMID:26473956
Performance assessment of an irreversible nano Brayton cycle operating with Maxwell-Boltzmann gas
NASA Astrophysics Data System (ADS)
Açıkkalp, Emin; Caner, Necmettin
2015-05-01
In the last decades, nano-technology has been developed very fast. According to this, nano-cycle thermodynamics should improve with a similar rate. In this paper, a nano-scale irreversible Brayton cycle working with helium is evaluated for different thermodynamic criteria. These are maximum work output, ecological function, ecological coefficient of performance, exergetic performance criteria and energy efficiency. Thermodynamic analysis was performed for these criteria and results were submitted numerically. In addition, these criteria are compared with each other and the most convenient methods for the optimum conditions are suggested.
Sensitivity analysis of a ground-water-flow model
Torak, Lynn J.; ,
1991-01-01
A sensitivity analysis was performed on 18 hydrological factors affecting steady-state groundwater flow in the Upper Floridan aquifer near Albany, southwestern Georgia. Computations were based on a calibrated, two-dimensional, finite-element digital model of the stream-aquifer system and the corresponding data inputs. Flow-system sensitivity was analyzed by computing water-level residuals obtained from simulations involving individual changes to each hydrological factor. Hydrological factors to which computed water levels were most sensitive were those that produced the largest change in the sum-of-squares of residuals for the smallest change in factor value. Plots of the sum-of-squares of residuals against multiplier or additive values that effect change in the hydrological factors are used to evaluate the influence of each factor on the simulated flow system. The shapes of these 'sensitivity curves' indicate the importance of each hydrological factor to the flow system. Because the sensitivity analysis can be performed during the preliminary phase of a water-resource investigation, it can be used to identify the types of hydrological data required to accurately characterize the flow system prior to collecting additional data or making management decisions.
Steed, Chad A.; Halsey, William; Dehoff, Ryan; ...
2017-02-16
Flexible visual analysis of long, high-resolution, and irregularly sampled time series data from multiple sensor streams is a challenge in several domains. In the field of additive manufacturing, this capability is critical for realizing the full potential of large-scale 3D printers. Here, we propose a visual analytics approach that helps additive manufacturing researchers acquire a deep understanding of patterns in log and imagery data collected by 3D printers. Our specific goals include discovering patterns related to defects and system performance issues, optimizing build configurations to avoid defects, and increasing production efficiency. We introduce Falcon, a new visual analytics system thatmore » allows users to interactively explore large, time-oriented data sets from multiple linked perspectives. Falcon provides overviews, detailed views, and unique segmented time series visualizations, all with adjustable scale options. To illustrate the effectiveness of Falcon at providing thorough and efficient knowledge discovery, we present a practical case study involving experts in additive manufacturing and data from a large-scale 3D printer. The techniques described are applicable to the analysis of any quantitative time series, though the focus of this paper is on additive manufacturing.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Steed, Chad A.; Halsey, William; Dehoff, Ryan
Flexible visual analysis of long, high-resolution, and irregularly sampled time series data from multiple sensor streams is a challenge in several domains. In the field of additive manufacturing, this capability is critical for realizing the full potential of large-scale 3D printers. Here, we propose a visual analytics approach that helps additive manufacturing researchers acquire a deep understanding of patterns in log and imagery data collected by 3D printers. Our specific goals include discovering patterns related to defects and system performance issues, optimizing build configurations to avoid defects, and increasing production efficiency. We introduce Falcon, a new visual analytics system thatmore » allows users to interactively explore large, time-oriented data sets from multiple linked perspectives. Falcon provides overviews, detailed views, and unique segmented time series visualizations, all with adjustable scale options. To illustrate the effectiveness of Falcon at providing thorough and efficient knowledge discovery, we present a practical case study involving experts in additive manufacturing and data from a large-scale 3D printer. The techniques described are applicable to the analysis of any quantitative time series, though the focus of this paper is on additive manufacturing.« less
Space Shuttle Columbia Aging Wiring Failure Analysis
NASA Technical Reports Server (NTRS)
McDaniels, Steven J.
2005-01-01
A Space Shuttle Columbia main engine controller 14 AWG wire short circuited during the launch of STS-93. Post-flight examination divulged that the wire had electrically arced against the head of a nearby bolt. More extensive inspection revealed additional damage to the subject wire, and to other wires as well from the mid-body of Columbia. The shorted wire was to have been constructed from nickel-plated copper conductors surrounded by the polyimide insulation Kapton, top-coated with an aromatic polyimide resin. The wires were analyzed via scanning electron microscope (SEM), energy dispersive X-Ray spectroscopy (EDX), and electron spectroscopy for chemical analysis (ESCA); differential scanning calorimetry (DSC) and thermal gravimetric analysis (TGA) were performed on the polyimide. Exemplar testing under laboratory conditions was performed to replicate the mechanical damage characteristics evident on the failed wires. The exemplar testing included a step test, where, as the name implies, a person stepped on a simulated wire bundle that rested upon a bolt head. Likewise, a shear test that forced a bolt head and a torque tip against a wire was performed to attempt to damage the insulation and conductor. Additionally, a vibration test was performed to determine if a wire bundle would abrade when vibrated against the head of a bolt. Also, an abrasion test was undertaken to determine if the polyimide of the wire could be damaged by rubbing against convolex helical tubing. Finally, an impact test was performed to ascertain if the use of the tubing would protect the wire from the strike of a foreign object.
Meta-analysis of the relationship between TQM and Business Performance
NASA Astrophysics Data System (ADS)
F, Ahmad M.; N, Zakuan; A, Jusoh; Z, Tasir; J, Takala
2013-06-01
Meta-analysis has been conducted based on 20 previous works from 4,040 firms at 16 countries from Asia, Europe and America. Throughout this paper a meta-analysis, this paper reviews the relationships between TQM and business performance amongst the regions. Meta-analysis result concludes that the average of rc is 0.47; Asia (rc=0.54), America (rc=0.43) and Europe (rc=0.38). The analysis also shows that Asia developed countries have greatest impact of TQM (rc=0.56). However, the analysis of ANOVA and t-test show that there is no significant difference amongst type of country (developed and developing countries) and regions at p=0.05. In addition, the average result of rc2 is 0.24; Asia (rc2=0.33), America (rc2=0.22) and Europe (rc2=0.15). Meanwhile, rc2 in developing countries (rc2=0.28) are higher than developed countries (rc2=0.21).
ERIC Educational Resources Information Center
Hong, Hee Kyung
2012-01-01
The purpose of this study was to simultaneously examine relationships between teacher quality and instructional time and mathematics and science achievement of 8th grade cohorts in 18 advanced and developing economies. In addition, the study examined changes in mathematics and science performance across the two groups of economies over time using…
Efficient Caption-Based Retrieval of Multimedia Information
1993-10-09
in the design of transportable natural language interfaces. Artifcial Intelligence , 32 (1987), 173-243. - 13- (101 Jones, M. and Eisner, J. A...systems for multimedia data . They exploit captions on the data and perform natural-language processing of them and English retrieval requests. Some...content analysis of the data is also performed to obtain additional descriptive information. The key to getting this approach to work is sufficiently
Implementation of the External Quality Assessment Program in Brazil.
Fleury, Marcos Kneip; Menezes, Maria Elizabeth; Correa, José Abol
2017-02-15
The External Quality Assessment (EQA) in Brazil is performed by the National Health Ministry for diseases that are under supervision of Public Health Department. In addition to the government program, the Brazilian Society of Clinical Analysis and the Brazilian Society of Medical Pathology are allowed to provide their programs under the Supervision of National Agency for Sanitary Surveillance (ANVISA) that regulates laboratories to perform EQA programs.
Effect of hydrogen addition on soot formation in an ethylene/air premixed flame
NASA Astrophysics Data System (ADS)
De Iuliis, S.; Maffi, S.; Migliorini, F.; Cignoli, F.; Zizak, G.
2012-03-01
The effect of hydrogen addition to fuel in soot formation and growth mechanisms is investigated in a rich ethylene/air premixed flame. To this purpose, three-angle scattering and extinction measurements are carried out in flames with different hydrogen contents. By applying the Rayleigh-Debye-Gans theory and the fractal-like description, soot concentration and morphology, with the evaluation of radius of gyration, volume-mean diameter and primary particle diameter are retrieved. To derive fractal parameters such as fractal dimension and fractal prefactor to be used for optical measurements, sampling technique and TEM analysis are performed. In addition, data concerning soot morphology obtained from TEM analysis are compared with the optical results. A good agreement in the value of the primary particle diameter between optical and ex-situ measurements is found. Significant effects of hydrogen addition are detected and presented in this work. In particular, hydrogen addition to fuel is responsible for a reduction in soot concentration, radius of gyration and primary particle diameter.
Repair-level analysis for Space Station Freedom
NASA Technical Reports Server (NTRS)
Chadwick, M.; Yaniec, J.
1992-01-01
To assign repair or discard-at-failure designations for orbital replacement units (ORUs) used on Space Station Freedom Electric Power System (SSFEPS), new algorithms and methods were required. Unique parameters, such as upmass costs, extravehicular activity costs and intravehicular activity (IVA) costs specific to Space Station Freedom's maintenance concept were incorporated into the Repair-Level Analysis (RLA). Additional outputs were also required of the SSFEPS RLA that were not required of previous RLAs. These outputs included recommendations for the number of launches that an ORU should be capable of attaining and an economic basis for condemnation rate. These unique parameters were not addressable using existing RLA models: therefore, a new approach was developed. In addition, it was found that preemptive analysis could be performed using spreadsheet-based Boolean expressions to represent the logical condition of the items under analysis.
Analysis and Design of Launch Vehicle Flight Control Systems
NASA Technical Reports Server (NTRS)
Wie, Bong; Du, Wei; Whorton, Mark
2008-01-01
This paper describes the fundamental principles of launch vehicle flight control analysis and design. In particular, the classical concept of "drift-minimum" and "load-minimum" control principles is re-examined and its performance and stability robustness with respect to modeling uncertainties and a gimbal angle constraint is discussed. It is shown that an additional feedback of angle-of-attack or lateral acceleration can significantly improve the overall performance and robustness, especially in the presence of unexpected large wind disturbance. Non-minimum-phase structural filtering of "unstably interacting" bending modes of large flexible launch vehicles is also shown to be effective and robust.
NASA Astrophysics Data System (ADS)
Cushing, Patrick Ryan
This study compared the performance of high school students on laboratory assessments. Thirty-four high school students who were enrolled in the second semester of a regular biology class or had completed the biology course the previous semester participated in this study. They were randomly assigned to examinations of two formats, performance-task and traditional multiple-choice, from two content areas, using a compound light microscope and diffusion. Students were directed to think-aloud as they performed the assessments. Additional verbal data were obtained during interviews following the assessment. The tape-recorded narrative data were analyzed for type and diversity of knowledge and skill categories, and percentage of in-depth processing demonstrated. While overall mean scores on the assessments were low, elicited statements provided additional insight into student cognition. Results indicated that a greater diversity of knowledge and skill categories was elicited by the two microscope assessments and by the two performance-task assessments. In addition, statements demonstrating in-depth processing were coded most frequently in narratives elicited during clinical interviews following the diffusion performance-task assessment. This study calls for individual teachers to design authentic assessment practices and apply them to daily classroom routines. Authentic assessment should be an integral part of the learning process and not merely an end result. In addition, teachers are encouraged to explicitly identify and model, through think-aloud methods, desired cognitive behaviors in the classroom.
The AES total ankle replacement: A mid-term analysis of 93 cases.
Henricson, Anders; Knutson, Kaj; Lindahl, Johan; Rydholm, Urban
2010-06-01
There are few studies concerning specific total ankle arthroplasties. This study reports mid-term survival data for the AES prosthesis. Ninety-three AES ankle arthroplasties were performed by the senior authors. The mean follow-up was 3.5 years. The 5-year survivorship and also the number of simultaneous procedures, reoperations, additional procedures and revisions are analyzed. The 5-year survivorship with revision for any reason as end-point was 90%. Simultaneous procedures were performed in 25 patients, deltoid release and subtalar fusion being the most common. There were seven revisions, one due to loosening, and two due to infection, instability and fractures, respectively. Twenty-seven reoperations or additional procedures were performed in 23 patients with a procedure for malleolar impingement being the most common reoperation, and correction of hindfoot varus being the most common reason for an additional procedure. The AES total ankle replacement seems to be a reasonably safe procedure in experienced hands. Copyright 2009 European Foot and Ankle Society. Published by Elsevier Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Barth, Andrew; Mamich, Harvey; Hoelscher, Brian
2015-01-01
The first test flight of the Orion Multi-Purpose Crew Vehicle presented additional challenges for guidance, navigation and control as compared to a typical re-entry from the International Space Station or other Low Earth Orbit. An elevated re-entry velocity and steeper flight path angle were chosen to achieve aero-thermal flight test objectives. New IMU's, a GPS receiver, and baro altimeters were flight qualified to provide the redundant navigation needed for human space flight. The guidance and control systems must manage the vehicle lift vector in order to deliver the vehicle to a precision, coastal, water landing, while operating within aerodynamic load, reaction control system, and propellant constraints. Extensive pre-flight six degree-of-freedom analysis was performed that showed mission success for the nominal mission as well as in the presence of sensor and effector failures. Post-flight reconstruction analysis of the test flight is presented in this paper to show whether that all performance metrics were met and establish how well the pre-flight analysis predicted the in-flight performance.
NASA Technical Reports Server (NTRS)
Csank, Jeffrey; Zinnecker, Alicia
2014-01-01
Systems analysis involves steady-state simulations of combined components to evaluate the steady-state performance, weight, and cost of a system; dynamic considerations are not included until later in the design process. The Dynamic Systems Analysis task, under NASAs Fixed Wing project, is developing the capability for assessing dynamic issues at earlier stages during systems analysis. To provide this capability the Tool for Turbine Engine Closed-loop Transient Analysis (TTECTrA) has been developed to design a single flight condition controller (defined as altitude and Mach number) and, ultimately, provide an estimate of the closed-loop performance of the engine model. This tool has been integrated with the Commercial Modular Aero-Propulsion System Simulation 40,000(CMAPSS40k) engine model to demonstrate the additional information TTECTrA makes available for dynamic systems analysis. This dynamic data can be used to evaluate the trade-off between performance and safety, which could not be done with steady-state systems analysis data. TTECTrA has been designed to integrate with any turbine engine model that is compatible with the MATLABSimulink (The MathWorks, Inc.) environment.
NASA Technical Reports Server (NTRS)
Csank, Jeffrey Thomas; Zinnecker, Alicia Mae
2014-01-01
Systems analysis involves steady-state simulations of combined components to evaluate the steady-state performance, weight, and cost of a system; dynamic considerations are not included until later in the design process. The Dynamic Systems Analysis task, under NASAs Fixed Wing project, is developing the capability for assessing dynamic issues at earlier stages during systems analysis. To provide this capability the Tool for Turbine Engine Closed-loop Transient Analysis (TTECTrA) has been developed to design a single flight condition controller (defined as altitude and Mach number) and, ultimately, provide an estimate of the closed-loop performance of the engine model. This tool has been integrated with the Commercial Modular Aero-Propulsion System Simulation 40,000 (CMAPSS 40k) engine model to demonstrate the additional information TTECTrA makes available for dynamic systems analysis. This dynamic data can be used to evaluate the trade-off between performance and safety, which could not be done with steady-state systems analysis data. TTECTrA has been designed to integrate with any turbine engine model that is compatible with the MATLAB Simulink (The MathWorks, Inc.) environment.
NASA Astrophysics Data System (ADS)
Mulyati, S.; Armando, M. A.; Mawardi, H.; Azmi, F. A.; Pratiwi, W. P.; Fadzlina, A.; Akbar, R.; Syawaliah
2018-03-01
This paper reports the effects of rice husk nanosilica addition on the performance of polyethersulfone (PES) membrane. Polyethersulfone membrane (PES) was fabricated by using N-methyl-2-pyrolidone (NMP) as a solvent and rice husk nanosilica as a modifying agent. The influence of the rice husk nanosilica additive on the characteristics and performance of the membrane has been studied. Scanning Electron Microscopy (SEM) analysis confirmed that the manufactured membrane has an asymmetric morphological structure consisting of two layers. The upper part of the membrane is a thin layer, meanwhile in the bottom side is a porous layer. The addition of 5% nanosilica resulting a PES membrane to have a bigger porous than that of pristine PES. The pure water flux of nanosilica-modified membranes were greater in comparison to the pure water flux of unmodified PES membrane. The performance of all membranes were evaluated on humic acid removal. The highest selectivity was showcased by pure PES membrane. The introduction of rice husk nanosilica additive to the membrane declined the selectivity of the membrane to humic acid in the feed solution. This is caused by the pores enlargement and enhanced hydrophilicity of the membrane after modification with rice husk biosilica.
Prototyping and Characterization of an Adjustable Skew Angle Single Gimbal Control Moment Gyroscope
2015-03-01
performance, and an analysis of the test results is provided. In addition to the standard battery of CMG performance tests that were planned, a...objectives for this new CMG is to provide comparable performance to the Andrews CMGs, the values in Table 1 will be used for output torque comparison...essentially fixed at 53.4°. This specific skew angle value is not the problem, as this is one commonly used CMG skew angle for satellite systems. The real
Parametric study on the performance of automotive MR shock absorbers
NASA Astrophysics Data System (ADS)
Gołdasz, J.; Dzierżek, S.
2016-09-01
The paper contains the results of a parametric study to explore the influence of various quantities on the performance range of semi-active automotive shock absorbers using the magnetorheological (MR) fluid under steady-state and transient excitations. The analysis was performed with simulated data and using a standard single-tube shock absorber configuration with a single-gap MR valve. Additionally, the impact of material variables and valves geometry was examined as the parameters were varied and its dynamic range studied.
NASA Astrophysics Data System (ADS)
Hild, Jutta; Krüger, Wolfgang; Brüstle, Stefan; Trantelle, Patrick; Unmüßig, Gabriel; Voit, Michael; Heinze, Norbert; Peinsipp-Byma, Elisabeth; Beyerer, Jürgen
2017-05-01
Real-time motion video analysis is a challenging and exhausting task for the human observer, particularly in safety and security critical domains. Hence, customized video analysis systems providing functions for the analysis of subtasks like motion detection or target tracking are welcome. While such automated algorithms relieve the human operators from performing basic subtasks, they impose additional interaction duties on them. Prior work shows that, e.g., for interaction with target tracking algorithms, a gaze-enhanced user interface is beneficial. In this contribution, we present an investigation on interaction with an independent motion detection (IDM) algorithm. Besides identifying an appropriate interaction technique for the user interface - again, we compare gaze-based and traditional mouse-based interaction - we focus on the benefit an IDM algorithm might provide for an UAS video analyst. In a pilot study, we exposed ten subjects to the task of moving target detection in UAS video data twice, once performing with automatic support, once performing without it. We compare the two conditions considering performance in terms of effectiveness (correct target selections). Additionally, we report perceived workload (measured using the NASA-TLX questionnaire) and user satisfaction (measured using the ISO 9241-411 questionnaire). The results show that a combination of gaze input and automated IDM algorithm provides valuable support for the human observer, increasing the number of correct target selections up to 62% and reducing workload at the same time.
NASA Astrophysics Data System (ADS)
Muhlen, Luis S. W.; Najafi, Behzad; Rinaldi, Fabio; Marchesi, Renzo
2014-04-01
Solar troughs are amongst the most commonly used technologies for collecting solar thermal energy and any attempt to increase the performance of these systems is welcomed. In the present study a parabolic solar trough is simulated using a one dimensional finite element model in which the energy balances for the fluid, the absorber and the envelope in each element are performed. The developed model is then validated using the available experimental data . A sensitivity analysis is performed in the next step in order to study the effect of changing the type of the working fluid and the corresponding Reynolds number on the overall performance of the system. The potential improvement due to the addition of a shield on the upper half of the annulus and enhancing the convection coefficient of the heat transfer fluid is also studied.
Humphries, Stephen M; Yagihashi, Kunihiro; Huckleberry, Jason; Rho, Byung-Hak; Schroeder, Joyce D; Strand, Matthew; Schwarz, Marvin I; Flaherty, Kevin R; Kazerooni, Ella A; van Beek, Edwin J R; Lynch, David A
2017-10-01
Purpose To evaluate associations between pulmonary function and both quantitative analysis and visual assessment of thin-section computed tomography (CT) images at baseline and at 15-month follow-up in subjects with idiopathic pulmonary fibrosis (IPF). Materials and Methods This retrospective analysis of preexisting anonymized data, collected prospectively between 2007 and 2013 in a HIPAA-compliant study, was exempt from additional institutional review board approval. The extent of lung fibrosis at baseline inspiratory chest CT in 280 subjects enrolled in the IPF Network was evaluated. Visual analysis was performed by using a semiquantitative scoring system. Computer-based quantitative analysis included CT histogram-based measurements and a data-driven textural analysis (DTA). Follow-up CT images in 72 of these subjects were also analyzed. Univariate comparisons were performed by using Spearman rank correlation. Multivariate and longitudinal analyses were performed by using a linear mixed model approach, in which models were compared by using asymptotic χ 2 tests. Results At baseline, all CT-derived measures showed moderate significant correlation (P < .001) with pulmonary function. At follow-up CT, changes in DTA scores showed significant correlation with changes in both forced vital capacity percentage predicted (ρ = -0.41, P < .001) and diffusing capacity for carbon monoxide percentage predicted (ρ = -0.40, P < .001). Asymptotic χ 2 tests showed that inclusion of DTA score significantly improved fit of both baseline and longitudinal linear mixed models in the prediction of pulmonary function (P < .001 for both). Conclusion When compared with semiquantitative visual assessment and CT histogram-based measurements, DTA score provides additional information that can be used to predict diminished function. Automatic quantification of lung fibrosis at CT yields an index of severity that correlates with visual assessment and functional change in subjects with IPF. © RSNA, 2017.
An experimental analysis of a doped lithium fluoride direct absorption solar receiver
NASA Technical Reports Server (NTRS)
Kesseli, James; Pollak, Tom; Lacy, Dovie
1988-01-01
An experimental analysis of two key elements of a direct absorption solar receiver for use with Brayton solar dynamic systems was conducted. Experimental data are presented on LiF crystals doped with dysprosium, samarium, and cobalt fluorides. In addition, a simulation of the cavity/window environment was performed and a posttest inspection was conducted to evaluate chemical reactivity, transmissivity, and condensation rate.
ERIC Educational Resources Information Center
King, Thomas; McKean, Cristina; Rush, Robert; Westrupp, Elizabeth M.; Mensah, Fiona K.; Reilly, Sheena; Law, James
2017-01-01
Maternal education captured at a single time point is commonly employed as a predictor of a child's cognitive development. In this article, we ask what bearing the acquisition of additional qualifications has upon reading performance in middle childhood. This was a secondary analysis of the United Kingdom's Millennium Cohort Study, a cohort of…
NASA Technical Reports Server (NTRS)
Parzen, Benjamin
1992-01-01
The theory of oscillator analysis in the immittance domain should be read in conjunction with the additional theory presented here. The combined theory enables the computer simulation of the steady state oscillator. The simulation makes the calculation of the oscillator total steady state performance practical, including noise at all oscillator locations. Some specific precision oscillators are analyzed.
Aging analysis of high performance FinFET flip-flop under Dynamic NBTI simulation configuration
NASA Astrophysics Data System (ADS)
Zainudin, M. F.; Hussin, H.; Halim, A. K.; Karim, J.
2018-03-01
A mechanism known as Negative-bias Temperature Instability (NBTI) degrades a main electrical parameters of a circuit especially in terms of performance. So far, the circuit design available at present are only focussed on high performance circuit without considering the circuit reliability and robustness. In this paper, the main circuit performances of high performance FinFET flip-flop such as delay time, and power were studied with the presence of the NBTI degradation. The aging analysis was verified using a 16nm High Performance Predictive Technology Model (PTM) based on different commands available at Synopsys HSPICE. The results shown that the circuit under the longer dynamic NBTI simulation produces the highest impact in the increasing of gate delay and decrease in the average power reduction from a fresh simulation until the aged stress time under a nominal condition. In addition, the circuit performance under a varied stress condition such as temperature and negative stress gate bias were also studied.
Twenty-five years of sport performance research in the Journal of Sports Sciences.
Nevill, Alan; Atkinson, Greg; Hughes, Mike
2008-02-15
In this historical review covering the past 25 years, we reflect on the content of manuscripts relevant to the Sport Performance section of the Journal of Sports Sciences. Due to the wide diversity of sport performance research, the remit of the Sport Performance section has been broad and includes mathematical and statistical evaluation of competitive sports performances, match- and notation-analysis, talent identification, training and selection or team organization. In addition, due to the academic interests of its section editors, they adopted a quality-assurance role for the Sport Performance section, invariably communicated through key editorials that subsequently shaped the editorial policy of the Journal. Key high-impact manuscripts are discussed, providing readers with some insight into what might lead an article to become a citation "classic". Finally, landmark articles in the areas of "science and football" and "notation analysis" are highlighted, providing further insight into how such articles have contributed to the development of sport performance research in general and the Journal of Sports Sciences in particular.
Millard, Heather A Towle; Millard, Ralph P; Constable, Peter D; Freeman, Lyn J
2014-02-01
To determine the relationships among traditional and laparoscopic surgical skills, spatial analysis skills, and video gaming proficiency of third-year veterinary students. Prospective, randomized, controlled study. A convenience sample of 29 third-year veterinary students. The students had completed basic surgical skills training with inanimate objects but had no experience with soft tissue, orthopedic, or laparoscopic surgery; the spatial analysis test; or the video games that were used in the study. Scores for traditional surgical, laparoscopic, spatial analysis, and video gaming skills were determined, and associations among these were analyzed by means of Spearman's rank order correlation coefficient (rs). A significant positive association (rs = 0.40) was detected between summary scores for video game performance and laparoscopic skills, but not between video game performance and traditional surgical skills scores. Spatial analysis scores were positively (rs = 0.30) associated with video game performance scores; however, that result was not significant. Spatial analysis scores were not significantly associated with laparoscopic surgical skills scores. Traditional surgical skills scores were not significantly associated with laparoscopic skills or spatial analysis scores. Results of this study indicated video game performance of third-year veterinary students was predictive of laparoscopic but not traditional surgical skills, suggesting that laparoscopic performance may be improved with video gaming experience. Additional studies would be required to identify methods for improvement of traditional surgical skills.
Berretta, Massimiliano; Micek, Agnieszka; Lafranconi, Alessandra; Rossetti, Sabrina; Di Francia, Raffaele; De Paoli, Paolo; Rossi, Paola; Facchini, Gaetano
2018-04-17
Coffee consumption has been associated with numerous cancers, but evidence on ovarian cancer risk is controversial. Therefore, we performed a meta-analysis on prospective cohort studies in order to review the evidence on coffee consumption and risk of ovarian cancer. Studies were identified through searching the PubMed and MEDLINE databases up to March 2017. Risk estimates were retrieved from the studies, and dose-response analysis was modelled by using restricted cubic splines. Additionally, a stratified analysis by menopausal status was performed. A total of 8 studies were eligible for the dose-response meta-analysis. Studies included in the analysis comprised 787,076 participants and 3,541 ovarian cancer cases. The results showed that coffee intake was not associated with ovarian cancer risk (RR = 1.06, 95% CI: 0.89, 1.26). Stratified and subgroup analysis showed consisted results. This comprehensive meta-analysis did not find evidence of an association between the consumption of coffee and risk of ovarian cancer.
Volumetric neuroimage analysis extensions for the MIPAV software package.
Bazin, Pierre-Louis; Cuzzocreo, Jennifer L; Yassa, Michael A; Gandler, William; McAuliffe, Matthew J; Bassett, Susan S; Pham, Dzung L
2007-09-15
We describe a new collection of publicly available software tools for performing quantitative neuroimage analysis. The tools perform semi-automatic brain extraction, tissue classification, Talairach alignment, and atlas-based measurements within a user-friendly graphical environment. They are implemented as plug-ins for MIPAV, a freely available medical image processing software package from the National Institutes of Health. Because the plug-ins and MIPAV are implemented in Java, both can be utilized on nearly any operating system platform. In addition to the software plug-ins, we have also released a digital version of the Talairach atlas that can be used to perform regional volumetric analyses. Several studies are conducted applying the new tools to simulated and real neuroimaging data sets.
NASA Technical Reports Server (NTRS)
Hague, D. S.; Rozendaal, H. L.
1977-01-01
Program NSEG is a rapid mission analysis code based on the use of approximate flight path equations of motion. Equation form varies with the segment type, for example, accelerations, climbs, cruises, descents, and decelerations. Realistic and detailed vehicle characteristics are specified in tabular form. In addition to its mission performance calculation capabilities, the code also contains extensive flight envelope performance mapping capabilities. For example, rate-of-climb, turn rates, and energy maneuverability parameter values may be mapped in the Mach-altitude plane. Approximate take off and landing analyses are also performed. At high speeds, centrifugal lift effects are accounted for. Extensive turbojet and ramjet engine scaling procedures are incorporated in the code.
Airfoil Vibration Dampers program
NASA Technical Reports Server (NTRS)
Cook, Robert M.
1991-01-01
The Airfoil Vibration Damper program has consisted of an analysis phase and a testing phase. During the analysis phase, a state-of-the-art computer code was developed, which can be used to guide designers in the placement and sizing of friction dampers. The use of this computer code was demonstrated by performing representative analyses on turbine blades from the High Pressure Oxidizer Turbopump (HPOTP) and High Pressure Fuel Turbopump (HPFTP) of the Space Shuttle Main Engine (SSME). The testing phase of the program consisted of performing friction damping tests on two different cantilever beams. Data from these tests provided an empirical check on the accuracy of the computer code developed in the analysis phase. Results of the analysis and testing showed that the computer code can accurately predict the performance of friction dampers. In addition, a valuable set of friction damping data was generated, which can be used to aid in the design of friction dampers, as well as provide benchmark test cases for future code developers.
Vojdeman, Fie Juhl; Van't Veer, Mars B; Tjønnfjord, Geir E; Itälä-Remes, Maija; Kimby, Eva; Polliack, Aaron; Wu, Ka L; Doorduijn, Jeanette K; Alemayehu, Wendimagegn G; Wittebol, Shulamiet; Kozak, Tomas; Walewski, Jan; Abrahamse-Testroote, Martine C J; van Oers, Marinus H J; Geisler, Christian Hartmann
2017-03-01
In the HOVON68 CLL trial, patients 65 to 75 years of age had no survival benefit from the addition of low-dose alemtuzumab to fludarabine and cyclophosphamide (FC) in contrast to younger patients. The reasons are explored in this 5-year trial update using both survival analysis and competing risk analysis on non-CLL-related mortality. Elderly FCA patients died more frequently from causes not related to CLL, and more often related to comorbidity (mostly cardiovascular) than to infection. In a Cox multivariate analysis, del(17p), performance status >0, and comorbidity were associated with a higher non-CLL-related mortality in the elderly independent of the treatment modality. Thus, while the 'fit' elderly with no comorbidity or performance status of 0 might potentially benefit from chemo-immunotherapy with FC, caution is warranted, when considering alemtuzumab treatment in elderly patients with cardiovascular comorbidity.
Investigation of Periodic Nuclear Decay Data with Spectral Analysis Techniques
NASA Astrophysics Data System (ADS)
Javorsek, D.; Sturrock, P.; Buncher, J.; Fischbach, E.; Gruenwald, T.; Hoft, A.; Horan, T.; Jenkins, J.; Kerford, J.; Lee, R.; Mattes, J.; Morris, D.; Mudry, R.; Newport, J.; Petrelli, M.; Silver, M.; Stewart, C.; Terry, B.; Willenberg, H.
2009-12-01
We provide the results from a spectral analysis of nuclear decay experiments displaying unexplained periodic fluctuations. The analyzed data was from 56Mn decay reported by the Children's Nutrition Research Center in Houston, 32Si decay reported by an experiment performed at the Brookhaven National Laboratory, and 226Ra decay reported by an experiment performed at the Physikalisch-Technische-Bundesanstalt in Germany. All three data sets possess the same primary frequency mode consisting of an annual period. Additionally a spectral comparison of the local ambient temperature, atmospheric pressure, relative humidity, Earth-Sun distance, and the plasma speed and latitude of the heliospheric current sheet (HCS) was performed. Following analysis of these six possible causal factors, their reciprocals, and their linear combinations, a possible link between nuclear decay rate fluctuations and the linear combination of the HCS latitude and 1/R motivates searching for a possible mechanism with such properties.
A Shot Number Based Approach to Performance Analysis in Table Tennis
Yoshida, Kazuto; Yamada, Koshi
2017-01-01
Abstract The current study proposes a novel approach that improves the conventional performance analysis in table tennis by introducing the concept of frequency, or the number of shots, of each shot number. The improvements over the conventional method are as follows: better accuracy of the evaluation of skills and tactics of players, additional insights into scoring and returning skills and ease of understanding the results with a single criterion. The performance analysis of matches played at the 2012 Summer Olympics in London was conducted using the proposed method. The results showed some effects of the shot number and gender differences in table tennis. Furthermore, comparisons were made between Chinese players and players from other countries, what threw light on the skills and tactics of the Chinese players. The present findings demonstrate that the proposed method provides useful information and has some advantages over the conventional method. PMID:28210334
Wind Energy Conversion System Analysis Model (WECSAM) computer program documentation
NASA Astrophysics Data System (ADS)
Downey, W. T.; Hendrick, P. L.
1982-07-01
Described is a computer-based wind energy conversion system analysis model (WECSAM) developed to predict the technical and economic performance of wind energy conversion systems (WECS). The model is written in CDC FORTRAN V. The version described accesses a data base containing wind resource data, application loads, WECS performance characteristics, utility rates, state taxes, and state subsidies for a six state region (Minnesota, Michigan, Wisconsin, Illinois, Ohio, and Indiana). The model is designed for analysis at the county level. The computer model includes a technical performance module and an economic evaluation module. The modules can be run separately or together. The model can be run for any single user-selected county within the region or looped automatically through all counties within the region. In addition, the model has a restart capability that allows the user to modify any data-base value written to a scratch file prior to the technical or economic evaluation.
NASA Technical Reports Server (NTRS)
Trolinger, J. D.; Lal, R. B.; Batra, A. K.; Mcintosh, D.
1991-01-01
The first International Microgravity Laboratory (IML-1), scheduled for spaceflight in early 1992 includes a crystal-growth-from-solution experiment which is equipped with an array of optical diagnostics instrumentation which includes transmission and reflection holography, tomography, schlieren, and particle image displacement velocimetry. During the course of preparation for this spaceflight experiment we have performed both experimentation and analysis for each of these diagnostics. In this paper we describe the work performed in the development of holographic particle image displacement velocimetry for microgravity application which will be employed primarily to observe and quantify minute convective currents in the Spacelab environment and also to measure the value of g. Additionally, the experiment offers a unique opportunity to examine physical phenomena which are normally negligible and not observable. A preliminary analysis of the motion of particles in fluid was performed and supporting experiments were carried out. The results of the analysis and the experiments are reported.
Initial Risk Analysis and Decision Making Framework
DOE Office of Scientific and Technical Information (OSTI.GOV)
Engel, David W.
2012-02-01
Commercialization of new carbon capture simulation initiative (CCSI) technology will include two key elements of risk management, namely, technical risk (will process and plant performance be effective, safe, and reliable) and enterprise risk (can project losses and costs be controlled within the constraints of market demand to maintain profitability and investor confidence). Both of these elements of risk are incorporated into the risk analysis subtask of Task 7. Thus far, this subtask has developed a prototype demonstration tool that quantifies risk based on the expected profitability of expenditures when retrofitting carbon capture technology on a stylized 650 MW pulverized coalmore » electric power generator. The prototype is based on the selection of specific technical and financial factors believed to be important determinants of the expected profitability of carbon capture, subject to uncertainty. The uncertainty surrounding the technical performance and financial variables selected thus far is propagated in a model that calculates the expected profitability of investments in carbon capture and measures risk in terms of variability in expected net returns from these investments. Given the preliminary nature of the results of this prototype, additional work is required to expand the scope of the model to include additional risk factors, additional information on extant and proposed risk factors, the results of a qualitative risk factor elicitation process, and feedback from utilities and other interested parties involved in the carbon capture project. Additional information on proposed distributions of these risk factors will be integrated into a commercial implementation framework for the purpose of a comparative technology investment analysis.« less
Multilevel microvibration test for performance predictions of a space optical load platform
NASA Astrophysics Data System (ADS)
Li, Shiqi; Zhang, Heng; Liu, Shiping; Wang, Yue
2018-05-01
This paper presents a framework for the multilevel microvibration analysis and test of a space optical load platform. The test framework is conducted on three levels, including instrument, subsystem, and system level. Disturbance source experimental investigations are performed to evaluate the vibration amplitude and study vibration mechanism. Transfer characteristics of space camera are validated by a subsystem test, which allows the calculation of transfer functions from various disturbance sources to optical performance outputs. In order to identify the influence of the source on the spacecraft performance, a system level microvibration measurement test has been performed on the ground. From the time domain analysis and spectrum analysis of multilevel microvibration tests, we concluded that the disturbance source has a significant effect on its installation position. After transmitted through mechanical links, the residual vibration reduces to a background noise level. In addition, the angular microvibration of the platform jitter is mainly concentrated in the rotation of y-axes. This work is applied to a real practical application involving the high resolution satellite camera system.
Structural Analysis of Women’s Heptathlon
Gassmann, Freya; Fröhlich, Michael; Emrich, Eike
2016-01-01
The heptathlon comprises the results of seven single disciplines, assuming an equal influence from each discipline, depending on the measured performance. Data analysis was based on the data recorded for the individual performances of the 10 winning heptathletes in the World Athletics Championships from 1987 to 2013 and the Olympic Games from 1988 to 2012. In addition to descriptive analysis methods, correlations, bivariate and multivariate linear regressions, and panel data regressions were used. The transformation of the performances from seconds, centimeters, and meters into points showed that the individual disciplines do not equally affect the overall competition result. The currently valid conversion formula for the run, jump, and throw disciplines prefers the sprint and jump disciplines but penalizes the athletes performing in the 800 m run, javelin throw, and shotput disciplines. Furthermore, 21% to 48% of the variance of the sum of points can be attributed to the performances in the disciplines of long jump, 200 m sprint, 100 m hurdles, and high jump. To balance the effects of the single disciplines in the heptathlon, the formula to calculate points should be reevaluated. PMID:29910260
Application of Microchip Electrophoresis for Clinical Tests
NASA Astrophysics Data System (ADS)
Yatsushiro, Shouki; Kataoka, Masatoshi
Microchip electrophoresis has recently attracted much attention in the field of nuclear acid analysis due to its high efficiency, ease of operation, low consumption of samples and reagents, and relatively low costs. In addition, the analysis has expanded to an analytical field like not only the analysis of DNA but also the analysis of RNA, the protein, the sugar chain, and the cellular function, etc. In this report, we showed that high-performance monitoring systems for human blood glucose levels and α-amylase activity in human plasma using microchip electrophoresis.
Niu, Guanghui; Shi, Qi; Xu, Mingjun; Lai, Hongjun; Lin, Qingyu; Liu, Kunping; Duan, Yixiang
2015-10-01
In this article, a novel and alternative method of laser-induced breakdown spectroscopy (LIBS) analysis for liquid sample is proposed, which involves the removal of metal ions from a liquid to a solid substrate using a cost-efficient adsorbent, dehydrated carbon, obtained using a dehydration reaction. Using this new technique, researchers can detect trace metal ions in solutions qualitatively and quantitatively, and the drawbacks of performing liquid analysis using LIBS can be avoided because the analysis is performed on a solid surface. To achieve better performance using this technique, we considered parameters potentially influencing both adsorption performance and LIBS analysis. The calibration curves were evaluated, and the limits of detection obtained for Cu(2+), Pb(2+), and Cr(3+) were 0.77, 0.065, and 0.46 mg/L, respectively, which are better than those in the previous studies. In addition, compared to other absorbents, the adsorbent used in this technique is much cheaper in cost, easier to obtain, and has fewer or no other elements other than C, H, and O that could result in spectral interference during analysis. We also used the recommended method to analyze spiked samples, obtaining satisfactory results. Thus, this new technique is helpful and promising for use in wastewater analysis and management.
An Analysis of the Navy Regional Data Automation Center (NARDAC) chargeback System
1986-09-01
addition, operational control is concerned with performing predefined activities whereas management control relates to the organiza- tion’s goals and...In effect, the management control system monitors the progress of operations and alerts the "appropriate management level" when performance as measured...architecture, the financial control processes, and the audit function ( Brandon , 1978; Anderson, 1983). In an operating DP environment, however, non-financial
ERIC Educational Resources Information Center
Howard, Larry L.
2011-01-01
This paper estimates models of the transitional effects of food insecurity experiences on children's non-cognitive performance in school classrooms using a panel of 4710 elementary students enrolled in 1st, 3rd, and 5th grade (1999-2003). In addition to an extensive set of child and household-level characteristics, we use information on U.S.…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nathan D. Jerred; Robert C. O'Brien; Steven D. Howe
Recent developments at the Center for Space Nuclear Research (CSNR) on a Martian exploration probe have lead to the assembly of a multi-functional variable atmosphere testing facility (VATF). The VATF has been assembled to perform transient blow-down analysis of a radioisotope thermal rocket (RTR) concept that has been proposed for the Mars Hopper; a long-lived, long-ranged mobile platform for the Martian surface. This study discusses the current state of the VATF as well as recent blow-down testing performed on a laboratory-scale prototype of the Mars Hopper. The VATF allows for the simulation of Mars ambient conditions within the pressure vesselmore » as well as to safely perform blow-down tests through the prototype using CO2 gas; the proposed propellant for the Mars Hopper. Empirical data gathered will lead to a better understanding of CO2 behavior and will provide validation of simulation models. Additionally, the potential of the VATF to test varying propulsion system designs has been recognized. In addition to being able to simulate varying atmospheres and blow-down gases for the RTR, it can be fitted to perform high temperature hydrogen testing of fuel elements for nuclear thermal propulsion.« less
Mathysen, Danny G P; Aclimandos, Wagih; Roelant, Ella; Wouters, Kristien; Creuzot-Garcher, Catherine; Ringens, Peter J; Hawlina, Marko; Tassignon, Marie-José
2013-11-01
To investigate whether introduction of item-response theory (IRT) analysis, in parallel to the 'traditional' statistical analysis methods available for performance evaluation of multiple T/F items as used in the European Board of Ophthalmology Diploma (EBOD) examination, has proved beneficial, and secondly, to study whether the overall assessment performance of the current written part of EBOD is sufficiently high (KR-20≥ 0.90) to be kept as examination format in future EBOD editions. 'Traditional' analysis methods for individual MCQ item performance comprise P-statistics, Rit-statistics and item discrimination, while overall reliability is evaluated through KR-20 for multiple T/F items. The additional set of statistical analysis methods for the evaluation of EBOD comprises mainly IRT analysis. These analysis techniques are used to monitor whether the introduction of negative marking for incorrect answers (since EBOD 2010) has a positive influence on the statistical performance of EBOD as a whole and its individual test items in particular. Item-response theory analysis demonstrated that item performance parameters should not be evaluated individually, but should be related to one another. Before the introduction of negative marking, the overall EBOD reliability (KR-20) was good though with room for improvement (EBOD 2008: 0.81; EBOD 2009: 0.78). After the introduction of negative marking, the overall reliability of EBOD improved significantly (EBOD 2010: 0.92; EBOD 2011:0.91; EBOD 2012: 0.91). Although many statistical performance parameters are available to evaluate individual items, our study demonstrates that the overall reliability assessment remains the only crucial parameter to be evaluated allowing comparison. While individual item performance analysis is worthwhile to undertake as secondary analysis, drawing final conclusions seems to be more difficult. Performance parameters need to be related, as shown by IRT analysis. Therefore, IRT analysis has proved beneficial for the statistical analysis of EBOD. Introduction of negative marking has led to a significant increase in the reliability (KR-20 > 0.90), indicating that the current examination format can be kept for future EBOD examinations. © 2013 Acta Ophthalmologica Scandinavica Foundation. Published by John Wiley & Sons Ltd.
Váradi, Csaba; Mittermayr, Stefan; Millán-Martín, Silvia; Bones, Jonathan
2016-12-01
Capillary electrophoresis (CE) offers excellent efficiency and orthogonality to liquid chromatographic (LC) separations for oligosaccharide structural analysis. Combination of CE with high resolution mass spectrometry (MS) for glycan analysis remains a challenging task due to the MS incompatibility of background electrolyte buffers and additives commonly used in offline CE separations. Here, a novel method is presented for the analysis of 2-aminobenzoic acid (2-AA) labelled glycans by capillary electrophoresis coupled to mass spectrometry (CE-MS). To ensure maximum resolution and excellent precision without the requirement for excessive analysis times, CE separation conditions including the concentration and pH of the background electrolyte, the effect of applied pressure on the capillary inlet and the capillary length were evaluated. Using readily available 12/13 C 6 stable isotopologues of 2-AA, the developed method can be applied for quantitative glycan profiling in a twoplex manner based on the generation of extracted ion electropherograms (EIE) for 12 C 6 'light' and 13 C 6 'heavy' 2-AA labelled glycan isotope clusters. The twoplex quantitative CE-MS glycan analysis platform is ideally suited for comparability assessment of biopharmaceuticals, such as monoclonal antibodies, for differential glycomic analysis of clinical material for potential biomarker discovery or for quantitative microheterogeneity analysis of different glycosylation sites within a glycoprotein. Additionally, due to the low injection volume requirements of CE, subsequent LC-MS analysis of the same sample can be performed facilitating the use of orthogonal separation techniques for structural elucidation or verification of quantitative performance.
Günzel, Karsten; Cash, Hannes; Buckendahl, John; Königbauer, Maximilian; Asbach, Patrick; Haas, Matthias; Neymeyer, Jörg; Hinz, Stefan; Miller, Kurt; Kempkensteffen, Carsten
2017-01-13
To explore the diagnostic benefit of an additional image fusion of the sagittal plane in addition to the standard axial image fusion, using a sensor-based MRI/US fusion platform. During July 2013 and September 2015, 251 patients with at least one suspicious lesion on mpMRI (rated by PI-RADS) were included into the analysis. All patients underwent MRI/US targeted biopsy (TB) in combination with a 10 core systematic prostate biopsy (SB). All biopsies were performed on a sensor-based fusion system. Group A included 162 men who received TB by an axial MRI/US image fusion. Group B comprised 89 men in whom the TB was performed with an additional sagittal image fusion. The median age in group A was 67 years (IQR 61-72) and in group B 68 years (IQR 60-71). The median PSA level in group A was 8.10 ng/ml (IQR 6.05-14) and in group B 8.59 ng/ml (IQR 5.65-12.32). In group A the proportion of patients with a suspicious digital rectal examination (DRE) (14 vs. 29%, p = 0.007) and the proportion of primary biopsies (33 vs 46%, p = 0.046) were significantly lower. The rate of PI-RADS 3 lesions were overrepresented in group A compared to group B (19 vs. 9%; p = 0.044). Classified according to PI-RADS 3, 4 and 5, the detection rates of TB were 42, 48, 75% in group A and 25, 74, 90% in group B. The rate of PCa with a Gleason score ≥7 missed by TB was 33% (18 cases) in group A and 9% (5 cases) in group B; p-value 0.072. An explorative multivariate binary logistic regression analysis revealed that PI-RADS, a suspicious DRE and performing an additional sagittal image fusion were significant predictors for PCa detection in TB. 9 PCa were only detected by TB with sagittal fusion (sTB) and sTB identified 10 additional clinically significant PCa (Gleason ≥7). Performing an additional sagittal image fusion besides the standard axial fusion appears to improve the accuracy of the sensor-based MRI/US fusion platform.
Test data analysis for concentrating photovoltaic arrays
NASA Astrophysics Data System (ADS)
Maish, A. B.; Cannon, J. E.
A test data analysis approach for use with steady state efficiency measurements taken on concentrating photovoltaic arrays is presented. The analysis procedures can be used to identify based and erroneous data. The steps involved in analyzing the test data are screening the data, developing coefficients for the performance equation, analyzing statistics to ensure adequacy of the regression fit to the data, and plotting the data. In addition, this paper analyzes the sources and magnitudes of precision and bias errors that affect measurement accuracy are analyzed.
Analysis of airfoil leading edge separation bubbles
NASA Technical Reports Server (NTRS)
Carter, J. E.; Vatsa, V. N.
1982-01-01
A local inviscid-viscous interaction technique was developed for the analysis of low speed airfoil leading edge transitional separation bubbles. In this analysis an inverse boundary layer finite difference analysis is solved iteratively with a Cauchy integral representation of the inviscid flow which is assumed to be a linear perturbation to a known global viscous airfoil analysis. Favorable comparisons with data indicate the overall validity of the present localized interaction approach. In addition numerical tests were performed to test the sensitivity of the computed results to the mesh size, limits on the Cauchy integral, and the location of the transition region.
A genome-wide association study of corneal astigmatism: The CREAM Consortium.
Shah, Rupal L; Li, Qing; Zhao, Wanting; Tedja, Milly S; Tideman, J Willem L; Khawaja, Anthony P; Fan, Qiao; Yazar, Seyhan; Williams, Katie M; Verhoeven, Virginie J M; Xie, Jing; Wang, Ya Xing; Hess, Moritz; Nickels, Stefan; Lackner, Karl J; Pärssinen, Olavi; Wedenoja, Juho; Biino, Ginevra; Concas, Maria Pina; Uitterlinden, André; Rivadeneira, Fernando; Jaddoe, Vincent W V; Hysi, Pirro G; Sim, Xueling; Tan, Nicholas; Tham, Yih-Chung; Sensaki, Sonoko; Hofman, Albert; Vingerling, Johannes R; Jonas, Jost B; Mitchell, Paul; Hammond, Christopher J; Höhn, René; Baird, Paul N; Wong, Tien-Yin; Cheng, Chinfsg-Yu; Teo, Yik Ying; Mackey, David A; Williams, Cathy; Saw, Seang-Mei; Klaver, Caroline C W; Guggenheim, Jeremy A; Bailey-Wilson, Joan E
2018-01-01
To identify genes and genetic markers associated with corneal astigmatism. A meta-analysis of genome-wide association studies (GWASs) of corneal astigmatism undertaken for 14 European ancestry (n=22,250) and 8 Asian ancestry (n=9,120) cohorts was performed by the Consortium for Refractive Error and Myopia. Cases were defined as having >0.75 diopters of corneal astigmatism. Subsequent gene-based and gene-set analyses of the meta-analyzed results of European ancestry cohorts were performed using VEGAS2 and MAGMA software. Additionally, estimates of single nucleotide polymorphism (SNP)-based heritability for corneal and refractive astigmatism and the spherical equivalent were calculated for Europeans using LD score regression. The meta-analysis of all cohorts identified a genome-wide significant locus near the platelet-derived growth factor receptor alpha ( PDGFRA ) gene: top SNP: rs7673984, odds ratio=1.12 (95% CI:1.08-1.16), p=5.55×10 -9 . No other genome-wide significant loci were identified in the combined analysis or European/Asian ancestry-specific analyses. Gene-based analysis identified three novel candidate genes for corneal astigmatism in Europeans-claudin-7 ( CLDN7 ), acid phosphatase 2, lysosomal ( ACP2 ), and TNF alpha-induced protein 8 like 3 ( TNFAIP8L3 ). In addition to replicating a previously identified genome-wide significant locus for corneal astigmatism near the PDGFRA gene, gene-based analysis identified three novel candidate genes, CLDN7 , ACP2 , and TNFAIP8L3 , that warrant further investigation to understand their role in the pathogenesis of corneal astigmatism. The much lower number of genetic variants and genes demonstrating an association with corneal astigmatism compared to published spherical equivalent GWAS analyses suggest a greater influence of rare genetic variants, non-additive genetic effects, or environmental factors in the development of astigmatism.
Harkness, Mark; Fisher, Angela; Lee, Michael D; Mack, E Erin; Payne, Jo Ann; Dworatzek, Sandra; Roberts, Jeff; Acheson, Carolyn; Herrmann, Ronald; Possolo, Antonio
2012-04-01
A large, multi-laboratory microcosm study was performed to select amendments for supporting reductive dechlorination of high levels of trichloroethylene (TCE) found at an industrial site in the United Kingdom (UK) containing dense non-aqueous phase liquid (DNAPL) TCE. The study was designed as a fractional factorial experiment involving 177 bottles distributed between four industrial laboratories and was used to assess the impact of six electron donors, bioaugmentation, addition of supplemental nutrients, and two TCE levels (0.57 and 1.90 mM or 75 and 250 mg/L in the aqueous phase) on TCE dechlorination. Performance was assessed based on the concentration changes of TCE and reductive dechlorination degradation products. The chemical data was evaluated using analysis of variance (ANOVA) and survival analysis techniques to determine both main effects and important interactions for all the experimental variables during the 203-day study. The statistically based design and analysis provided powerful tools that aided decision-making for field application of this technology. The analysis showed that emulsified vegetable oil (EVO), lactate, and methanol were the most effective electron donors, promoting rapid and complete dechlorination of TCE to ethene. Bioaugmentation and nutrient addition also had a statistically significant positive impact on TCE dechlorination. In addition, the microbial community was measured using phospholipid fatty acid analysis (PLFA) for quantification of total biomass and characterization of the community structure and quantitative polymerase chain reaction (qPCR) for enumeration of Dehalococcoides organisms (Dhc) and the vinyl chloride reductase (vcrA) gene. The highest increase in levels of total biomass and Dhc was observed in the EVO microcosms, which correlated well with the dechlorination results. Copyright © 2012 Elsevier B.V. All rights reserved.
Ghogawala, Zoher; Whitmore, Robert G; Watters, William C; Sharan, Alok; Mummaneni, Praveen V; Dailey, Andrew T; Choudhri, Tanvir F; Eck, Jason C; Groff, Michael W; Wang, Jeffrey C; Resnick, Daniel K; Dhall, Sanjay S; Kaiser, Michael G
2014-07-01
A comprehensive economic analysis generally involves the calculation of indirect and direct health costs from a societal perspective as opposed to simply reporting costs from a hospital or payer perspective. Hospital charges for a surgical procedure must be converted to cost data when performing a cost-effectiveness analysis. Once cost data has been calculated, quality-adjusted life year data from a surgical treatment are calculated by using a preference-based health-related quality-of-life instrument such as the EQ-5D. A recent cost-utility analysis from a single study has demonstrated the long-term (over an 8-year time period) benefits of circumferential fusions over stand-alone posterolateral fusions. In addition, economic analysis from a single study has found that lumbar fusion for selected patients with low-back pain can be recommended from an economic perspective. Recent economic analysis, from a single study, finds that femoral ring allograft might be more cost-effective compared with a specific titanium cage when performing an anterior lumbar interbody fusion plus posterolateral fusion.
NASA Astrophysics Data System (ADS)
Rastogi, Monisha; Vaish, Rahul; Madhar, Niyaz Ahamad; Shaikh, Hamid; Al-Zahrani, S. M.
2015-10-01
The present study deals with the diffusion and phase transition behaviour of paraffin reinforced with carbon nano-additives namely graphene oxide (GO) and surface functionalized single walled carbon nanotubes (SWCNT). Bulk disordered systems of paraffin hydrocarbons impregnated with carbon nano-additives have been generated in realistic equilibrium conformations for potential application as latent heat storage systems. Ab initio molecular dynamics(MD) in conjugation with COMPASS forcefield has been implemented using periodic boundary conditions. The proposed scheme allows determination of optimum nano-additive loading for improving thermo-physical properties through analysis of mass, thermal and transport properties; and assists in determination of composite behaviour and related performance from microscopic point of view. It was observed that nanocomposites containing 7.8 % surface functionalised SWCNT and 55% GO loading corresponds to best latent heat storage system. The propounded methodology could serve as a by-pass route for economically taxing and iterative experimental procedures required to attain the optimum composition for best performance. The results also hint at the large unexplored potential of ab-initio classical MD techniques for predicting performance of new nanocomposites for potential phase change material applications.
Cutolo, Maurizio; Vanhaecke, Amber; Ruaro, Barbara; Deschepper, Ellen; Ickinger, Claudia; Melsens, Karin; Piette, Yves; Trombetta, Amelia Chiara; De Keyser, Filip; Smith, Vanessa
2018-06-06
A reliable tool to evaluate flow is paramount in systemic sclerosis (SSc). We describe herein on the one hand a systematic literature review on the reliability of laser speckle contrast analysis (LASCA) to measure the peripheral blood perfusion (PBP) in SSc and perform an additional pilot study, investigating the intra- and inter-rater reliability of LASCA. A systematic search was performed in 3 electronic databases, according to the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) guidelines. In the pilot study, 30 SSc patients and 30 healthy subjects (HS) underwent LASCA assessment. Intra-rater reliability was assessed by having a first anchor rater performing the measurements at 2 time-points and inter-rater reliability by having the anchor rater and a team of second raters performing the measurements in 15 SSc and 30 HS. The measurements were repeated with a second anchor rater in the other 15 SSc patients, as external validation. Only 1 of the 14 records of interest identified through the systematic search was included in the final analysis. In the additional pilot study: intra-class correlation coefficient (ICC) for intra-rater reliability of the first anchor rater was 0.95 in SSc and 0.93 in HS, the ICC for inter-rater reliability was 0.97 in SSc and 0.93 in HS. Intra- and inter-rater reliability of the second anchor rater was 0.78 and 0.87. The identified literature regarding the reliability of LASCA measurements reports good to excellent inter-rater agreement. This very pilot study could confirm the reliability of LASCA measurements with good to excellent inter-rater agreement and found additionally good to excellent intra-rater reliability. Furthermore, similar results were found in the external validation. Copyright © 2018. Published by Elsevier B.V.
IN SITU FIELD TESTING OF PROCESSES
DOE Office of Scientific and Technical Information (OSTI.GOV)
J.S.Y. YANG
2004-11-08
The purpose of this scientific analysis report is to update and document the data and subsequent analyses from ambient field-testing activities performed in underground drifts and surface-based boreholes through unsaturated zone (UZ) tuff rock units. In situ testing, monitoring, and associated laboratory studies are conducted to directly assess and evaluate the waste emplacement environment and the natural barriers to radionuclide transport at Yucca Mountain. This scientific analysis report supports and provides data to UZ flow and transport model reports, which in turn contribute to the Total System Performance Assessment (TSPA) of Yucca Mountain, an important document for the license applicationmore » (LA). The objectives of ambient field-testing activities are described in Section 1.1. This report is the third revision (REV 03), which supercedes REV 02. The scientific analysis of data for inputs to model calibration and validation as documented in REV 02 were developed in accordance with the Technical Work Plan (TWP) ''Technical Work Plan for: Performance Assessment Unsaturated Zone'' (BSC 2004 [DIRS 167969]). This revision was developed in accordance with the ''Technical Work Plan for: Unsaturated Zone Flow Analysis and Model Report Integration'' (BSC 2004 [DIRS 169654], Section 1.2.4) for better integrated, consistent, transparent, traceable, and more complete documentation in this scientific analysis report and associated UZ flow and transport model reports. No additional testing or analyses were performed as part of this revision. The list of relevant acceptance criteria is provided by ''Technical Work Plan for: Unsaturated Zone Flow Analysis and Model Report Integration'' (BSC 2004 [DIRS 169654]), Table 3-1. Additional deviations from the TWP regarding the features, events, and processes (FEPs) list are discussed in Section 1.3. Documentation in this report includes descriptions of how, and under what conditions, the tests were conducted. The descriptions and analyses provide data useful for refining and confirming the understanding of flow, drift seepage, and transport processes in the UZ. The UZ testing activities included measurement of permeability distribution, quantification of the seepage of water into the drifts, evaluation of fracture-matrix interaction, study of flow along faults, testing of flow and transport between drifts, characterization of hydrologic heterogeneity along drifts, estimation of drying effects on the rock surrounding the drifts due to ventilation, monitoring of moisture conditions in open and sealed drifts, and determination of the degree of minimum construction water migration below drift. These field tests were conducted in two underground drifts at Yucca Mountain, the Exploratory Studies Facility (ESF) drift, and the cross-drift for Enhanced Characterization of the Repository Block (ECRB), as described in Section 1.2. Samples collected in boreholes and underground drifts have been used for additional hydrochemical and isotopic analyses for additional understanding of the UZ setting. The UZ transport tests conducted at the nearby Busted Butte site (see Figure 1-4) are also described in this scientific analysis report.« less
NASA Technical Reports Server (NTRS)
Wilmington, R. P.; Klute, Glenn K. (Editor); Carroll, Amy E. (Editor); Stuart, Mark A. (Editor); Poliner, Jeff (Editor); Rajulu, Sudhakar (Editor); Stanush, Julie (Editor)
1992-01-01
Kinematics, the study of motion exclusive of the influences of mass and force, is one of the primary methods used for the analysis of human biomechanical systems as well as other types of mechanical systems. The Anthropometry and Biomechanics Laboratory (ABL) in the Crew Interface Analysis section of the Man-Systems Division performs both human body kinematics as well as mechanical system kinematics using the Ariel Performance Analysis System (APAS). The APAS supports both analysis of analog signals (e.g. force plate data collection) as well as digitization and analysis of video data. The current evaluations address several methodology issues concerning the accuracy of the kinematic data collection and analysis used in the ABL. This document describes a series of evaluations performed to gain quantitative data pertaining to position and constant angular velocity movements under several operating conditions. Two-dimensional as well as three-dimensional data collection and analyses were completed in a controlled laboratory environment using typical hardware setups. In addition, an evaluation was performed to evaluate the accuracy impact due to a single axis camera offset. Segment length and positional data exhibited errors within 3 percent when using three-dimensional analysis and yielded errors within 8 percent through two-dimensional analysis (Direct Linear Software). Peak angular velocities displayed errors within 6 percent through three-dimensional analyses and exhibited errors of 12 percent when using two-dimensional analysis (Direct Linear Software). The specific results from this series of evaluations and their impacts on the methodology issues of kinematic data collection and analyses are presented in detail. The accuracy levels observed in these evaluations are also presented.
Li, Yangyang; Xu, Fuqing; Li, Yu; Lu, Jiaxin; Li, Shuyan; Shah, Ajay; Zhang, Xuehua; Zhang, Hongyu; Gong, Xiaoyan; Li, Guoxue
2018-03-01
Anaerobic co-digestion is commonly believed to be benefical for biogas production. However, additional of co-substrates may require additional energy inputs and thus affect the overall energy efficiency of the system. In this study, reactor performance and energy analysis of solid state anaerobic digestion (SS-AD) of tomato residues with dairy manure and corn stover were investigated. Different fractions of tomato residues (0, 20, 40, 60, 80 and 100%, based on volatile solid weight (VS)) were co-digested with dairy manure and corn stover at 15% total solids. Energy analysis based on experimental data was conducted for three scenarios: SS-AD of 100% dairy manure, SS-AD of binary mixture (60% dairy manure and 40% corn stover, VS based), and SS-AD of ternary mixture (36% dairy manure, 24% corn stover, and 40% tomato residues, VS based). For each scenario, the energy requirements for individual process components, including feedstock collection and transportation, feedstock pretreatment, biogas plant operation, digestate processing and handling, and the energy production were examined. Results showed that the addition of 20 and 40% tomato residues increased methane yield compared to that of the dairy manure and corn stover mixture, indicating that the co-digestion could balance nutrients and improve the performance of solid-state anaerobic digestion. The energy required for heating substrates had the dominant effect on the total energy consumption. The highest volatile solids (VS) reduction (57.0%), methane yield (379.1 L/kg VS feed ), and net energy production were achieved with the mixture of 24% corn stover, 36% dairy manure, and 40% tomato residues. Thus, the extra energy input for adding tomato residues for co-digestion could be compensated by the increase of methane yield. Copyright © 2017 Elsevier Ltd. All rights reserved.
Fujiwara, Rance J T; Shih, Allen F; Mehra, Saral
2017-11-01
Objective To characterize the relationship between industry payments and use of paranasal sinus balloon catheter dilations (BCDs) for chronic rhinosinusitis. Study Design Cross-sectional analysis of Medicare B Public Use Files and Open Payments data. Setting Two national databases, 2013 to 2014. Subjects and Methods Physicians with Medicare claims with Current Procedural Terminology codes 31295 to 31297 were identified and cross-referenced with industry payments. Multivariate linear regression controlling for age, race, sex, and comorbidity in a physician's Medicare population was performed to identify associations between use of BCDs and industry payments. The final analysis included 334 physicians performing 31,506 procedures, each of whom performed at least 11 balloon dilation procedures. Results Of 334 physicians, 280 (83.8%) received 4392 industry payments in total. Wide variation in payments to physicians was noted (range, $43.29-$111,685.10). The median payment for food and beverage was $19.26 and that for speaker or consulting fees was $409.45. One payment was associated with an additional 3.05 BCDs (confidence interval [95% CI],1.65-4.45; P < .001). One payment for food and beverages was associated with 3.81 additional BCDs (95% CI, 2.13-5.49; P < .001), and 1 payment for speaker or consulting fees was associated with 5.49 additional BCDs (95% CI, 0.32-10.63; P = .04). Conclusion Payments by manufacturers of BCD devices were associated with increased use of BCD for chronic rhinosinusitis. On separate analyses, the number of payments for food and beverages as well as that for speaker and consulting fees was associated with increased BCD use. This study was cross-sectional and cannot prove causality, and several factors likely exist for the uptrend in BCD use.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Robinson, D.G.; Eubanks, L.
1998-03-01
This software assists the engineering designer in characterizing the statistical uncertainty in the performance of complex systems as a result of variations in manufacturing processes, material properties, system geometry or operating environment. The software is composed of a graphical user interface that provides the user with easy access to Cassandra uncertainty analysis routines. Together this interface and the Cassandra routines are referred to as CRAX (CassandRA eXoskeleton). The software is flexible enough, that with minor modification, it is able to interface with large modeling and analysis codes such as heat transfer or finite element analysis software. The current version permitsmore » the user to manually input a performance function, the number of random variables and their associated statistical characteristics: density function, mean, coefficients of variation. Additional uncertainity analysis modules are continuously being added to the Cassandra core.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Robiinson, David G.
1999-02-20
This software assists the engineering designer in characterizing the statistical uncertainty in the performance of complex systems as a result of variations in manufacturing processes, material properties, system geometry or operating environment. The software is composed of a graphical user interface that provides the user with easy access to Cassandra uncertainty analysis routines. Together this interface and the Cassandra routines are referred to as CRAX (CassandRA eXoskeleton). The software is flexible enough, that with minor modification, it is able to interface with large modeling and analysis codes such as heat transfer or finite element analysis software. The current version permitsmore » the user to manually input a performance function, the number of random variables and their associated statistical characteristics: density function, mean, coefficients of variation. Additional uncertainity analysis modules are continuously being added to the Cassandra core.« less
Lin, Hongjun; Wang, Fangyuan; Ding, Linxian; Hong, Huachang; Chen, Jianrong; Lu, Xiaofeng
2011-09-15
The aim of this study was to investigate the feasibility of PAC-MBR process treating municipal secondary effluent. Two laboratory-scale submerged MBRs (SMBR) with and without PAC addition were continuously operated in parallel for secondary effluent treatment. Approximately 63%TOC, 95% NH(4)(+)-N and 98% turbidity in secondary effluent were removed by the PAC-MBR process. Most organics in the secondary effluent were found to be low molecular weight (MW) substances, which could be retained in the reactor and then removed to some extent by using PAC-MBR process. Parallel experiments showed that the addition of PAC significantly increased organic removal and responsible for the largest fraction of organic removal. Membrane fouling analysis showed the enhanced membrane performance in terms of sustainable operational time and filtration resistances by PAC addition. Based on these results, the PAC-MBR process was considered as an attractive option for the reduction of pollutants in secondary effluent. Copyright © 2011 Elsevier B.V. All rights reserved.
A New Antiwear Additive/Surface Pretreatment for PFPE Liquid Lubricants
NASA Technical Reports Server (NTRS)
Morales, Wilfredo; Fusaro, Robert L.; Siebert, Mark; Keith, Theo; Jansen, Ralph; Herrera-Fierro, Pilar
1995-01-01
Pin-on-disk tribology experiments were conducted on a perfluoroalkylelher (PFPE) liquid lubricant with and without a new PFPE lubricant antiwear additive material, a silane. It was found that the silane provided moderate improvement in the antiwear performance of the PFPE lubricant when applied to the metallic surface as a surface coating or when added to the PFPE as a dispersion (emulsion). Slightly better results were obtained by using the combination of a surface coating and an emulsion of the silane. The silane emulsions or coatings did not affect the friction properties of the lubricant. Micro-Fourier transformation infrared (muFTIR) spectroscopy analysis was performed to study silane transfer films and the degradation of the PFPE. The silane was found to mitigate degradation of the PFPE which may have been the major reason for the improved antiwear performance observed.
Huang, Xiuping; Zhang, Wei; Han, Zelong; Liu, Side
2015-01-01
Background The associations between toll-like receptor 2 (TLR2) and toll-like receptor 4(TLR4) polymorphisms and inflammatory bowel disease (IBD) susceptibility remain controversial. A meta-analysis was performed to assess these associations. Methods A systematic search was performed to identify all relevant studies relating TLR2 and TLR4 polymorphisms and IBD susceptibility. Odds ratios (ORs) and 95% confidence intervals (CIs) were calculated. Subgroup analyses were performed by ethnicity and publication quality. Results Thirty-eight eligible studies, assessing 10970 cases and 7061 controls were included. No TLR2 Arg677Trp polymorphism was found. No significant association was observed between TLR2 Arg753Gln polymorphism and Crohn’s disease (CD) or ulcerative colitis (UC) in all genetic models. Interestingly, TLR4 Asp299Gly polymorphism was significantly associated with increased risk of CD and UC in all genetic models, except for the additive one in CD. In addition, a statistically significant association between TLR4 Asp299Gly polymorphism and IBD was observed among high quality studies evaluating Caucasians, but not Asians. Associations between TLR4 Thr399Ile polymorphisms and CD risk were found only in the allele and dominant models. The TLR4 Thr399Ile polymorphism was associated with UC risk in pooled results as well as subgroup analysis of high quality publications assessing Caucasians, in allele and dominant models. Conclusions The meta-analysis provides evidence that TLR2 Arg753Gln is not associated with CD and UC susceptibility in Asians; TLR4 Asp299Gly is associated with CD and UC susceptibility in Caucasians, but not Asians. TLR4 Thr399Ile may be associated with IBD susceptibility in Caucasians only. Additional well-powered studies of Asp299Gly and other TLR4 variants are warranted. PMID:26023918
NASA Technical Reports Server (NTRS)
Muraca, R. J.; Stephens, M. V.; Dagenhart, J. R.
1975-01-01
A general analysis capable of predicting performance characteristics of cross-wind axis turbines was developed, including the effects of airfoil geometry, support struts, blade aspect ratio, windmill solidity, blade interference and curved flow. The results were compared with available wind tunnel results for a catenary blade shape. A theoretical performance curve for an aerodynamically efficient straight blade configuration was also presented. In addition, a linearized analytical solution applicable for straight configurations was developed. A listing of the computer program developed for numerical solutions of the general performance equations is included in the appendix.
Energy Efficient Engine: Combustor component performance program
NASA Technical Reports Server (NTRS)
Dubiel, D. J.
1986-01-01
The results of the Combustor Component Performance analysis as developed under the Energy Efficient Engine (EEE) program are presented. This study was conducted to demonstrate the aerothermal and environmental goals established for the EEE program and to identify areas where refinements might be made to meet future combustor requirements. In this study, a full annular combustor test rig was used to establish emission levels and combustor performance for comparison with those indicated by the supporting technology program. In addition, a combustor sector test rig was employed to examine differences in emissions and liner temperatures obtained during the full annular performance and supporting technology tests.
Statistical Analysis of NAS Parallel Benchmarks and LINPACK Results
NASA Technical Reports Server (NTRS)
Meuer, Hans-Werner; Simon, Horst D.; Strohmeier, Erich; Lasinski, T. A. (Technical Monitor)
1994-01-01
In the last three years extensive performance data have been reported for parallel machines both based on the NAS Parallel Benchmarks, and on LINPACK. In this study we have used the reported benchmark results and performed a number of statistical experiments using factor, cluster, and regression analyses. In addition to the performance results of LINPACK and the eight NAS parallel benchmarks, we have also included peak performance of the machine, and the LINPACK n and n(sub 1/2) values. Some of the results and observations can be summarized as follows: 1) All benchmarks are strongly correlated with peak performance. 2) LINPACK and EP have each a unique signature. 3) The remaining NPB can grouped into three groups as follows: (CG and IS), (LU and SP), and (MG, FT, and BT). Hence three (or four with EP) benchmarks are sufficient to characterize the overall NPB performance. Our poster presentation will follow a standard poster format, and will present the data of our statistical analysis in detail.
Simulation of Attacks for Security in Wireless Sensor Network
Diaz, Alvaro; Sanchez, Pablo
2016-01-01
The increasing complexity and low-power constraints of current Wireless Sensor Networks (WSN) require efficient methodologies for network simulation and embedded software performance analysis of nodes. In addition, security is also a very important feature that has to be addressed in most WSNs, since they may work with sensitive data and operate in hostile unattended environments. In this paper, a methodology for security analysis of Wireless Sensor Networks is presented. The methodology allows designing attack-aware embedded software/firmware or attack countermeasures to provide security in WSNs. The proposed methodology includes attacker modeling and attack simulation with performance analysis (node’s software execution time and power consumption estimation). After an analysis of different WSN attack types, an attacker model is proposed. This model defines three different types of attackers that can emulate most WSN attacks. In addition, this paper presents a virtual platform that is able to model the node hardware, embedded software and basic wireless channel features. This virtual simulation analyzes the embedded software behavior and node power consumption while it takes into account the network deployment and topology. Additionally, this simulator integrates the previously mentioned attacker model. Thus, the impact of attacks on power consumption and software behavior/execution-time can be analyzed. This provides developers with essential information about the effects that one or multiple attacks could have on the network, helping them to develop more secure WSN systems. This WSN attack simulator is an essential element of the attack-aware embedded software development methodology that is also introduced in this work. PMID:27869710
Norisue, Yasuhiro; Tokuda, Yasuharu; Juarez, Mayrol; Uchimido, Ryo; Fujitani, Shigeki; Stoeckel, David A
2017-02-07
Cumulative sum (CUSUM) analysis can be used to continuously monitor the performance of an individual or process and detect deviations from a preset or standard level of achievement. However, no previous study has evaluated the utility of CUSUM analysis in facilitating timely environmental assessment and interventions to improve performance of linear-probe endobronchial ultrasound-guided transbronchial needle aspiration (EBUS-TBNA). The aim of this study was to evaluate the usefulness of combined CUSUM and chronological environmental analysis as a tool to improve the learning environment for EBUS-TBNA trainees. This study was an observational chart review. To determine if performance was acceptable, CUSUM analysis was used to track procedural outcomes of trainees in EBUS-TBNA. To investigate chronological changes in the learning environment, multivariate logistic regression analysis was used to compare several indices before and after time points when significant changes occurred in proficiency. Presence of an additional attending bronchoscopist was inversely associated with nonproficiency (odds ratio, 0.117; 95% confidence interval, 0-0.749; P = 0.019). Other factors, including presence of an on-site cytopathologist and dose of sedatives used, were not significantly associated with duration of nonproficiency. Combined CUSUM and chronological environmental analysis may be useful in hastening interventions that improve performance of EBUS-TBNA.
Shared vision promotes family firm performance.
Neff, John E
2015-01-01
A clear picture of the influential drivers of private family firm performance has proven to be an elusive target. The unique characteristics of private family owned firms necessitate a broader, non-financial approach to reveal firm performance drivers. This research study sought to specify and evaluate the themes that distinguish successful family firms from less successful family firms. In addition, this study explored the possibility that these themes collectively form an effective organizational culture that improves longer-term firm performance. At an organizational level of analysis, research findings identified four significant variables: Shared Vision (PNS), Role Clarity (RCL), Confidence in Management (CON), and Professional Networking (OLN) that positively impacted family firm financial performance. Shared Vision exhibited the strongest positive influence among the significant factors. In addition, Family Functionality (APGAR), the functional integrity of the family itself, exhibited a significant supporting role. Taken together, the variables collectively represent an effective family business culture (EFBC) that positively impacted the long-term financial sustainability of family owned firms. The index of effective family business culture also exhibited potential as a predictive non-financial model of family firm performance.
Shared vision promotes family firm performance
Neff, John E.
2015-01-01
A clear picture of the influential drivers of private family firm performance has proven to be an elusive target. The unique characteristics of private family owned firms necessitate a broader, non-financial approach to reveal firm performance drivers. This research study sought to specify and evaluate the themes that distinguish successful family firms from less successful family firms. In addition, this study explored the possibility that these themes collectively form an effective organizational culture that improves longer-term firm performance. At an organizational level of analysis, research findings identified four significant variables: Shared Vision (PNS), Role Clarity (RCL), Confidence in Management (CON), and Professional Networking (OLN) that positively impacted family firm financial performance. Shared Vision exhibited the strongest positive influence among the significant factors. In addition, Family Functionality (APGAR), the functional integrity of the family itself, exhibited a significant supporting role. Taken together, the variables collectively represent an effective family business culture (EFBC) that positively impacted the long-term financial sustainability of family owned firms. The index of effective family business culture also exhibited potential as a predictive non-financial model of family firm performance. PMID:26042075
Analysis of Metallized Teflon(trademark) Film Materials Performance on Satellites
NASA Technical Reports Server (NTRS)
Pippin, H. Gary; Normand, Eugene; Wolf, Suzanne L. B.; Kamenetzky, Rachel; Kauffman, William J., Jr. (Technical Monitor)
2002-01-01
Laboratory and on-orbit performance data for two common thermal control materials, silver- and aluminum-backed (metallized) fluorinated ethyl-propylene (TER) was collected from a variety of sources and analyzed. This paper demonstrates that the change in solar absorptance, alpha, is a strong function of particulate radiation for these materials. Examination of additional data shows that the atomic oxygen recession rate is a strong function of solar exposure with an induction period of between 25 to 50 equivalent solar hours. The relationships determined in this analysis were incorporated into an electronic knowledge base, the 'Spacecraft Materials Selector,' under NASA contract NAS8-98213.
Richard, Joshua; Galloway, Jack; Fensin, Michael; ...
2015-04-04
A novel object-oriented modular mapping methodology for externally coupled neutronics–thermal hydraulics multiphysics simulations was developed. The Simulator using MCNP with Integrated Thermal-Hydraulics for Exploratory Reactor Studies (SMITHERS) code performs on-the-fly mapping of material-wise power distribution tallies implemented by MCNP-based neutron transport/depletion solvers for use in estimating coolant temperature and density distributions with a separate thermal-hydraulic solver. The key development of SMITHERS is that it reconstructs the hierarchical geometry structure of the material-wise power generation tallies from the depletion solver automatically, with only a modicum of additional information required from the user. In addition, it performs the basis mapping from themore » combinatorial geometry of the depletion solver to the required geometry of the thermal-hydraulic solver in a generalizable manner, such that it can transparently accommodate varying levels of thermal-hydraulic solver geometric fidelity, from the nodal geometry of multi-channel analysis solvers to the pin-cell level of discretization for sub-channel analysis solvers.« less
Light scattering methods to test inorganic PCMs for application in buildings
NASA Astrophysics Data System (ADS)
De Paola, M. G.; Calabrò, V.; De Simone, M.
2017-10-01
Thermal performance and stability over time are key parameters for the characterization and application of PCMs in the building sector. Generally, inorganic PCMs are dispersions of hydrated salts and additives in water that counteract phase segregation phenomena and subcooling. Traditional methods or in “house” methods can be used for evaluating thermal properties, while stability can be estimated over time by using optical techniques. By considering this double approach, in this work thermal and structural analyses of Glauber salt based composite PCMs are conducted by means of non-conventional equipment: T-history method (thermal analysis) and Turbiscan (stability analysis). Three samples with the same composition (Glauber salt with additives) were prepared by using different sonication times and their thermal performances were compared by testing both the thermal cycling and the thermal properties. The stability of the mixtures was verified by the identification of destabilization phenomena, the evaluation of the migration velocities of particles and the estimation of variation of particle size.
Aerocapture Systems Analysis for a Titan Mission
NASA Technical Reports Server (NTRS)
Lockwood, Mary K.; Queen, Eric M.; Way, David W.; Powell, Richard W.; Edquist, Karl; Starr, Brett W.; Hollis, Brian R.; Zoby, E. Vincent; Hrinda, Glenn A.; Bailey, Robert W.
2006-01-01
Performance projections for aerocapture show a vehicle mass savings of between 40 and 80%, dependent on destination, for an aerocapture vehicle compared to an all-propulsive chemical vehicle. In addition aerocapture is applicable to multiple planetary exploration destinations of interest to NASA. The 2001 NASA In-Space Propulsion Program (ISP) technology prioritization effort identified aerocapture as one of the top three propulsion technologies for solar system exploration missions. An additional finding was that aerocapture needed a better system definition and that supporting technology gaps needed to be identified. Consequently, the ISP program sponsored an aerocapture systems analysis effort that was completed in 2002. The focus of the effort was on aerocapture at Titan with a rigid aeroshell system. Titan was selected as the initial destination for the study due to potential interest in a follow-on mission to Cassini/Huygens. Aerocapture is feasible, and the performance is adequate, for the Titan mission and it can deliver 2.4 times more mass to Titan than an all-propulsive system for the same launch vehicle.
Di Segni, Mattia; de Soccio, Valeria; Cantisani, Vito; Bonito, Giacomo; Rubini, Antonello; Di Segni, Gabriele; Lamorte, Sveva; Magri, Valentina; De Vito, Corrado; Migliara, Giuseppe; Bartolotta, Tommaso Vincenzo; Metere, Alessio; Giacomelli, Laura; de Felice, Carlo; D'Ambrosio, Ferdinando
2018-06-01
To assess the diagnostic performance and the potential as a teaching tool of S-detect in the assessment of focal breast lesions. 61 patients (age 21-84 years) with benign breast lesions in follow-up or candidate to pathological sampling or with suspicious lesions candidate to biopsy were enrolled. The study was based on a prospective and on a retrospective phase. In the prospective phase, after completion of baseline US by an experienced breast radiologist and S-detect assessment, 5 operators with different experience and dedication to breast radiology performed elastographic exams. In the retrospective phase, the 5 operators performed a retrospective assessment and categorized lesions with BI-RADS 2013 lexicon. Integration of S-detect to in-training operators evaluations was performed by giving priority to S-detect analysis in case of disagreement. 2 × 2 contingency tables and ROC analysis were used to assess the diagnostic performances; inter-rater agreement was measured with Cohen's k; Bonferroni's test was used to compare performances. A significance threshold of p = 0.05 was adopted. All operators showed sensitivity > 90% and varying specificity (50-75%); S-detect showed sensitivity > 90 and 70.8% specificity, with inter-rater agreement ranging from moderate to good. Lower specificities were improved by the addition of S-detect. The addition of elastography did not lead to any improvement of the diagnostic performance. S-detect is a feasible tool for the characterization of breast lesions; it has a potential as a teaching tool for the less experienced operators.
1985-09-01
34 " develop a more accurate concept of human behavior. In addition, students learn how to improve their abilities to 22 [ .". ., , . . * lead, follow...contains four volumes with 36 lessons. This block defines the arena where professional Air Force officers operate. In addition, students learn to... learned in unit A, to perform limited position classification casework, and to * -write evaluation reports. Students may either enroll in Unit A only, or in
Bloemen, A; van Dooren, P; Huizinga, B F; Hoofwijk, A G M
2012-02-01
Incisional hernia is a frequent complication of abdominal surgery (incidence 2-20%). Diagnosis by physical examination is sometimes difficult, especially in small incisional hernias or in obese patients. The additional diagnostic value of standardized ultrasonography was evaluated in this prospective study. A total of 456 patients participating in a randomized trial comparing two suture materials for closure of the abdominal fascia underwent physical examination and ultrasonography at 6-month intervals. Wound complaints and treatment of incisional hernia were also noted. Statistical analysis was performed using the Chi-squared and Fisher's exact tests (SPSS). Interest variability analysis was performed. During a median follow-up of 31 months, 103 incisional hernias were found. A total of 82 incisional hernias were found by physical examination and an additional 21 with ultrasonography. Six of these additional hernias were symptomatic and only one of the additional hernias received operative treatment. The false-negative rates for physical examination and ultrasonography were 25.3 and 24.4%, respectively. Interest variability was low, with a Kappa of 0.697 (P < 0.001). There are no clear diagnostic criteria for incisional hernia available in the literature. Standardized combination of ultrasonography with physical examination during follow-up yields a significant number of, mostly asymptomatic, hernias, which would not be found using physical examination alone. This is especially relevant in research settings.
Development and validation of an integrated DNA walking strategy to detect GMO expressing cry genes.
Fraiture, Marie-Alice; Vandamme, Julie; Herman, Philippe; Roosens, Nancy H C
2018-06-27
Recently, an integrated DNA walking strategy has been proposed to prove the presence of GMO via the characterisation of sequences of interest, including their transgene flanking regions and the unnatural associations of elements in their transgenic cassettes. To this end, the p35S, tNOS and t35S pCAMBIA elements have been selected as key targets, allowing the coverage of most of GMO, EU authorized or not. In the present study, a bidirectional DNA walking method anchored on the CryAb/c genes is proposed with the aim to cover additional GMO and additional sequences of interest. The performance of the proposed bidirectional DNA walking method anchored on the CryAb/c genes has been evaluated in a first time for its feasibility using several GM events possessing these CryAb/c genes. Afterwards, its sensitivity has been investigated through low concentrations of targets (as low as 20 HGE). In addition, to illustrate its applicability, the entire workflow has been tested on a sample mimicking food/feed matrices analysed in GMO routine analysis. Given the successful assessment of its performance, the present bidirectional DNA walking method anchored on the CryAb/c genes can easily be implemented in GMO routine analysis by the enforcement laboratories and allows completing the entire DNA walking strategy in targeting an additional transgenic element frequently found in GMO.
NASA Technical Reports Server (NTRS)
Searcy, Brittani
2017-01-01
Using virtual environments to assess complex large scale human tasks provides timely and cost effective results to evaluate designs and to reduce operational risks during assembly and integration of the Space Launch System (SLS). NASA's Marshall Space Flight Center (MSFC) uses a suite of tools to conduct integrated virtual analysis during the design phase of the SLS Program. Siemens Jack is a simulation tool that allows engineers to analyze human interaction with CAD designs by placing a digital human model into the environment to test different scenarios and assess the design's compliance to human factors requirements. Engineers at MSFC are using Jack in conjunction with motion capture and virtual reality systems in MSFC's Virtual Environments Lab (VEL). The VEL provides additional capability beyond standalone Jack to record and analyze a person performing a planned task to assemble the SLS at Kennedy Space Center (KSC). The VEL integrates Vicon Blade motion capture system, Siemens Jack, Oculus Rift, and other virtual tools to perform human factors assessments. By using motion capture and virtual reality, a more accurate breakdown and understanding of how an operator will perform a task can be gained. By virtual analysis, engineers are able to determine if a specific task is capable of being safely performed by both a 5% (approx. 5ft) female and a 95% (approx. 6'1) male. In addition, the analysis will help identify any tools or other accommodations that may to help complete the task. These assessments are critical for the safety of ground support engineers and keeping launch operations on schedule. Motion capture allows engineers to save and examine human movements on a frame by frame basis, while virtual reality gives the actor (person performing a task in the VEL) an immersive view of the task environment. This presentation will discuss the need of human factors for SLS and the benefits of analyzing tasks in NASA MSFC's VEL.
Improved score statistics for meta-analysis in single-variant and gene-level association studies.
Yang, Jingjing; Chen, Sai; Abecasis, Gonçalo
2018-06-01
Meta-analysis is now an essential tool for genetic association studies, allowing them to combine large studies and greatly accelerating the pace of genetic discovery. Although the standard meta-analysis methods perform equivalently as the more cumbersome joint analysis under ideal settings, they result in substantial power loss under unbalanced settings with various case-control ratios. Here, we investigate the power loss problem by the standard meta-analysis methods for unbalanced studies, and further propose novel meta-analysis methods performing equivalently to the joint analysis under both balanced and unbalanced settings. We derive improved meta-score-statistics that can accurately approximate the joint-score-statistics with combined individual-level data, for both linear and logistic regression models, with and without covariates. In addition, we propose a novel approach to adjust for population stratification by correcting for known population structures through minor allele frequencies. In the simulated gene-level association studies under unbalanced settings, our method recovered up to 85% power loss caused by the standard methods. We further showed the power gain of our methods in gene-level tests with 26 unbalanced studies of age-related macular degeneration . In addition, we took the meta-analysis of three unbalanced studies of type 2 diabetes as an example to discuss the challenges of meta-analyzing multi-ethnic samples. In summary, our improved meta-score-statistics with corrections for population stratification can be used to construct both single-variant and gene-level association studies, providing a useful framework for ensuring well-powered, convenient, cross-study analyses. © 2018 WILEY PERIODICALS, INC.
Interdependencies and Causalities in Coupled Financial Networks
Vodenska, Irena; Aoyama, Hideaki; Fujiwara, Yoshi; Iyetomi, Hiroshi; Arai, Yuta
2016-01-01
We explore the foreign exchange and stock market networks for 48 countries from 1999 to 2012 and propose a model, based on complex Hilbert principal component analysis, for extracting significant lead-lag relationships between these markets. The global set of countries, including large and small countries in Europe, the Americas, Asia, and the Middle East, is contrasted with the limited scopes of targets, e.g., G5, G7 or the emerging Asian countries, adopted by previous works. We construct a coupled synchronization network, perform community analysis, and identify formation of four distinct network communities that are relatively stable over time. In addition to investigating the entire period, we divide the time period into into “mild crisis,” (1999–2002), “calm,” (2003–2006) and “severe crisis” (2007–2012) sub-periods and find that the severe crisis period behavior dominates the dynamics in the foreign exchange-equity synchronization network. We observe that in general the foreign exchange market has predictive power for the global stock market performances. In addition, the United States, German and Mexican markets have forecasting power for the performances of other global equity markets. PMID:26977806
2011-01-01
Background Addition of sugar syrups to the basic wort is a popular technique to achieve higher gravity in beer fermentations, but it results in dilution of the free amino nitrogen (FAN) content in the medium. The multicomponent protease enzyme Flavourzyme has beneficial effect on the brewer's yeast fermentation performance during high gravity fermentations as it increases the initial FAN value and results in higher FAN uptake, higher specific growth rate, higher ethanol yield and improved flavour profile. Results In the present study, transcriptome and metabolome analysis were used to elucidate the effect on the addition of the multicomponent protease enzyme Flavourzyme and its influence on the metabolism of the brewer's yeast strain Weihenstephan 34/70. The study underlines the importance of sufficient nitrogen availability during the course of beer fermentation. The applied metabolome and transcriptome analysis allowed mapping the effect of the wort sugar composition on the nitrogen uptake. Conclusion Both the transcriptome and the metabolome analysis revealed that there is a significantly higher impact of protease addition for maltose syrup supplemented fermentations, while addition of glucose syrup to increase the gravity in the wort resulted in increased glucose repression that lead to inhibition of amino acid uptake and hereby inhibited the effect of the protease addition. PMID:21513553
NASA Astrophysics Data System (ADS)
Grimm, T.; Wiora, G.; Witt, G.
2017-03-01
Good correlations between three-dimensional surface analyses of laser-beam-melted parts of nickel alloy HX and their mechanical properties were found. The surface analyses were performed with a confocal microscope, which offers a more profound surface data basis than a conventional, two-dimensional tactile profilometry. This new approach results in a wide range of three-dimensional surface parameters, which were each evaluated with respect to their feasibility for quality control in additive manufacturing. As a result of an automated surface analysis process by the confocal microscope and an industrial six-axis robot, the results are an innovative approach for quality control in additive manufacturing.
Two-pole microring weight banks.
Tait, Alexander N; Wu, Allie X; Ferreira de Lima, Thomas; Nahmias, Mitchell A; Shastri, Bhavin J; Prucnal, Paul R
2018-05-15
Weighted addition is an elemental multi-input to single-output operation that can be implemented with high-performance photonic devices. Microring (MRR) weight banks bring programmable weighted addition to silicon photonics. Prior work showed that their channel limits are affected by coherent inter-channel effects that occur uniquely in weight banks. We fabricate two-pole designs that exploit this inter-channel interference in a way that is robust to dynamic tuning and fabrication variation. Scaling analysis predicts a channel count improvement of 3.4-fold, which is substantially greater than predicted by incoherent analysis used in conventional MRR devices. Advances in weight bank design expand the potential of reconfigurable analog photonic networks and multivariate microwave photonics.
A second-order all-digital phase-locked loop
NASA Technical Reports Server (NTRS)
Holmes, J. K.; Tegnelia, C. R.
1974-01-01
A simple second-order digital phase-locked loop has been designed to synchronize itself to a square-wave subcarrier. Analysis and experimental performance are given for both acquisition behavior and steady-state phase error performance. In addition, the damping factor and the noise bandwidth are derived analytically. Although all the data are given for the square-wave subcarrier case, the results are applicable to arbitrary subcarriers that are odd symmetric about their transition region.
Advanced Unmanned Search System (AUSS) Performance Analysis
1979-07-15
interference (from thrusters , flow noise , etc.) with sonar data; (4) Sonar range scales can be adjusted, on scene, for viewing the same contacts with...intact. The H-bomb search was performed at 2000 feet, the sub- marine search at 8400 feet. An additional submarine search was selected at 20,000 feet to...Sonar Targets," by Stephen Miller, Marine Physical Laboratory, Scripps Institution of Oceanography, January 1977. 10 Table 2. Baseline towed system
Host Genes and Resistance/Sensitivity to Military Priority Pathogens
2012-06-01
must be performed before fruitful linkage analysis can be performed with each of these parameters. We have also begun to measure the concentrations of...resistant to this pathogen). • Using parental strain we have identified at least eleven additional phenotypes that will allow fruitful linkage... affect severity of oviduct infection (Figure 2). 3.3 BXD strains colony for DoD select agents We maintained more than 400 cages for DoD
Yang, Liqing; Du, Shuai; Sun, Yuefeng
2017-11-01
This meta-analysis aimed to perform a meta-analysis to investigate the impact of additional intravenous acetaminophen for pain management after total joint arthroplasty (TJA). We conducted electronic searches of Medline (1966-2017.07), PubMed (1966-2017.07), Embase (1980-2017.07), ScienceDirect (1985-2017.07) and the Cochrane Library. Randomized controlled trials (RCTs) and non-RCTs were included. The quality assessments were performed according to the Cochrane systematic review method. The primary outcomes were postoperative pain scores and opioid consumption. Meta-analysis was performed using Stata 11.0 software. A total of four studies were retrieved involving 865 participants. The present meta-analysis indicated that there were significant differences between groups in terms of pain scores at POD 1 (WMD = -0.954, 95% CI: -1.204 to -0.703, P = 0.000), POD 2 (WMD = -1.072, 95% CI: -2.072 to -0.073, P = 0.000), and POD 3 (WMD = -0.883, 95% CI: -1.142 to -0.624, P = 0.000). Significant differences were found regarding opioid consumption at POD 1 (WMD = -3.144, 95% CI: -4.142 to -2.146, P = 0.000), POD 2 (WMD = -5.665, 95% CI: -7.383 to -3.947, P = 0.000), and POD 3 (WMD = -3.563, 95% CI: -6.136 to -0.991, P = 0.007). Additional intravenous acetaminophen to multimodal analgesia could significantly reduce pain and opioid consumption after total joint arthroplasty with fewer adverse effects. Higher quality RCTs are required for further research. Copyright © 2017 IJS Publishing Group Ltd. Published by Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Muhsin Ithnin, Ahmad; Jazair Yahya, Wira; Baun Fletcher, Jasmine; Kadir, Hasannuddin Abd
2017-10-01
Water-in-diesel emulsion fuel (W/D) is one of the alternative fuels that capable to reduce the exhaust emission of diesel engine significantly especially the nitrogen oxides (NOx) and particulate matter (PM). However, the usage of W/D emulsion fuels contributed to higher CO emissions. Supplementing metal additive into the fuel is the alternate way to reduce the CO emissions and improve performance. The present paper investigates the effect of using W/D blended with organic based manganese metal additives on the diesel engine performance and exhaust emission. The test were carried out by preparing and analysing the results observed from five different tested fuel which were D2, emulsion fuel (E10: 89% D2, 10% - water, 1% - surfactant), E10Mn100, E10Mn150, E10Mn200. Organic based Manganese (100ppm, 150ppm, 200ppm) used as the additive in the three samples of the experiments. E10Mn200 achieved the maximum reduction of BSFC up to 13.66% and has the highest exhaust gas temperature. Whereas, E10Mn150 achieved the highest reduction of CO by 14.67%, and slightly increased of NOx emissions as compared to other emulsion fuels. Organic based manganese which act as catalyst promotes improvement of the emulsion fuel performance and reduced the harmful emissions discharged.
Effect of indium addition in U-Zr metallic fuel on lanthanide migration
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kim, Yeon Soo; Wiencek, T.; O'Hare, E.
Advanced fast reactor concepts to achieve ultra-high burnup (~50%) require prevention of fuel-cladding chemical interaction (FCCI). Fission product lanthanide accumulation at high burnup is substantial and significantly contributes to FCCI upon migration to the cladding interface. Diffusion barriers are typically used to prevent interaction of the lanthanides with the cladding. A more active method has been proposed which immobilizes the lanthanides through formation of stable compounds with an additive. Theoretical analysis showed that indium, thallium, and antimony are good candidates. Indium was the strongest candidate because of its low reactivity with iron-based cladding alloys. Characterization of the as-fabricated alloys wasmore » performed to determine the effectiveness of the indium addition in forming compounds with lanthanides, represented by cerium. Tests to examine how effectively the dopant prevents lanthanide migration under a thermal gradient were also performed. The results showed that indium effectively prevented cerium migration.« less
Determination of semi-volatile additives in wines using SPME and GC-MS.
Sagandykova, Gulyaim N; Alimzhanova, Mereke B; Nurzhanova, Yenglik T; Kenessov, Bulat
2017-04-01
Parameters of headspace solid-phase microextraction, such as fiber coating (85μm CAR/PDMS), extraction time (2min for white and 3min for red wines), temperature (85°C), pre-incubation time (15min) were optimized for identification and quantification of semi-volatile additives (propylene glycol, sorbic and benzoic acids) in wines. To overcome problems in their determination, an evaporation of the wine matrix was performed. Using the optimized method, screening of 25 wine samples was performed, and the presence of propylene glycol, sorbic and benzoic acids was found in 22, 20 and 6 samples, respectively. Analysis of different wines using a standard addition approach showed good linearity in concentration ranges 0-250, 0-125, and 0-250mg/L for propylene glycol, sorbic and benzoic acids, respectively. The proposed method can be recommended for quality control of wine and disclosing adulterated samples. Copyright © 2016 Elsevier Ltd. All rights reserved.
Mess, Aylin; Vietzke, Jens-Peter; Rapp, Claudius; Francke, Wittko
2011-10-01
Tackifier resins play an important role as additives in pressure sensitive adhesives (PSAs) to modulate their desired properties. With dependence on their origin and processing, tackifier resins can be multicomponent mixtures. Once they have been incorporated in a polymer matrix, conventional chemical analysis of tackifiers usually tends to be challenging because a suitable sample pretreatment and/or separation is necessary and all characteristic components have to be detected for an unequivocal identification of the resin additive. Nevertheless, a reliable analysis of tackifiers is essential for product quality and safety reasons. A promising approach for the examination of tackifier resins in PSAs is the novel direct analysis in real time mass spectrometry (DART-MS) technique, which enables screening analysis without time-consuming sample preparation. In the present work, four key classes of tackifier resins were studied (rosin, terpene phenolic, polyterpene, and hydrocarbon resins). Their corresponding complex mass spectra were interpreted and used as reference spectra for subsequent analyses. These data were used to analyze tackifier additives in synthetic rubber and acrylic adhesive matrixes. To prove the efficiency of the developed method, complete PSA products containing two or three different tackifiers were analyzed. The tackifier resins were successfully identified, while measurement time and interpretation took less than 10 mins per sample. Determination of resin additives in PSAs can be performed down to 0.1% (w/w, limit of detection) using the three most abundant signals for each tackifier. In summary, DART-MS is a rapid and efficient screening method for the analysis of various tackifiers in PSAs.
Schulte-Uebbing, Lena; de Vries, Wim
2018-02-01
Elevated nitrogen (N) deposition may increase net primary productivity in N-limited terrestrial ecosystems and thus enhance the terrestrial carbon (C) sink. To assess the magnitude of this N-induced C sink, we performed a meta-analysis on data from forest fertilization experiments to estimate N-induced C sequestration in aboveground tree woody biomass, a stable C pool with long turnover times. Our results show that boreal and temperate forests responded strongly to N addition and sequestered on average an additional 14 and 13 kg C per kg N in aboveground woody biomass, respectively. Tropical forests, however, did not respond significantly to N addition. The common hypothesis that tropical forests do not respond to N because they are phosphorus-limited could not be confirmed, as we found no significant response to phosphorus addition in tropical forests. Across climate zones, we found that young forests responded more strongly to N addition, which is important as many previous meta-analyses of N addition experiments rely heavily on data from experiments on seedlings and young trees. Furthermore, the C-N response (defined as additional mass unit of C sequestered per additional mass unit of N addition) was affected by forest productivity, experimental N addition rate, and rate of ambient N deposition. The estimated C-N responses from our meta-analysis were generally lower that those derived with stoichiometric scaling, dynamic global vegetation models, and forest growth inventories along N deposition gradients. We estimated N-induced global C sequestration in tree aboveground woody biomass by multiplying the C-N responses obtained from the meta-analysis with N deposition estimates per biome. We thus derived an N-induced global C sink of about 177 (112-243) Tg C/year in aboveground and belowground woody biomass, which would account for about 12% of the forest biomass C sink (1,400 Tg C/year). © 2017 John Wiley & Sons Ltd.
Authenticity analysis of pear juice employing chromatographic fingerprinting.
Willems, Jamie L; Low, Nicholas H
2014-12-03
Pear juice is predominately composed of carbohydrates/polyols (>95% of the total soluble solids), making it susceptible to adulteration by the addition of less expensive commercial sweeteners. In this research, the major carbohydrate and polyol (fructose, glucose, sucrose, and sorbitol) content of 32 pure pear juices representing five world producing regions and three years of production was determined. Additionally, methods employing oligosaccharide profiling to detect the debasing of these samples with four commercial sweeteners (HFCS 55 and 90, TIS, and HIS) were developed using capillary gas chromatography with flame ionization detection (CGC-FID) and high-performance liquid chromatography with pulsed amperometric detection (HPAE-PAD). Detection limits for the four commercial sweeteners ranged from 0.5 to 5.0% (v/v). In addition, the developed CGC-FID method could be used to (a) detect the addition of pear to apple juice via arbutin detection and (b) determine if a pear juice was produced using enzymatic liquefaction via the presence of O-β-d-glucopyranosyl-(1→4)-d-glucopyranose (cellobiose), all within a single chromatographic analysis.
Yin, Dengyang; Hu, Xunxiu; Liu, Dantong; Du, Wencheng; Wang, Haibo; Guo, Mengzhe; Tang, Daoquan
2017-06-01
Liquid chromatography coupled with mass spectrometry technique has been widely used in the analysis of biological targets such as amino acids, peptides, and proteins. In this work, eight common single carboxylic acids or diacids, which contain different pKa have been investigated as the additives to the analysis of amino acids. As the results, carboxylic acid additive can improve the signal intensity of acidity amino acids such as Asp and Glu and the chromatographic separation of basic amino acids such as Arg, His, and Lys. In particular, the diacids have better performance than single acids. The proposed mechanism is that the diacid has hydrogen bond interaction with amino acids to reduce their polarity/amphiprotic characteristics. Besides, oxalic acid has been found having better enhancement than phthalic acid by overall consideration. Therefore, we successfully quantified the 15 amino acids in Sepia bulk pharmaceutical chemical by using oxalic acid as the additive.
Initial Data Analysis Results for ATD-2 ISAS HITL Simulation
NASA Technical Reports Server (NTRS)
Lee, Hanbong
2017-01-01
To evaluate the operational procedures and information requirements for the core functional capabilities of the ATD-2 project, such as tactical surface metering tool, APREQ-CFR procedure, and data element exchanges between ramp and tower, human-in-the-loop (HITL) simulations were performed in March, 2017. This presentation shows the initial data analysis results from the HITL simulations. With respect to the different runway configurations and metering values in tactical surface scheduler, various airport performance metrics were analyzed and compared. These metrics include gate holding time, taxi-out in time, runway throughput, queue size and wait time in queue, and TMI flight compliance. In addition to the metering value, other factors affecting the airport performance in the HITL simulation, including run duration, runway changes, and TMI constraints, are also discussed.
Mars Microprobe Entry Analysis
NASA Technical Reports Server (NTRS)
Braun, Robert D.; Mitcheltree, Robert A.; Cheatwood, F. McNeil
1998-01-01
The Mars Microprobe mission will provide the first opportunity for subsurface measurements, including water detection, near the south pole of Mars. In this paper, performance of the Microprobe aeroshell design is evaluated through development of a six-degree-of-freedom (6-DOF) aerodynamic database and flight dynamics simulation. Numerous mission uncertainties are quantified and a Monte-Carlo analysis is performed to statistically assess mission performance. Results from this 6-DOF Monte-Carlo simulation demonstrate that, in a majority of the cases (approximately 2-sigma), the penetrator impact conditions are within current design tolerances. Several trajectories are identified in which the current set of impact requirements are not satisfied. From these cases, critical design parameters are highlighted and additional system requirements are suggested. In particular, a relatively large angle-of-attack range near peak heating is identified.
Importance-performance analysis as a guide for hospitals in improving their provision of services.
Whynes, D K; Reed, G
1995-11-01
As a result of the 1990 National Health Services Act, hospitals now compete with one another to win service contracts. A high level of service quality represents an important ingredient of a successful competitive strategy, yet, in general, hospitals have little external information on which to base quality decisions. Specifically, in their efforts to win contracts from fundholding general practitioners, hospitals require information on that which these purchasers deem important with respect to quality, and on how these purchasers assess the quality of their current service performance. The problem is complicated by the fact that hospital service quality, in itself, is multi-dimensional. In other areas of economic activity, the information problem has been resolved by importance-performance analysis and this paper reports the findings of such an analysis conducted for hosptials in the Trent region. The importance and performance service quality ratings of fundholders were obtained from a questionnaire survey and used in a particular variant of importance-performance analysis, which possesses certain advantages over more conventional approaches. In addition to providing empirical data on the determinants of service quality, as perceived by the purchasers of hospital services, this paper demonstrates how such information can be successfully employed in a quality enhancement strategy.
Design, fabrication and skin-electrode contact analysis of polymer microneedle-based ECG electrodes
NASA Astrophysics Data System (ADS)
O'Mahony, Conor; Grygoryev, Konstantin; Ciarlone, Antonio; Giannoni, Giuseppe; Kenthao, Anan; Galvin, Paul
2016-08-01
Microneedle-based ‘dry’ electrodes have immense potential for use in diagnostic procedures such as electrocardiography (ECG) analysis, as they eliminate several of the drawbacks associated with the conventional ‘wet’ electrodes currently used for physiological signal recording. To be commercially successful in such a competitive market, it is essential that dry electrodes are manufacturable in high volumes and at low cost. In addition, the topographical nature of these emerging devices means that electrode performance is likely to be highly dependent on the quality of the skin-electrode contact. This paper presents a low-cost, wafer-level micromoulding technology for the fabrication of polymeric ECG electrodes that use microneedle structures to make a direct electrical contact to the body. The double-sided moulding process can be used to eliminate post-process via creation and wafer dicing steps. In addition, measurement techniques have been developed to characterize the skin-electrode contact force. We perform the first analysis of signal-to-noise ratio dependency on contact force, and show that although microneedle-based electrodes can outperform conventional gel electrodes, the quality of ECG recordings is significantly dependent on temporal and mechanical aspects of the skin-electrode interface.
Jiang, Ping; Lucy, Charles A
2015-10-15
Electrospray ionization mass spectrometry (ESI-MS) has significantly impacted the analysis of complex biological and petroleum samples. However ESI-MS has limited ionization efficiency for samples in low dielectric and low polarity solvents. Addition of a make-up solvent through a T union or electrospray solvent through continuous flow extractive desorption electrospray ionization (CF-EDESI) enable ionization of analytes in non-ESI friendly solvents. A conventional make-up solvent addition setup was used and a CF-EDESI source was built for ionization of nitrogen-containing standards in hexane or hexane/isopropanol. Factors affecting the performance of both sources have been investigated and optimized. Both the make-up solvent addition and CF-EDESI improve the ionization efficiency for heteroatom compounds in non-ESI friendly solvents. Make-up solvent addition provides higher ionization efficiency than CF-EDESI. Neither the make-up solvent addition nor the CF-EDESI eliminates ionization suppression of nitrogen-containing compounds caused by compounds of the same chemical class. Copyright © 2015 Elsevier B.V. All rights reserved.
Wang, Wen-Cheng; Cho, Wen-Chien; Chen, Yin-Jen
2014-01-01
It is estimated that mainland Chinese tourists travelling to Taiwan can bring annual revenues of 400 billion NTD to the Taiwan economy. Thus, how the Taiwanese Government formulates relevant measures to satisfy both sides is the focus of most concern. Taiwan must improve the facilities and service quality of its tourism industry so as to attract more mainland tourists. This paper conducted a questionnaire survey of mainland tourists and used grey relational analysis in grey mathematics to analyze the satisfaction performance of all satisfaction question items. The first eight satisfaction items were used as independent variables, and the overall satisfaction performance was used as a dependent variable for quantile regression model analysis to discuss the relationship between the dependent variable under different quantiles and independent variables. Finally, this study further discussed the predictive accuracy of the least mean regression model and each quantile regression model, as a reference for research personnel. The analysis results showed that other variables could also affect the overall satisfaction performance of mainland tourists, in addition to occupation and age. The overall predictive accuracy of quantile regression model Q0.25 was higher than that of the other three models. PMID:24574916
Wang, Wen-Cheng; Cho, Wen-Chien; Chen, Yin-Jen
2014-01-01
It is estimated that mainland Chinese tourists travelling to Taiwan can bring annual revenues of 400 billion NTD to the Taiwan economy. Thus, how the Taiwanese Government formulates relevant measures to satisfy both sides is the focus of most concern. Taiwan must improve the facilities and service quality of its tourism industry so as to attract more mainland tourists. This paper conducted a questionnaire survey of mainland tourists and used grey relational analysis in grey mathematics to analyze the satisfaction performance of all satisfaction question items. The first eight satisfaction items were used as independent variables, and the overall satisfaction performance was used as a dependent variable for quantile regression model analysis to discuss the relationship between the dependent variable under different quantiles and independent variables. Finally, this study further discussed the predictive accuracy of the least mean regression model and each quantile regression model, as a reference for research personnel. The analysis results showed that other variables could also affect the overall satisfaction performance of mainland tourists, in addition to occupation and age. The overall predictive accuracy of quantile regression model Q0.25 was higher than that of the other three models.
Giannenas, Ilias; Bonos, Eleftherios; Skoufos, Ioannis; Tzora, Athina; Stylianaki, Ioanna; Lazari, Diamanto; Tsinas, Anastasios; Christaki, Efterpi; Florou-Paneri, Panagiota
2018-06-06
1. This feeding trial investigated the effects of herbal feed additives on performance of broiler chickens, jejunal and caecal microbiota, jejunal morphology, and meat chemical composition and oxidative stability during refrigerated storage. 2. In a 42 days trial, 320 one-day-old broiler chickens were randomly allocated to four groups with four replicate pens each containing 20 chicks. The control group was fed maize-soybean-based diets. The diets of the other three groups were supplemented with herbal feed additives: HRB1 with Stresomix TM (0.5 g/kg feed); HRB2 with Ayucee TM (1.0 g/kg feed); HRB3 with Salcochek Pro TM (1.0 g/kg feed). The GC/MS analysis of the feed additives showed that the major components of HRB1 were β-caryophyllene (14.4%) and menthol (9.8%); HRB2 were n-hexadecanoic acid (14.22%) and β-caryophyllene (14.4%) and HRB3 were menthol (69.6%) and clavicol methyl ether (13.9%). 3. Intestinal samples were taken at 42 d to determine bacterial populations (total aerobe counts, Lactobacilli, and Escherichia coli) and perform gut morphology analysis. Meat samples were analysed for chemical composition and oxidative stability under storage. 4. The HRB1 group had improved (P<0.05) body weight gain and tended to have improved (0.05≤P<0.10) feed conversion ratio, compared to the control group. Jejunum lactic acid bacteria counts were increased (P<0.001) in groups HRB1 and HRB3, compared to the control group, whereas caecal lactic acid bacteria counts tended to increase (0.05≤ P< 0.10) in group HRB1, compared to the control group. Breast meat fat content tended to be lower (0.05≤ P< 0.10) in group HRB1. Meat oxidative stability was improved (P<0.001) and jejunum villus height, crypt depth and goblet cells numbers were increased (P<0.001) in all three herbal supplemented groups, compared to the control. 5. In conclusion, herbal feed additives may be able to improve both growth performance and antioxidant activity of broiler chickens, based on their phenolic compound content.
Performance deterioration based on existing (historical) data; JT9D jet engine diagnostics program
NASA Technical Reports Server (NTRS)
Sallee, G. P.
1978-01-01
The results of the collection and analysis of historical data pertaining to the deterioration of JT9D engine performance are presented. The results of analyses of prerepair and postrepair engine test stand performance data from a number of airlines to establish the individual as well as average losses in engine performance with respect to service use are included. Analysis of the changes in mechanical condition of parts, obtained by inspection of used gas-path parts of varying age, allowed preliminary assessments of component performance deterioration levels and identification of the causitive factors. These component performance estimates, refined by data from special engine back-to-back testing related to module performance restoration, permitted the development of preliminary models of engine component/module performance deterioration with respect to usage. The preliminary assessment of the causes of module performance deterioration and the trends with usage are explained, along with the role each module plays in overall engine performance deterioration. Preliminary recommendations with respect to operating and maintenance practices which could be adopted to control the level of performance deterioration are presented. The needs for additional component sensitivity testing as well as outstanding issues are discussed.
NASA Astrophysics Data System (ADS)
Douin, Myriam; Guerlou-Demourgues, Liliane; Goubault, Lionel; Bernard, Patrick; Delmas, Claude
When used as conductive additive at the positive electrode of Ni-MH batteries, the Na 0.6CoO 2 phase is converted, during the first charge, by oxidation, in a γ-hydrated cobalt oxyhydroxide, which exhibits promising performances. The behavior of these phases was studied in specific deep discharge or low potential storage conditions, through electrochemical short-circuit experiments. The evolution of the electrodes during the cycling was followed by X-ray diffraction and SEM analysis. These novel additives appear to be more efficient in these extreme conditions than the CoO or Co(OH) 2 additives, commonly used in industrial devices.
NASA Astrophysics Data System (ADS)
Lin, Dongguo; Kang, Tae Gon; Han, Jun Sae; Park, Seong Jin; Chung, Seong Taek; Kwon, Young-Sam
2018-02-01
Both experimental and numerical analysis of powder injection molding (PIM) of Ti-6Al-4V alloy were performed to prepare a defect-free high-performance Ti-6Al-4V part with low carbon/oxygen contents. The prepared feedstock was characterized with specific experiments to identify its viscosity, pressure-volume-temperature and thermal properties to simulate its injection molding process. A finite-element-based numerical scheme was employed to simulate the thermomechanical process during the injection molding. In addition, the injection molding, debinding, sintering and hot isostatic pressing processes were performed in sequence to prepare the PIMed parts. With optimized processing conditions, the PIMed Ti-6Al-4V part exhibits excellent physical and mechanical properties, showing a final density of 99.8%, tensile strength of 973 MPa and elongation of 16%.
Amplify-and-forward cooperative diversity for green UWB-based WBSNs.
Shaban, Heba; Abou El-Nasr, Mohamad
2013-01-01
This paper proposes a novel green cooperative diversity technique based on suboptimal template-based ultra-wideband (UWB) wireless body sensor networks (WBSNs) using amplify-and-forward (AF) relays. In addition, it analyzes the bit-error-rate (BER) performance of the proposed nodes. The analysis is based on the moment-generating function (MGF) of the total signal-to-noise ratio (SNR) at the destination. It also provides an approximate value for the total SNR. The analysis studies the performance of equally correlated binary pulse position modulation (EC-BPPM) assuming the sinusoidal and square suboptimal template pulses. Numerical results are provided for the performance evaluation of optimal and suboptimal template-based nodes with and without relay cooperation. Results show that one relay node provides ~23 dB performance enhancement at 1e - 3 BER, which mitigates the effect of the nondesirable non-line-of-sight (NLOS) links in WBSNs.
Amplify-and-Forward Cooperative Diversity for Green UWB-Based WBSNs
2013-01-01
This paper proposes a novel green cooperative diversity technique based on suboptimal template-based ultra-wideband (UWB) wireless body sensor networks (WBSNs) using amplify-and-forward (AF) relays. In addition, it analyzes the bit-error-rate (BER) performance of the proposed nodes. The analysis is based on the moment-generating function (MGF) of the total signal-to-noise ratio (SNR) at the destination. It also provides an approximate value for the total SNR. The analysis studies the performance of equally correlated binary pulse position modulation (EC-BPPM) assuming the sinusoidal and square suboptimal template pulses. Numerical results are provided for the performance evaluation of optimal and suboptimal template-based nodes with and without relay cooperation. Results show that one relay node provides ~23 dB performance enhancement at 1e − 3 BER, which mitigates the effect of the nondesirable non-line-of-sight (NLOS) links in WBSNs. PMID:24307880
Sechopoulos, Ioannis
2013-01-01
Many important post-acquisition aspects of breast tomosynthesis imaging can impact its clinical performance. Chief among them is the reconstruction algorithm that generates the representation of the three-dimensional breast volume from the acquired projections. But even after reconstruction, additional processes, such as artifact reduction algorithms, computer aided detection and diagnosis, among others, can also impact the performance of breast tomosynthesis in the clinical realm. In this two part paper, a review of breast tomosynthesis research is performed, with an emphasis on its medical physics aspects. In the companion paper, the first part of this review, the research performed relevant to the image acquisition process is examined. This second part will review the research on the post-acquisition aspects, including reconstruction, image processing, and analysis, as well as the advanced applications being investigated for breast tomosynthesis. PMID:23298127
Energy-Discriminative Performance of a Spectral Micro-CT System
He, Peng; Yu, Hengyong; Bennett, James; Ronaldson, Paul; Zainon, Rafidah; Butler, Anthony; Butler, Phil; Wei, Biao; Wang, Ge
2013-01-01
Experiments were performed to evaluate the energy-discriminative performance of a spectral (multi-energy) micro-CT system. The system, designed by MARS (Medipix All Resolution System) Bio-Imaging Ltd. (Christchurch, New Zealand), employs a photon-counting energy-discriminative detector technology developed by CERN (European Organization for Nuclear Research). We used the K-edge attenuation characteristic of some known materials to calibrate the detector’s photon energy discrimination. For tomographic analysis, we used the compressed sensing (CS) based ordered-subset simultaneous algebraic reconstruction techniques (OS-SART) to reconstruct sample images, which is effective to reduce noise and suppress artifacts. Unlike conventional CT, the principal component analysis (PCA) method can be applied to extract and quantify additional attenuation information from a spectral CT dataset. Our results show that the spectral CT has a good energy-discriminative performance and provides more attenuation information than the conventional CT. PMID:24004864
Thermoelectric pump performance analysis computer code
NASA Technical Reports Server (NTRS)
Johnson, J. L.
1973-01-01
A computer program is presented that was used to analyze and design dual-throat electromagnetic dc conduction pumps for the 5-kwe ZrH reactor thermoelectric system. In addition to a listing of the code and corresponding identification of symbols, the bases for this analytical model are provided.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aubart, M.A.; Chandler, B.D.; Gould, R.A.T.
Platinum- and palladium-gold cluster compounds were evaluated with respect to their ability to catalyze H{sub 2}-D{sub 2} equilibration. In addition, these phosphine-stabilized complexes were structurally characterized. Mechanistic studies for this reaction were performed by kinetic and spectroscopic analysis. The catalytic reaction appears to occur in three steps, which were determined.
The X-43A Six Degree of Freedom Monte Carlo Analysis
NASA Technical Reports Server (NTRS)
Baumann, Ethan; Bahm, Catherine; Strovers, Brian; Beck, Roger
2008-01-01
This report provides an overview of the Hyper-X research vehicle Monte Carlo analysis conducted with the six-degree-of-freedom simulation. The methodology and model uncertainties used for the Monte Carlo analysis are presented as permitted. In addition, the process used to select hardware validation test cases from the Monte Carlo data is described. The preflight Monte Carlo analysis indicated that the X-43A control system was robust to the preflight uncertainties and provided the Hyper-X project an important indication that the vehicle would likely be successful in accomplishing the mission objectives. The X-43A inflight performance is compared to the preflight Monte Carlo predictions and shown to exceed the Monte Carlo bounds in several instances. Possible modeling shortfalls are presented that may account for these discrepancies. The flight control laws and guidance algorithms were robust enough as a result of the preflight Monte Carlo analysis that the unexpected in-flight performance did not have undue consequences. Modeling and Monte Carlo analysis lessons learned are presented.
The X-43A Six Degree of Freedom Monte Carlo Analysis
NASA Technical Reports Server (NTRS)
Baumann, Ethan; Bahm, Catherine; Strovers, Brian; Beck, Roger; Richard, Michael
2007-01-01
This report provides an overview of the Hyper-X research vehicle Monte Carlo analysis conducted with the six-degree-of-freedom simulation. The methodology and model uncertainties used for the Monte Carlo analysis are presented as permitted. In addition, the process used to select hardware validation test cases from the Monte Carlo data is described. The preflight Monte Carlo analysis indicated that the X-43A control system was robust to the preflight uncertainties and provided the Hyper-X project an important indication that the vehicle would likely be successful in accomplishing the mission objectives. The X-43A in-flight performance is compared to the preflight Monte Carlo predictions and shown to exceed the Monte Carlo bounds in several instances. Possible modeling shortfalls are presented that may account for these discrepancies. The flight control laws and guidance algorithms were robust enough as a result of the preflight Monte Carlo analysis that the unexpected in-flight performance did not have undue consequences. Modeling and Monte Carlo analysis lessons learned are presented.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, In Joon; Chung, Jin Wook, E-mail: chungjw@snu.ac.kr; Yin, Yong Hu
2015-10-15
PurposeThis study was designed to analyze retrospectively the performance of cone-beam computed tomography (CBCT) hepatic arteriography in depicting tumors and their feeders and to investigate the related determining factors in chemoembolization for hepatocellular carcinoma (HCC).MethodsEighty-six patients with 142 tumors satisfying the imaging diagnosis criteria of HCC were included in this study. The performance of CBCT hepatic arteriography for chemoembolization per tumor and per patient was evaluated using maximum intensity projection images alone (MIP analysis) or MIP combined with multiplanar reformation images (MIP + MPR analysis) regarding the following three aspects: tumor depiction, confidence of tumor feeder detection, and trackability of tumor feeders.more » Tumor size, tumor enhancement, tumor location, number of feeders, diaphragmatic motion, portal vein enhancement, and hepatic artery to parenchyma enhancement ratio were regarded as potential determining factors.ResultsTumors were depicted in 125 (88.0 %) and 142 tumors (100 %) on MIP and MIP + MPR analysis, respectively. Imaging performances on MIP and MIP + MPR analysis were good enough to perform subsegmental chemoembolization without additional angiographic investigation in 88 (62.0 %) and 128 tumors (90.1 %) on per-tumor basis and in 43 (50 %) and 73 (84.9 %) on per-patient basis, respectively. Significant determining factors for performance in MIP + MPR analysis on per tumor basis were tumor size (p = 0.030), tumor enhancement (0.005), tumor location (p = 0.001), and diaphragmatic motion (p < 0.001).ConclusionsCBCT hepatic arteriography provided sufficient information for subsegmental chemoembolization by depicting tumors and their feeders in the vast majority of patients. Combined analysis of MIP and MPR images was essential to enhance the performance of CBCT hepatic arteriography.« less
Yang, Lingjing; Li, Xixia; Tong, Xiang; Fan, Hong
2015-12-01
Previous studies have shown that glutathione S-transferase P1 (GSTP1) was associated with chronic obstructive pulmonary disease (COPD). However, the association between GSTP1 Ile (105) Val gene polymorphism and COPD remains controversial. To drive a more precise estimation, we performed a meta-analysis based on published case-control studies. An electronic search of PubMed, EMBASE, Cochrane library, Web of Science and China Knowledge Resource Integrated (CNKI) Database for papers on GSTP1 Ile (105) Val gene polymorphism and COPD risk was performed. The pooled odds ratios (ORs) with 95% confidence intervals (CIs) were used to assess the strength of association in the homozygote model, heterozygote model, dominant model, recessive model and an additive mode. Statistical heterogeneity, test of publication bias and sensitivity analysis was performed. The software STATA (Version 13.0) was used data analysis. Overall, seventeen studies with 1892 cases and 2012 controls were included in this meta-analysis. The GSTP1 Ile (105) Val polymorphism showed pooled odds ratios for the homozygote comparison (OR = 1.501, 95%CI [0.862, 2.614]), heterozygote comparison (OR = 0.924, 95%CI [0.733, 1.165]), dominant model (OR = 1.003, 95%CI [0.756, 1.331]), recessive model (OR = 1.510, 95%CI [0.934, 2.439]), and an additive model (OR = 1.072, 95%CI [0.822, 1.398]). In conclusion, the current meta-analysis, based on the most updated information, showed no significant association between GSTP1 Ile (105) Val gene polymorphism and COPD risk in any genetic models. The results of subgroup analysis also showed no significant association between GSTP1 Ile (105) Val gene polymorphism and COPD risk in Asian population and Caucasian population. Further studies involving large populations and careful control with age, sex, ethnicity, and cigarette smoking are greatly needed.
Yang, Lingjing; Li, Xixia; Tong, Xiang; Fan, Hong
2015-01-01
Introduction Previous studies have shown that glutathione S-transferase P1 (GSTP1) was associated with chronic obstructive pulmonary disease (COPD). However, the association between GSTP1 Ile (105) Val gene polymorphism and COPD remains controversial. To drive a more precise estimation, we performed a meta-analysis based on published case–control studies. Methods An electronic search of PubMed, EMBASE, Cochrane library, Web of Science and China Knowledge Resource Integrated (CNKI) Database for papers on GSTP1 Ile (105) Val gene polymorphism and COPD risk was performed. The pooled odds ratios (ORs) with 95% confidence intervals (CIs) were used to assess the strength of association in the homozygote model, heterozygote model, dominant model, recessive model and an additive mode. Statistical heterogeneity, test of publication bias and sensitivity analysis was performed. The software STATA (Version 13.0) was used data analysis. Results Overall, seventeen studies with 1892 cases and 2012 controls were included in this meta-analysis. The GSTP1 Ile (105) Val polymorphism showed pooled odds ratios for the homozygote comparison (OR = 1.501, 95%CI [0.862, 2.614]), heterozygote comparison (OR = 0.924, 95%CI [0.733, 1.165]), dominant model (OR = 1.003, 95%CI [0.756, 1.331]), recessive model (OR = 1.510, 95%CI [0.934, 2.439]), and an additive model (OR = 1.072, 95%CI [0.822, 1.398]). Conclusions In conclusion, the current meta-analysis, based on the most updated information, showed no significant association between GSTP1 Ile (105) Val gene polymorphism and COPD risk in any genetic models. The results of subgroup analysis also showed no significant association between GSTP1 Ile (105) Val gene polymorphism and COPD risk in Asian population and Caucasian population. Further studies involving large populations and careful control with age, sex, ethnicity, and cigarette smoking are greatly needed. PMID:26504746
DOE Office of Scientific and Technical Information (OSTI.GOV)
Roussel, G.
Leak-Before-Break (LBB) technology has not been applied in the first design of the seven Pressurized Water Reactors the Belgian utility is currently operating. The design basis of these plants required to consider the dynamic effects associated with the ruptures to be postulated in the high energy piping. The application of the LBB technology to the existing plants has been recently approved by the Belgian Safety Authorities but with a limitation to the primary coolant loop. LBB analysis has been initiated for the Doel 3 and Tihange 2 plants to allow the withdrawal of some of the reactor coolant pump snubbersmore » at both plants and not reinstall some of the restraints after steam generator replacement at Doel 3. LBB analysis was also found beneficial to demonstrate the acceptability of the primary components and piping to the new conditions resulting from power uprating and stretch-out operation. LBB analysis has been subsequently performed on the primary coolant loop of the Tihange I plant and is currently being performed for the Doel 4 plant. Application of the LBB to the primary coolant loop is based in Belgium on the U.S. Nuclear Regulatory Commission requirements. However the Belgian Safety Authorities required some additional analyses and put some restrictions on the benefits of the LBB analysis to maintain the global safety of the plant at a sufficient level. This paper develops the main steps of the safety evaluation performed by the Belgian Safety Authorities for accepting the application of the LBB technology to existing plants and summarizes the requirements asked for in addition to the U.S. Nuclear Regulatory Commission rules.« less
Counter tube window and X-ray fluorescence analyzer study
NASA Technical Reports Server (NTRS)
Hertel, R.; Holm, M.
1973-01-01
A study was performed to determine the best design tube window and X-ray fluorescence analyzer for quantitative analysis of Venusian dust and condensates. The principal objective of the project was to develop the best counter tube window geometry for the sensing element of the instrument. This included formulation of a mathematical model of the window and optimization of its parameters. The proposed detector and instrument has several important features. The instrument will perform a near real-time analysis of dust in the Venusian atmosphere, and is capable of measuring dust layers less than 1 micron thick. In addition, wide dynamic measurement range will be provided to compensate for extreme variations in count rates. An integral pulse-height analyzer and memory accumulate data and read out spectra for detail computer analysis on the ground.
Design and Analysis of a Turbopump for a Conceptual Expander Cycle Upper-Stage Engine
NASA Technical Reports Server (NTRS)
Dorney, Daniel J.; Rothermel, Jeffry; Griffin, Lisa W.; Thornton, Randall J.; Forbes, John C.; Skelly, Stephen E.; Huber, Frank W.
2006-01-01
As part of the development of technologies for rocket engines that will power spacecraft to the Moon and Mars, a program was initiated to develop a conceptual upper stage engine with wide flow range capability. The resulting expander cycle engine design employs a radial turbine to allow higher pump speeds and efficiencies. In this paper, the design and analysis of the pump section of the engine are discussed. One-dimensional meanline analyses and three-dimensional unsteady computational fluid dynamics simulations were performed for the pump stage. Configurations with both vaneless and vaned diffusers were investigated. Both the meanline analysis and computational predictions show that the pump will meet the performance objectives. Additional details describing the development of a water flow facility test are also presented.
Man-machine interface analysis of the flight design system
NASA Technical Reports Server (NTRS)
Ramsey, H. R.; Atwood, M. E.; Willoughby, J. K.
1978-01-01
The objective of the current effort was to perform a broad analysis of the human factors issues involved in the design of the Flight Design System (FDS). The analysis was intended to include characteristics of the system itself, such as: (1) basic structure and functional capabilities of FDS; (2) user backgrounds, capabilities, and possible modes of use; (3) FDS interactive dialogue, problem solving aids; (4) system data management capabilities; and to include, as well, such system related matters as: (1) flight design team structure; (2) roles of technicians; (3) user training; and (4) methods of evaluating system performance. Wherever possible, specific recommendations are made. In other cases, the issues which seem most important are identified. In some cases, additional analyses or experiments which might provide resolution are suggested.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Emery, Keith
The measurement of photovoltaic (PV) performance with respect to reference conditions requires measuring current versus voltage for a given tabular reference spectrum, junction temperature, and total irradiance. This report presents the procedures implemented by the PV Cell and Module Performance Characterization Group at the National Renewable Energy Laboratory (NREL) to achieve the lowest practical uncertainty. A rigorous uncertainty analysis of these procedures is presented, which follows the International Organization for Standardization (ISO) Guide to the Expression of Uncertainty in Measurement. This uncertainty analysis is required for the team’s laboratory accreditation under ISO standard 17025, “General Requirements for the Competence ofmore » Testing and Calibration Laboratories.” The report also discusses additional areas where the uncertainty can be reduced.« less
Optimum sensitivity derivatives of objective functions in nonlinear programming
NASA Technical Reports Server (NTRS)
Barthelemy, J.-F. M.; Sobieszczanski-Sobieski, J.
1983-01-01
The feasibility of eliminating second derivatives from the input of optimum sensitivity analyses of optimization problems is demonstrated. This elimination restricts the sensitivity analysis to the first-order sensitivity derivatives of the objective function. It is also shown that when a complete first-order sensitivity analysis is performed, second-order sensitivity derivatives of the objective function are available at little additional cost. An expression is derived whose application to linear programming is presented.
Ha, Don-Hyung; Ly, Tiffany; Caron, Joseph M; Zhang, Haitao; Fritz, Kevin E; Robinson, Richard D
2015-11-18
In this work, we demonstrate a general lithium-ion battery electrode fabrication method for colloidal nanoparticles (NPs) using electrophoretic deposition (EPD). Our process is capable of forming robust electrodes from copper sulfide, manganese sulfide, and germanium NPs without the use of additives such as polymeric binders and conductive agents. After EPD, we show two postprocessing treatments ((NH4)2S and inert atmosphere heating) to effectively remove surfactant ligands and create a linked network of particles. The NP films fabricated by this simple process exhibit excellent electrochemical performance as lithium-ion battery electrodes. Additive-free Cu(2-x)S and MnS NP films show well-defined plateaus at ∼1.7 V, demonstrating potential for use as cathode electrodes. Because of the absence of additives in the NP film, this additive-free NP film is an ideal template for ex situ analyses of the particles to track particle morphology changes and deterioration as a result of Li ion cycling. To this end, we perform a size-dependent investigation of Cu(2-x)S NPs and demonstrate that there is no significant relationship between size and capacity when comparing small (3.8 nm), medium (22 nm), and large (75 nm) diameter Cu(2-x)S NPs up to 50 cycles; however, the 75 nm NPs show higher Coulombic efficiency. Ex situ TEM analysis suggests that Cu(2-x)S NPs eventually break into smaller particles (<10 nm), explaining a weak correlation between size and performance. We also report for the first time on additive-free Ge NP films, which show stable capacities for up to 50 cycles at 750 mAh/g.
Influence of the grade on the variability of the mechanical properties of polypropylene waste.
Jmal, Hamdi; Bahlouli, Nadia; Wagner-Kocher, Christiane; Leray, Dimitri; Ruch, Frédéric; Munsch, Jean-Nicolas; Nardin, Michel
2018-05-01
The prior properties of recycled polypropylene depend on the origin of waste deposits and its chemical constituents. To obtain specific properties with a predefine melt flow index of polypropylene, the suppliers of polymer introduce additives and fillers. However, the addition of additives and/or fillers can modify strongly the mechanical behaviour of recycled polypropylene. To understand the impact of the additives and fillers on the quasi-static mechanical behaviour, we consider, in this study, three different recycled polypropylenes with three different melt flow index obtained from different waste deposits. The chemical constituents of the additives and filler contents of the recycled polypropylenes are determined through thermo-physico-chemical analysis. Tensile and bending tests performed at different strain rates allow identifying the mechanical properties such as the elastic modulus, the yield stress, the maximum stress, and the failure mechanisms. The results obtained are compared with non-recycled polypropylene and with few researches to explain the combined effect of additives. Finally, a post-mortem analysis of the samples was carried out to make the link between the obtained mechanical properties and microstructure. Copyright © 2018 Elsevier Ltd. All rights reserved.
Saida, Tomomi; Fukushima, Wakaba; Ohfuji, Satoko; Kondo, Kyoko; Matsunaga, Ichiro; Hirota, Yoshio
2014-01-01
No previous study has performed multivariate analysis of the risk factors of fatty liver disease (FL), focusing on the effect of weight gain of ≥ 10 kg since the age of 20, and no analysis model exists that simultaneously evaluates body mass index (BMI) and body fat percentage (BFP) as adjustment variables. To investigate these, we collected anthropometric data from health checkups, and conducted a cross-sectional study (targeting 1851 males and 1259 females aged 30 years or over). Regardless of sex, weight gain of ≥10 kg since the age of 20 was positively associated with FL. Our stratified analysis of BFP into two categories, to evaluate the interaction between BMI and BFP in FL, indicated an approximately fivefold increase in the odds ratio in the male group with high BMI and BFP values compared to those with low BMI and BFP values, with a synergy index of 1.77 > 1. On the other hand, females demonstrated no significant additive interaction, with a synergy index of 0.49 < 1. We revealed that weight gain ≥ 10 kg since the age of 20 is significantly associated with FL regardless of sex. In addition, by performing a synergy index (S), we showed that the additive interaction between BMI and BFP in FL differs according to gender. © 2013 Journal of Gastroenterology and Hepatology Foundation and Wiley Publishing Asia Pty Ltd.
NASA Astrophysics Data System (ADS)
Mulyati, S.; Aprilia, S.; Safiah; Syawaliah; Armando, M. A.; Mawardi, H.
2018-05-01
The effect of polyethylene glycol (PEG) additive on the characteristics and performance of the cellulose acetate ultrafiltration membrane to chromium metal removal has been studied using some variation of concentration in the casting solution. The concentration of cellulose acetate polymer was 17.5%, whereas the variations of PEG concentration were regulated at 0, 2.5, 5, 7.5 and 10% by weight. Dimethyl formamide (DMF) was used as a solvent. Pure water flux, membrane morphology test, functional group analysis, and molecular weight cut off (MWCO) were investigated to characterize of the prepared membranes. Membrane performance was tested against Cr(III) metal removal. The results confirmed that the pure water flux improved with the increasing of additive concentration. The maximum improvement occurred at membrane with modification using 7.5% PEG. At this PEG concentration, the pure water flux elevated from 49.5 L/m2.h to 62.2 L/m2.h. The addition of PEG successfully improved the membrane flux because the role PEG plays as a pore-forming agent. Membrane with addition of 7.5% PEG showcased rejection result for chromium metal of 31.89%. This value is lower than that of pure CA membrane which rejection value against Cr(III) metal amounted to 35.72%.
Tuan, Nguyen Ngoc; Chang, Yi-Chia; Yu, Chang-Ping; Huang, Shir-Ly
2014-01-01
In this study, the first survey of microbial community in thermophilic anaerobic digester using swine manure as sole feedstock was performed by multiple approaches including denaturing gradient gel electrophoresis (DGGE), clone library and pyrosequencing techniques. The integrated analysis of 21 DGGE bands, 126 clones and 8506 pyrosequencing read sequences revealed that Clostridia from the phylum Firmicutes account for the most dominant Bacteria. In addition, our analysis also identified additional taxa that were missed by the previous researches, including members of the bacterial phyla Synergistetes, Planctomycetes, Armatimonadetes, Chloroflexi and Nitrospira which might also play a role in thermophilic anaerobic digester. Most archaeal 16S rRNA sequences could be assigned to the order Methanobacteriales instead of Methanomicrobiales comparing to previous studies. In addition, this study reported that the member of Methanothermobacter genus was firstly found in thermophilic anaerobic digester. Copyright © 2014 Elsevier GmbH. All rights reserved.
Arukalam, Innocent O; Oguzie, Emeka E; Li, Ying
2016-12-15
Perfluorodecyltrichlorosilane-based poly(dimethylsiloxane)-ZnO (FDTS-based PDMS-ZnO) nanocomposite coating with anti-corrosion and anti-fouling capabilities has been prepared using a one-step fabrication technique. XPS analysis and contact angle measurements showed the fluorine content to increase, while the hydrophobicity of the coatings decreased with addition of FDTS. XRD analysis revealed existence of ZnO nanoparticles of dimensions ranging from 11.45 to 93.01nm on the surface of coatings, with the mean particle size decreasing with FDTS addition, and was confirmed by SEM and TEM observations. Interestingly, the anti-corrosion performance and mechanical properties of the coatings increased remarkably on addition of FDTS. Indeed, the observed low adhesion strength, surface energies and the outstanding anti-corrosive properties imply that the obtained coating would be useful in anti-fouling applications. Copyright © 2016 Elsevier Inc. All rights reserved.
Positioning performance analysis of the time sum of arrival algorithm with error features
NASA Astrophysics Data System (ADS)
Gong, Feng-xun; Ma, Yan-qiu
2018-03-01
The theoretical positioning accuracy of multilateration (MLAT) with the time difference of arrival (TDOA) algorithm is very high. However, there are some problems in practical applications. Here we analyze the location performance of the time sum of arrival (TSOA) algorithm from the root mean square error ( RMSE) and geometric dilution of precision (GDOP) in additive white Gaussian noise (AWGN) environment. The TSOA localization model is constructed. Using it, the distribution of location ambiguity region is presented with 4-base stations. And then, the location performance analysis is started from the 4-base stations with calculating the RMSE and GDOP variation. Subsequently, when the location parameters are changed in number of base stations, base station layout and so on, the performance changing patterns of the TSOA location algorithm are shown. So, the TSOA location characteristics and performance are revealed. From the RMSE and GDOP state changing trend, the anti-noise performance and robustness of the TSOA localization algorithm are proved. The TSOA anti-noise performance will be used for reducing the blind-zone and the false location rate of MLAT systems.
Physician performance assessment using a composite quality index.
Liu, Kaibo; Jain, Shabnam; Shi, Jianjun
2013-07-10
Assessing physician performance is important for the purposes of measuring and improving quality of service and reducing healthcare delivery costs. In recent years, physician performance scorecards have been used to provide feedback on individual measures; however, one key challenge is how to develop a composite quality index that combines multiple measures for overall physician performance evaluation. A controversy arises over establishing appropriate weights to combine indicators in multiple dimensions, and cannot be easily resolved. In this study, we proposed a generic unsupervised learning approach to develop a single composite index for physician performance assessment by using non-negative principal component analysis. We developed a new algorithm named iterative quadratic programming to solve the numerical issue in the non-negative principal component analysis approach. We conducted real case studies to demonstrate the performance of the proposed method. We provided interpretations from both statistical and clinical perspectives to evaluate the developed composite ranking score in practice. In addition, we implemented the root cause assessment techniques to explain physician performance for improvement purposes. Copyright © 2012 John Wiley & Sons, Ltd.
Analysis of line structure in handwritten documents using the Hough transform
NASA Astrophysics Data System (ADS)
Ball, Gregory R.; Kasiviswanathan, Harish; Srihari, Sargur N.; Narayanan, Aswin
2010-01-01
In the analysis of handwriting in documents a central task is that of determining line structure of the text, e.g., number of text lines, location of their starting and end-points, line-width, etc. While simple methods can handle ideal images, real world documents have complexities such as overlapping line structure, variable line spacing, line skew, document skew, noisy or degraded images etc. This paper explores the application of the Hough transform method to handwritten documents with the goal of automatically determining global document line structure in a top-down manner which can then be used in conjunction with a bottom-up method such as connected component analysis. The performance is significantly better than other top-down methods, such as the projection profile method. In addition, we evaluate the performance of skew analysis by the Hough transform on handwritten documents.
Canonical and symplectic analysis for three dimensional gravity without dynamics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Escalante, Alberto, E-mail: aescalan@ifuap.buap.mx; Osmart Ochoa-Gutiérrez, H.
2017-03-15
In this paper a detailed Hamiltonian analysis of three-dimensional gravity without dynamics proposed by V. Hussain is performed. We report the complete structure of the constraints and the Dirac brackets are explicitly computed. In addition, the Faddeev–Jackiw symplectic approach is developed; we report the complete set of Faddeev–Jackiw constraints and the generalized brackets, then we show that the Dirac and the generalized Faddeev–Jackiw brackets coincide to each other. Finally, the similarities and advantages between Faddeev–Jackiw and Dirac’s formalism are briefly discussed. - Highlights: • We report the symplectic analysis for three dimensional gravity without dynamics. • We report the Faddeev–Jackiwmore » constraints. • A pure Dirac’s analysis is performed. • The complete structure of Dirac’s constraints is reported. • We show that symplectic and Dirac’s brackets coincide to each other.« less
Kosjek, Tina; Negreira, Noelia; Heath, Ester; López de Alda, Miren; Barceló, Damià
2018-01-01
This study aims to identify (bio)transformation products of vincristine, a plant alkaloid chemotherapy drug. A batch biotransformation experiment was set-up using activated sludge at two concentration levels with and without the addition of a carbon source. Sample analysis was performed on an ultra-high performance liquid chromatograph coupled to a high-resolution hybrid quadrupole-Orbitrap tandem mass spectrometer. To identify molecular ions of vincristine transformation products and to propose molecular and chemical structures, we performed data-dependent acquisition experiments combining full-scan mass spectrometry data with product ion spectra. In addition, the use of non-commercial detection and prediction algorithms such as MZmine 2 and EAWAG-BBD Pathway Prediction System, was proven to be proficient for screening for transformation products in complex wastewater matrix total ion chromatograms. In this study eleven vincristine transformation products were detected, nine of which were tentatively identified. Copyright © 2017 Elsevier B.V. All rights reserved.
Digital PCR Improves Mutation Analysis in Pancreas Fine Needle Aspiration Biopsy Specimens.
Sho, Shonan; Court, Colin M; Kim, Stephen; Braxton, David R; Hou, Shuang; Muthusamy, V Raman; Watson, Rabindra R; Sedarat, Alireza; Tseng, Hsian-Rong; Tomlinson, James S
2017-01-01
Applications of precision oncology strategies rely on accurate tumor genotyping from clinically available specimens. Fine needle aspirations (FNA) are frequently obtained in cancer management and often represent the only source of tumor tissues for patients with metastatic or locally advanced diseases. However, FNAs obtained from pancreas ductal adenocarcinoma (PDAC) are often limited in cellularity and/or tumor cell purity, precluding accurate tumor genotyping in many cases. Digital PCR (dPCR) is a technology with exceptional sensitivity and low DNA template requirement, characteristics that are necessary for analyzing PDAC FNA samples. In the current study, we sought to evaluate dPCR as a mutation analysis tool for pancreas FNA specimens. To this end, we analyzed alterations in the KRAS gene in pancreas FNAs using dPCR. The sensitivity of dPCR mutation analysis was first determined using serial dilution cell spiking studies. Single-cell laser-microdissection (LMD) was then utilized to identify the minimal number of tumor cells needed for mutation detection. Lastly, dPCR mutation analysis was performed on 44 pancreas FNAs (34 formalin-fixed paraffin-embedded (FFPE) and 10 fresh (non-fixed)), including samples highly limited in cellularity (100 cells) and tumor cell purity (1%). We found dPCR to detect mutations with allele frequencies as low as 0.17%. Additionally, a single tumor cell could be detected within an abundance of normal cells. Using clinical FNA samples, dPCR mutation analysis was successful in all preoperative FNA biopsies tested, and its accuracy was confirmed via comparison with resected tumor specimens. Moreover, dPCR revealed additional KRAS mutations representing minor subclones within a tumor that were not detected by the current clinical gold standard method of Sanger sequencing. In conclusion, dPCR performs sensitive and accurate mutation analysis in pancreas FNAs, detecting not only the dominant mutation subtype, but also the additional rare mutation subtypes representing tumor heterogeneity.
Digital PCR Improves Mutation Analysis in Pancreas Fine Needle Aspiration Biopsy Specimens
Court, Colin M.; Kim, Stephen; Braxton, David R.; Hou, Shuang; Muthusamy, V. Raman; Watson, Rabindra R.; Sedarat, Alireza; Tseng, Hsian-Rong; Tomlinson, James S.
2017-01-01
Applications of precision oncology strategies rely on accurate tumor genotyping from clinically available specimens. Fine needle aspirations (FNA) are frequently obtained in cancer management and often represent the only source of tumor tissues for patients with metastatic or locally advanced diseases. However, FNAs obtained from pancreas ductal adenocarcinoma (PDAC) are often limited in cellularity and/or tumor cell purity, precluding accurate tumor genotyping in many cases. Digital PCR (dPCR) is a technology with exceptional sensitivity and low DNA template requirement, characteristics that are necessary for analyzing PDAC FNA samples. In the current study, we sought to evaluate dPCR as a mutation analysis tool for pancreas FNA specimens. To this end, we analyzed alterations in the KRAS gene in pancreas FNAs using dPCR. The sensitivity of dPCR mutation analysis was first determined using serial dilution cell spiking studies. Single-cell laser-microdissection (LMD) was then utilized to identify the minimal number of tumor cells needed for mutation detection. Lastly, dPCR mutation analysis was performed on 44 pancreas FNAs (34 formalin-fixed paraffin-embedded (FFPE) and 10 fresh (non-fixed)), including samples highly limited in cellularity (100 cells) and tumor cell purity (1%). We found dPCR to detect mutations with allele frequencies as low as 0.17%. Additionally, a single tumor cell could be detected within an abundance of normal cells. Using clinical FNA samples, dPCR mutation analysis was successful in all preoperative FNA biopsies tested, and its accuracy was confirmed via comparison with resected tumor specimens. Moreover, dPCR revealed additional KRAS mutations representing minor subclones within a tumor that were not detected by the current clinical gold standard method of Sanger sequencing. In conclusion, dPCR performs sensitive and accurate mutation analysis in pancreas FNAs, detecting not only the dominant mutation subtype, but also the additional rare mutation subtypes representing tumor heterogeneity. PMID:28125707
Multilayer Pressure Vessel Materials Testing and Analysis Phase 2
NASA Technical Reports Server (NTRS)
Popelar, Carl F.; Cardinal, Joseph W.
2014-01-01
To provide NASA with a suite of materials strength, fracture toughness and crack growth rate test results for use in remaining life calculations for the vessels described above, Southwest Research Institute® (SwRI®) was contracted in two phases to obtain relevant material property data from a representative vessel. An initial characterization of the strength, fracture and fatigue crack growth properties was performed in Phase 1. Based on the results and recommendations of Phase 1, a more extensive material property characterization effort was developed in this Phase 2 effort. This Phase 2 characterization included additional strength, fracture and fatigue crack growth of the multilayer vessel and head materials. In addition, some more limited characterization of the welds and heat affected zones (HAZs) were performed. This report
Determining characteristics of artificial near-Earth objects using observability analysis
NASA Astrophysics Data System (ADS)
Friedman, Alex M.; Frueh, Carolin
2018-03-01
Observability analysis is a method for determining whether a chosen state of a system can be determined from the output or measurements. Knowledge of state information availability resulting from observability analysis leads to improved sensor tasking for observation of orbital debris and better control of active spacecraft. This research performs numerical observability analysis of artificial near-Earth objects. Analysis of linearization methods and state transition matrices is performed to determine the viability of applying linear observability methods to the nonlinear orbit problem. Furthermore, pre-whitening is implemented to reformulate classical observability analysis. In addition, the state in observability analysis is typically composed of position and velocity; however, including object characteristics beyond position and velocity can be crucial for precise orbit propagation. For example, solar radiation pressure has a significant impact on the orbit of high area-to-mass ratio objects in geosynchronous orbit. Therefore, determining the time required for solar radiation pressure parameters to become observable is important for understanding debris objects. In order to compare observability analysis results with and without measurement noise and an extended state, quantitative measures of observability are investigated and implemented.
Zhao, Yong-Gang; Chen, Xiao-Hong; Yao, Shan-Shan; Pan, Sheng-Dong; Li, Xiao-Ping; Jin, Mi-Cong
2012-01-01
A reversed-phase high-performance liquid chromatography (RP-HPLC) method was developed for the simultaneous determination of nine food additives, i.e., acesulfame, saccharin, caffeine, aspartame, benzoic acid, sorbic acid, stevioside, dehydroacetic acid and neotame in red wine. The effects of ion-suppressors, i.e., trifluoroacetic acid (TFA) and ammonium acetate (AmAc) on retention behavior of nine food additives in RP-HPLC separation were discussed in detail. The relationships between retention factors of solutes and volume percent of ion-suppressors in the mobile-phase systems of acetonitrile-TFA aqueous solution and acetonitrile-TFA-AmAc aqueous solution were quantitatively established, respectively. The results showed that the ion suppressors had not only an ion suppression effect, but also an organic modification effect on the acidic analytes. The baseline separation of nine food additives was completed by a gradient elution with acetonitrile-TFA(0.01%, v/v)-AmAc(2.5 mmol L(-1)) aqueous solution as the mobile phase. The recoveries were between 80.2 - 99.5% for all analytes with RSDs in the range of 1.5 - 8.9%. The linearities were in the range of 0.2 - 100.0 mg L(-1) with determination coefficients (r(2)) higher than 0.9991 for all analytes. The limits of quantification (LOQs) were between 0.53 - 0.99 mg L(-1). The applicability of the proposed method to detect and quantify food additives has been demonstrated in the analysis of 30 real samples.
Mendes, Paula; Nunes, Luis Miguel; Teixeira, Margarida Ribau
2014-09-01
This article demonstrates how decision-makers can be guided in the process of defining performance target values in the balanced scorecard system. We apply a method based on sensitivity analysis with Monte Carlo simulation to the municipal solid waste management system in Loulé Municipality (Portugal). The method includes two steps: sensitivity analysis of performance indicators to identify those performance indicators with the highest impact on the balanced scorecard model outcomes; and sensitivity analysis of the target values for the previously identified performance indicators. Sensitivity analysis shows that four strategic objectives (IPP1: Comply with the national waste strategy; IPP4: Reduce nonrenewable resources and greenhouse gases; IPP5: Optimize the life-cycle of waste; and FP1: Meet and optimize the budget) alone contribute 99.7% of the variability in overall balanced scorecard value. Thus, these strategic objectives had a much stronger impact on the estimated balanced scorecard outcome than did others, with the IPP1 and the IPP4 accounting for over 55% and 22% of the variance in overall balanced scorecard value, respectively. The remaining performance indicators contribute only marginally. In addition, a change in the value of a single indicator's target value made the overall balanced scorecard value change by as much as 18%. This may lead to involuntarily biased decisions by organizations regarding performance target-setting, if not prevented with the help of methods such as that proposed and applied in this study. © The Author(s) 2014.
ApproachPer suggestion made by CASAC AMMS members during the April 3, 2014 conference call on the Review of Federal Reference Method for Ozone: Nitric Oxide-Chemiluminescence, ORD has performed additional data analysis activities to explain and mitigate scatter observed in the co...
Encoding Orientation and the Remembering of Schizophrenic Young Adults
ERIC Educational Resources Information Center
Koh, Soon D.; Peterson, Rolf A.
1978-01-01
This research examines different types of encoding strategies, in addition to semantic and organizational encodings, and their effects on schizophrenics' remembering. Based on Craik and Lockhart (1972), i.e., memory performance is a function of depth of encoding processing, this analysis compares schizophrenics' encoding processing with that of…
DOT National Transportation Integrated Search
1999-10-01
The objective of this four-year research effort is to develop and test a methodology to estimate the economic impacts of median design. This report summarizes the activities performed in the third year of this project. The primary task in the third y...
An evaluation was performed of the International Waste Technologies (IWT) HWT-20 additive and the Geo-Con, Inc. deep-soil-mixing equipment for an in situ stabilization/solidification process and its applicability as an on-site treatment method for waste site cleanup. The analysis...
Impact of sentinel lymph node biopsy on immediate breast reconstruction after mastectomy.
Wood, Benjamin C; David, Lisa R; Defranzo, Anthony J; Stewart, John H; Shen, Perry; Geisinger, Kim R; Marks, Malcolm W; Levine, Edward A
2009-07-01
Traditionally, sentinel lymph node biopsy (SLNB) is performed at the time of mastectomy and reconstruction. However, several groups have advocated SLNB as a separate outpatient procedure before mastectomy, when immediate reconstruction is planned, to allow for complete pathologic evaluation. The purpose of this study was to determine the impact of intraoperative analysis of SLNB on the reconstructive plan when performed at the same time as definitive surgery. A retrospective review was conducted of all mastectomy cases performed at a single institution between September 1998 and November 2007. Of the 747 mastectomy cases reviewed, SLNB was conducted in 344 cases, and there was immediate breast reconstruction in 193 of those cases. There were 27 (7.8%) false negative and three (0.9%) false positive intraoperative analysis of SLNB. Touch preparation analysis from the SLNB changed the reconstructive plan in four (2.1%) cases. In our experience, SLNB can be performed at the time of mastectomy with minimal impact on the reconstructive plan. A staged approach incurs significant additional expense, increases the delay in initiation of systemic therapy and the propensity of procedure-related morbidity; therefore, SLNB should not be performed as a separate procedure before definitive surgery with immediate breast reconstruction.
Ortega, Juan Ignacio; Evangelio, Carlos; Clemente, Filipe Manuel; Martins, Fernando Manuel Lourenço; González-Víllora, Sixto
2016-06-16
The main objective was to analyze a friendly match of youth elite soccer players identifying the variance of tactical and physiological response parameters during the game. In addition, detecting the impact of both halves on player performance. For the purposes of this study twenty-two U19 players were analyzed playing 11v11. Activity profile, heart rate (HR and HRmax), grouped in five different zones were analyzed via Bluetooth technology, technical performance was analyzed by the Team Sport Assessment Procedure (TSAP), and tactical performance was measured by Social Network Analysis. A comparison of heart rate responses showed significant main effects in the halves (p = 0.001; η p 2 = 0.623). A comparison between tactical position and technical performance had significant main effects (p = 0.001; η p 2 = 0.390). Tactical position showed statistically significant effects on tactical prominence (p = 0.002; η p 2 = 0.296). Therefore, fatigue is a component distinguished in technical/tactical parameters, such as volume of play and efficiency index. Results suggest that fatigue effects may constrain technical performance and, for that reason, the use of instruments to monitor the fatigue effect during matches may be suggested.
Ortega, Juan Ignacio; Evangelio, Carlos; Clemente, Filipe Manuel; Martins, Fernando Manuel Lourenço; González-Víllora, Sixto
2016-01-01
The main objective was to analyze a friendly match of youth elite soccer players identifying the variance of tactical and physiological response parameters during the game. In addition, detecting the impact of both halves on player performance. For the purposes of this study twenty-two U19 players were analyzed playing 11v11. Activity profile, heart rate (HR and HRmax), grouped in five different zones were analyzed via Bluetooth technology, technical performance was analyzed by the Team Sport Assessment Procedure (TSAP), and tactical performance was measured by Social Network Analysis. A comparison of heart rate responses showed significant main effects in the halves (p = 0.001; ηp2 = 0.623). A comparison between tactical position and technical performance had significant main effects (p = 0.001; ηp2 = 0.390). Tactical position showed statistically significant effects on tactical prominence (p = 0.002; ηp2 = 0.296). Therefore, fatigue is a component distinguished in technical/tactical parameters, such as volume of play and efficiency index. Results suggest that fatigue effects may constrain technical performance and, for that reason, the use of instruments to monitor the fatigue effect during matches may be suggested. PMID:29910283
Information theoretic analysis of linear shift-invariant edge-detection operators
NASA Astrophysics Data System (ADS)
Jiang, Bo; Rahman, Zia-ur
2012-06-01
Generally, the designs of digital image processing algorithms and image gathering devices remain separate. Consequently, the performance of digital image processing algorithms is evaluated without taking into account the influences by the image gathering process. However, experiments show that the image gathering process has a profound impact on the performance of digital image processing and the quality of the resulting images. Huck et al. proposed one definitive theoretic analysis of visual communication channels, where the different parts, such as image gathering, processing, and display, are assessed in an integrated manner using Shannon's information theory. We perform an end-to-end information theory based system analysis to assess linear shift-invariant edge-detection algorithms. We evaluate the performance of the different algorithms as a function of the characteristics of the scene and the parameters, such as sampling, additive noise etc., that define the image gathering system. The edge-detection algorithm is regarded as having high performance only if the information rate from the scene to the edge image approaches its maximum possible. This goal can be achieved only by jointly optimizing all processes. Our information-theoretic assessment provides a new tool that allows us to compare different linear shift-invariant edge detectors in a common environment.
Experience with HEP analysis on mounted filesystems.
NASA Astrophysics Data System (ADS)
Fuhrmann, Patrick; Gasthuber, Martin; Kemp, Yves; Ozerov, Dmitry
2012-12-01
We present results on different approaches on mounted filesystems in use or under investigation at DESY. dCache, established since long as a storage system for physics data has implemented the NFS v4.1/pNFS protocol. New performance results will be shown with the most current version of the dCache server. In addition to the native usage of the mounted filesystem in a LAN environment, the results are given for the performance of the dCache NFS v4.1/pNFS in WAN case. Several commercial vendors are currently in alpha or beta phase of adding the NFS v4.1/pNFS protocol to their storage appliances. We will test some of these vendor solutions for their readiness for HEP analysis. DESY has recently purchased an IBM Sonas system. We will present the result of a thorough performance evaluation using the native protocols NFS (v3 or v4) and GPFS. As the emphasis is on the usability for end user analysis, we will use latest ROOT versions and current end user analysis code for benchmark scenarios.
TDRSS telecommunications study. Phase 1: Final report
NASA Technical Reports Server (NTRS)
Cahn, C. R.; Cnossen, R. S.
1974-01-01
A parametric analysis of the telecommunications support capability of the Tracking and Data Relay Satellite System (TDRSS) was performed. Emphasis was placed on maximizing support capability provided to the user while minimizing impact on the user spacecraft. This study evaluates the present TDRSS configuration as presented in the TDRSS Definition Phase Study Report, December 1973 to determine potential changes for improving the overall performance. In addition, it provides specifications of the user transponder equipment to be used in the TDRSS.
2013-01-01
pretest and posttests ( p G .05). An additional analysis was conducted to determine if there were differences in outcomes based on whether par- ticipants...would be predicted based on FIGURE 1 Pretest and posttest mean performance scores for all 26 objectives of the anaphylaxis scenario. FIGURE 2 Pretest ...clinical practice. Another limitation of the study is the use of a pretest / posttest designwithout a control group for comparison of results. Finally
Structural analysis of high-rpm composite propfan blades for a cruise missile wind tunnel model
NASA Technical Reports Server (NTRS)
Carek, David A.
1993-01-01
Analyses were performed on a high-speed composite blade set for the Department of Defense Propfan Missile Interactions Project. The final design iteration, which resulted in the CM2D-2 blade design, is described in this report. Mode shapes, integral order excitation, and stress margins were examined. In addition, geometric corrections were performed to compensate for blade deflection under operating conditions with respect to the aerodynamic design shape.
Performance of Radiant Heating Systems of Low-Energy Buildings
NASA Astrophysics Data System (ADS)
Sarbu, Ioan; Mirza, Matei; Crasmareanu, Emanuel
2017-10-01
After the introduction of plastic piping, the application of water-based radiant heating with pipes embedded in room surfaces (i.e., floors, walls, and ceilings), has significantly increased worldwide. Additionally, interest and growth in radiant heating and cooling systems have increased in recent years because they have been demonstrated to be energy efficient in comparison to all-air distribution systems. This paper briefly describes the heat distribution systems in buildings, focusing on the radiant panels (floor, wall, ceiling, and floor-ceiling). Main objective of this study is the performance investigation of different types of low-temperature heating systems with different methods. Additionally, a comparative analysis of the energy, environmental, and economic performances of floor, wall, ceiling, and floor-ceiling heating using numerical simulation with Transient Systems Simulation (TRNSYS) software is performed. This study showed that the floor-ceiling heating system has the best performance in terms of the lowest energy consumption, operation cost, CO2 emission, and the nominal boiler power. The comparison of the room operative air temperatures and the set-point operative air temperature indicates also that all radiant panel systems provide satisfactory results without significant deviations.
Structured functional additive regression in reproducing kernel Hilbert spaces.
Zhu, Hongxiao; Yao, Fang; Zhang, Hao Helen
2014-06-01
Functional additive models (FAMs) provide a flexible yet simple framework for regressions involving functional predictors. The utilization of data-driven basis in an additive rather than linear structure naturally extends the classical functional linear model. However, the critical issue of selecting nonlinear additive components has been less studied. In this work, we propose a new regularization framework for the structure estimation in the context of Reproducing Kernel Hilbert Spaces. The proposed approach takes advantage of the functional principal components which greatly facilitates the implementation and the theoretical analysis. The selection and estimation are achieved by penalized least squares using a penalty which encourages the sparse structure of the additive components. Theoretical properties such as the rate of convergence are investigated. The empirical performance is demonstrated through simulation studies and a real data application.
Abdullah, Asadatun; Rehbein, Hartmut
2016-01-30
In spite of the many studies performed over the years, there are still problems in the authentication of closely related tuna species, not only for canned fish but also for raw products. With the aim of providing screening methods to identify different tuna species and related scombrids, segments of mitochondrial cytochrome b (cyt b) and nuclear parvalbumin genes were amplified and sequenced or subjected to single-strand conformation polymorphism (SSCP) and restriction fragment length polymorphism (RFLP) analyses. The nucleotide diagnostic sites in the cyt b gene of five tuna species from Indonesia were determined in this study and used to construct a phylogenetic tree. In addition, the suitability of the nuclear gene that encodes parvalbumin for the differentiation of tuna species was determined by SSCP and RFLP analyses of an intron segment. RFLP differentiated Thunnus albacares and from T. obesus, and fish species in the Thunnus genus could be distinguished from bullet tuna (Auxis rochei) by SSCP. Parvalbumin-based polymerase chain reaction systems could serve as an additional tool in the detection and identification of tuna and other Scombridae fish species for routine seafood control. This reaction can be performed in addition to the cyt b analysis as previously described. © 2015 Society of Chemical Industry.
Properties and Applications of High Emissivity Composite Films Based on Far-Infrared Ceramic Powder
Xiong, Yabo; Huang, Shaoyun; Wang, Wenqi; Liu, Xinghai; Li, Houbin
2017-01-01
Polymer matrix composite materials that can emit radiation in the far-infrared region of the spectrum are receiving increasing attention due to their ability to significantly influence biological processes. This study reports on the far-infrared emissivity property of composite films based on far-infrared ceramic powder. X-ray fluorescence spectrometry, Fourier transform infrared spectroscopy, thermogravimetric analysis, and X-ray powder diffractometry were used to evaluate the physical properties of the ceramic powder. The ceramic powder was found to be rich in aluminum oxide, titanium oxide, and silicon oxide, which demonstrate high far-infrared emissivity. In addition, the micromorphology, mechanical performance, dynamic mechanical properties, and far-infrared emissivity of the composite were analyzed to evaluate their suitability for strawberry storage. The mechanical properties of the far-infrared radiation ceramic (cFIR) composite films were not significantly influenced (p ≥ 0.05) by the addition of the ceramic powder. However, the dynamic mechanical analysis (DMA) properties of the cFIR composite films, including a reduction in damping and shock absorption performance, were significant influenced by the addition of the ceramic powder. Moreover, the cFIR composite films showed high far-infrared emissivity, which has the capability of prolonging the storage life of strawberries. This research demonstrates that cFIR composite films are promising for future applications. PMID:29186047
Properties and Applications of High Emissivity Composite Films Based on Far-Infrared Ceramic Powder.
Xiong, Yabo; Huang, Shaoyun; Wang, Wenqi; Liu, Xinghai; Li, Houbin
2017-11-29
Polymer matrix composite materials that can emit radiation in the far-infrared region of the spectrum are receiving increasing attention due to their ability to significantly influence biological processes. This study reports on the far-infrared emissivity property of composite films based on far-infrared ceramic powder. X-ray fluorescence spectrometry, Fourier transform infrared spectroscopy, thermogravimetric analysis, and X-ray powder diffractometry were used to evaluate the physical properties of the ceramic powder. The ceramic powder was found to be rich in aluminum oxide, titanium oxide, and silicon oxide, which demonstrate high far-infrared emissivity. In addition, the micromorphology, mechanical performance, dynamic mechanical properties, and far-infrared emissivity of the composite were analyzed to evaluate their suitability for strawberry storage. The mechanical properties of the far-infrared radiation ceramic (cFIR) composite films were not significantly influenced ( p ≥ 0.05) by the addition of the ceramic powder. However, the dynamic mechanical analysis (DMA) properties of the cFIR composite films, including a reduction in damping and shock absorption performance, were significant influenced by the addition of the ceramic powder. Moreover, the cFIR composite films showed high far-infrared emissivity, which has the capability of prolonging the storage life of strawberries. This research demonstrates that cFIR composite films are promising for future applications.
A CAD Approach to Integrating NDE With Finite Element
NASA Technical Reports Server (NTRS)
Abdul-Aziz, Ali; Downey, James; Ghosn, Louis J.; Baaklini, George Y.
2004-01-01
Nondestructive evaluation (NDE) is one of several technologies applied at NASA Glenn Research Center to determine atypical deformities, cracks, and other anomalies experienced by structural components. NDE consists of applying high-quality imaging techniques (such as x-ray imaging and computed tomography (CT)) to discover hidden manufactured flaws in a structure. Efforts are in progress to integrate NDE with the finite element (FE) computational method to perform detailed structural analysis of a given component. This report presents the core outlines for an in-house technical procedure that incorporates this combined NDE-FE interrelation. An example is presented to demonstrate the applicability of this analytical procedure. FE analysis of a test specimen is performed, and the resulting von Mises stresses and the stress concentrations near the anomalies are observed, which indicates the fidelity of the procedure. Additional information elaborating on the steps needed to perform such an analysis is clearly presented in the form of mini step-by-step guidelines.
Setting Standards for Reporting and Quantification in Fluorescence-Guided Surgery.
Hoogstins, Charlotte; Burggraaf, Jan Jaap; Koller, Marjory; Handgraaf, Henricus; Boogerd, Leonora; van Dam, Gooitzen; Vahrmeijer, Alexander; Burggraaf, Jacobus
2018-05-29
Intraoperative fluorescence imaging (FI) is a promising technique that could potentially guide oncologic surgeons toward more radical resections and thus improve clinical outcome. Despite the increase in the number of clinical trials, fluorescent agents and imaging systems for intraoperative FI, a standardized approach for imaging system performance assessment and post-acquisition image analysis is currently unavailable. We conducted a systematic, controlled comparison between two commercially available imaging systems using a novel calibration device for FI systems and various fluorescent agents. In addition, we analyzed fluorescence images from previous studies to evaluate signal-to-background ratio (SBR) and determinants of SBR. Using the calibration device, imaging system performance could be quantified and compared, exposing relevant differences in sensitivity. Image analysis demonstrated a profound influence of background noise and the selection of the background on SBR. In this article, we suggest clear approaches for the quantification of imaging system performance assessment and post-acquisition image analysis, attempting to set new standards in the field of FI.
Samsudin, Mohd Dinie Muhaimin; Mat Don, Mashitah
2015-01-01
Oil palm trunk (OPT) sap was utilized for growth and bioethanol production by Saccharomycescerevisiae with addition of palm oil mill effluent (POME) as nutrients supplier. Maximum yield (YP/S) was attained at 0.464g bioethanol/g glucose presence in the OPT sap-POME-based media. However, OPT sap and POME are heterogeneous in properties and fermentation performance might change if it is repeated. Contribution of parametric uncertainty analysis on bioethanol fermentation performance was then assessed using Monte Carlo simulation (stochastic variable) to determine probability distributions due to fluctuation and variation of kinetic model parameters. Results showed that based on 100,000 samples tested, the yield (YP/S) ranged 0.423-0.501g/g. Sensitivity analysis was also done to evaluate the impact of each kinetic parameter on the fermentation performance. It is found that bioethanol fermentation highly depend on growth of the tested yeast. Copyright © 2014 Elsevier Ltd. All rights reserved.
A Self-Directed Method for Cell-Type Identification and Separation of Gene Expression Microarrays
Zuckerman, Neta S.; Noam, Yair; Goldsmith, Andrea J.; Lee, Peter P.
2013-01-01
Gene expression analysis is generally performed on heterogeneous tissue samples consisting of multiple cell types. Current methods developed to separate heterogeneous gene expression rely on prior knowledge of the cell-type composition and/or signatures - these are not available in most public datasets. We present a novel method to identify the cell-type composition, signatures and proportions per sample without need for a-priori information. The method was successfully tested on controlled and semi-controlled datasets and performed as accurately as current methods that do require additional information. As such, this method enables the analysis of cell-type specific gene expression using existing large pools of publically available microarray datasets. PMID:23990767
Downey, Mark O; Rochfort, Simone
2008-08-01
A limitation of large-scale viticultural trials is the time and cost of comprehensive compositional analysis of the fruit by high-performance liquid chromatography (HPLC). In addition, separate methods have generally been required to identify and quantify different classes of metabolites. To address these shortcomings a reversed-phase HPLC method was developed to simultaneously separate the anthocyanins and flavonols present in grape skins. The method employs a methanol and water gradient acidified with 10% formic acid with a run-time of 48 min including re-equilibration. Identity of anthocyanins and flavonols in Shiraz (Vitis vinifera L.) skin was confirmed by mass spectral analysis.
GREENE, CASEY S.; TAN, JIE; UNG, MATTHEW; MOORE, JASON H.; CHENG, CHAO
2017-01-01
Recent technological advances allow for high throughput profiling of biological systems in a cost-efficient manner. The low cost of data generation is leading us to the “big data” era. The availability of big data provides unprecedented opportunities but also raises new challenges for data mining and analysis. In this review, we introduce key concepts in the analysis of big data, including both “machine learning” algorithms as well as “unsupervised” and “supervised” examples of each. We note packages for the R programming language that are available to perform machine learning analyses. In addition to programming based solutions, we review webservers that allow users with limited or no programming background to perform these analyses on large data compendia. PMID:27908398
GREENE, CASEY S.; TAN, JIE; UNG, MATTHEW; MOORE, JASON H.; CHENG, CHAO
2017-01-01
Recent technological advances allow for high throughput profiling of biological systems in a cost-efficient manner. The low cost of data generation is leading us to the “big data” era. The availability of big data provides unprecedented opportunities but also raises new challenges for data mining and analysis. In this review, we introduce key concepts in the analysis of big data, including both “machine learning” algorithms as well as “unsupervised” and “supervised” examples of each. We note packages for the R programming language that are available to perform machine learning analyses. In addition to programming based solutions, we review webservers that allow users with limited or no programming background to perform these analyses on large data compendia. PMID:24799088
Upgrades to the NESS (Nuclear Engine System Simulation) Code
NASA Technical Reports Server (NTRS)
Fittje, James E.
2007-01-01
In support of the President's Vision for Space Exploration, the Nuclear Thermal Rocket (NTR) concept is being evaluated as a potential propulsion technology for human expeditions to the moon and Mars. The need for exceptional propulsion system performance in these missions has been documented in numerous studies, and was the primary focus of a considerable effort undertaken during the 1960's and 1970's. The NASA Glenn Research Center is leveraging this past NTR investment in their vehicle concepts and mission analysis studies with the aid of the Nuclear Engine System Simulation (NESS) code. This paper presents the additional capabilities and upgrades made to this code in order to perform higher fidelity NTR propulsion system analysis and design.
Aeroelastic Optimization of Generalized Tube and Wing Aircraft Concepts Using HCDstruct Version 2.0
NASA Technical Reports Server (NTRS)
Quinlan, Jesse R.; Gern, Frank H.
2017-01-01
Major enhancements were made to the Higher-fidelity Conceptual Design and structural optimization (HCDstruct) tool developed at NASA Langley Research Center (LaRC). Whereas previous versions were limited to hybrid wing body (HWB) configurations, the current version of HCDstruct now supports the analysis of generalized tube and wing (TW) aircraft concepts. Along with significantly enhanced user input options for all air- craft configurations, these enhancements represent HCDstruct version 2.0. Validation was performed using a Boeing 737-200 aircraft model, for which primary structure weight estimates agreed well with available data. Additionally, preliminary analysis of the NASA D8 (ND8) aircraft concept was performed, highlighting several new features of the tool.
Greene, Casey S; Tan, Jie; Ung, Matthew; Moore, Jason H; Cheng, Chao
2014-12-01
Recent technological advances allow for high throughput profiling of biological systems in a cost-efficient manner. The low cost of data generation is leading us to the "big data" era. The availability of big data provides unprecedented opportunities but also raises new challenges for data mining and analysis. In this review, we introduce key concepts in the analysis of big data, including both "machine learning" algorithms as well as "unsupervised" and "supervised" examples of each. We note packages for the R programming language that are available to perform machine learning analyses. In addition to programming based solutions, we review webservers that allow users with limited or no programming background to perform these analyses on large data compendia. © 2014 Wiley Periodicals, Inc.
Jackowetz, J N; Mira de Orduña, R
2013-08-15
Sulphur dioxide (SO2) is essential for the preservation of wines. The presence of SO2 binding compounds in musts and wines may limit sulphite efficacy leading to higher total SO2 additions, which may exceed SO2 limits permitted by law and pose health risks for sensitive individuals. An improved method for the quantification of significant wine SO2 binding compounds is presented that applies a novel sample treatment approach and rapid UHPLC separation. Glucose, galacturonic acid, alpha-ketoglutarate, pyruvate, acetoin and acetaldehyde were derivatised with 2,4-dinitrophenylhydrazine and separated using a solid core C18 phase by ultra high performance liquid chromatography. Addition of EDTA to samples prevented de novo acetaldehyde formation from ethanol oxidation. Optimised derivatisation duration enhanced reproducibility and allowed for glucose and galacturonic acid quantification. High glucose residues were found to interfere with the recovery of other SO2 binders, but practical SO2 concentrations and red wine pigments did not affect derivatisation efficiency. The calibration range, method accuracy, precision and limits of detection were found to be satisfactory for routine analysis of SO2 binders in wines. The current method represents a significant improvement in the comprehensive analysis of SO2 binding wine carbonyls. It allows for the quantification of major SO2 binders at practical analyte concentrations, and uses a simple sample treatment method that prevents treatment artifacts. Equipment utilisation could be reduced by rapid LC separation while maintaining analytical performance parameters. The improved method will be a valuable addition for the analysis of total SO2 binder pools in oenological samples. Published by Elsevier Ltd.
Cost-effectiveness analysis of implants versus autologous perforator flaps using the BREAST-Q.
Matros, Evan; Albornoz, Claudia R; Razdan, Shantanu N; Mehrara, Babak J; Macadam, Sheina A; Ro, Teresa; McCarthy, Colleen M; Disa, Joseph J; Cordeiro, Peter G; Pusic, Andrea L
2015-04-01
Reimbursement has been recognized as a physician barrier to autologous reconstruction. Autologous reconstructions are more expensive than prosthetic reconstructions, but provide greater health-related quality of life. The authors' hypothesis is that autologous tissue reconstructions are cost-effective compared with prosthetic techniques when considering health-related quality of life and patient satisfaction. A cost-effectiveness analysis from the payer perspective, including patient input, was performed for unilateral and bilateral reconstructions with deep inferior epigastric perforator (DIEP) flaps and implants. The effectiveness measure was derived using the BREAST-Q and interpreted as the cost for obtaining 1 year of perfect breast health-related quality-adjusted life-year. Costs were obtained from the 2010 Nationwide Inpatient Sample. The incremental cost-effectiveness ratio was generated. A sensitivity analysis for age and stage at diagnosis was performed. BREAST-Q scores from 309 patients with implants and 217 DIEP flap reconstructions were included. The additional cost for obtaining 1 year of perfect breast-related health for a unilateral DIEP flap compared with implant reconstruction was $11,941. For bilateral DIEP flaps compared with implant reconstructions, the cost for an additional breast health-related quality-adjusted life-year was $28,017. The sensitivity analysis demonstrated that the cost for an additional breast health-related quality-adjusted life-year for DIEP flaps compared with implants was less for younger patients and earlier stage breast cancer. DIEP flaps are cost-effective compared with implants, especially for unilateral reconstructions. Cost-effectiveness of autologous techniques is maximized in women with longer life expectancy. Patient-reported outcomes findings can be incorporated into cost-effectiveness analyses to demonstrate the relative value of reconstructive procedures.
Arithmetic on Your Phone: A Large Scale Investigation of Simple Additions and Multiplications.
Zimmerman, Federico; Shalom, Diego; Gonzalez, Pablo A; Garrido, Juan Manuel; Alvarez Heduan, Facundo; Dehaene, Stanislas; Sigman, Mariano; Rieznik, Andres
2016-01-01
We present the results of a gamified mobile device arithmetic application which allowed us to collect vast amount of data in simple arithmetic operations. Our results confirm and replicate, on a large sample, six of the main principles derived in a long tradition of investigation: size effect, tie effect, size-tie interaction effect, five-effect, RTs and error rates correlation effect, and most common error effect. Our dataset allowed us to perform a robust analysis of order effects for each individual problem, for which there is controversy both in experimental findings and in the predictions of theoretical models. For addition problems, the order effect was dominated by a max-then-min structure (i.e 7+4 is easier than 4+7). This result is predicted by models in which additions are performed as a translation starting from the first addend, with a distance given by the second addend. In multiplication, we observed a dominance of two effects: (1) a max-then-min pattern that can be accounted by the fact that it is easier to perform fewer additions of the largest number (i.e. 8x3 is easier to compute as 8+8+8 than as 3+3+…+3) and (2) a phonological effect by which problems for which there is a rhyme (i.e. "seis por cuatro es veinticuatro") are performed faster. Above and beyond these results, our study bares an important practical conclusion, as proof of concept, that participants can be motivated to perform substantial arithmetic training simply by presenting it in a gamified format.
Arithmetic on Your Phone: A Large Scale Investigation of Simple Additions and Multiplications
Zimmerman, Federico; Shalom, Diego; Gonzalez, Pablo A.; Garrido, Juan Manuel; Alvarez Heduan, Facundo; Dehaene, Stanislas; Sigman, Mariano; Rieznik, Andres
2016-01-01
We present the results of a gamified mobile device arithmetic application which allowed us to collect vast amount of data in simple arithmetic operations. Our results confirm and replicate, on a large sample, six of the main principles derived in a long tradition of investigation: size effect, tie effect, size-tie interaction effect, five-effect, RTs and error rates correlation effect, and most common error effect. Our dataset allowed us to perform a robust analysis of order effects for each individual problem, for which there is controversy both in experimental findings and in the predictions of theoretical models. For addition problems, the order effect was dominated by a max-then-min structure (i.e 7+4 is easier than 4+7). This result is predicted by models in which additions are performed as a translation starting from the first addend, with a distance given by the second addend. In multiplication, we observed a dominance of two effects: (1) a max-then-min pattern that can be accounted by the fact that it is easier to perform fewer additions of the largest number (i.e. 8x3 is easier to compute as 8+8+8 than as 3+3+…+3) and (2) a phonological effect by which problems for which there is a rhyme (i.e. "seis por cuatro es veinticuatro") are performed faster. Above and beyond these results, our study bares an important practical conclusion, as proof of concept, that participants can be motivated to perform substantial arithmetic training simply by presenting it in a gamified format. PMID:28033357
Lazaris, Charalampos; Kelly, Stephen; Ntziachristos, Panagiotis; Aifantis, Iannis; Tsirigos, Aristotelis
2017-01-05
Chromatin conformation capture techniques have evolved rapidly over the last few years and have provided new insights into genome organization at an unprecedented resolution. Analysis of Hi-C data is complex and computationally intensive involving multiple tasks and requiring robust quality assessment. This has led to the development of several tools and methods for processing Hi-C data. However, most of the existing tools do not cover all aspects of the analysis and only offer few quality assessment options. Additionally, availability of a multitude of tools makes scientists wonder how these tools and associated parameters can be optimally used, and how potential discrepancies can be interpreted and resolved. Most importantly, investigators need to be ensured that slight changes in parameters and/or methods do not affect the conclusions of their studies. To address these issues (compare, explore and reproduce), we introduce HiC-bench, a configurable computational platform for comprehensive and reproducible analysis of Hi-C sequencing data. HiC-bench performs all common Hi-C analysis tasks, such as alignment, filtering, contact matrix generation and normalization, identification of topological domains, scoring and annotation of specific interactions using both published tools and our own. We have also embedded various tasks that perform quality assessment and visualization. HiC-bench is implemented as a data flow platform with an emphasis on analysis reproducibility. Additionally, the user can readily perform parameter exploration and comparison of different tools in a combinatorial manner that takes into account all desired parameter settings in each pipeline task. This unique feature facilitates the design and execution of complex benchmark studies that may involve combinations of multiple tool/parameter choices in each step of the analysis. To demonstrate the usefulness of our platform, we performed a comprehensive benchmark of existing and new TAD callers exploring different matrix correction methods, parameter settings and sequencing depths. Users can extend our pipeline by adding more tools as they become available. HiC-bench consists an easy-to-use and extensible platform for comprehensive analysis of Hi-C datasets. We expect that it will facilitate current analyses and help scientists formulate and test new hypotheses in the field of three-dimensional genome organization.
Surface-specific additive manufacturing test artefacts
NASA Astrophysics Data System (ADS)
Townsend, Andrew; Racasan, Radu; Blunt, Liam
2018-06-01
Many test artefact designs have been proposed for use with additive manufacturing (AM) systems. These test artefacts have primarily been designed for the evaluation of AM form and dimensional performance. A series of surface-specific measurement test artefacts designed for use in the verification of AM manufacturing processes are proposed here. Surface-specific test artefacts can be made more compact because they do not require the large dimensions needed for accurate dimensional and form measurements. The series of three test artefacts are designed to provide comprehensive information pertaining to the manufactured surface. Measurement possibilities include deviation analysis, surface texture parameter data generation, sub-surface analysis, layer step analysis and build resolution comparison. The test artefacts are designed to provide easy access for measurement using conventional surface measurement techniques, for example, focus variation microscopy, stylus profilometry, confocal microscopy and scanning electron microscopy. Additionally, the test artefacts may be simply visually inspected as a comparative tool, giving a fast indication of process variation between builds. The three test artefacts are small enough to be included in every build and include built-in manufacturing traceability information, making them a convenient physical record of the build.
New integrated information system for pusan national university hospital.
Kim, Hyung Hoi; Cho, Kyung-Won; Kim, Hye Sook; Kim, Ju-Sim; Kim, Jung Hyun; Han, Sang Pil; Park, Chun Bok; Kim, Seok; Chae, Young Moon
2011-03-01
This study presents the information system for Pusan National University Hospital (PNUH), evaluates its performance qualitatively, and conducts economic analysis. Information system for PNUH was designed by component-based development and developed by internet technologies. Order Communication System, Electronic Medical Record, and Clinical Decision Support System were newly developed. The performance of the hospital information system was qualitatively evaluated based on the performance reference model in order to identify problem areas for the old system. The Information Economics approach was used to analyze the economic feasibility of hospital information system in order to account for the intangible benefits. Average performance scores were 3.16 for input layer, 3.35 for process layer, and 3.57 for business layer. In addition, the cumulative benefit to cost ratio was 0.50 in 2011, 1.73 in 2012, 1.76 in 2013, 1.71 in 2014, and 1.71 in 2015. The B/C ratios steadily increase as value items are added. While overall performance scores were reasonably high, doctors were less satisfied with the system, perhaps due to the weak clinical function in the systems. The information economics analysis demonstrated the economic profitability of the information systems if all intangible benefits were included. The second qualitative evaluation survey and economic analysis were proposed to evaluate the changes in performance of the new system.
NASA Astrophysics Data System (ADS)
Mikhailovskaya, A. V.; Golovin, I. S.; Zaitseva, A. A.; Portnoi, V. K.; Dröttboom, P.; Cifre, J.
2013-03-01
Methods of microstructural analysis, measurements of hardness, and temperature and time dependences of internal friction (TDIF and TDIF(iso), respectively) have been used to study recrystallization in cold-rolled alloys and grain-boundary relaxation in annealed alloys. A complex analysis of the effect of additions of transition metals (Mn, Cr) on the magnitude of the activation energy of the background of the internal friction in deformed and annealed states and on the activation parameters of grain-boundary relaxation has been performed. Methods of amplitude dependences of internal friction (ADIF) have been used to determine the critical amplitude that corresponds to the beginning of microplastic deformation in the alloys at different temperatures.
Piepho, H P
1995-03-01
The additive main effects multiplicative interaction model is frequently used in the analysis of multilocation trials. In the analysis of such data it is of interest to decide how many of the multiplicative interaction terms are significant. Several tests for this task are available, all of which assume that errors are normally distributed with a common variance. This paper investigates the robustness of several tests (Gollob, F GH1, FGH2, FR)to departures from these assumptions. It is concluded that, because of its better robustness, the F Rtest is preferable. If the other tests are to be used, preliminary tests for the validity of assumptions should be performed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Castanera, Raul; Lopez-Varas, Leticia; Borgognone, Alessandra
Transposable elements (TEs) are exceptional contributors to eukaryotic genome diversity. Their ubiquitous presence impacts the genomes of nearly all species and mediates genome evolution by causing mutations and chromosomal rearrangements and by modulating gene expression. We performed an exhaustive analysis of the TE content in 18 fungal genomes, including strains of the same species and species of the same genera. Our results depicted a scenario of exceptional variability, with species having 0.02 to 29.8% of their genome consisting of transposable elements. A detailed analysis performed on two strains of Pleurotus ostreatus uncovered a genome that is populated mainly by Classmore » I elements, especially LTR-retrotransposons amplified in recent bursts from 0 to 2 million years (My) ago. The preferential accumulation of TEs in clusters led to the presence of genomic regions that lacked intra- and inter-specific conservation. In addition, we investigated the effect of TE insertions on the expression of their nearby upstream and downstream genes. Our results showed that an important number of genes under TE influence are significantly repressed, with stronger repression when genes are localized within transposon clusters. Our transcriptional analysis performed in four additional fungal models revealed that this TE-mediated silencing was present only in species with active cytosine methylation machinery. We hypothesize that this phenomenon is related to epigenetic defense mechanisms that are aimed to suppress TE expression and control their proliferation.« less
Castanera, Raul; Lopez-Varas, Leticia; Borgognone, Alessandra; ...
2016-06-13
Transposable elements (TEs) are exceptional contributors to eukaryotic genome diversity. Their ubiquitous presence impacts the genomes of nearly all species and mediates genome evolution by causing mutations and chromosomal rearrangements and by modulating gene expression. We performed an exhaustive analysis of the TE content in 18 fungal genomes, including strains of the same species and species of the same genera. Our results depicted a scenario of exceptional variability, with species having 0.02 to 29.8% of their genome consisting of transposable elements. A detailed analysis performed on two strains of Pleurotus ostreatus uncovered a genome that is populated mainly by Classmore » I elements, especially LTR-retrotransposons amplified in recent bursts from 0 to 2 million years (My) ago. The preferential accumulation of TEs in clusters led to the presence of genomic regions that lacked intra- and inter-specific conservation. In addition, we investigated the effect of TE insertions on the expression of their nearby upstream and downstream genes. Our results showed that an important number of genes under TE influence are significantly repressed, with stronger repression when genes are localized within transposon clusters. Our transcriptional analysis performed in four additional fungal models revealed that this TE-mediated silencing was present only in species with active cytosine methylation machinery. We hypothesize that this phenomenon is related to epigenetic defense mechanisms that are aimed to suppress TE expression and control their proliferation.« less
García-Pinillos, Felipe; Molina-Molina, Alejandro; Latorre-Román, Pedro Á
2016-06-01
This study aimed to determine whether kinematic data during countermovement jump (CMJ) might explain post-activation potentiation (PAP) phenomenon after an exhausting running test. Thirty-three trained endurance runners performed the Léger Test; an incremental test which consists of continuous running between two lines 20 m apart. CMJ performance was determined before (pre-test) and immediately after the protocol (post-test). Sagittal plane, video of CMJs was recorded and kinematic data were obtained throughout 2-Dimensional analysis. In addition to the duration of eccentric and concentric phases of CMJ, hip, knee and ankle angles were measured at four key points during CMJ: the lowest position of the squat, take-off, landing, and at the lowest position after landing. Additionally, heart rate was monitored, and rate of perceived exertion was recorded at post-test. Analysis of variance revealed a significant improvement in CMJ (p = 0.002) at post-test. Cluster analysis grouped according to whether PAP was experienced (responders group: RG, n = 25) or not (non-responders group: NRG, n = 8) relative to CMJ change from rest to post-test. RG significantly improved (p < 0.001) the performance in CMJ, whereas NRG remained unchanged. Kinematic data did not show significant differences between RG and NRG. Thus, the data suggest that jumping kinematic does not provide the necessary information to explain PAP phenomenon after intensive running exercises in endurance athletes.
Impact of Operating Rules on Planning Capacity Expansion of Urban Water Supply Systems
NASA Astrophysics Data System (ADS)
de Neufville, R.; Galelli, S.; Tian, X.
2017-12-01
This study addresses the impact of operating rules on capacity planning of urban water supply systems. The continuous growth of metropolitan areas represents a major challenge for water utilities, which often rely on industrial water supply (e.g., desalination, reclaimed water) to complement natural resources (e.g., reservoirs). These additional sources increase the reliability of supply, equipping operators with additional means to hedge against droughts. How do their rules for using industrial water supply impact the performance of water supply system? How might it affect long-term plans for capacity expansion? Possibly significantly, as demonstrated by the analysis of the operations and planning of a water supply system inspired by Singapore. Our analysis explores the system dynamics under multiple inflow and management scenarios to understand the extent to which alternative operating rules for the use of industrial water supply affect system performance. Results first show that these operating rules can have significant impact on the variability in system performance (e.g., reliability, energy use) comparable to that of hydro-climatological conditions. Further analyses of several capacity expansion exercises—based on our original hydrological and management scenarios—show that operating rules significantly affect the timing and magnitude of critical decisions, such as the construction of new desalination plants. These results have two implications: Capacity expansion analysis should consider the effect of a priori uncertainty about operating rules; and operators should consider how their flexibility in operating rules can affect their perceived need for capacity.
Self-regulated learning and academic performance in medical education.
Lucieer, Susanna M; Jonker, Laura; Visscher, Chris; Rikers, Remy M J P; Themmen, Axel P N
2016-06-01
Medical schools aim to graduate medical doctors who are able to self-regulate their learning. It is therefore important to investigate whether medical students' self-regulated learning skills change during medical school. In addition, since these skills are expected to be helpful to learn more effectively, it is of interest to investigate whether these skills are related to academic performance. In a cross-sectional design, the Self-Regulation of Learning Self-Report Scale (SRL-SRS) was used to investigate the change in students' self-regulated learning skills. First and third-year students (N = 949, 81.7%) SRL-SRS scores were compared with ANOVA. The relation with academic performance was investigated with multinomial regression analysis. Only one of the six skills, reflection, significantly, but positively, changed during medical school. In addition, a small, but positive relation of monitoring, reflection, and effort with first-year GPA was found, while only effort was related to third-year GPA. The change in self-regulated learning skills is minor as only the level of reflection differs between the first and third year. In addition, the relation between self-regulated learning skills and academic performance is limited. Medical schools are therefore encouraged to re-examine the curriculum and methods they use to enhance their students' self-regulated learning skills. Future research is required to understand the limited impact on performance.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhou, Ning; Huang, Zhenyu; Tuffner, Francis K.
2010-02-28
Small signal stability problems are one of the major threats to grid stability and reliability. Prony analysis has been successfully applied on ringdown data to monitor electromechanical modes of a power system using phasor measurement unit (PMU) data. To facilitate an on-line application of mode estimation, this paper develops a recursive algorithm for implementing Prony analysis and proposed an oscillation detection method to detect ringdown data in real time. By automatically detecting ringdown data, the proposed method helps guarantee that Prony analysis is applied properly and timely on the ringdown data. Thus, the mode estimation results can be performed reliablymore » and timely. The proposed method is tested using Monte Carlo simulations based on a 17-machine model and is shown to be able to properly identify the oscillation data for on-line application of Prony analysis. In addition, the proposed method is applied to field measurement data from WECC to show the performance of the proposed algorithm.« less
Parametric and experimental analysis using a power flow approach
NASA Technical Reports Server (NTRS)
Cuschieri, J. M.
1988-01-01
Having defined and developed a structural power flow approach for the analysis of structure-borne transmission of structural vibrations, the technique is used to perform an analysis of the influence of structural parameters on the transmitted energy. As a base for comparison, the parametric analysis is first performed using a Statistical Energy Analysis approach and the results compared with those obtained using the power flow approach. The advantages of using structural power flow are thus demonstrated by comparing the type of results obtained by the two methods. Additionally, to demonstrate the advantages of using the power flow method and to show that the power flow results represent a direct physical parameter that can be measured on a typical structure, an experimental investigation of structural power flow is also presented. Results are presented for an L-shaped beam for which an analytical solution has already been obtained. Furthermore, the various methods available to measure vibrational power flow are compared to investigate the advantages and disadvantages of each method.
Leasure, A Renee; Stirlen, Joan; Lu, Shu Hua
2012-01-01
Ventilator-associated pneumonia (VAP) is a subset of hospital-acquired pneumonias and is a serious, sometimes fatal, complication in patients who need mechanical ventilation. In addition, pay-for-performance initiative has placed increased emphasis on preventing nosocomial infections including VAP. Facilities may not be reimbursed for costs associated with prevalence infections. This article presents a review and meta-analysis of the prevention of VAP through the aspiration of subglottic secretion.
Simulator Evaluation of Lineup Visual Landing Aids for Night Carrier Landing.
1987-03-10
recognized that the system is less than optimum (2,3). Because the information from the meatball is of zero order (displacement only), there are...gives the analysis-of-variance summaries of glideslope performance across the flight segments for TOT glideslope + 0.3 degrees (± 1.0 meatball ), RMS...accepted as reliable. In addition, analysis-of- variance of percent TOT glideslope ± 0.45 degrees (± 1.5 meatball ) did not reveal any statistical
NASA Astrophysics Data System (ADS)
Pembroke, A. D.; Colbert, J. A.
2015-12-01
The Community Coordinated Modeling Center (CCMC) provides hosting for many of the simulations used by the space weather community of scientists, educators, and forecasters. CCMC users may submit model runs through the Runs on Request system, which produces static visualizations of model output in the browser, while further analysis may be performed off-line via Kameleon, CCMC's cross-language access and interpolation library. Off-line analysis may be suitable for power-users, but storage and coding requirements present a barrier to entry for non-experts. Moreover, a lack of a consistent framework for analysis hinders reproducibility of scientific findings. To that end, we have developed Kameleon Live, a cloud based interactive analysis and visualization platform. Kameleon Live allows users to create scientific studies built around selected runs from the Runs on Request database, perform analysis on those runs, collaborate with other users, and disseminate their findings among the space weather community. In addition to showcasing these novel collaborative analysis features, we invite feedback from CCMC users as we seek to advance and improve on the new platform.
Kardum-Skelin, Ika
2011-09-01
Clinical cytology is an interdisciplinary medical diagnostic profession that integrates clinical, laboratory and analytical fields along with final cytologist's expert opinion. Cytology involves nonaggressive, minimally invasive and simple for use procedures that are fully acceptable for the patient. Cytology offers rapid orientation, while in combination with additional technologies on cytologic smear analysis (cytochemistry, immunocytochemistry for cell marker analysis, computer image analysis) or sophisticated methods on cytologic samples (flow cytometry, molecular and cytogenetic analysis) it plays a major role in the diagnosis, subtyping and prognosis of malignant tumors. Ten rules for successful performance in cytology are as follows: 1) due knowledge of overall cytology (general cytologist); 2) inclusion in all stages of cytologic sample manipulation from sampling through reporting; 3) due knowledge of additional technologies to provide appropriate interpretation and/or rational advice in dubious cases; 4) to preserve dignity of the profession because every profession has its advantages, shortcomings and limitations; 5) to insist on quality control of the performance, individual cytologists and cytology team; 6) knowledge transfer to young professionals; 7) assisting fellow professionals in dubious cases irrespective of the time needed and fee because it implies helping the patient and the profession itself; 8) experience exchange with other related professionals to upgrade mutual understanding; 9) to prefer the interest of the profession over one's own interest; and 10) to love cytology.
Luszczki, Jarogniew J; Zagaja, Mirosław; Miziak, Barbara; Kondrat-Wrobel, Maria W; Zaluska, Katarzyna; Wroblewska-Luczka, Paula; Adamczuk, Piotr; Czuczwar, Stanislaw J; Florek-Luszczki, Magdalena
2018-01-01
To isobolographically determine the types of interactions that occur between retigabine and lacosamide (LCM; two third-generation antiepileptic drugs) with respect to their anticonvulsant activity and acute adverse effects (sedation) in the maximal electroshock-induced seizures (MES) and chimney test (motor performance) in adult male Swiss mice. Type I isobolographic analysis for nonparallel dose-response effects for the combination of retigabine with LCM (at the fixed-ratio of 1:1) in both the MES and chimney test in mice was performed. Brain concentrations of retigabine and LCM were measured by high-pressure liquid chromatography (HPLC) to characterize any pharmacokinetic interactions occurring when combining these drugs. Linear regression analysis revealed that retigabine had its dose-response effect line nonparallel to that of LCM in both the MES and chimney tests. The type I isobolographic analysis illustrated that retigabine combined with LCM (fixed-ratio of 1:1) exerted an additive interaction in the mouse MES model and sub-additivity (antagonism) in the chimney test. With HPLC, retigabine and LCM did not mutually change their total brain concentrations, thereby confirming the pharmacodynamic nature of the interaction. LCM combined with retigabine possesses a beneficial preclinical profile (benefit index ranged from 2.07 to 2.50) and this 2-drug combination is worth recommending as treatment plan to patients with pharmacoresistant epilepsy. © 2017 S. Karger AG, Basel.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shahnam, Mehrdad; Gel, Aytekin; Subramaniyan, Arun K.
Adequate assessment of the uncertainties in modeling and simulation is becoming an integral part of the simulation based engineering design. The goal of this study is to demonstrate the application of non-intrusive Bayesian uncertainty quantification (UQ) methodology in multiphase (gas-solid) flows with experimental and simulation data, as part of our research efforts to determine the most suited approach for UQ of a bench scale fluidized bed gasifier. UQ analysis was first performed on the available experimental data. Global sensitivity analysis performed as part of the UQ analysis shows that among the three operating factors, steam to oxygen ratio has themore » most influence on syngas composition in the bench-scale gasifier experiments. An analysis for forward propagation of uncertainties was performed and results show that an increase in steam to oxygen ratio leads to an increase in H2 mole fraction and a decrease in CO mole fraction. These findings are in agreement with the ANOVA analysis performed in the reference experimental study. Another contribution in addition to the UQ analysis is the optimization-based approach to guide to identify next best set of additional experimental samples, should the possibility arise for additional experiments. Hence, the surrogate models constructed as part of the UQ analysis is employed to improve the information gain and make incremental recommendation, should the possibility to add more experiments arise. In the second step, series of simulations were carried out with the open-source computational fluid dynamics software MFiX to reproduce the experimental conditions, where three operating factors, i.e., coal flow rate, coal particle diameter, and steam-to-oxygen ratio, were systematically varied to understand their effect on the syngas composition. Bayesian UQ analysis was performed on the numerical results. As part of Bayesian UQ analysis, a global sensitivity analysis was performed based on the simulation results, which shows that the predicted syngas composition is strongly affected not only by the steam-to-oxygen ratio (which was observed in experiments as well) but also by variation in the coal flow rate and particle diameter (which was not observed in experiments). The carbon monoxide mole fraction is underpredicted at lower steam-to-oxygen ratios and overpredicted at higher steam-to-oxygen ratios. The opposite trend is observed for the carbon dioxide mole fraction. These discrepancies are attributed to either excessive segregation of the phases that leads to the fuel-rich or -lean regions or alternatively the selection of reaction models, where different reaction models and kinetics can lead to different syngas compositions throughout the gasifier. To improve quality of numerical models used, the effect that uncertainties in reaction models for gasification, char oxidation, carbon monoxide oxidation, and water gas shift will have on the syngas composition at different grid resolution, along with bed temperature were investigated. The global sensitivity analysis showed that among various reaction models employed for water gas shift, gasification, char oxidation, the choice of reaction model for water gas shift has the greatest influence on syngas composition, with gasification reaction model being second. Syngas composition also shows a small sensitivity to temperature of the bed. The hydrodynamic behavior of the bed did not change beyond grid spacing of 18 times the particle diameter. However, the syngas concentration continued to be affected by the grid resolution as low as 9 times the particle diameter. This is due to a better resolution of the phasic interface between the gases and solid that leads to stronger heterogeneous reactions. This report is a compilation of three manuscripts published in peer-reviewed journals for the series of studies mentioned above.« less
Combining ability for yield and fruit quality in the pepper Capsicum annuum.
do Nascimento, N F F; do Rêgo, E R; Nascimento, M F; Bruckner, C H; Finger, F L; do Rêgo, M M
2014-04-29
The objective of this study was to determine the effects of the general and specific combining abilities (GCA and SCA, respectively) of 15 characteristics and to evaluate the most promising crosses and the reciprocal effect between the hybrids of six parents of the Capsicum annuum species. Six parents, belonging to the Horticultural Germplasm Bank of Centro de Ciências Agrárias of Universidade Federal da Paraíba, were crossed in complete diallel manner. The 30 hybrids generated and the parents were then analyzed in a completely randomized design with three replicates. The data were submitted to analysis of variance at 1% probability, and the means were grouped by the Scott-Knott test at 1% probability. The diallel analysis was performed according to the Griffing method, model I and fixed model. Both additive and non-additive effects influenced the hybrids' performance, as indicated by the GCA/SCA ratio. The non-additive effects, epistasis and/or dominance, played a more important role than the additive effects in pedicel length, pericarp thickness, fresh matter, dry matter content, seed yield per fruit, fruit yield per plant, days to fructification, and total soluble solids. The GCA effects were more important than the SCA effects in the fruit weight, fruit length and diameter, placenta length, yield, vitamin C, and titratable acidity characteristics. The results found here clearly show that ornamental pepper varieties can be developed through hybridization in breeding programs with C. annuum.
Yang, H H Wendy
2017-01-01
A new practical and time-saving ultra-high performance liquid chromatography (UHPLC) method has been developed for determining the organic impurities in the anthraquinone color additives D&C Violet No. 2 and D&C Green No. 6. The impurities determined are p-toluidine, 1-hydroxyanthraquinone, 1,4-dihydroxyanthraquinone, and two subsidiary colors. The newly developed UHPLC method uses a 1.7-μ particle size C-18 column, 0.1 M ammonium acetate and acetonitrile as eluents, and photodiode array detection. For the quantification of the impurities, six-point calibration curves were used with correlation coefficients that ranged from 0.9974 to 0.9998. Recoveries of impurities ranged from 99 to 104%. Relative standard deviations ranged from 0.81 to 4.29%. The limits of detection for the impurities ranged from 0.0067% to 0.216%. Samples from sixteen batches of each color additive were analyzed, and the results favorably compared with the results obtained by gravity-elution column chromatography, thin-layer chromatography, and isooctane extraction. Unlike with those other methods, use of the UHPLC method permits all of the impurities to be determined in a single analysis, while also reducing the amount of organic waste and saving time and labor. The method is expected to be implemented by the U.S. Food and Drug Administration for analysis of color additive samples submitted for batch certification.
Yoshida, Catherine E; Kruczkiewicz, Peter; Laing, Chad R; Lingohr, Erika J; Gannon, Victor P J; Nash, John H E; Taboada, Eduardo N
2016-01-01
For nearly 100 years serotyping has been the gold standard for the identification of Salmonella serovars. Despite the increasing adoption of DNA-based subtyping approaches, serotype information remains a cornerstone in food safety and public health activities aimed at reducing the burden of salmonellosis. At the same time, recent advances in whole-genome sequencing (WGS) promise to revolutionize our ability to perform advanced pathogen characterization in support of improved source attribution and outbreak analysis. We present the Salmonella In Silico Typing Resource (SISTR), a bioinformatics platform for rapidly performing simultaneous in silico analyses for several leading subtyping methods on draft Salmonella genome assemblies. In addition to performing serovar prediction by genoserotyping, this resource integrates sequence-based typing analyses for: Multi-Locus Sequence Typing (MLST), ribosomal MLST (rMLST), and core genome MLST (cgMLST). We show how phylogenetic context from cgMLST analysis can supplement the genoserotyping analysis and increase the accuracy of in silico serovar prediction to over 94.6% on a dataset comprised of 4,188 finished genomes and WGS draft assemblies. In addition to allowing analysis of user-uploaded whole-genome assemblies, the SISTR platform incorporates a database comprising over 4,000 publicly available genomes, allowing users to place their isolates in a broader phylogenetic and epidemiological context. The resource incorporates several metadata driven visualizations to examine the phylogenetic, geospatial and temporal distribution of genome-sequenced isolates. As sequencing of Salmonella isolates at public health laboratories around the world becomes increasingly common, rapid in silico analysis of minimally processed draft genome assemblies provides a powerful approach for molecular epidemiology in support of public health investigations. Moreover, this type of integrated analysis using multiple sequence-based methods of sub-typing allows for continuity with historical serotyping data as we transition towards the increasing adoption of genomic analyses in epidemiology. The SISTR platform is freely available on the web at https://lfz.corefacility.ca/sistr-app/.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Muth, Thomas R.; Peter, William H.
The team performed a literature review, conducted residual stress measurements, performed failure analysis, and demonstrated a solid state additive manufacturing repair technique on samples removed from a scrapped propeller hub. The team evaluated multiple options for hub repair that included existing metal buildup technologies that the Federal Aviation Administration (FAA) has already embraced, such as cold spray, high velocity oxy-fuel deposition (HVOF), and plasma spray. In addition the team helped Piedmont Propulsion Systems, LLC (PPS) evaluate three potential solutions that could be deployed at different stages in the life cycle of aluminum alloy hubs, in addition to the conventional spraymore » coating method for repair. For new hubs, a machining practice to prevent fretting with the steel drive shaft was recommended. For hubs that were refurbished with some material remaining above the minimal material condition (MMC), a silver interface applied by an electromagnetic pulse additive manufacturing method was recommended. For hubs that were at or below the MMC, a solid state additive manufacturing technique using ultrasonic welding (UW) of thin layers of 7075 aluminum to the hub interface was recommended. A cladding demonstration using the UW technique achieved mechanical bonding of the layers showing promise as a viable repair method.« less
NASA Astrophysics Data System (ADS)
Safitri, Nina; Mubarok, M. Zaki; Winarko, Ronny; Tanlega, Zela
2018-05-01
In the present study, precipitation of nickel and cobalt as mixed hydroxide precipitate (MHP) from pregnant leach solution of nickel limonite ore from Soroako after iron removal stage was carried out. A series of MHP precipitation experiments was conducted by using MgO slurry as neutralizing agent and the effects of pH, temperature, duration of precipitation and the addition of MHP seed on the precipitation behavior of nickel, cobalt, as well as iron and manganese was studied. Characterization of MHP product was performed by particle size analyzer (PSA) as well as X-Ray Fluorescence (XRF), X-Ray Diffractometer (XRD) and Scanning Electron Microscope (SEM) analyses. Kinetics analysis was made by using differential-integral method for the rate of homogenous reaction. Precipitation at pH 7, temperature 50°C for 30 minute, without seed addition resulted in nickel and cobalt recoveries of 82.8% and 92%, respectively with co-precipitated iron and manganese of 70% and 24.2%, respectively. The seed addition increases nickel and cobalt precipitations significantly to 99.9% and 99.1%, respectively. However, the addition of seed into led to a significant increase of manganese co-precipitation from 24.2% without seed addition to 39.5% at the addition of 1 g seed per 200 mL of PLS. Kinetics analysis revealed that Ni precipitation to form MHP follows the second-order reaction kinetics with activation energy of 94.6 kJ/mol.
Bloch, Carter; Schneider, Jesper W.; Sinkjær, Thomas
2016-01-01
The present paper examines the relation between size, accumulation and performance for research grants, where we examine the relation between grant size for Centres of Excellence (CoE) funded by the Danish National Research Foundation (DNRF) and various ex post research performance measures, including impact and shares of highly cited articles. We examine both the relation between size and performance and also how performance for CoEs evolves over the course of grant periods. In terms of dynamics, it appears that performance over the grant period (i.e. 10 years) is falling for the largest CoEs, while it is increasing for those among the smallest half. Overall, multivariate econometric analysis finds evidence that performance is increasing in grant size and over time. In both cases, the relation appears to be non-linear, suggesting that there is a point at which performance peaks. The CoEs have also been very successful in securing additional funding, which can be viewed as a ‘cumulative effect’ of center grants. In terms of new personnel, the far majority of additional funding is spent on early career researchers, hence, this accumulation would appear to have a ‘generational’ dimension, allowing for scientific expertise to be passed on to an increasing number of younger researchers. PMID:26862907
Visibility analysis of fire lookout towers in the Boyabat State Forest Enterprise in Turkey.
Kucuk, Omer; Topaloglu, Ozer; Altunel, Arif Oguz; Cetin, Mehmet
2017-07-01
For a successful fire suppression, it is essential to detect and intervene forest fires as early as possible. Fire lookout towers are crucial assets in detecting forest fires, in addition to other technological advancements. In this study, we performed a visibility analysis on a network of fire lookout towers currently operating in a relatively fire-prone region in Turkey's Western Black Sea region. Some of these towers had not been functioning properly; it was proposed that these be taken out of the grid and replaced with new ones. The percentage of visible areas under the current network of fire lookout towers was 73%; it could rise to 81% with the addition of newly proposed towers. This study was the first research to conduct a visibility analysis of current and newly proposed fire lookout towers in the Western Black Sea region and focus on its forest fire problem.
Warmerdam, G J J; Vullings, R; Van Laar, J O E H; Van der Hout-Van der Jagt, M B; Bergmans, J W M; Schmitt, L; Oei, S G
2016-08-01
Cardiotocography (CTG) is currently the most often used technique for detection of fetal distress. Unfortunately, CTG has a poor specificity. Recent studies suggest that, in addition to CTG, information on fetal distress can be obtained from analysis of fetal heart rate variability (HRV). However, uterine contractions can strongly influence fetal HRV. The aim of this study is therefore to investigate whether HRV analysis for detection of fetal distress can be improved by distinguishing contractions from rest periods. Our results from feature selection indicate that HRV features calculated separately during contractions or during rest periods are more informative on fetal distress than HRV features that are calculated over the entire fetal heart rate. Furthermore, classification performance improved from a geometric mean of 69.0% to 79.6% when including the contraction-dependent HRV features, in addition to HRV features calculated over the entire fetal heart rate.
Introduction on performance analysis and profiling methodologies for KVM on ARM virtualization
NASA Astrophysics Data System (ADS)
Motakis, Antonios; Spyridakis, Alexander; Raho, Daniel
2013-05-01
The introduction of hardware virtualization extensions on ARM Cortex-A15 processors has enabled the implementation of full virtualization solutions for this architecture, such as KVM on ARM. This trend motivates the need to quantify and understand the performance impact, emerged by the application of this technology. In this work we start looking into some interesting performance metrics on KVM for ARM processors, which can provide us with useful insight that may lead to potential improvements in the future. This includes measurements such as interrupt latency and guest exit cost, performed on ARM Versatile Express and Samsung Exynos 5250 hardware platforms. Furthermore, we discuss additional methodologies that can provide us with a deeper understanding in the future of the performance footprint of KVM. We identify some of the most interesting approaches in this field, and perform a tentative analysis on how these may be implemented in the KVM on ARM port. These take into consideration hardware and software based counters for profiling, and issues related to the limitations of the simulators which are often used, such as the ARM Fast Models platform.
Hardesty, Samantha L; Hagopian, Louis P; McIvor, Melissa M; Wagner, Leaora L; Sigurdsson, Sigurdur O; Bowman, Lynn G
2014-09-01
The present study isolated the effects of frequently used staff training intervention components to increase communication between direct care staff and clinicians working on an inpatient behavioral unit. Written "protocol review" quizzes developed by clinicians were designed to assess knowledge about a patient's behavioral protocols. Direct care staff completed these at the beginning of each day and evening shift. Clinicians were required to score and discuss these protocol reviews with direct care staff for at least 75% of shifts over a 2-week period. During baseline, only 21% of clinicians met this requirement. Completing and scoring of protocol reviews did not improve following additional in-service training (M = 15%) or following an intervention aimed at decreasing response effort combined with prompting (M = 28%). After implementing an intervention involving specified performance criterion and performance feedback, 86% of clinicians reached the established goal. Results of a component analysis suggested that the presentation of both the specified performance criterion and supporting contingencies was necessary to maintain acceptable levels of performance. © The Author(s) 2014.
TVD, Linnehan collects data during LMS-1 Spacelab mission
1996-07-09
STS078-430-009 (20 June-7 July 1996) --- Astronaut Richard M. Linnehan, mission specialist, performs a test on his leg using the Torque Velocity Dynamometer (TVD). Dr. Thirsk was measuring changes in muscle forces of the leg in this particular view. The TVD hardware is also used to measure arm muscle forces and velocity at the bicep and tricep areas. Crewmembers for the mission performed all experiment protocols prior to flight to develop a baseline and will also perform post-flight tests to complete the analysis. Additionally, muscle biopsies were taken before the flight and will be conducted after the flight.
NASA Astrophysics Data System (ADS)
Tsai, Li-Fen; Shaw, Jing-Chi; Wang, Pei-Wen; Shih, Meng-Long; Su, Yi-Jing
2011-10-01
This study aims to probe into customers' online word-of-mouth regarding cultural heritage applications and performance facilities in Cultural and Creative Industries. Findings demonstrate that, regarding online word-of-mouth for art museums, museums, and art villages, items valued by customers are design aesthetics of displays and collections, educational functions, and environments and landscapes. The percentages are 10.102%, 11.208% and 11.44%, respectively. In addition, cultural heritage applications and performance facility industries in Taiwan are highly valued in online word-of-mouth.
2011-01-01
Background Cytogenetic evaluation is a key component of the diagnosis and prognosis of chronic lymphocytic leukemia (CLL). We performed oligonucleotide-based comparative genomic hybridization microarray analysis on 34 samples with CLL and known abnormal karyotypes previously determined by cytogenetics and/or fluorescence in situ hybridization (FISH). Results Using a custom designed microarray that targets >1800 genes involved in hematologic disease and other malignancies, we identified additional cryptic aberrations and novel findings in 59% of cases. These included gains and losses of genes associated with cell cycle regulation, apoptosis and susceptibility loci on 3p21.31, 5q35.2q35.3, 10q23.31q23.33, 11q22.3, and 22q11.23. Conclusions Our results show that microarray analysis will detect known aberrations, including microscopic and cryptic alterations. In addition, novel genomic changes will be uncovered that may become important prognostic predictors or treatment targets for CLL in the future. PMID:22087757
Goerlitz, D.F.; Franks, B.J.
1989-01-01
Appraisal of ground water contaminated by organic substances raises problems of difficult sample collection and timely chemical analysis. High-performance liquid chromatography was evaluated for on-site determination of specific organic contaminants in ground water samples and was used at three study sites. Organic solutes were determined directly in water samples, with little or no preparation, and usually in less than an hour after collection. This information improved sampling efficiency and was useful in screening for subsequent laboratory analysis. On two occasions, on-site analysis revealed that samples were undergoing rapid change, with major solutes being upgraded and alteration products being formed. In addition to sample stability, this technique proved valuable for monitoring other sampling factors such as compositional changes with respect to pumping, filtration, and cross contamination. -Authors
Sherman, Lawrence; Clement, Peter T; Cherian, Meena N; Ndayimirije, Nestor; Noel, Luc; Dahn, Bernice; Gwenigale, Walter T; Kushner, Adam L
2011-01-01
To document infrastructure, personnel, procedures performed, and supplies and equipment available at all county hospitals in Liberia using the World Health Organization Tool for Situational Analysis of Emergency and Essential Surgical Care. Survey of county hospitals using the World Health Organization Tool for Situational Analysis of Emergency and Essential Surgical Care. Sixteen county hospitals in Liberia. Infrastructure, personnel, procedures performed, and supplies and equipment available. Uniformly, gross deficiencies in infrastructure, personnel, and supplies and equipment were identified. The World Health Organization Tool for Situational Analysis of Emergency and Essential Surgical Care was useful in identifying baseline emergency and surgical conditions for evidenced-based planning. To achieve the Poverty Reduction Strategy and delivery of the Basic Package of Health and Social Welfare Services, additional resources and manpower are needed to improve surgical and anesthetic care.
NASA Astrophysics Data System (ADS)
Wang, Yupeng; Chang, Kyunghi
In this paper, we analyze the coexistence issues of M-WiMAX TDD and WCDMA FDD systems. Smart antenna techniques are applied to mitigate the performance loss induced by adjacent channel interference (ACI) in the scenarios where performance is heavily degraded. In addition, an ACI model is proposed to capture the effect of transmit beamforming at the M-WiMAX base station. Furthermore, a MCS-based throughput analysis is proposed, to jointly consider the effects of ACI, system packet error rate requirement, and the available modulation and coding schemes, which is not possible by using the conventional Shannon equation based analysis. From the results, we find that the proposed MCS-based analysis method is quite suitable to analyze the system theoretical throughput in a practical manner.
Data processing, multi-omic pathway mapping, and metabolite activity analysis using XCMS Online
Forsberg, Erica M; Huan, Tao; Rinehart, Duane; Benton, H Paul; Warth, Benedikt; Hilmers, Brian; Siuzdak, Gary
2018-01-01
Systems biology is the study of complex living organisms, and as such, analysis on a systems-wide scale involves the collection of information-dense data sets that are representative of an entire phenotype. To uncover dynamic biological mechanisms, bioinformatics tools have become essential to facilitating data interpretation in large-scale analyses. Global metabolomics is one such method for performing systems biology, as metabolites represent the downstream functional products of ongoing biological processes. We have developed XCMS Online, a platform that enables online metabolomics data processing and interpretation. A systems biology workflow recently implemented within XCMS Online enables rapid metabolic pathway mapping using raw metabolomics data for investigating dysregulated metabolic processes. In addition, this platform supports integration of multi-omic (such as genomic and proteomic) data to garner further systems-wide mechanistic insight. Here, we provide an in-depth procedure showing how to effectively navigate and use the systems biology workflow within XCMS Online without a priori knowledge of the platform, including uploading liquid chromatography (LCLC)–mass spectrometry (MS) data from metabolite-extracted biological samples, defining the job parameters to identify features, correcting for retention time deviations, conducting statistical analysis of features between sample classes and performing predictive metabolic pathway analysis. Additional multi-omics data can be uploaded and overlaid with previously identified pathways to enhance systems-wide analysis of the observed dysregulations. We also describe unique visualization tools to assist in elucidation of statistically significant dysregulated metabolic pathways. Parameter input takes 5–10 min, depending on user experience; data processing typically takes 1–3 h, and data analysis takes ~30 min. PMID:29494574
Smoking increases the risk of diabetic foot amputation: A meta-analysis.
Liu, Min; Zhang, Wei; Yan, Zhaoli; Yuan, Xiangzhen
2018-02-01
Accumulating evidence suggests that smoking is associated with diabetic foot amputation. However, the currently available results are inconsistent and controversial. Therefore, the present study performed a meta-analysis to systematically review the association between smoking and diabetic foot amputation and to investigate the risk factors of diabetic foot amputation. Public databases, including PubMed and Embase, were searched prior to 29th February 2016. The heterogeneity was assessed using the Cochran's Q statistic and the I 2 statistic, and odds ratio (OR) and 95% confidence interval (CI) were calculated and pooled appropriately. Sensitivity analysis was performed to evaluate the stability of the results. In addition, Egger's test was applied to assess any potential publication bias. Based on the research, a total of eight studies, including five cohort studies and three case control studies were included. The data indicated that smoking significantly increased the risk of diabetic foot amputation (OR=1.65; 95% CI, 1.09-2.50; P<0.0001) compared with non-smoking. Sensitivity analysis demonstrated that the pooled analysis did not vary substantially following the exclusion of any one study. Additionally, there was no evidence of publication bias (Egger's test, t=0.1378; P=0.8958). Furthermore, no significant difference was observed between the minor and major amputation groups in patients who smoked (OR=0.79; 95% CI, 0.24-2.58). The results of the present meta-analysis suggested that smoking is a notable risk factor for diabetic foot amputation. Smoking cessation appears to reduce the risk of diabetic foot amputation.
The influence of anthropometrics on physical employment standard performance.
Reilly, T; Spivock, M; Prayal-Brown, A; Stockbrugger, B; Blacklock, R
2016-10-01
The Canadian Armed Forces (CAF) recently implemented the Fitness for Operational Requirements of CAF Employment (FORCE), a new physical employment standard (PES). Data collection throughout development included anthropometric profiles of the CAF. To determine if anthropometric measurements and demographic information would predict the performance outcomes of the FORCE and/or Common Military Task Fitness Evaluation (CMTFE). We conducted a secondary analysis of data from FORCE research. We obtained bioelectrical impedance and segmental analysis. Statistical analysis included correlation and linear regression analyses. Among the 668 study subjects, as predicted, any task requiring lifting, pulling or moving of an object was significantly and positively correlated (r > 0.67) to lean body mass (LBM) measurements. LBM correlated with stretcher carry (r = 0.78) and with lifting actions such as sand bag drag (r = 0.77), vehicle extrication (r = 0.71), sand bag fortification (r = 0.68) and sand bag lift time (r = -0.67). The difference between the correlation of dead mass (DM) with task performance compared with LBM was not statistically significant. DM and LBM can be used in a PES to predict success on military tasks such as casualty evacuation and manual material handling. However, there is no minimum LBM required to perform these tasks successfully. These data direct future research on how we should diversify research participants by anthropometrics, in addition to the traditional demographic variables of gender and age, to highlight potential important adverse impact with PES design. In addition, the results can be used to develop better training regimens to facilitate passing a PES. © All rights reserved. ‘The Influence of Anthropometrics on Physical Employment Standard Performance’ has been reproduced with the permission of DND, 2016.
Cost Analysis of Channeled, Distal Chip Laryngoscope for In-office Laryngopharyngeal Biopsies.
Marcus, Sonya; Timen, Micah; Dion, Gregory R; Fritz, Mark A; Branski, Ryan C; Amin, Milan R
2018-02-19
Given that financial considerations play an increasingly prominent role in clinical decision-making, we sought (1) to determine the cost-effectiveness of in-office biopsy for the patient, the provider, and the health-care system, and (2) to determine the diagnostic accuracy of in-office biopsy. Retrospective, financial analyses were performed. Patients who underwent in-office (Current Procedural Terminology Code 31576) or operative biopsy (CPT Code 31535) for laryngopharyngeal lesions were included. Two financial analyses were performed: (1) the average cost of operating room (OR) versus in-office biopsy was calculated, and (2) a break-even analysis was calculated to determine the cost-effectiveness of in-office biopsy for the provider. In addition, the diagnostic accuracy of in-office biopsies and need for additional biopsies or procedures was recorded. Of the 48 patients included in the current study, 28 underwent in-office biopsy. A pathologic sample was obtained in 26 of 28 (92.9%) biopsies performed in the office. Of these patients, 16 avoided subsequent OR procedures. The average per patient cost was $7000 and $11,000 for in-office and OR biopsy, respectively. Break-even analysis demonstrated that the provider could achieve a profit 2 years after purchase of the necessary equipment. In-office laryngopharyngeal biopsies are accurate and, overall, more cost-effective than OR biopsies. Purchase of the channeled, distal chip laryngoscope and biopsy forceps to perform in-office biopsies can be profitable for a provider with a videolaryngoscopy tower. In-office biopsy should be considered the initial diagnostic tool for suspected laryngopharyngeal malignancies noted on videolaryngoscopy. Copyright © 2018 The Voice Foundation. Published by Elsevier Inc. All rights reserved.
Design and application of electromechanical actuators for deep space missions
NASA Technical Reports Server (NTRS)
Haskew, Tim A.; Wander, John
1993-01-01
During the period 8/16/92 through 2/15/93, work has been focused on three major topics: (1) screw modeling and testing; (2) motor selection; and (3) health monitoring and fault diagnosis. Detailed theoretical analysis has been performed to specify a full dynamic model for the roller screw. A test stand has been designed for model parameter estimation and screw testing. In addition, the test stand is expected to be used to perform a study on transverse screw loading.
Vibration waveform effects on dynamic stabilization of ablative Rayleigh-Taylor instability
DOE Office of Scientific and Technical Information (OSTI.GOV)
Piriz, A. R.; Lucchio, L. Di; Rodriguez Prieto, G.
2011-08-15
An analysis of dynamic stabilization of Rayleigh-Taylor instability in an ablation front is performed by considering a general square wave for modulating the vertical acceleration of the front. Such a kind of modulation allows for clarifying the role of thermal conduction in the mechanism of dynamic stabilization. In addition, the study of the effect of different modulations by varying the duration and amplitude of the square wave in each half-period provides insight on the optimum performance of dynamic stabilization.
Evaluating evaluation forms form.
Smith, Roger P
2004-02-01
To provide a tool for evaluating evaluation forms. A new form has been developed and tested on itself and a sample of evaluation forms obtained from the graduate medical education offices of several local universities. Additional forms from hospital administration were also subjected to analysis. The new form performed well when applied to itself. The form performed equally well when applied to the other (subject) forms, although their scores were embarrassingly poor. A new form for evaluating evaluation forms is needed, useful, and now available.
Endometritis: Diagnostic Tools for Infectious Endometritis.
Ferris, Ryan A
2016-12-01
Infectious endometritis is among the leading causes of subfertility in the mare. However, the best way to reliably diagnose these cases of infectious endometritis can be confusing to the veterinary practitioner. The goal of this article is to describe how to perform various sample collection techniques, what analyses can be performed on these samples, and how to interpret the results of these analysis. Additionally, future technologies will be presented that are not currently used in equine reproduction practice. Published by Elsevier Inc.
Zonta, F; Stancher, B
1985-07-19
A high-performance liquid chromatographic method for determining phylloquinone (vitamin K1) in soy bean oils is described. Resolution of vitamin K1 from interfering peaks of the matrix was obtained after enzymatic digestion, extraction and liquid-solid chromatography on alumina. An isocratic reversed-phase chromatography with UV detection was used in the final stage. The quantitation was carried out by the standard addition method, and the recovery of the whole procedure was 88.2%.
NASA Astrophysics Data System (ADS)
Yong, Kilyuk; Jo, Sujang; Bang, Hyochoong
This paper presents a modified Rodrigues parameter (MRP)-based nonlinear observer design to estimate bias, scale factor and misalignment of gyroscope measurements. A Lyapunov stability analysis is carried out for the nonlinear observer. Simulation is performed and results are presented illustrating the performance of the proposed nonlinear observer under the condition of persistent excitation maneuver. In addition, a comparison between the nonlinear observer and alignment Kalman filter (AKF) is made to highlight favorable features of the nonlinear observer.
An analysis of aerodynamic requirements for coordinated bank-to-turn autopilots
NASA Technical Reports Server (NTRS)
Arrow, A.
1982-01-01
Two planar missile airframes were compared having the potential for improved bank-to-turn control but having different aerodynamic properties. The comparison was made with advanced level autopilots using both linear and nonlinear 3-D aerodynamic models to obtain realistic missile body angular rates and control surface incidence. Cortical cross-coupling effects are identified and desirable aerodynamics are recommended for improved coordinated (BTT) (CBTT) performance. In addition, recommendations are made for autopilot control law analyses and design techniques for improving CBTT performance.
THz optical design considerations and optimization for medical imaging applications
NASA Astrophysics Data System (ADS)
Sung, Shijun; Garritano, James; Bajwa, Neha; Nowroozi, Bryan; Llombart, Nuria; Grundfest, Warren; Taylor, Zachary D.
2014-09-01
THz imaging system design will play an important role making possible imaging of targets with arbitrary properties and geometries. This study discusses design consideration and imaging performance optimization techniques in THz quasioptical imaging system optics. Analysis of field and polarization distortion by off-axis parabolic (OAP) mirrors in THz imaging optics shows how distortions are carried in a series of mirrors while guiding the THz beam. While distortions of the beam profile by individual mirrors are not significant, these effects are compounded by a series of mirrors in antisymmetric orientation. It is shown that symmetric orientation of the OAP mirror effectively cancels this distortion to recover the original beam profile. Additionally, symmetric orientation can correct for some geometrical off-focusing due to misalignment. We also demonstrate an alternative method to test for overall system optics alignment by investigating the imaging performance of the tilted target plane. Asymmetric signal profile as a function of the target plane's tilt angle indicates when one or more imaging components are misaligned, giving a preferred tilt direction. Such analysis can offer additional insight into often elusive source device misalignment at an integrated system. Imaging plane tilting characteristics are representative of a 3-D modulation transfer function of the imaging system. A symmetric tilted plane is preferred to optimize imaging performance.
Hybrid, experimental and computational, investigation of mechanical components
NASA Astrophysics Data System (ADS)
Furlong, Cosme; Pryputniewicz, Ryszard J.
1996-07-01
Computational and experimental methodologies have unique features for the analysis and solution of a wide variety of engineering problems. Computations provide results that depend on selection of input parameters such as geometry, material constants, and boundary conditions which, for correct modeling purposes, have to be appropriately chosen. In addition, it is relatively easy to modify the input parameters in order to computationally investigate different conditions. Experiments provide solutions which characterize the actual behavior of the object of interest subjected to specific operating conditions. However, it is impractical to experimentally perform parametric investigations. This paper discusses the use of a hybrid, computational and experimental, approach for study and optimization of mechanical components. Computational techniques are used for modeling the behavior of the object of interest while it is experimentally tested using noninvasive optical techniques. Comparisons are performed through a fringe predictor program used to facilitate the correlation between both techniques. In addition, experimentally obtained quantitative information, such as displacements and shape, can be applied in the computational model in order to improve this correlation. The result is a validated computational model that can be used for performing quantitative analyses and structural optimization. Practical application of the hybrid approach is illustrated with a representative example which demonstrates the viability of the approach as an engineering tool for structural analysis and optimization.
Methods of learning in statistical education: Design and analysis of a randomized trial
NASA Astrophysics Data System (ADS)
Boyd, Felicity Turner
Background. Recent psychological and technological advances suggest that active learning may enhance understanding and retention of statistical principles. A randomized trial was designed to evaluate the addition of innovative instructional methods within didactic biostatistics courses for public health professionals. Aims. The primary objectives were to evaluate and compare the addition of two active learning methods (cooperative and internet) on students' performance; assess their impact on performance after adjusting for differences in students' learning style; and examine the influence of learning style on trial participation. Methods. Consenting students enrolled in a graduate introductory biostatistics course were randomized to cooperative learning, internet learning, or control after completing a pretest survey. The cooperative learning group participated in eight small group active learning sessions on key statistical concepts, while the internet learning group accessed interactive mini-applications on the same concepts. Controls received no intervention. Students completed evaluations after each session and a post-test survey. Study outcome was performance quantified by examination scores. Intervention effects were analyzed by generalized linear models using intent-to-treat analysis and marginal structural models accounting for reported participation. Results. Of 376 enrolled students, 265 (70%) consented to randomization; 69, 100, and 96 students were randomized to the cooperative, internet, and control groups, respectively. Intent-to-treat analysis showed no differences between study groups; however, 51% of students in the intervention groups had dropped out after the second session. After accounting for reported participation, expected examination scores were 2.6 points higher (of 100 points) after completing one cooperative learning session (95% CI: 0.3, 4.9) and 2.4 points higher after one internet learning session (95% CI: 0.0, 4.7), versus nonparticipants or controls, adjusting for other performance predictors. Students who preferred learning by reflective observation and active experimentation experienced improved performance through internet learning (5.9 points, 95% CI: 1.2, 10.6) and cooperative learning (2.9 points, 95% CI: 0.6, 5.2), respectively. Learning style did not influence study participation. Conclusions. No performance differences by group were observed by intent-to-treat analysis. Participation in active learning appears to improve student performance in an introductory biostatistics course and provides opportunities for enhancing understanding beyond that attained in traditional didactic classrooms.
Wagner, Mathilde; Corcuera-Solano, Idoia; Lo, Grace; Esses, Steven; Liao, Joseph; Besa, Cecilia; Chen, Nelson; Abraham, Ginu; Fung, Maggie; Babb, James S; Ehman, Richard L; Taouli, Bachir
2017-08-01
Purpose To assess the determinants of technical failure of magnetic resonance (MR) elastography of the liver in a large single-center study. Materials and Methods This retrospective study was approved by the institutional review board. Seven hundred eighty-one MR elastography examinations performed in 691 consecutive patients (mean age, 58 years; male patients, 434 [62.8%]) in a single center between June 2013 and August 2014 were retrospectively evaluated. MR elastography was performed at 3.0 T (n = 443) or 1.5 T (n = 338) by using a gradient-recalled-echo pulse sequence. MR elastography and anatomic image analysis were performed by two observers. Additional observers measured liver T2* and fat fraction. Technical failure was defined as no pixel value with a confidence index higher than 95% and/or no apparent shear waves imaged. Logistic regression analysis was performed to assess potential predictive factors of technical failure of MR elastography. Results The technical failure rate of MR elastography at 1.5 T was 3.5% (12 of 338), while it was higher, 15.3% (68 of 443), at 3.0 T. On the basis of univariate analysis, body mass index, liver iron deposition, massive ascites, use of 3.0 T, presence of cirrhosis, and alcoholic liver disease were all significantly associated with failure of MR elastography (P < .004); but on the basis of multivariable analysis, only body mass index, liver iron deposition, massive ascites, and use of 3.0 T were significantly associated with failure of MR elastography (P < .004). Conclusion The technical failure rate of MR elastography with a gradient-recalled-echo pulse sequence was low at 1.5 T but substantially higher at 3.0 T. Massive ascites, iron deposition, and high body mass index were additional independent factors associated with failure of MR elastography of the liver with a two-dimensional gradient-recalled-echo pulse sequence. © RSNA, 2017.
NASA Technical Reports Server (NTRS)
Ranaudo, R. J.; Batterson, J. G.; Reehorst, A. L.; Bond, T. H.; Omara, T. M.
1989-01-01
A flight test was performed with the NASA Lewis Research Center's DH-6 icing research aircraft. The purpose was to employ a flight test procedure and data analysis method, to determine the accuracy with which the effects of ice on aircraft stability and control could be measured. For simplicity, flight testing was restricted to the short period longitudinal mode. Two flights were flown in a clean (baseline) configuration, and two flights were flown with simulated horizontal tail ice. Forty-five repeat doublet maneuvers were performed in each of four test configurations, at a given trim speed, to determine the ensemble variation of the estimated stability and control derivatives. Additional maneuvers were also performed in each configuration, to determine the variation in the longitudinal derivative estimates over a wide range of trim speeds. Stability and control derivatives were estimated by a Modified Stepwise Regression (MSR) technique. A measure of the confidence in the derivative estimates was obtained by comparing the standard error for the ensemble of repeat maneuvers, to the average of the estimated standard errors predicted by the MSR program. A multiplicative relationship was determined between the ensemble standard error, and the averaged program standard errors. In addition, a 95 percent confidence interval analysis was performed for the elevator effectiveness estimates, C sub m sub delta e. This analysis identified the speed range where changes in C sub m sub delta e could be attributed to icing effects. The magnitude of icing effects on the derivative estimates were strongly dependent on flight speed and aircraft wing flap configuration. With wing flaps up, the estimated derivatives were degraded most at lower speeds corresponding to that configuration. With wing flaps extended to 10 degrees, the estimated derivatives were degraded most at the higher corresponding speeds. The effects of icing on the changes in longitudinal stability and control derivatives were adequately determined by the flight test procedure and the MSR analysis method discussed herein.
QC-ART: A tool for real-time quality control assessment of mass spectrometry-based proteomics data.
Stanfill, Bryan A; Nakayasu, Ernesto S; Bramer, Lisa M; Thompson, Allison M; Ansong, Charles K; Clauss, Therese; Gritsenko, Marina A; Monroe, Matthew E; Moore, Ronald J; Orton, Daniel J; Piehowski, Paul D; Schepmoes, Athena A; Smith, Richard D; Webb-Robertson, Bobbie-Jo; Metz, Thomas O
2018-04-17
Liquid chromatography-mass spectrometry (LC-MS)-based proteomics studies of large sample cohorts can easily require from months to years to complete. Acquiring consistent, high-quality data in such large-scale studies is challenging because of normal variations in instrumentation performance over time, as well as artifacts introduced by the samples themselves, such as those due to collection, storage and processing. Existing quality control methods for proteomics data primarily focus on post-hoc analysis to remove low-quality data that would degrade downstream statistics; they are not designed to evaluate the data in near real-time, which would allow for interventions as soon as deviations in data quality are detected. In addition to flagging analyses that demonstrate outlier behavior, evaluating how the data structure changes over time can aide in understanding typical instrument performance or identify issues such as a degradation in data quality due to the need for instrument cleaning and/or re-calibration. To address this gap for proteomics, we developed Quality Control Analysis in Real-Time (QC-ART), a tool for evaluating data as they are acquired in order to dynamically flag potential issues with instrument performance or sample quality. QC-ART has similar accuracy as standard post-hoc analysis methods with the additional benefit of real-time analysis. We demonstrate the utility and performance of QC-ART in identifying deviations in data quality due to both instrument and sample issues in near real-time for LC-MS-based plasma proteomics analyses of a sample subset of The Environmental Determinants of Diabetes in the Young cohort. We also present a case where QC-ART facilitated the identification of oxidative modifications, which are often underappreciated in proteomic experiments. Published under license by The American Society for Biochemistry and Molecular Biology, Inc.
Parsley, Nancy L; Harris, Ilene B
2012-01-01
The teaching and assessment of professionalism have become central areas of research and practice in medicine and in allopathic and osteopathic undergraduate and graduate medical education generally. In contrast, discussion of professionalism as it relates to podiatric medical education is nearly nonexistent in the literature. A study of podiatric medical students' perceptions of professionalism-related issues in the clinical setting was performed using a qualitative analysis. A written survey was sent to 88 students who had recently completed their clinical training experiences. The survey was completed anonymously, and all identifying information was redacted before analysis of the data, which was performed using thematic content analysis with constant comparative analysis. In addition, basic demographic information was acquired as part of the data collection process. Sixty-six students (75%) responded and agreed to participate in the survey. Students provided written reports of lapses in professional behavior that they had witnessed, heard about, or been personally involved in performing. The study confirmed that podiatric medical students had experienced various types of professional lapses in behavior, and six predominant themes were identified. This study, which was performed with a selected group of individuals at a single institution, serves as an initial assessment of the needs of podiatric medical students and will be useful for developing professionalism-related instructional activities that could benefit students in the future.
Angeler, David G; Viedma, Olga; Moreno, José M
2009-11-01
Time lag analysis (TLA) is a distance-based approach used to study temporal dynamics of ecological communities by measuring community dissimilarity over increasing time lags. Despite its increased use in recent years, its performance in comparison with other more direct methods (i.e., canonical ordination) has not been evaluated. This study fills this gap using extensive simulations and real data sets from experimental temporary ponds (true zooplankton communities) and landscape studies (landscape categories as pseudo-communities) that differ in community structure and anthropogenic stress history. Modeling time with a principal coordinate of neighborhood matrices (PCNM) approach, the canonical ordination technique (redundancy analysis; RDA) consistently outperformed the other statistical tests (i.e., TLAs, Mantel test, and RDA based on linear time trends) using all real data. In addition, the RDA-PCNM revealed different patterns of temporal change, and the strength of each individual time pattern, in terms of adjusted variance explained, could be evaluated, It also identified species contributions to these patterns of temporal change. This additional information is not provided by distance-based methods. The simulation study revealed better Type I error properties of the canonical ordination techniques compared with the distance-based approaches when no deterministic component of change was imposed on the communities. The simulation also revealed that strong emphasis on uniform deterministic change and low variability at other temporal scales is needed to result in decreased statistical power of the RDA-PCNM approach relative to the other methods. Based on the statistical performance of and information content provided by RDA-PCNM models, this technique serves ecologists as a powerful tool for modeling temporal change of ecological (pseudo-) communities.
Andersson, Johanna; Wikström, Ewa
2014-01-01
The purpose of this paper is to analyse how accounts of collaboration practice were made and used to construct accountability in the empirical context of coordination associations, a Swedish form of collaboration between four authorities in health and social care. They feature pooled budgets, joint leadership and joint reporting systems, intended to facilitate both collaboration and (shared) accountability. Empirical data were collected in field observations in local, regional and national settings. In addition, the study is based on analysis of local association documents such as evaluations and annual reports, and analysis of national agency reports. Accountability is constructed hierarchically with a narrow focus on performance, and horizontal (shared) accountability as well as outcomes are de-emphasised. Through this narrow construction of accountability the coordination associations are re-created as hierarchical and accountability is delegated rather than shared. Features such as pooled budgets, joint leadership and joint reporting systems can support collaboration but do not necessarily translate into shared accountability if accountability is interpreted and constructed hierarchically. When practice conforms to what is counted and accounted for, using the hierarchical and narrow construction of accountability, the result may be that the associations become an additional authority. That would increase rather than decrease fragmentation in the field. This research derives from first-hand observations of actor-to-actor episodes complemented with the analysis of documents and reports. It provides critical analysis of the construction and evaluation of accounts and accountability related to practice and performance in collaboration. The main contribution is the finding that despite the conditions intended to facilitate inter-organisational collaboration and horizontal accountability, the hierarchical accountability persisted.
Performance of Blind Source Separation Algorithms for FMRI Analysis using a Group ICA Method
Correa, Nicolle; Adali, Tülay; Calhoun, Vince D.
2007-01-01
Independent component analysis (ICA) is a popular blind source separation (BSS) technique that has proven to be promising for the analysis of functional magnetic resonance imaging (fMRI) data. A number of ICA approaches have been used for fMRI data analysis, and even more ICA algorithms exist, however the impact of using different algorithms on the results is largely unexplored. In this paper, we study the performance of four major classes of algorithms for spatial ICA, namely information maximization, maximization of non-gaussianity, joint diagonalization of cross-cumulant matrices, and second-order correlation based methods when they are applied to fMRI data from subjects performing a visuo-motor task. We use a group ICA method to study the variability among different ICA algorithms and propose several analysis techniques to evaluate their performance. We compare how different ICA algorithms estimate activations in expected neuronal areas. The results demonstrate that the ICA algorithms using higher-order statistical information prove to be quite consistent for fMRI data analysis. Infomax, FastICA, and JADE all yield reliable results; each having their strengths in specific areas. EVD, an algorithm using second-order statistics, does not perform reliably for fMRI data. Additionally, for the iterative ICA algorithms, it is important to investigate the variability of the estimates from different runs. We test the consistency of the iterative algorithms, Infomax and FastICA, by running the algorithm a number of times with different initializations and note that they yield consistent results over these multiple runs. Our results greatly improve our confidence in the consistency of ICA for fMRI data analysis. PMID:17540281
Performance of blind source separation algorithms for fMRI analysis using a group ICA method.
Correa, Nicolle; Adali, Tülay; Calhoun, Vince D
2007-06-01
Independent component analysis (ICA) is a popular blind source separation technique that has proven to be promising for the analysis of functional magnetic resonance imaging (fMRI) data. A number of ICA approaches have been used for fMRI data analysis, and even more ICA algorithms exist; however, the impact of using different algorithms on the results is largely unexplored. In this paper, we study the performance of four major classes of algorithms for spatial ICA, namely, information maximization, maximization of non-Gaussianity, joint diagonalization of cross-cumulant matrices and second-order correlation-based methods, when they are applied to fMRI data from subjects performing a visuo-motor task. We use a group ICA method to study variability among different ICA algorithms, and we propose several analysis techniques to evaluate their performance. We compare how different ICA algorithms estimate activations in expected neuronal areas. The results demonstrate that the ICA algorithms using higher-order statistical information prove to be quite consistent for fMRI data analysis. Infomax, FastICA and joint approximate diagonalization of eigenmatrices (JADE) all yield reliable results, with each having its strengths in specific areas. Eigenvalue decomposition (EVD), an algorithm using second-order statistics, does not perform reliably for fMRI data. Additionally, for iterative ICA algorithms, it is important to investigate the variability of estimates from different runs. We test the consistency of the iterative algorithms Infomax and FastICA by running the algorithm a number of times with different initializations, and we note that they yield consistent results over these multiple runs. Our results greatly improve our confidence in the consistency of ICA for fMRI data analysis.
Ionic Liquids as Novel Lubricants and /or Lubricant Additives
DOE Office of Scientific and Technical Information (OSTI.GOV)
Qu, J.; Viola, M. B.
2013-10-31
This ORNL-GM CRADA developed ionic liquids (ILs) as novel lubricants or oil additives for engine lubrication. A new group of oil-miscible ILs have been designed and synthesized with high thermal stability, non-corrosiveness, excellent wettability, and most importantly effective anti-scuffing/anti-wear and friction reduction characteristics. Mechanistic analysis attributes the superior lubricating performance of IL additives to their physical and chemical interactions with metallic surfaces. Working with a leading lubricant formulation company, the team has successfully developed a prototype low-viscosity engine oil using a phosphonium-phosphate IL as an anti-wear additive. Tribological bench tests of the IL-additized formulated oil showed 20-33% lower friction inmore » mixed and elastohydrodynamic lubrication and 38-92% lower wear in boundary lubrication when compared with commercial Mobil 1 and Mobil Clean 5W-30 engine oils. High-temperature, high load (HTHL) full-size engine tests confirmed the excellent anti-wear performance for the IL-additized engine oil. Sequence VID engine dynamometer tests demonstrated an improved fuel economy by >2% for this IL-additized engine oil benchmarked against the Mobil 1 5W-30 oil. In addition, accelerated catalyst aging tests suggest that the IL additive may potentially have less adverse impact on three-way catalysts compared to the conventional ZDDP. Follow-on research is needed for further development and optimization of IL chemistry and oil formulation to fully meet ILSAC GF-5 specifications and further enhance the automotive engine efficiency and durability.« less
Code of Federal Regulations, 2011 CFR
2011-07-01
...; (ii) Reestablish eligibility and certification as a private nonprofit, private for-profit, or public... approval of an additional educational program or a location under § 600.10(c). (2) Increase its level of... any wage analysis the institution may have performed, including any consideration of Bureau of Labor...
ERIC Educational Resources Information Center
Peterson, Diana Coomes; Mlynarczyk, Gregory S.A.
2016-01-01
This study examined whether student learning outcome measures are influenced by the addition of three-dimensional and digital teaching tools to a traditional dissection and lecture learning format curricula. The study was performed in a semester long graduate level course that incorporated both gross anatomy and neuroanatomy curricula. Methods…
What are you trying to learn? Study designs and the appropriate analysis for your research question
USDA-ARS?s Scientific Manuscript database
One fundamental necessity in the entire process of a well-performed study is the experimental design. A well-designed study can help researchers understand and have confidence in their results and analyses, and additionally the agreement or disagreement with the stated hypothesis. This well-designed...
ERIC Educational Resources Information Center
Jackson, Carla Wood; Schatschneider, Christopher; Leacox, Lindsey
2014-01-01
Purpose: The authors of this study described developmental trajectories and predicted kindergarten performance of Spanish and English receptive vocabulary acquisition of young Latino/a English language learners (ELLs) from socioeconomically disadvantaged migrant families. In addition, the authors examined the extent to which gender and individual…
Some Additional Lessons from the Wechsler Scales: A Rejoinder to Kaufman and Keith.
ERIC Educational Resources Information Center
Macmann, Gregg M.; Barnett, David W.
1994-01-01
Reacts to previous arguments regarding verbal and performance constructs of Wechsler Scales. Contends that general factor model is more plausible representation of data for these scales. Suggests issue is moot when considered in regards to practical applications. Supports analysis of needed skills and instructional environments in educational…
A New Variable Weighting and Selection Procedure for K-Means Cluster Analysis
ERIC Educational Resources Information Center
Steinley, Douglas; Brusco, Michael J.
2008-01-01
A variance-to-range ratio variable weighting procedure is proposed. We show how this weighting method is theoretically grounded in the inherent variability found in data exhibiting cluster structure. In addition, a variable selection procedure is proposed to operate in conjunction with the variable weighting technique. The performances of these…
Genome-Wide Association Study of Intelligence: Additive Effects of Novel Brain Expressed Genes
ERIC Educational Resources Information Center
Loo, Sandra K.; Shtir, Corina; Doyle, Alysa E.; Mick, Eric; McGough, James J.; McCracken, James; Biederman, Joseph; Smalley, Susan L.; Cantor, Rita M.; Faraone, Stephen V.; Nelson, Stanley F.
2012-01-01
Objective: The purpose of the present study was to identify common genetic variants that are associated with human intelligence or general cognitive ability. Method: We performed a genome-wide association analysis with a dense set of 1 million single-nucleotide polymorphisms (SNPs) and quantitative intelligence scores within an ancestrally…
Kemerdere, Rahsan; Ahmedov, Merdin Lyutviev; Alizada, Orkhan; Yeni, Seher Naz; Oz, Buge; Tanriverdi, Taner
2018-05-23
Temporal lobe epilepsy (TLE) is the most common form of focal epilepsy. Focal cortical dysplasia is the most common dual pathology found in association with the hippocampal sclerosis. In this study, the effect of dual pathology on freedom from seizure was sought in patients with TLE. This study performed a retrospective analysis of patients with TLE who underwent surgery between 2010 and 2017. Histopathologic analysis was performed on patients with and without dual pathology in the temporal neocortex. Seizure outcomes were compared. A total of 54 patients with TLE were included. The rate of overall favorable seizure outcome was found to be 96.3%. In 53.7%, dual pathology was present in the temporal cortices in addition to the hippocampal sclerosis. Patients without dual pathology showed significantly greater freedom from seizure (P = 0.02). Patients without dual pathology had a significantly higher seizure-free rate after anterior temporal resection than patients with dual pathology. Resection of the temporal cortex in addition to mesial temporal structures seems to be reasonable for better seizure outcome. Copyright © 2018 Elsevier Inc. All rights reserved.
Global performance enhancements via pedestal optimisation on ASDEX Upgrade
NASA Astrophysics Data System (ADS)
Dunne, M. G.; Frassinetti, L.; Beurskens, M. N. A.; Cavedon, M.; Fietz, S.; Fischer, R.; Giannone, L.; Huijsmans, G. T. A.; Kurzan, B.; Laggner, F.; McCarthy, P. J.; McDermott, R. M.; Tardini, G.; Viezzer, E.; Willensdorfer, M.; Wolfrum, E.; The EUROfusion MST1 Team; The ASDEX Upgrade Team
2017-02-01
Results of experimental scans of heating power, plasma shape, and nitrogen content are presented, with a focus on global performance and pedestal alteration. In detailed scans at low triangularity, it is shown that the increase in stored energy due to nitrogen seeding stems from the pedestal. It is also shown that the confinement increase is driven through the temperature pedestal at the three heating power levels studied. In a triangularity scan, an orthogonal effect of shaping and seeding is observed, where increased plasma triangularity increases the pedestal density, while impurity seeding (carbon and nitrogen) increases the pedestal temperature in addition to this effect. Modelling of these effects was also undertaken, with interpretive and predictive models being employed. The interpretive analysis shows a general agreement of the experimental pedestals in separate power, shaping, and seeding scans with peeling-ballooning theory. Predictive analysis was used to isolate the individual effects, showing that the trends of additional heating power and increased triangularity can be recoverd. However, a simple change of the effective charge in the plasma cannot explain the observed levels of confinement improvement in the present models.
3D printed fluidics with embedded analytic functionality for automated reaction optimisation
Capel, Andrew J; Wright, Andrew; Harding, Matthew J; Weaver, George W; Li, Yuqi; Harris, Russell A; Edmondson, Steve; Goodridge, Ruth D
2017-01-01
Additive manufacturing or ‘3D printing’ is being developed as a novel manufacturing process for the production of bespoke micro- and milliscale fluidic devices. When coupled with online monitoring and optimisation software, this offers an advanced, customised method for performing automated chemical synthesis. This paper reports the use of two additive manufacturing processes, stereolithography and selective laser melting, to create multifunctional fluidic devices with embedded reaction monitoring capability. The selectively laser melted parts are the first published examples of multifunctional 3D printed metal fluidic devices. These devices allow high temperature and pressure chemistry to be performed in solvent systems destructive to the majority of devices manufactured via stereolithography, polymer jetting and fused deposition modelling processes previously utilised for this application. These devices were integrated with commercially available flow chemistry, chromatographic and spectroscopic analysis equipment, allowing automated online and inline optimisation of the reaction medium. This set-up allowed the optimisation of two reactions, a ketone functional group interconversion and a fused polycyclic heterocycle formation, via spectroscopic and chromatographic analysis. PMID:28228852
Hey, Matthias; Hocke, Thomas; Mauger, Stefan; Müller-Deile, Joachim
2016-11-01
Individual speech intelligibility was measured in quiet and noise for cochlear Implant recipients upgrading from the Freedom to the CP900 series sound processor. The postlingually deafened participants (n = 23) used either Nucleus CI24RE or CI512 cochlear implant, and currently wore a Freedom sound processor. A significant group mean improvement in speech intelligibility was found in quiet (Freiburg monosyllabic words at 50 dB SPL ) and in noise (adaptive Oldenburger sentences in noise) for the two CP900 series SmartSound programs compared to the Freedom program. Further analysis was carried out on individual's speech intelligibility outcomes in quiet and in noise. Results showed a significant improvement or decrement for some recipients when upgrading to the new programs. To further increase speech intelligibility outcomes when upgrading, an enhanced upgrade procedure is proposed that includes additional testing with different signal-processing schemes. Implications of this research are that future automated scene analysis and switching technologies could provide additional performance improvements by introducing individualized scene-dependent settings.
Therapy of bovine endometritis with prostaglandin F2α: a meta-analysis.
Haimerl, P; Heuwieser, W; Arlt, S
2013-05-01
The objective of the conducted meta-analysis was to assess the efficacy of the treatment of bovine endometritis with PGF(2α) by statistical means. Postpartum uterine infections have a high prevalence and a very negative effect on reproductive performance in dairy cattle. Because of a wide discordance between research results, a meta-analysis of the efficacy of the treatment of bovine endometritis with PGF(2α) was conducted. A comprehensive literature search was performed using online databases to reveal a total of 2,307 references. In addition, 5 articles were retrieved by reviewing citations. After applying specific exclusion criteria and evaluating specific evidence parameters, 5 publications, comprising 6 trials, were eligible for being analyzed by means of meta-analysis. Data for each trial were extracted and analyzed using meta-analysis software Review Manager (version 5.1; The Nordic Cochrane Centre, Copenhagen, Denmark). Estimated effect sizes of PGF(2α) were calculated on calving to first service and calving to conception interval. Prostaglandin F(2α) treatment of cows with chronic endometritis had a negative effect on both reproductive performance parameters. Heterogeneity was substantial for calving to first service and calving to conception interval [I(2) (measure of variation beyond chance)=100 and 87%, respectively]; therefore, random-effects models were used. Sensitivity analysis as well as subgroup analysis showed that the performance of randomization was influential in modifying effect size of PGF(2α) treatment. The funnel plot illustrated a publication bias toward smaller studies that reported a prolonged calving to conception interval after a PGF(2α) treatment. We conclude that the investigation of this subject by means of meta-analysis did not reveal an improvement of reproductive performance of cows with endometritis after treatment with PGF(2α). Furthermore, there is a shortage of comparable high quality studies investigating reproductive performance after PGF(2α) treatment of cows with chronic endometritis. Copyright © 2013 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.
Open access for ALICE analysis based on virtualization technology
NASA Astrophysics Data System (ADS)
Buncic, P.; Gheata, M.; Schutz, Y.
2015-12-01
Open access is one of the important leverages for long-term data preservation for a HEP experiment. To guarantee the usability of data analysis tools beyond the experiment lifetime it is crucial that third party users from the scientific community have access to the data and associated software. The ALICE Collaboration has developed a layer of lightweight components built on top of virtualization technology to hide the complexity and details of the experiment-specific software. Users can perform basic analysis tasks within CernVM, a lightweight generic virtual machine, paired with an ALICE specific contextualization. Once the virtual machine is launched, a graphical user interface is automatically started without any additional configuration. This interface allows downloading the base ALICE analysis software and running a set of ALICE analysis modules. Currently the available tools include fully documented tutorials for ALICE analysis, such as the measurement of strange particle production or the nuclear modification factor in Pb-Pb collisions. The interface can be easily extended to include an arbitrary number of additional analysis modules. We present the current status of the tools used by ALICE through the CERN open access portal, and the plans for future extensions of this system.
NASA Astrophysics Data System (ADS)
Fuadiah, N. F.; Suryadi, D.; Turmudi
2018-05-01
This study focuses on the design of a didactical situation in addition and subtraction involving negative integers at the pilot experiment phase. As we know, negative numbers become an obstacle for students in solving problems related to them. This study aims to create a didactical design that can assist students in understanding the addition and subtraction. Another expected result in this way is that students are introduced to the characteristics of addition and subtraction of integers. The design was implemented on 32 seventh grade students in one of the classes in a junior secondary school as the pilot experiment. Learning activities were observed thoroughly including the students’ responses that emerged during the learning activities. The written documentation of the students was also used to support the analysis in the learning activities. The results of the analysis showed that this method could help the students perform a large number of integer operations that could not be done with a number line. The teacher’s support as a didactical potential contract was still needed to encourage institutionalization processes. The results of the design analysis used as the basis of the revision are expected to be implemented by the teacher in the teaching experiment.
van der Harst, Pim; Verweij, Niek
2018-02-02
Coronary artery disease (CAD) is a complex phenotype driven by genetic and environmental factors. Ninety-seven genetic risk loci have been identified to date, but the identification of additional susceptibility loci might be important to enhance our understanding of the genetic architecture of CAD. To expand the number of genome-wide significant loci, catalog functional insights, and enhance our understanding of the genetic architecture of CAD. We performed a genome-wide association study in 34 541 CAD cases and 261 984 controls of UK Biobank resource followed by replication in 88 192 cases and 162 544 controls from CARDIoGRAMplusC4D. We identified 75 loci that replicated and were genome-wide significant ( P <5×10 -8 ) in meta-analysis, 13 of which had not been reported previously. Next, to further identify novel loci, we identified all promising ( P <0.0001) loci in the CARDIoGRAMplusC4D data and performed reciprocal replication and meta-analyses with UK Biobank. This led to the identification of 21 additional novel loci reaching genome-wide significance ( P <5×10 -8 ) in meta-analysis. Finally, we performed a genome-wide meta-analysis of all available data revealing 30 additional novel loci ( P <5×10 -8 ) without further replication. The increase in sample size by UK Biobank raised the number of reconstituted gene sets from 4.2% to 13.9% of all gene sets to be involved in CAD. For the 64 novel loci, 155 candidate causal genes were prioritized, many without an obvious connection to CAD. Fine mapping of the 161 CAD loci generated lists of credible sets of single causal variants and genes for functional follow-up. Genetic risk variants of CAD were linked to development of atrial fibrillation, heart failure, and death. We identified 64 novel genetic risk loci for CAD and performed fine mapping of all 161 risk loci to obtain a credible set of causal variants. The large expansion of reconstituted gene sets argues in favor of an expanded omnigenic model view on the genetic architecture of CAD. © 2017 The Authors.
Improvements in analysis techniques for segmented mirror arrays
NASA Astrophysics Data System (ADS)
Michels, Gregory J.; Genberg, Victor L.; Bisson, Gary R.
2016-08-01
The employment of actively controlled segmented mirror architectures has become increasingly common in the development of current astronomical telescopes. Optomechanical analysis of such hardware presents unique issues compared to that of monolithic mirror designs. The work presented here is a review of current capabilities and improvements in the methodology of the analysis of mechanically induced surface deformation of such systems. The recent improvements include capability to differentiate surface deformation at the array and segment level. This differentiation allowing surface deformation analysis at each individual segment level offers useful insight into the mechanical behavior of the segments that is unavailable by analysis solely at the parent array level. In addition, capability to characterize the full displacement vector deformation of collections of points allows analysis of mechanical disturbance predictions of assembly interfaces relative to other assembly interfaces. This capability, called racking analysis, allows engineers to develop designs for segment-to-segment phasing performance in assembly integration, 0g release, and thermal stability of operation. The performance predicted by racking has the advantage of being comparable to the measurements used in assembly of hardware. Approaches to all of the above issues are presented and demonstrated by example with SigFit, a commercially available tool integrating mechanical analysis with optical analysis.
Development of Lithium Dimethyl Phosphate as an Electrolyte Additive for Lithium Ion Batteries
Milien, Mickdy S.; Tottempudi, Usha; Son, Miyoung; ...
2016-04-27
The novel electrolyte additive lithium dimethyl phosphate (LiDMP) has been synthesized and characterized. Incorporation of LiDMP (0.1% wt) into LiPF 6 in ethylene carbonate (EC) / ethyl methyl carbonate (EMC) (3:7 wt) results in improved rate performance and reduced impedance for graphite / LiNi 1/3Mn 1/3Co 1/3O 2 cells. Ex-situ surface analysis of the electrodes suggests that incorporation of LiDMP results in a modification of the solid electrolyte interphase (SEI) on the anode. A decrease in the concentration of lithium alkyl carbonates and an increase in the concentration of lithium fluoro phosphates are observed. The change in the anode SEImore » structure is responsible for the increased rate performance and decreased cell impedance.« less
A Systems Framework for Assessing Plumbing Products-Related Water Conservation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Williams, Alison; Dunham Whitehead, Camilla; Lutz, James
2011-12-02
Reducing the water use of plumbing products—toilets, urinals, faucets, and showerheads —has been a popular conservation measure. Improved technologies have created opportunities for additional conservation in this area. However, plumbing products do not operate in a vacuum. This paper reviews the literature related to plumbing products to determine a systems framework for evaluating future conservation measures using these products. The main framework comprises the following categories: water use efficiency, product components, product performance, source water, energy, and plumbing/sewer infrastructure. This framework for analysis provides a starting point for professionals considering future water conservation measures to evaluate the need for additionalmore » research, collaboration with other standards or codes committees, and attachment of additional metrics to water use efficiency (such as performance).« less
Patient satisfaction in Dental Healthcare Centers.
Ali, Dena A
2016-01-01
This study aimed to (1) measure the degree of patient satisfaction among the clinical and nonclinical dental services offered at specialty dental centers and (2) investigate the factors associated with the degree of overall satisfaction. Four hundred and ninety-seven participants from five dental centers were recruited for this study. Each participant completed a self-administered questionnaire to measure patient satisfaction with clinical and nonclinical dental services. Analysis of variance, t-tests, a general linear model, and stepwise regression analysis was applied. The respondents were generally satisfied, but internal differences were observed. The exhibited highest satisfaction with the dentists' performance, followed by the dental assistants' services, and the lowest satisfaction with the center's physical appearance and accessibility. Females, participants with less than a bachelor's degree, and younger individuals were more satisfied with the clinical and nonclinical dental services. The stepwise regression analysis revealed that the coefficient of determination (R (2)) was 40.4%. The patient satisfaction with the performance of the dentists explained 42.6% of the overall satisfaction, whereas their satisfaction with the clinical setting explained 31.5% of the overall satisfaction. Additional improvements with regard to the accessibility and physical appearance of the dental centers are needed. In addition, interventions regarding accessibility, particularly when booking an appointment, are required.
Bicycle: a bioinformatics pipeline to analyze bisulfite sequencing data.
Graña, Osvaldo; López-Fernández, Hugo; Fdez-Riverola, Florentino; González Pisano, David; Glez-Peña, Daniel
2018-04-15
High-throughput sequencing of bisulfite-converted DNA is a technique used to measure DNA methylation levels. Although a considerable number of computational pipelines have been developed to analyze such data, none of them tackles all the peculiarities of the analysis together, revealing limitations that can force the user to manually perform additional steps needed for a complete processing of the data. This article presents bicycle, an integrated, flexible analysis pipeline for bisulfite sequencing data. Bicycle analyzes whole genome bisulfite sequencing data, targeted bisulfite sequencing data and hydroxymethylation data. To show how bicycle overtakes other available pipelines, we compared them on a defined number of features that are summarized in a table. We also tested bicycle with both simulated and real datasets, to show its level of performance, and compared it to different state-of-the-art methylation analysis pipelines. Bicycle is publicly available under GNU LGPL v3.0 license at http://www.sing-group.org/bicycle. Users can also download a customized Ubuntu LiveCD including bicycle and other bisulfite sequencing data pipelines compared here. In addition, a docker image with bicycle and its dependencies, which allows a straightforward use of bicycle in any platform (e.g. Linux, OS X or Windows), is also available. ograna@cnio.es or dgpena@uvigo.es. Supplementary data are available at Bioinformatics online.
CRAX/Cassandra Reliability Analysis Software
DOE Office of Scientific and Technical Information (OSTI.GOV)
Robinson, D.
1999-02-10
Over the past few years Sandia National Laboratories has been moving toward an increased dependence on model- or physics-based analyses as a means to assess the impact of long-term storage on the nuclear weapons stockpile. These deterministic models have also been used to evaluate replacements for aging systems, often involving commercial off-the-shelf components (COTS). In addition, the models have been used to assess the performance of replacement components manufactured via unique, small-lot production runs. In either case, the limited amount of available test data dictates that the only logical course of action to characterize the reliability of these components ismore » to specifically consider the uncertainties in material properties, operating environment etc. within the physics-based (deterministic) model. This not only provides the ability to statistically characterize the expected performance of the component or system, but also provides direction regarding the benefits of additional testing on specific components within the system. An effort was therefore initiated to evaluate the capabilities of existing probabilistic methods and, if required, to develop new analysis methods to support the inclusion of uncertainty in the classical design tools used by analysts and design engineers at Sandia. The primary result of this effort is the CMX (Cassandra Exoskeleton) reliability analysis software.« less
Patient satisfaction in Dental Healthcare Centers
Ali, Dena A.
2016-01-01
Objectives: This study aimed to (1) measure the degree of patient satisfaction among the clinical and nonclinical dental services offered at specialty dental centers and (2) investigate the factors associated with the degree of overall satisfaction. Materials and Methods: Four hundred and ninety-seven participants from five dental centers were recruited for this study. Each participant completed a self-administered questionnaire to measure patient satisfaction with clinical and nonclinical dental services. Analysis of variance, t-tests, a general linear model, and stepwise regression analysis was applied. Results: The respondents were generally satisfied, but internal differences were observed. The exhibited highest satisfaction with the dentists’ performance, followed by the dental assistants’ services, and the lowest satisfaction with the center's physical appearance and accessibility. Females, participants with less than a bachelor's degree, and younger individuals were more satisfied with the clinical and nonclinical dental services. The stepwise regression analysis revealed that the coefficient of determination (R2) was 40.4%. The patient satisfaction with the performance of the dentists explained 42.6% of the overall satisfaction, whereas their satisfaction with the clinical setting explained 31.5% of the overall satisfaction. Conclusion: Additional improvements with regard to the accessibility and physical appearance of the dental centers are needed. In addition, interventions regarding accessibility, particularly when booking an appointment, are required. PMID:27403045
Discrimination Enhancement with Transient Feature Analysis of a Graphene Chemical Sensor.
Nallon, Eric C; Schnee, Vincent P; Bright, Collin J; Polcha, Michael P; Li, Qiliang
2016-01-19
A graphene chemical sensor is subjected to a set of structurally and chemically similar hydrocarbon compounds consisting of toluene, o-xylene, p-xylene, and mesitylene. The fractional change in resistance of the sensor upon exposure to these compounds exhibits a similar response magnitude among compounds, whereas large variation is observed within repetitions for each compound, causing a response overlap. Therefore, traditional features depending on maximum response change will cause confusion during further discrimination and classification analysis. More robust features that are less sensitive to concentration, sampling, and drift variability would provide higher quality information. In this work, we have explored the advantage of using transient-based exponential fitting coefficients to enhance the discrimination of similar compounds. The advantages of such feature analysis to discriminate each compound is evaluated using principle component analysis (PCA). In addition, machine learning-based classification algorithms were used to compare the prediction accuracies when using fitting coefficients as features. The additional features greatly enhanced the discrimination between compounds while performing PCA and also improved the prediction accuracy by 34% when using linear discrimination analysis.
A genome-wide association study of corneal astigmatism: The CREAM Consortium
Shah, Rupal L.; Li, Qing; Zhao, Wanting; Tedja, Milly S.; Tideman, J. Willem L.; Khawaja, Anthony P.; Fan, Qiao; Yazar, Seyhan; Williams, Katie M.; Verhoeven, Virginie J.M.; Xie, Jing; Wang, Ya Xing; Hess, Moritz; Nickels, Stefan; Lackner, Karl J.; Pärssinen, Olavi; Wedenoja, Juho; Biino, Ginevra; Concas, Maria Pina; Uitterlinden, André; Rivadeneira, Fernando; Jaddoe, Vincent W.V.; Hysi, Pirro G.; Sim, Xueling; Tan, Nicholas; Tham, Yih-Chung; Sensaki, Sonoko; Hofman, Albert; Vingerling, Johannes R.; Jonas, Jost B.; Mitchell, Paul; Hammond, Christopher J.; Höhn, René; Baird, Paul N.; Wong, Tien-Yin; Cheng, Chinfsg-Yu; Teo, Yik Ying; Mackey, David A.; Williams, Cathy; Saw, Seang-Mei; Klaver, Caroline C.W.; Bailey-Wilson, Joan E.
2018-01-01
Purpose To identify genes and genetic markers associated with corneal astigmatism. Methods A meta-analysis of genome-wide association studies (GWASs) of corneal astigmatism undertaken for 14 European ancestry (n=22,250) and 8 Asian ancestry (n=9,120) cohorts was performed by the Consortium for Refractive Error and Myopia. Cases were defined as having >0.75 diopters of corneal astigmatism. Subsequent gene-based and gene-set analyses of the meta-analyzed results of European ancestry cohorts were performed using VEGAS2 and MAGMA software. Additionally, estimates of single nucleotide polymorphism (SNP)-based heritability for corneal and refractive astigmatism and the spherical equivalent were calculated for Europeans using LD score regression. Results The meta-analysis of all cohorts identified a genome-wide significant locus near the platelet-derived growth factor receptor alpha (PDGFRA) gene: top SNP: rs7673984, odds ratio=1.12 (95% CI:1.08–1.16), p=5.55×10−9. No other genome-wide significant loci were identified in the combined analysis or European/Asian ancestry-specific analyses. Gene-based analysis identified three novel candidate genes for corneal astigmatism in Europeans—claudin-7 (CLDN7), acid phosphatase 2, lysosomal (ACP2), and TNF alpha-induced protein 8 like 3 (TNFAIP8L3). Conclusions In addition to replicating a previously identified genome-wide significant locus for corneal astigmatism near the PDGFRA gene, gene-based analysis identified three novel candidate genes, CLDN7, ACP2, and TNFAIP8L3, that warrant further investigation to understand their role in the pathogenesis of corneal astigmatism. The much lower number of genetic variants and genes demonstrating an association with corneal astigmatism compared to published spherical equivalent GWAS analyses suggest a greater influence of rare genetic variants, non-additive genetic effects, or environmental factors in the development of astigmatism. PMID:29422769
Henden, Lyndal; Lee, Stuart; Mueller, Ivo; Barry, Alyssa; Bahlo, Melanie
2018-05-01
Identification of genomic regions that are identical by descent (IBD) has proven useful for human genetic studies where analyses have led to the discovery of familial relatedness and fine-mapping of disease critical regions. Unfortunately however, IBD analyses have been underutilized in analysis of other organisms, including human pathogens. This is in part due to the lack of statistical methodologies for non-diploid genomes in addition to the added complexity of multiclonal infections. As such, we have developed an IBD methodology, called isoRelate, for analysis of haploid recombining microorganisms in the presence of multiclonal infections. Using the inferred IBD status at genomic locations, we have also developed a novel statistic for identifying loci under positive selection and propose relatedness networks as a means of exploring shared haplotypes within populations. We evaluate the performance of our methodologies for detecting IBD and selection, including comparisons with existing tools, then perform an exploratory analysis of whole genome sequencing data from a global Plasmodium falciparum dataset of more than 2500 genomes. This analysis identifies Southeast Asia as having many highly related isolates, possibly as a result of both reduced transmission from intensified control efforts and population bottlenecks following the emergence of antimalarial drug resistance. Many signals of selection are also identified, most of which overlap genes that are known to be associated with drug resistance, in addition to two novel signals observed in multiple countries that have yet to be explored in detail. Additionally, we investigate relatedness networks over the selected loci and determine that one of these sweeps has spread between continents while the other has arisen independently in different countries. IBD analysis of microorganisms using isoRelate can be used for exploring population structure, positive selection and haplotype distributions, and will be a valuable tool for monitoring disease control and elimination efforts of many diseases.
Grisham, Rachel N.; Sylvester, Brooke E.; Won, Helen; McDermott, Gregory; DeLair, Deborah; Ramirez, Ricardo; Yao, Zhan; Shen, Ronglai; Dao, Fanny; Bogomolniy, Faina; Makker, Vicky; Sala, Evis; Soumerai, Tara E.; Hyman, David M.; Socci, Nicholas D.; Viale, Agnes; Gershenson, David M.; Farley, John; Levine, Douglas A.; Rosen, Neal; Berger, Michael F.; Spriggs, David R.; Aghajanian, Carol A.; Solit, David B.; Iyer, Gopa
2015-01-01
Purpose No effective systemic therapy exists for patients with metastatic low-grade serous (LGS) ovarian cancers. BRAF and KRAS mutations are common in serous borderline (SB) and LGS ovarian cancers, and MEK inhibition has been shown to induce tumor regression in a minority of patients; however, no correlation has been observed between mutation status and clinical response. With the goal of identifying biomarkers of sensitivity to MEK inhibitor treatment, we performed an outlier analysis of a patient who experienced a complete, durable, and ongoing (> 5 years) response to selumetinib, a non-ATP competitive MEK inhibitor. Patients and Methods Next-generation sequencing was used to analyze this patient's tumor as well as an additional 28 SB/LGS tumors. Functional characterization of an identified novel alteration of interest was performed. Results Analysis of the extraordinary responder's tumor identified a 15-nucleotide deletion in the negative regulatory helix of the MAP2K1 gene encoding for MEK1. Functional characterization demonstrated that this mutant induced extracellular signal-regulated kinase pathway activation, promoted anchorage-independent growth and tumor formation in mice, and retained sensitivity to selumetinib. Analysis of additional LGS/SB tumors identified mutations predicted to induce extracellular signal-regulated kinase pathway activation in 82% (23 of 28), including two patients with BRAF fusions, one of whom achieved an ongoing complete response to MEK inhibitor–based combination therapy. Conclusion Alterations affecting the mitogen-activated protein kinase pathway are present in the majority of patients with LGS ovarian cancer. Next-generation sequencing analysis revealed deletions and fusions that are not detected by older sequencing approaches. These findings, coupled with the observation that a subset of patients with recurrent LGS ovarian cancer experienced dramatic and durable responses to MEK inhibitor therapy, support additional clinical studies of MEK inhibitors in this disease. PMID:26324360
Structured functional additive regression in reproducing kernel Hilbert spaces
Zhu, Hongxiao; Yao, Fang; Zhang, Hao Helen
2013-01-01
Summary Functional additive models (FAMs) provide a flexible yet simple framework for regressions involving functional predictors. The utilization of data-driven basis in an additive rather than linear structure naturally extends the classical functional linear model. However, the critical issue of selecting nonlinear additive components has been less studied. In this work, we propose a new regularization framework for the structure estimation in the context of Reproducing Kernel Hilbert Spaces. The proposed approach takes advantage of the functional principal components which greatly facilitates the implementation and the theoretical analysis. The selection and estimation are achieved by penalized least squares using a penalty which encourages the sparse structure of the additive components. Theoretical properties such as the rate of convergence are investigated. The empirical performance is demonstrated through simulation studies and a real data application. PMID:25013362
Cascaded systems analysis of photon counting detectors
Xu, J.; Zbijewski, W.; Gang, G.; Stayman, J. W.; Taguchi, K.; Lundqvist, M.; Fredenberg, E.; Carrino, J. A.; Siewerdsen, J. H.
2014-01-01
Purpose: Photon counting detectors (PCDs) are an emerging technology with applications in spectral and low-dose radiographic and tomographic imaging. This paper develops an analytical model of PCD imaging performance, including the system gain, modulation transfer function (MTF), noise-power spectrum (NPS), and detective quantum efficiency (DQE). Methods: A cascaded systems analysis model describing the propagation of quanta through the imaging chain was developed. The model was validated in comparison to the physical performance of a silicon-strip PCD implemented on an experimental imaging bench. The signal response, MTF, and NPS were measured and compared to theory as a function of exposure conditions (70 kVp, 1–7 mA), detector threshold, and readout mode (i.e., the option for coincidence detection). The model sheds new light on the dependence of spatial resolution, charge sharing, and additive noise effects on threshold selection and was used to investigate the factors governing PCD performance, including the fundamental advantages and limitations of PCDs in comparison to energy-integrating detectors (EIDs) in the linear regime for which pulse pileup can be ignored. Results: The detector exhibited highly linear mean signal response across the system operating range and agreed well with theoretical prediction, as did the system MTF and NPS. The DQE analyzed as a function of kilovolt (peak), exposure, detector threshold, and readout mode revealed important considerations for system optimization. The model also demonstrated the important implications of false counts from both additive electronic noise and charge sharing and highlighted the system design and operational parameters that most affect detector performance in the presence of such factors: for example, increasing the detector threshold from 0 to 100 (arbitrary units of pulse height threshold roughly equivalent to 0.5 and 6 keV energy threshold, respectively), increased the f50 (spatial-frequency at which the MTF falls to a value of 0.50) by ∼30% with corresponding improvement in DQE. The range in exposure and additive noise for which PCDs yield intrinsically higher DQE was quantified, showing performance advantages under conditions of very low-dose, high additive noise, and high fidelity rejection of coincident photons. Conclusions: The model for PCD signal and noise performance agreed with measurements of detector signal, MTF, and NPS and provided a useful basis for understanding complex dependencies in PCD imaging performance and the potential advantages (and disadvantages) in comparison to EIDs as well as an important guide to task-based optimization in developing new PCD imaging systems. PMID:25281959
Cascaded systems analysis of photon counting detectors.
Xu, J; Zbijewski, W; Gang, G; Stayman, J W; Taguchi, K; Lundqvist, M; Fredenberg, E; Carrino, J A; Siewerdsen, J H
2014-10-01
Photon counting detectors (PCDs) are an emerging technology with applications in spectral and low-dose radiographic and tomographic imaging. This paper develops an analytical model of PCD imaging performance, including the system gain, modulation transfer function (MTF), noise-power spectrum (NPS), and detective quantum efficiency (DQE). A cascaded systems analysis model describing the propagation of quanta through the imaging chain was developed. The model was validated in comparison to the physical performance of a silicon-strip PCD implemented on an experimental imaging bench. The signal response, MTF, and NPS were measured and compared to theory as a function of exposure conditions (70 kVp, 1-7 mA), detector threshold, and readout mode (i.e., the option for coincidence detection). The model sheds new light on the dependence of spatial resolution, charge sharing, and additive noise effects on threshold selection and was used to investigate the factors governing PCD performance, including the fundamental advantages and limitations of PCDs in comparison to energy-integrating detectors (EIDs) in the linear regime for which pulse pileup can be ignored. The detector exhibited highly linear mean signal response across the system operating range and agreed well with theoretical prediction, as did the system MTF and NPS. The DQE analyzed as a function of kilovolt (peak), exposure, detector threshold, and readout mode revealed important considerations for system optimization. The model also demonstrated the important implications of false counts from both additive electronic noise and charge sharing and highlighted the system design and operational parameters that most affect detector performance in the presence of such factors: for example, increasing the detector threshold from 0 to 100 (arbitrary units of pulse height threshold roughly equivalent to 0.5 and 6 keV energy threshold, respectively), increased the f50 (spatial-frequency at which the MTF falls to a value of 0.50) by ∼30% with corresponding improvement in DQE. The range in exposure and additive noise for which PCDs yield intrinsically higher DQE was quantified, showing performance advantages under conditions of very low-dose, high additive noise, and high fidelity rejection of coincident photons. The model for PCD signal and noise performance agreed with measurements of detector signal, MTF, and NPS and provided a useful basis for understanding complex dependencies in PCD imaging performance and the potential advantages (and disadvantages) in comparison to EIDs as well as an important guide to task-based optimization in developing new PCD imaging systems.
A novel bi-level meta-analysis approach: applied to biological pathway analysis.
Nguyen, Tin; Tagett, Rebecca; Donato, Michele; Mitrea, Cristina; Draghici, Sorin
2016-02-01
The accumulation of high-throughput data in public repositories creates a pressing need for integrative analysis of multiple datasets from independent experiments. However, study heterogeneity, study bias, outliers and the lack of power of available methods present real challenge in integrating genomic data. One practical drawback of many P-value-based meta-analysis methods, including Fisher's, Stouffer's, minP and maxP, is that they are sensitive to outliers. Another drawback is that, because they perform just one statistical test for each individual experiment, they may not fully exploit the potentially large number of samples within each study. We propose a novel bi-level meta-analysis approach that employs the additive method and the Central Limit Theorem within each individual experiment and also across multiple experiments. We prove that the bi-level framework is robust against bias, less sensitive to outliers than other methods, and more sensitive to small changes in signal. For comparative analysis, we demonstrate that the intra-experiment analysis has more power than the equivalent statistical test performed on a single large experiment. For pathway analysis, we compare the proposed framework versus classical meta-analysis approaches (Fisher's, Stouffer's and the additive method) as well as against a dedicated pathway meta-analysis package (MetaPath), using 1252 samples from 21 datasets related to three human diseases, acute myeloid leukemia (9 datasets), type II diabetes (5 datasets) and Alzheimer's disease (7 datasets). Our framework outperforms its competitors to correctly identify pathways relevant to the phenotypes. The framework is sufficiently general to be applied to any type of statistical meta-analysis. The R scripts are available on demand from the authors. sorin@wayne.edu Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Yiadom, Maame Yaa A B; Scheulen, James; McWade, Conor M; Augustine, James J
2016-07-01
The objective was to obtain a commitment to adopt a common set of definitions for emergency department (ED) demographic, clinical process, and performance metrics among the ED Benchmarking Alliance (EDBA), ED Operations Study Group (EDOSG), and Academy of Academic Administrators of Emergency Medicine (AAAEM) by 2017. A retrospective cross-sectional analysis of available data from three ED operations benchmarking organizations supported a negotiation to use a set of common metrics with identical definitions. During a 1.5-day meeting-structured according to social change theories of information exchange, self-interest, and interdependence-common definitions were identified and negotiated using the EDBA's published definitions as a start for discussion. Methods of process analysis theory were used in the 8 weeks following the meeting to achieve official consensus on definitions. These two lists were submitted to the organizations' leadership for implementation approval. A total of 374 unique measures were identified, of which 57 (15%) were shared by at least two organizations. Fourteen (4%) were common to all three organizations. In addition to agreement on definitions for the 14 measures used by all three organizations, agreement was reached on universal definitions for 17 of the 57 measures shared by at least two organizations. The negotiation outcome was a list of 31 measures with universal definitions to be adopted by each organization by 2017. The use of negotiation, social change, and process analysis theories achieved the adoption of universal definitions among the EDBA, EDOSG, and AAAEM. This will impact performance benchmarking for nearly half of US EDs. It initiates a formal commitment to utilize standardized metrics, and it transitions consistency in reporting ED operations metrics from consensus to implementation. This work advances our ability to more accurately characterize variation in ED care delivery models, resource utilization, and performance. In addition, it permits future aggregation of these three data sets, thus facilitating the creation of more robust ED operations research data sets unified by a universal language. Negotiation, social change, and process analysis principles can be used to advance the adoption of additional definitions. © 2016 by the Society for Academic Emergency Medicine.